modelId
stringlengths
5
139
author
stringlengths
2
42
last_modified
timestamp[us, tz=UTC]date
2020-02-15 11:33:14
2025-09-07 18:30:29
downloads
int64
0
223M
likes
int64
0
11.7k
library_name
stringclasses
544 values
tags
listlengths
1
4.05k
pipeline_tag
stringclasses
55 values
createdAt
timestamp[us, tz=UTC]date
2022-03-02 23:29:04
2025-09-07 18:30:28
card
stringlengths
11
1.01M
mradermacher/MN-12B-Lyra-v4-GGUF
mradermacher
2024-11-14T09:27:09Z
92
2
transformers
[ "transformers", "gguf", "en", "base_model:Sao10K/MN-12B-Lyra-v4", "base_model:quantized:Sao10K/MN-12B-Lyra-v4", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us", "conversational" ]
null
2024-11-14T08:41:22Z
--- base_model: Sao10K/MN-12B-Lyra-v4 language: - en library_name: transformers license: cc-by-nc-4.0 quantized_by: mradermacher --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/Sao10K/MN-12B-Lyra-v4 <!-- provided-files --> weighted/imatrix quants are available at https://huggingface.co/mradermacher/MN-12B-Lyra-v4-i1-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q2_K.gguf) | Q2_K | 4.9 | | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q3_K_S.gguf) | Q3_K_S | 5.6 | | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q3_K_M.gguf) | Q3_K_M | 6.2 | lower quality | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q3_K_L.gguf) | Q3_K_L | 6.7 | | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.IQ4_XS.gguf) | IQ4_XS | 6.9 | | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q4_K_S.gguf) | Q4_K_S | 7.2 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q4_K_M.gguf) | Q4_K_M | 7.6 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q5_K_S.gguf) | Q5_K_S | 8.6 | | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q5_K_M.gguf) | Q5_K_M | 8.8 | | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q6_K.gguf) | Q6_K | 10.2 | very good quality | | [GGUF](https://huggingface.co/mradermacher/MN-12B-Lyra-v4-GGUF/resolve/main/MN-12B-Lyra-v4.Q8_0.gguf) | Q8_0 | 13.1 | fast, best quality | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
MuwKiUser/trialmodel0-merged-16bit
MuwKiUser
2024-11-14T09:18:58Z
120
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "unsloth", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T09:17:46Z
--- library_name: transformers tags: - unsloth --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
michaelowenliu/ddpm-celebahq-finetuned-butterflies-2epochs
michaelowenliu
2024-11-14T09:09:55Z
46
0
diffusers
[ "diffusers", "safetensors", "pytorch", "unconditional-image-generation", "diffusion-models-class", "license:mit", "diffusers:DDPMPipeline", "region:us" ]
unconditional-image-generation
2024-11-14T09:06:02Z
--- license: mit tags: - pytorch - diffusers - unconditional-image-generation - diffusion-models-class --- # Example Fine-Tuned Model for Unit 2 of the [Diffusion Models Class 🧨](https://github.com/huggingface/diffusion-models-class) Describe your model here ## Usage ```python from diffusers import DDPMPipeline pipeline = DDPMPipeline.from_pretrained('michaelowenliu/ddpm-celebahq-finetuned-butterflies-2epochs') image = pipeline().images[0] image ```
liusincere/Abbie
liusincere
2024-11-14T09:08:25Z
89
0
diffusers
[ "diffusers", "flux", "lora", "text-to-image", "en", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
2024-11-14T07:38:36Z
--- license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md language: - en base_model: - black-forest-labs/FLUX.1-dev pipeline_tag: text-to-image tags: - flux - diffusers - lora ---
mradermacher/QandoraExp-7B-Persona-GGUF
mradermacher
2024-11-14T08:58:00Z
14
1
transformers
[ "transformers", "gguf", "mergekit", "merge", "en", "base_model:bunnycore/QandoraExp-7B-Persona", "base_model:quantized:bunnycore/QandoraExp-7B-Persona", "endpoints_compatible", "region:us", "conversational" ]
null
2024-11-13T05:43:28Z
--- base_model: bunnycore/QandoraExp-7B-Persona language: - en library_name: transformers quantized_by: mradermacher tags: - mergekit - merge --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/bunnycore/QandoraExp-7B-Persona <!-- provided-files --> weighted/imatrix quants are available at https://huggingface.co/mradermacher/QandoraExp-7B-Persona-i1-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q2_K.gguf) | Q2_K | 3.1 | | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q3_K_S.gguf) | Q3_K_S | 3.6 | | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q3_K_L.gguf) | Q3_K_L | 4.2 | | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.IQ4_XS.gguf) | IQ4_XS | 4.4 | | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q4_K_S.gguf) | Q4_K_S | 4.6 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q4_K_M.gguf) | Q4_K_M | 4.8 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q5_K_S.gguf) | Q5_K_S | 5.4 | | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q5_K_M.gguf) | Q5_K_M | 5.5 | | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q6_K.gguf) | Q6_K | 6.4 | very good quality | | [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q8_0.gguf) | Q8_0 | 8.2 | fast, best quality | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
nikhil2902/Qwen_2vl_cat4
nikhil2902
2024-11-14T08:55:16Z
12
0
transformers
[ "transformers", "safetensors", "qwen2_vl", "image-text-to-text", "llama-factory", "conversational", "arxiv:1910.09700", "text-generation-inference", "endpoints_compatible", "region:us" ]
image-text-to-text
2024-11-10T23:27:23Z
--- library_name: transformers tags: - llama-factory --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
adriansanz/intent_analysis_2labels
adriansanz
2024-11-14T08:48:45Z
114
0
transformers
[ "transformers", "tensorboard", "safetensors", "xlm-roberta", "text-classification", "generated_from_trainer", "base_model:FacebookAI/xlm-roberta-base", "base_model:finetune:FacebookAI/xlm-roberta-base", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-11-14T08:47:40Z
--- library_name: transformers license: mit base_model: xlm-roberta-base tags: - generated_from_trainer metrics: - accuracy - precision - recall - f1 model-index: - name: intent_analysis results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # intent_analysis This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0083 - Accuracy: 0.9990 - Precision: 0.9990 - Recall: 0.9990 - F1: 0.9990 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | No log | 1.0 | 76 | 0.0166 | 0.9971 | 0.9971 | 0.9971 | 0.9971 | | No log | 2.0 | 152 | 0.0074 | 0.9990 | 0.9990 | 0.9990 | 0.9990 | | No log | 3.0 | 228 | 0.0083 | 0.9990 | 0.9990 | 0.9990 | 0.9990 | ### Framework versions - Transformers 4.46.2 - Pytorch 2.5.0+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3
Masakazu1999/model
Masakazu1999
2024-11-14T08:47:34Z
33
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "text-generation-inference", "unsloth", "trl", "sft", "conversational", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "4-bit", "bitsandbytes", "region:us" ]
text-generation
2024-11-14T08:41:57Z
--- base_model: unsloth/qwen2.5-32b-instruct-bnb-4bit tags: - text-generation-inference - transformers - unsloth - qwen2 - trl - sft license: apache-2.0 language: - en --- # Uploaded model - **Developed by:** Masakazu1999 - **License:** apache-2.0 - **Finetuned from model :** unsloth/qwen2.5-32b-instruct-bnb-4bit This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
noaebbot/llama3.2-3B-insights_16bit-resolution
noaebbot
2024-11-14T08:43:08Z
122
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "sft", "conversational", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T08:42:17Z
--- base_model: unsloth/llama-3.2-3b-instruct-bnb-4bit tags: - text-generation-inference - transformers - unsloth - llama - trl - sft license: apache-2.0 language: - en --- # Uploaded model - **Developed by:** noaebbot - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3.2-3b-instruct-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
abdfikih/llama3.2-8b-train-py
abdfikih
2024-11-14T08:39:01Z
121
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "sft", "conversational", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T08:37:15Z
--- base_model: unsloth/llama-3.2-3b-instruct-bnb-4bit tags: - text-generation-inference - transformers - unsloth - llama - trl - sft license: apache-2.0 language: - en --- # Uploaded model - **Developed by:** abdfikih - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3.2-3b-instruct-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
Arthur-Tsai/your_model_name
Arthur-Tsai
2024-11-14T08:38:44Z
46
0
transformers
[ "transformers", "safetensors", "custom-model", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-11-14T08:36:16Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
FractalGPT/RuQwen2.5-3B-Instruct-AWQ
FractalGPT
2024-11-14T08:37:04Z
307
5
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "conversational", "ru", "en", "base_model:Qwen/Qwen2.5-3B-Instruct-AWQ", "base_model:quantized:Qwen/Qwen2.5-3B-Instruct-AWQ", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "awq", "region:us" ]
text-generation
2024-11-11T23:32:02Z
--- base_model: - Qwen/Qwen2.5-3B-Instruct-AWQ language: - ru - en library_name: transformers license: apache-2.0 pipeline_tag: text-generation --- --- ## FractalGPT/RuQwen2.5-3B-Instruct-AWQ --- ### Our products - [FractalGPT](https://fractalgpt.ru) ### Model Overview - **RuQwen2.5-3B-Instruct-AWQ** by FractalGPT is a language model tailored to deliver high-quality Russian language output. Building upon the Qwen2.5 series, it is optimized for Russian-language tasks while retaining broad multilingual support. - **Improved Russian Language Quality**: Adaptations have significantly enhanced the fluency, accuracy, and coherence of Russian text generation, making it an excellent choice for Russian-language applications. ### Model Specifications - **Type**: Instruction-tuned Causal Language Model - **Training Stages**: Pretraining & Instruction Tuning - **Architecture**: Transformer with RoPE, SwiGLU, RMSNorm, and Attention QKV bias - **Layers**: 36 - **Attention Heads (GQA)**: 16 for Q and 2 for KV - **Context Length**: Supports a full context of 131,072 tokens and generation of up to 8,192 tokens - **Quantization**: AWQ 4 bit - **Base model**: Qwen/Qwen2.5-3B-Instruct-AWQ ### Requirements The code of Qwen2.5 has been in the latest Hugging face transformers and we advise you to use the latest version of transformers. * [**Run example in Collab**](https://colab.research.google.com/drive/1gQmvUVUmnepi6i2As67bHoscnRl6uwKY?usp=sharing) ``` pip install autoawq -q pip install --upgrade torch -q pip install --upgrade transformers -q ``` With transformers<4.37.0, you will encounter the following error: ``` KeyError: 'qwen2' ``` Also check out our [AWQ documentation](https://qwen.readthedocs.io/en/latest/quantization/awq.html) for more usage guide. With pytorch<2.4.0, you will encounter the following error: ``` AttributeError: module 'torch.library' has no attribute 'register_fake' ``` ### Quickstart We use a special RuQwen2ForCausalLM class to work with this model: ```python from transformers import Qwen2ForCausalLM, AutoConfig, AutoTokenizer import torch class RuQwen2ForCausalLM(Qwen2ForCausalLM): def __init__(self, config): super().__init__(config) if hasattr(self, "lm_head") and isinstance(self.lm_head, torch.nn.Linear): if self.lm_head.bias is None: self.config.add_bias_to_lm_head = True self._add_bias_to_lm_head() def _add_bias_to_lm_head(self): """Добавляет bias в lm_head, если его нет.""" old_lm_head = self.lm_head # lm_head с bias self.lm_head = torch.nn.Linear( old_lm_head.in_features, old_lm_head.out_features, dtype=self.model.dtype, bias=True, ) with torch.no_grad(): self.lm_head.weight = old_lm_head.weight torch.nn.init.zeros_(self.lm_head.bias) @classmethod def from_pretrained(cls, model_name, *args, **kwargs): # Загружает модель с конфигурацией model = super().from_pretrained(model_name, *args, **kwargs) if hasattr(model.config, "add_bias_to_lm_head") and not model.config.add_bias_to_lm_head: model._add_bias_to_lm_head() return model def save_pretrained(self, save_directory, *args, **kwargs): self.config.add_bias_to_lm_head = self.lm_head.bias is not None super().save_pretrained(save_directory, *args, **kwargs) ``` Here provides a code snippet with apply_chat_template to show you how to load the tokenizer and model and how to generate contents. ```python def generate(messages): input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(model.device) output = model.generate(input_ids, max_new_tokens=1024, do_sample=False, temperature=None, top_k=None, top_p=None) generated_text = tokenizer.decode(output[0], skip_special_tokens=False)#.split('<|im_start|>assistant')[1] return generated_text model_name = 'FractalGPT/RuQwen2.5-3B-Instruct-AWQ' model = RuQwen2ForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16) tokenizer = AutoTokenizer.from_pretrained(model_name) prompt = "Классификация медицинских терминов" messages = [ {"role": "system", "content": "You are RuQwen, created by FractalGPT. You are a helpful assistant."}, {"role": "user", "content": prompt} ] print(generate(messages)) ``` ### Example of improvements Question: 'Классификация медицинских терминов' The answer to the original model ``` Классификация медицинских терминов играет ключевую роль в медицине и информационных системах. Вот несколько основных подходов к классификации медицинских терминов: 1. **International Classification of Diseases (ICD)**: Это стандартная система классификации заболеваний и причин смерти, разработанная ВОЗ. Она используется во многих странах для классификации медицинских случаев. 2. **Medical Subject Headings (MeSH)**: Это система классификации медицинской литературы, разработанная National Library of Medicine (NLM). MeSH помогает организовать информацию о медицинских исследованиях и статьях. 3. **Systematized Nomenclature of Medicine (SNOMED)**: SNOMED International предоставляет глубоко специализированную систему классификации медицинских терминов. Она предназначена для использования в больницах, клиниках и других медицинских учреждениях. 4. **Unified Medical Language System (UMLS)**: Это расширяемая система классификации медицинских терминов, которая объединяет данные из различных источников, таких как ICD-10, MeSH и другие. 5. **Clinical Terms**: Это система классификации медицинских терминов, разработанная для использования в клинических системах управления пациентами (EHR). 6. **Pharmaceutical Product Names (PPNs)**: Это система классификации названий лекарственных препаратов, разработанная для обеспечения уникальности и стандартизации названий. Каждая из этих систем имеет свои особенности и области применения. Например, ICD主要用于疾病和死亡原因的分类,而MeSH主要用于医学文献的分类。SNOMED则提供更详细的医疗术语分类,适用于临床应用。 Также существуют специализированные системы для конкретных областей медицины, такие как терминология для хирургии, терапии, педиатрии и т.д. Классификация медицинских терминов позволяет упорядочить информацию, сделать ее доступной для широкого круга пользователей и обеспечивает стандартизацию терминологии, что важно для точного понимания и эффективного общения между врачами, пациентами и медицинскими работниками. ``` Our model's answer ``` Классификация медицинских терминов играет ключевую роль в медицине и информационных системах. Вот несколько основных подходов к классификации медицинских терминов: 1. **International Classification of Diseases (ICD)**: Это стандартная система классификации заболеваний и причин смерти, разработанная ВОЗ. Она используется во многих странах для классификации медицинских случаев. 2. **Medical Subject Headings (MeSH)**: Это система классификации медицинской литературы, разработанная National Library of Medicine (NLM). MeSH помогает организовать информацию о медицинских исследованиях и статьях. 3. **Systematized Nomenclature of Medicine (SNOMED)**: SNOMED International предоставляет глубоко специализированную систему классификации медицинских терминов. Он включает термины для диагностики, лечения, тестирования и других аспектов медицинского обслуживания. 4. **Unified Medical Language System (UMLS)**: Это система метаданных, которая объединяет терминологические данные из различных источников, таких как ICD-10, MeSH и другие системы классификации. 5. **Clinical Terms**: Это терминология, используемая в клинической практике, которая может включать термины из SNOMED, ICD и других систем. 6. **Pharmacopoeias**: Эти документы обычно содержат классификацию лекарственных препаратов и их применения. Каждая из этих систем имеет свои особенности и области применения. Например, ICD используется для классификации заболеваний и причин смерти, в то время как MeSH и SNOMED используются для классификации медицинской информации и терминологии соответственно. Также существуют более специализированные системы классификации, такие как терминология для конкретных областей медицины (например, терминология для хирургии или терминология для педиатрии). Классификация медицинских терминов позволяет упорядочить информацию, сделать ее доступной для широкого круга пользователей и обеспечивает стандартизацию терминологии, что важно для эффективного общения между врачами, пациентами и другими участниками медицинского процесса. ```
RichardErkhov/unsloth_-_SmolLM2-360M-gguf
RichardErkhov
2024-11-14T08:34:11Z
5
0
null
[ "gguf", "endpoints_compatible", "region:us" ]
null
2024-11-14T08:05:53Z
Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) SmolLM2-360M - GGUF - Model creator: https://huggingface.co/unsloth/ - Original model: https://huggingface.co/unsloth/SmolLM2-360M/ | Name | Quant method | Size | | ---- | ---- | ---- | | [SmolLM2-360M.Q2_K.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q2_K.gguf) | Q2_K | 0.2GB | | [SmolLM2-360M.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q3_K_S.gguf) | Q3_K_S | 0.2GB | | [SmolLM2-360M.Q3_K.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q3_K.gguf) | Q3_K | 0.22GB | | [SmolLM2-360M.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q3_K_M.gguf) | Q3_K_M | 0.22GB | | [SmolLM2-360M.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q3_K_L.gguf) | Q3_K_L | 0.23GB | | [SmolLM2-360M.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.IQ4_XS.gguf) | IQ4_XS | 0.21GB | | [SmolLM2-360M.Q4_0.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q4_0.gguf) | Q4_0 | 0.21GB | | [SmolLM2-360M.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.IQ4_NL.gguf) | IQ4_NL | 0.21GB | | [SmolLM2-360M.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q4_K_S.gguf) | Q4_K_S | 0.24GB | | [SmolLM2-360M.Q4_K.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q4_K.gguf) | Q4_K | 0.25GB | | [SmolLM2-360M.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q4_K_M.gguf) | Q4_K_M | 0.25GB | | [SmolLM2-360M.Q4_1.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q4_1.gguf) | Q4_1 | 0.23GB | | [SmolLM2-360M.Q5_0.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q5_0.gguf) | Q5_0 | 0.25GB | | [SmolLM2-360M.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q5_K_S.gguf) | Q5_K_S | 0.26GB | | [SmolLM2-360M.Q5_K.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q5_K.gguf) | Q5_K | 0.27GB | | [SmolLM2-360M.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q5_K_M.gguf) | Q5_K_M | 0.27GB | | [SmolLM2-360M.Q5_1.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q5_1.gguf) | Q5_1 | 0.27GB | | [SmolLM2-360M.Q6_K.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q6_K.gguf) | Q6_K | 0.34GB | | [SmolLM2-360M.Q8_0.gguf](https://huggingface.co/RichardErkhov/unsloth_-_SmolLM2-360M-gguf/blob/main/SmolLM2-360M.Q8_0.gguf) | Q8_0 | 0.36GB | Original model description: --- base_model: HuggingFaceTB/SmolLM2-360M language: - en library_name: transformers license: apache-2.0 tags: - llama - unsloth - transformers --- # Finetune SmolLM2, Llama 3.2, Gemma 2, Mistral 2-5x faster with 70% less memory via Unsloth! We have a free Google Colab Tesla T4 notebook for Llama 3.2 (3B) here: https://colab.research.google.com/drive/1Ys44kVvmeZtnICzWz0xgpRnrIOjZAuxp?usp=sharing [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/Discord%20button.png" width="200"/>](https://discord.gg/unsloth) [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) # unsloth/SmolLM2-360M For more details on the model, please go to Hugging Face's original [model card](https://huggingface.co/HuggingFaceTB/SmolLM2-360M) ## ✨ Finetune for Free All notebooks are **beginner friendly**! Add your dataset, click "Run All", and you'll get a 2x faster finetuned model which can be exported to GGUF, vLLM or uploaded to Hugging Face. | Unsloth supports | Free Notebooks | Performance | Memory use | |-----------------|--------------------------------------------------------------------------------------------------------------------------|-------------|----------| | **Llama-3.2 (3B)** | [▶️ Start on Colab](https://colab.research.google.com/drive/1Ys44kVvmeZtnICzWz0xgpRnrIOjZAuxp?usp=sharing) | 2.4x faster | 58% less | | **Llama-3.2 (11B vision)** | [▶️ Start on Colab](https://colab.research.google.com/drive/1Ys44kVvmeZtnICzWz0xgpRnrIOjZAuxp?usp=sharing) | 2.4x faster | 58% less | | **Llama-3.1 (8B)** | [▶️ Start on Colab](https://colab.research.google.com/drive/1Ys44kVvmeZtnICzWz0xgpRnrIOjZAuxp?usp=sharing) | 2.4x faster | 58% less | | **Phi-3.5 (mini)** | [▶️ Start on Colab](https://colab.research.google.com/drive/1lN6hPQveB_mHSnTOYifygFcrO8C1bxq4?usp=sharing) | 2x faster | 50% less | | **Gemma 2 (9B)** | [▶️ Start on Colab](https://colab.research.google.com/drive/1vIrqH5uYDQwsJ4-OO3DErvuv4pBgVwk4?usp=sharing) | 2.4x faster | 58% less | | **Mistral (7B)** | [▶️ Start on Colab](https://colab.research.google.com/drive/1Dyauq4kTZoLewQ1cApceUQVNcnnNTzg_?usp=sharing) | 2.2x faster | 62% less | | **DPO - Zephyr** | [▶️ Start on Colab](https://colab.research.google.com/drive/15vttTpzzVXv_tJwEk-hIcQ0S9FcEWvwP?usp=sharing) | 1.9x faster | 19% less | - This [conversational notebook](https://colab.research.google.com/drive/1Aau3lgPzeZKQ-98h69CCu1UJcvIBLmy2?usp=sharing) is useful for ShareGPT ChatML / Vicuna templates. - This [text completion notebook](https://colab.research.google.com/drive/1ef-tab5bhkvWmBOObepl1WgJvfvSzn5Q?usp=sharing) is for raw text. This [DPO notebook](https://colab.research.google.com/drive/15vttTpzzVXv_tJwEk-hIcQ0S9FcEWvwP?usp=sharing) replicates Zephyr. - \* Kaggle has 2x T4s, but we use 1. Due to overhead, 1x T4 is 5x faster. ## Special Thanks A huge thank you to the Hugging Face team for creating and releasing these models. ## Model Summary SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of tasks while being lightweight enough to run on-device. The 1.7B variant demonstrates significant advances over its predecessor SmolLM1-1.7B, particularly in instruction following, knowledge, reasoning, and mathematics. It was trained on 11 trillion tokens using a diverse dataset combination: FineWeb-Edu, DCLM, The Stack, along with new mathematics and coding datasets that we curated and will release soon. We developed the instruct version through supervised fine-tuning (SFT) using a combination of public datasets and our own curated datasets. We then applied Direct Preference Optimization (DPO) using [UltraFeedback](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized). The instruct model additionally supports tasks such as text rewriting, summarization and function calling thanks to datasets developed by [Argilla](https://huggingface.co/argilla) such as [Synth-APIGen-v0.1](https://huggingface.co/datasets/argilla/Synth-APIGen-v0.1). # SmolLM2 ![image/png](https://cdn-uploads.huggingface.co/production/uploads/61c141342aac764ce1654e43/y45hIMNREW7w_XpHYB_0q.png)
jimmylam6666/RoleplayV2E6-Q8_0-GGUF
jimmylam6666
2024-11-14T08:30:40Z
7
0
null
[ "gguf", "llama-cpp", "gguf-my-repo", "base_model:roy12715/RoleplayV2E6", "base_model:quantized:roy12715/RoleplayV2E6", "endpoints_compatible", "region:us", "conversational" ]
null
2024-11-14T08:30:05Z
--- base_model: roy12715/RoleplayV2E6 tags: - llama-cpp - gguf-my-repo --- # jimmylam6666/RoleplayV2E6-Q8_0-GGUF This model was converted to GGUF format from [`roy12715/RoleplayV2E6`](https://huggingface.co/roy12715/RoleplayV2E6) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/roy12715/RoleplayV2E6) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo jimmylam6666/RoleplayV2E6-Q8_0-GGUF --hf-file roleplayv2e6-q8_0.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo jimmylam6666/RoleplayV2E6-Q8_0-GGUF --hf-file roleplayv2e6-q8_0.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo jimmylam6666/RoleplayV2E6-Q8_0-GGUF --hf-file roleplayv2e6-q8_0.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo jimmylam6666/RoleplayV2E6-Q8_0-GGUF --hf-file roleplayv2e6-q8_0.gguf -c 2048 ```
hezarai/distilbert-fa-mask-filling
hezarai
2024-11-14T08:26:49Z
5
0
hezar
[ "hezar", "fill-mask", "fa", "base_model:hezarai/distilbert-base-fa", "base_model:finetune:hezarai/distilbert-base-fa", "region:us" ]
fill-mask
2024-01-01T16:42:36Z
--- language: - fa library_name: hezar tags: - fill-mask - hezar pipeline_tag: fill-mask base_model: - hezarai/distilbert-base-fa --- A DistilBERT model for mask filling. The base DistilBERT model is initialized from https://huggingface.co/HooshvareLab/distilbert-fa-zwnj-base. ## Usage ``` pip install hezar ``` ```python from hezar.models import Model model = Model.load("hezarai/distilbert-fa-mask-filling") inputs = ["ایشالا از [MASK] شروع میکنم"] outputs = model.predict(inputs) print(outputs) ```
hezarai/gpt2-base-fa
hezarai
2024-11-14T08:26:40Z
25
2
hezar
[ "hezar", "text-generation", "fa", "region:us" ]
text-generation
2023-10-22T10:30:03Z
--- language: - fa library_name: hezar tags: - text-generation - hezar pipeline_tag: text-generation --- The original [ParsGPT2](https://huggingface.co/HooshvareLab/gpt2-fa) model ported to Hezar for compatibility. ## Usage ``` pip install hezar ``` ```python from hezar.models import Model model = Model.load("hezarai/gpt2-base-fa") text = " در یک اتفاق شگفت انگیز، پژوهشگران" outputs = model.predict(text) print(outputs) ```
hezarai/vit-gpt2-fa-image-captioning-flickr30k
hezarai
2024-11-14T08:26:36Z
7
0
hezar
[ "hezar", "image-to-text", "fa", "dataset:hezarai/flickr30k-fa", "region:us" ]
image-to-text
2023-09-28T16:01:21Z
--- language: - fa library_name: hezar tags: - image-to-text - hezar metrics: - wer pipeline_tag: image-to-text datasets: - hezarai/flickr30k-fa --- A Persian image captioning model constructed from a ViT + GPT2 architecture trained on [flickr30k-fa](https://www.kaggle.com/datasets/sajjadayobi360/flickrfa) (created by Sajjad Ayoubi). The encoder (ViT) was initialized from https://huggingface.co/google/vit-base-patch16-224 and the decoder (GPT2) was initialized from https://huggingface.co/HooshvareLab/gpt2-fa . ## Usage ``` pip install hezar ``` ```python from hezar.models import Model model = Model.load("hezarai/vit-gpt2-fa-image-captioning-flickr30k") captions = model.predict("example_image.jpg") print(captions) ```
hezarai/trocr-base-fa-v1
hezarai
2024-11-14T08:26:31Z
20
1
hezar
[ "hezar", "image-to-text", "fa", "region:us" ]
image-to-text
2023-09-12T20:35:56Z
--- language: - fa library_name: hezar tags: - image-to-text - hezar pipeline_tag: image-to-text --- A TrOCR model for Persian. This is just a preview to verify the model works in Hezar. This model will be trained on more and better datasets soon! #### Example ```python from hezar.models import Model model = Model.load("trocr-base-fa-v1") text = model.predict(["image.jpg"]) print(text) ```
hezarai/bert-fa-ner-arman
hezarai
2024-11-14T08:26:23Z
35
0
hezar
[ "hezar", "token-classification", "fa", "dataset:hezarai/arman-ner", "base_model:hezarai/bert-base-fa", "base_model:finetune:hezarai/bert-base-fa", "region:us" ]
token-classification
2023-08-04T12:04:23Z
--- language: - fa library_name: hezar tags: - token-classification - hezar datasets: - hezarai/arman-ner pipeline_tag: token-classification base_model: - hezarai/bert-base-fa --- ParsBERT model trained on the [Arman-NER](https://huggingface.co/datasets/hezarai/arman-ner) for Named Entity Recognition.
hezarai/bert-fa-sentiment-dksf
hezarai
2024-11-14T08:26:10Z
147
0
hezar
[ "hezar", "text-classification", "fa", "base_model:hezarai/bert-base-fa", "base_model:finetune:hezarai/bert-base-fa", "region:us" ]
text-classification
2023-03-12T07:19:00Z
--- language: - fa library_name: hezar tags: - text-classification - hezar pipeline_tag: text-classification base_model: - hezarai/bert-base-fa ---
hezarai/distilbert-fa-sentiment-dksf
hezarai
2024-11-14T08:22:29Z
12
0
hezar
[ "hezar", "text-classification", "fa", "base_model:hezarai/distilbert-base-fa", "base_model:finetune:hezarai/distilbert-base-fa", "region:us" ]
text-classification
2023-02-02T11:30:32Z
--- language: - fa library_name: hezar tags: - text-classification - hezar pipeline_tag: text-classification base_model: - hezarai/distilbert-base-fa ---
Ahanaas/HermesWithYou
Ahanaas
2024-11-14T08:13:07Z
8
0
null
[ "safetensors", "llama", "en", "base_model:NousResearch/Hermes-3-Llama-3.1-8B", "base_model:finetune:NousResearch/Hermes-3-Llama-3.1-8B", "license:mit", "region:us" ]
null
2024-11-14T07:15:45Z
--- license: mit language: - en base_model: - NousResearch/Hermes-3-Llama-3.1-8B --- ## Inf ```py !git clone https://github.com/huggingface/transformers.git %cd transformers !git checkout <commit_id_for_4.47.0.dev0> !pip install . !pip install -q accelerate==0.34.2 bitsandbytes==0.44.1 peft==0.13.1 ``` #### Importing libs ```py import os import torch from datasets import load_dataset from transformers import ( AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig, pipeline, logging, ) ``` #### Bits&Bytes Config ```py use_4bit = True # Compute dtype for 4-bit base models bnb_4bit_compute_dtype = "float16" # Quantization type (fp4 or nf4) compute_dtype = getattr(torch, bnb_4bit_compute_dtype) use_nested_quant = False bnb_4bit_quant_type = "nf4" bnb_config = BitsAndBytesConfig( load_in_4bit=use_4bit, bnb_4bit_quant_type=bnb_4bit_quant_type, bnb_4bit_compute_dtype=compute_dtype, bnb_4bit_use_double_quant=use_nested_quant, ) ``` #### Loading Model ```py # Load base model model_name = 'Ahanaas/HermesWithYou' model = AutoModelForCausalLM.from_pretrained( model_name, quantization_config=bnb_config, device_map=0 ) ``` #### Loading Tokenizer ```py # Load tokenizer tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True) tokenizer.pad_token = tokenizer.eos_token tokenizer.padding_side = "right" ``` # Predictions ```py # Run text generation pipeline with our next model system_prompt = '''''' prompt = '''''' pipe = pipeline( task="text-generation", model=model, tokenizer=tokenizer, max_new_tokens=128, # Increase this to allow for longer outputs temperature=0.5, # Encourages more varied outputs top_k=50, # Limits to the top 50 tokens do_sample=True, # Enables sampling return_full_text=True, ) result = pipe(f"<|im_start|>system\n {system_prompt}\n<|im_end|>\n<|im_start|>user\n{prompt}\n<|im_end|>\n<|im_start|>assistant\n") # print(result[0]['generated_text']) generated_text = result[0]['generated_text'] # Print the extracted response text print(generated_text) ```
creatoryoon/trJangguToTimpani_batch4_chk100to1500_fp16
creatoryoon
2024-11-14T08:03:49Z
35
0
diffusers
[ "diffusers", "tensorboard", "safetensors", "diffusers:StableDiffusionInstructPix2PixPipeline", "region:us" ]
null
2024-11-13T06:10:57Z
# 모델 목록 # 모델의 데이터별 설명 # trJangguToTimpani: 장구만 일단 한번 넣어볼까?handmade(change A to B)장구만 20개 # tr0: handmade(change A to B) 인식용 데이터 (초기 데이터) 장구, 양금 등.... # tr: handmade(change A to B)+reco(change Joseon to middle age) # tr2: reco(change Joseon to middle age) # tr3: handmade(change Joseon to middle age) + reco(change Joseon to middle age) # tr4: reco (apply custom artistic style '!asacdls1jo673213@#*') # tr5: handmade(apply custom artistic style '!asacdls1jo673213@#*') + reco(apply custom artistic style '!asacdls1jo673213@#*') # tr6: handmade(change A to B) # tr7: handmade(apply custom artistic style '!asacdls1jo673213@#*') # tr8: handmade(change A to B) + reco(apply custom artistic style '!asacdls1jo673213@#*) # tr9: handmade(apply custom artistic style '!asacdls1jo673213@#*')+입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*') # tr10: 입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*') # tr11: handmade(change A to B) + 입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*) # tr12: handmade(apply custom artistic style '!asacdls1jo673213@#*')+ 검수및입력 조정한 reco(apply custom artistic style '!asacdls1jo673213@#*') # tr13: 검수및입력 조정한reco(apply custom artistic style '!asacdls1jo673213@#*')) # tr14: handmade(change A to B) + 검수및입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*) ''' trJangguToTimpani_batch4_chk100to1500_fp16 ** 이 경우 chkpoint가 100에서 시작, 100씩 점프, 1500이 마지막임에 유의 creatoryoon/tr0_batch8_chk200to2000_mixed_precision_fp16 **이 경우 chkpoint가 200에서 시작하고 200씩 점프, 2000이 마지막임에 유의 creatoryoon/tr1_batch64_15000_mixed_precision_fp16 creatoryoon/tr2_batch16_15000_mixed_precision_bf16 creatoryoon/tr3_batch16_15000_mixed_precision_bf16 creatoryoon/tr4_batch64_15000_mixed_precision_fp16 creatoryoon/tr5_batch16_15000_mixed_precision_bf16 creatoryoon/tr5_batch64_15000_mixed_precision_fp16 creatoryoon/tr6_batch16_15000_mixed_precision_bf16 creatoryoon/tr7_batch16_15000_mixed_precision_fp32 creatoryoon/tr8_batch16_15000_mixed_precision_bf16 creatoryoon/tr9_batch16_15000_mixed_precision_fp32 creatoryoon/tr10_batch16_15000_mixed_precision_bf16 creatoryoon/tr11_batch16_15000_mixed_precision_bf16 creatoryoon/tr12_batch16_15000_mixed_precision_fp32 creatoryoon/tr13_batch16_15000_mixed_precision_fp32 creatoryoon/tr14_batch16_15000_mixed_precision_bf16 '''
creatoryoon/tr14_batch16_15000_mixed_precision_bf16
creatoryoon
2024-11-14T08:03:24Z
28
0
diffusers
[ "diffusers", "tensorboard", "safetensors", "diffusers:StableDiffusionInstructPix2PixPipeline", "region:us" ]
null
2024-11-12T21:47:32Z
# 모델 목록 # 모델의 데이터별 설명 # trJangguToTimpani: 장구만 일단 한번 넣어볼까?handmade(change A to B)장구만 20개 # tr0: handmade(change A to B) 인식용 데이터 (초기 데이터) 장구, 양금 등.... # tr: handmade(change A to B)+reco(change Joseon to middle age) # tr2: reco(change Joseon to middle age) # tr3: handmade(change Joseon to middle age) + reco(change Joseon to middle age) # tr4: reco (apply custom artistic style '!asacdls1jo673213@#*') # tr5: handmade(apply custom artistic style '!asacdls1jo673213@#*') + reco(apply custom artistic style '!asacdls1jo673213@#*') # tr6: handmade(change A to B) # tr7: handmade(apply custom artistic style '!asacdls1jo673213@#*') # tr8: handmade(change A to B) + reco(apply custom artistic style '!asacdls1jo673213@#*) # tr9: handmade(apply custom artistic style '!asacdls1jo673213@#*')+입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*') # tr10: 입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*') # tr11: handmade(change A to B) + 입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*) # tr12: handmade(apply custom artistic style '!asacdls1jo673213@#*')+ 검수및입력 조정한 reco(apply custom artistic style '!asacdls1jo673213@#*') # tr13: 검수및입력 조정한reco(apply custom artistic style '!asacdls1jo673213@#*')) # tr14: handmade(change A to B) + 검수및입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*) ''' trJangguToTimpani_batch4_chk100to1500_fp16 ** 이 경우 chkpoint가 100에서 시작, 100씩 점프, 1500이 마지막임에 유의 creatoryoon/tr0_batch8_chk200to2000_mixed_precision_fp16 **이 경우 chkpoint가 200에서 시작하고 200씩 점프, 2000이 마지막임에 유의 creatoryoon/tr1_batch64_15000_mixed_precision_fp16 creatoryoon/tr2_batch16_15000_mixed_precision_bf16 creatoryoon/tr3_batch16_15000_mixed_precision_bf16 creatoryoon/tr4_batch64_15000_mixed_precision_fp16 creatoryoon/tr5_batch16_15000_mixed_precision_bf16 creatoryoon/tr5_batch64_15000_mixed_precision_fp16 creatoryoon/tr6_batch16_15000_mixed_precision_bf16 creatoryoon/tr7_batch16_15000_mixed_precision_fp32 creatoryoon/tr8_batch16_15000_mixed_precision_bf16 creatoryoon/tr9_batch16_15000_mixed_precision_fp32 creatoryoon/tr10_batch16_15000_mixed_precision_bf16 creatoryoon/tr11_batch16_15000_mixed_precision_bf16 creatoryoon/tr12_batch16_15000_mixed_precision_fp32 creatoryoon/tr13_batch16_15000_mixed_precision_fp32 creatoryoon/tr14_batch16_15000_mixed_precision_bf16 '''
GoldenLlama/krx_sg_qwen2.5_7b_it_v1
GoldenLlama
2024-11-14T08:02:28Z
8
0
null
[ "safetensors", "qwen2", "krx", "unsloth", "trl", "sft", "text-generation", "conversational", "ko", "en", "base_model:unsloth/Qwen2.5-7B-Instruct", "base_model:finetune:unsloth/Qwen2.5-7B-Instruct", "license:apache-2.0", "region:us" ]
text-generation
2024-11-13T00:42:59Z
--- license: apache-2.0 language: - ko - en base_model: - unsloth/Qwen2.5-7B-Instruct pipeline_tag: text-generation tags: - krx - unsloth - trl - sft ---
Whalejay/bert-sw_over_other_x2
Whalejay
2024-11-14T07:56:05Z
115
0
transformers
[ "transformers", "safetensors", "distilbert", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
question-answering
2024-11-14T07:55:38Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
BlinkDL/rwkv-6-misc
BlinkDL
2024-11-14T07:54:44Z
0
20
null
[ "base_model:BlinkDL/rwkv-6-world", "base_model:finetune:BlinkDL/rwkv-6-world", "license:apache-2.0", "region:us" ]
null
2024-04-30T05:19:47Z
--- license: apache-2.0 base_model: BlinkDL/rwkv-6-world --- Here are some finetuned RWKV-6 models :) Base model: https://huggingface.co/BlinkDL/rwkv-6-world https://github.com/BlinkDL/RWKV-LM https://www.rwkv.com/ * chnnovel: finetuned on chn novels + some instruction data * jpn: finetuned on jpn data + some instruction data What's next: * general chat-tuned * arabic
nicolofelicioni/pythia-1b-sft-hh-normal-13
nicolofelicioni
2024-11-14T07:47:38Z
120
0
transformers
[ "transformers", "safetensors", "gpt_neox", "text-generation", "trl", "dpo", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T07:44:14Z
--- library_name: transformers tags: - trl - dpo --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
som111/modelxx
som111
2024-11-14T07:46:40Z
27
0
transformers
[ "transformers", "gguf", "llama", "text-generation-inference", "unsloth", "en", "dataset:ruslanmv/ai-medical-chatbot", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
null
2024-11-14T07:26:22Z
--- base_model: unsloth/llama-3.2-3b-instruct-bnb-4bit tags: - text-generation-inference - transformers - unsloth - llama - gguf license: apache-2.0 language: - en datasets: - ruslanmv/ai-medical-chatbot --- # Uploaded model - **Developed by:** som111 - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3.2-3b-instruct-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
linhqyy/jazzy-st-1411
linhqyy
2024-11-14T07:25:00Z
16
0
diffusers
[ "diffusers", "flux", "flux-diffusers", "text-to-image", "simpletuner", "safe-for-work", "lora", "template:sd-lora", "standard", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
2024-11-14T03:59:56Z
--- license: other base_model: "black-forest-labs/FLUX.1-dev" tags: - flux - flux-diffusers - text-to-image - diffusers - simpletuner - safe-for-work - lora - template:sd-lora - standard inference: true widget: - text: 'unconditional (blank prompt)' parameters: negative_prompt: 'blurry, cropped, ugly' output: url: ./assets/image_0_0.png - text: 'A girl in light blue sits at the bar counter, holding an ice-cold wine glass and drinking alone on top of the Eiffel Tower, with a night view outside the window.. It features a close-up shot of her sitting by herself. She has long hair, wears glasses, faces away from the camera, and is wearing white shoes, black pants, a gray jacket, and a green scarf. with bright colors and a Paris night background featuring the Eiffel Tower. The composition is elegant, with the woman sitting on a high stool' parameters: negative_prompt: 'blurry, cropped, ugly' output: url: ./assets/image_1_0.png --- # jazzy-st-1411 This is a standard PEFT LoRA derived from [black-forest-labs/FLUX.1-dev](https://huggingface.co/black-forest-labs/FLUX.1-dev). The main validation prompt used during training was: ``` A girl in light blue sits at the bar counter, holding an ice-cold wine glass and drinking alone on top of the Eiffel Tower, with a night view outside the window.. It features a close-up shot of her sitting by herself. She has long hair, wears glasses, faces away from the camera, and is wearing white shoes, black pants, a gray jacket, and a green scarf. with bright colors and a Paris night background featuring the Eiffel Tower. The composition is elegant, with the woman sitting on a high stool ``` ## Validation settings - CFG: `3.0` - CFG Rescale: `0.0` - Steps: `20` - Sampler: `None` - Seed: `42` - Resolution: `1024x1024` Note: The validation settings are not necessarily the same as the [training settings](#training-settings). You can find some example images in the following gallery: <Gallery /> The text encoder **was not** trained. You may reuse the base model text encoder for inference. ## Training settings - Training epochs: 5 - Training steps: 5500 - Learning rate: 0.0004 - Max grad norm: 2.0 - Effective batch size: 1 - Micro-batch size: 1 - Gradient accumulation steps: 1 - Number of GPUs: 1 - Prediction type: flow-matching (extra parameters=['shift=3', 'flux_guidance_value=1.0', 'flux_lora_target=all']) - Rescaled betas zero SNR: False - Optimizer: adamw_bf16 - Precision: Pure BF16 - Quantised: No - Xformers: Not used - LoRA Rank: 32 - LoRA Alpha: 32.0 - LoRA Dropout: 0.1 - LoRA initialisation style: default ## Datasets ### jazzy-512 - Repeats: 10 - Total number of images: 28 - Total number of aspect buckets: 2 - Resolution: 0.262144 megapixels - Cropped: False - Crop style: None - Crop aspect: None - Used for regularisation data: No ### jazzy-768 - Repeats: 10 - Total number of images: 28 - Total number of aspect buckets: 1 - Resolution: 0.589824 megapixels - Cropped: False - Crop style: None - Crop aspect: None - Used for regularisation data: No ### jazzy-1024 - Repeats: 10 - Total number of images: 28 - Total number of aspect buckets: 1 - Resolution: 1.048576 megapixels - Cropped: False - Crop style: None - Crop aspect: None - Used for regularisation data: No ## Inference ```python import torch from diffusers import DiffusionPipeline model_id = 'black-forest-labs/FLUX.1-dev' adapter_id = 'linhqyy/jazzy-st-1411' pipeline = DiffusionPipeline.from_pretrained(model_id), torch_dtype=torch.bfloat16) # loading directly in bf16 pipeline.load_lora_weights(adapter_id) prompt = "A girl in light blue sits at the bar counter, holding an ice-cold wine glass and drinking alone on top of the Eiffel Tower, with a night view outside the window.. It features a close-up shot of her sitting by herself. She has long hair, wears glasses, faces away from the camera, and is wearing white shoes, black pants, a gray jacket, and a green scarf. with bright colors and a Paris night background featuring the Eiffel Tower. The composition is elegant, with the woman sitting on a high stool" ## Optional: quantise the model to save on vram. ## Note: The model was not quantised during training, so it is not necessary to quantise it during inference time. #from optimum.quanto import quantize, freeze, qint8 #quantize(pipeline.transformer, weights=qint8) #freeze(pipeline.transformer) pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu') # the pipeline is already in its target precision level image = pipeline( prompt=prompt, num_inference_steps=20, generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826), width=1024, height=1024, guidance_scale=3.0, ).images[0] image.save("output.png", format="PNG") ```
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k35_task1_organization_fold0
MayBashendy
2024-11-14T07:20:49Z
162
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:aubmindlab/bert-base-arabertv02", "base_model:finetune:aubmindlab/bert-base-arabertv02", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-11-14T06:33:17Z
--- library_name: transformers base_model: aubmindlab/bert-base-arabertv02 tags: - generated_from_trainer model-index: - name: Arabic_FineTuningAraBERT_AugV4-trial2_k35_task1_organization_fold0 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Arabic_FineTuningAraBERT_AugV4-trial2_k35_task1_organization_fold0 This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8527 - Qwk: 0.7626 - Mse: 0.8527 - Rmse: 0.9234 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse | |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:| | No log | 0.0030 | 2 | 4.7593 | -0.0132 | 4.7593 | 2.1816 | | No log | 0.0059 | 4 | 2.4852 | 0.0904 | 2.4852 | 1.5764 | | No log | 0.0089 | 6 | 1.5858 | 0.2434 | 1.5858 | 1.2593 | | No log | 0.0118 | 8 | 1.5146 | 0.0742 | 1.5146 | 1.2307 | | No log | 0.0148 | 10 | 1.6104 | 0.1639 | 1.6104 | 1.2690 | | No log | 0.0178 | 12 | 1.7833 | 0.1873 | 1.7833 | 1.3354 | | No log | 0.0207 | 14 | 2.0225 | 0.1873 | 2.0225 | 1.4222 | | No log | 0.0237 | 16 | 2.0302 | 0.1873 | 2.0302 | 1.4249 | | No log | 0.0266 | 18 | 2.0927 | 0.1075 | 2.0927 | 1.4466 | | No log | 0.0296 | 20 | 2.0956 | 0.1873 | 2.0956 | 1.4476 | | No log | 0.0325 | 22 | 1.7700 | 0.1893 | 1.7700 | 1.3304 | | No log | 0.0355 | 24 | 1.6574 | 0.2184 | 1.6574 | 1.2874 | | No log | 0.0385 | 26 | 1.6173 | 0.2435 | 1.6173 | 1.2717 | | No log | 0.0414 | 28 | 2.0235 | 0.1075 | 2.0235 | 1.4225 | | No log | 0.0444 | 30 | 2.1730 | 0.3402 | 2.1730 | 1.4741 | | No log | 0.0473 | 32 | 1.8696 | 0.3333 | 1.8696 | 1.3673 | | No log | 0.0503 | 34 | 1.4195 | 0.3243 | 1.4195 | 1.1914 | | No log | 0.0533 | 36 | 1.2090 | 0.3527 | 1.2090 | 1.0995 | | No log | 0.0562 | 38 | 1.2326 | 0.4282 | 1.2326 | 1.1102 | | No log | 0.0592 | 40 | 1.5126 | 0.3531 | 1.5126 | 1.2299 | | No log | 0.0621 | 42 | 2.1875 | 0.4783 | 2.1875 | 1.4790 | | No log | 0.0651 | 44 | 2.8127 | 0.3670 | 2.8127 | 1.6771 | | No log | 0.0680 | 46 | 2.7083 | 0.2553 | 2.7083 | 1.6457 | | No log | 0.0710 | 48 | 2.2321 | 0.2873 | 2.2321 | 1.4940 | | No log | 0.0740 | 50 | 1.6187 | 0.2416 | 1.6187 | 1.2723 | | No log | 0.0769 | 52 | 1.5202 | 0.3478 | 1.5202 | 1.2330 | | No log | 0.0799 | 54 | 1.7783 | 0.3870 | 1.7783 | 1.3335 | | No log | 0.0828 | 56 | 1.8897 | 0.3824 | 1.8897 | 1.3747 | | No log | 0.0858 | 58 | 1.8786 | 0.3274 | 1.8786 | 1.3706 | | No log | 0.0888 | 60 | 1.6867 | 0.3787 | 1.6867 | 1.2987 | | No log | 0.0917 | 62 | 1.4902 | 0.3478 | 1.4902 | 1.2207 | | No log | 0.0947 | 64 | 1.5438 | 0.3478 | 1.5438 | 1.2425 | | No log | 0.0976 | 66 | 1.8817 | 0.2145 | 1.8817 | 1.3718 | | No log | 0.1006 | 68 | 2.0589 | 0.3182 | 2.0589 | 1.4349 | | No log | 0.1036 | 70 | 2.2693 | 0.3029 | 2.2693 | 1.5064 | | No log | 0.1065 | 72 | 2.3478 | 0.3029 | 2.3478 | 1.5323 | | No log | 0.1095 | 74 | 2.2408 | 0.4596 | 2.2408 | 1.4969 | | No log | 0.1124 | 76 | 2.2201 | 0.4796 | 2.2201 | 1.4900 | | No log | 0.1154 | 78 | 2.5718 | 0.4431 | 2.5718 | 1.6037 | | No log | 0.1183 | 80 | 2.5926 | 0.4974 | 2.5926 | 1.6102 | | No log | 0.1213 | 82 | 2.1065 | 0.4796 | 2.1065 | 1.4514 | | No log | 0.1243 | 84 | 1.5341 | 0.6211 | 1.5341 | 1.2386 | | No log | 0.1272 | 86 | 1.3667 | 0.6812 | 1.3667 | 1.1691 | | No log | 0.1302 | 88 | 1.4645 | 0.6211 | 1.4645 | 1.2102 | | No log | 0.1331 | 90 | 1.7898 | 0.5385 | 1.7898 | 1.3378 | | No log | 0.1361 | 92 | 2.3346 | 0.4793 | 2.3346 | 1.5280 | | No log | 0.1391 | 94 | 2.7006 | 0.1794 | 2.7006 | 1.6433 | | No log | 0.1420 | 96 | 2.4862 | 0.1180 | 2.4862 | 1.5768 | | No log | 0.1450 | 98 | 1.9567 | 0.3119 | 1.9567 | 1.3988 | | No log | 0.1479 | 100 | 1.7088 | 0.5743 | 1.7088 | 1.3072 | | No log | 0.1509 | 102 | 1.7416 | 0.5804 | 1.7416 | 1.3197 | | No log | 0.1538 | 104 | 1.9250 | 0.4991 | 1.9250 | 1.3875 | | No log | 0.1568 | 106 | 1.8031 | 0.5200 | 1.8031 | 1.3428 | | No log | 0.1598 | 108 | 2.0587 | 0.4987 | 2.0587 | 1.4348 | | No log | 0.1627 | 110 | 2.3832 | 0.4234 | 2.3832 | 1.5438 | | No log | 0.1657 | 112 | 2.3690 | 0.4987 | 2.3690 | 1.5391 | | No log | 0.1686 | 114 | 2.0164 | 0.5177 | 2.0164 | 1.4200 | | No log | 0.1716 | 116 | 2.0101 | 0.5177 | 2.0101 | 1.4178 | | No log | 0.1746 | 118 | 2.2453 | 0.5177 | 2.2453 | 1.4984 | | No log | 0.1775 | 120 | 2.5961 | 0.3921 | 2.5961 | 1.6112 | | No log | 0.1805 | 122 | 2.8159 | 0.3280 | 2.8159 | 1.6781 | | No log | 0.1834 | 124 | 2.4890 | 0.4247 | 2.4890 | 1.5777 | | No log | 0.1864 | 126 | 2.1573 | 0.3977 | 2.1573 | 1.4688 | | No log | 0.1893 | 128 | 1.9296 | 0.5417 | 1.9296 | 1.3891 | | No log | 0.1923 | 130 | 1.8589 | 0.5200 | 1.8589 | 1.3634 | | No log | 0.1953 | 132 | 2.0585 | 0.5177 | 2.0585 | 1.4347 | | No log | 0.1982 | 134 | 2.0733 | 0.5177 | 2.0733 | 1.4399 | | No log | 0.2012 | 136 | 2.3734 | 0.4099 | 2.3734 | 1.5406 | | No log | 0.2041 | 138 | 2.4009 | 0.4099 | 2.4009 | 1.5495 | | No log | 0.2071 | 140 | 2.2114 | 0.4234 | 2.2114 | 1.4871 | | No log | 0.2101 | 142 | 2.2440 | 0.4043 | 2.2440 | 1.4980 | | No log | 0.2130 | 144 | 2.2808 | 0.4043 | 2.2808 | 1.5102 | | No log | 0.2160 | 146 | 2.5396 | 0.3527 | 2.5396 | 1.5936 | | No log | 0.2189 | 148 | 2.5210 | 0.3527 | 2.5210 | 1.5878 | | No log | 0.2219 | 150 | 2.4003 | 0.4297 | 2.4003 | 1.5493 | | No log | 0.2249 | 152 | 2.0669 | 0.4793 | 2.0669 | 1.4377 | | No log | 0.2278 | 154 | 1.9158 | 0.4793 | 1.9158 | 1.3841 | | No log | 0.2308 | 156 | 2.0965 | 0.4210 | 2.0965 | 1.4479 | | No log | 0.2337 | 158 | 2.3127 | 0.3527 | 2.3127 | 1.5208 | | No log | 0.2367 | 160 | 2.0917 | 0.4011 | 2.0917 | 1.4463 | | No log | 0.2396 | 162 | 2.1554 | 0.4043 | 2.1554 | 1.4681 | | No log | 0.2426 | 164 | 2.2684 | 0.4256 | 2.2684 | 1.5061 | | No log | 0.2456 | 166 | 2.1182 | 0.4625 | 2.1182 | 1.4554 | | No log | 0.2485 | 168 | 2.0176 | 0.4625 | 2.0176 | 1.4204 | | No log | 0.2515 | 170 | 1.7531 | 0.5965 | 1.7531 | 1.3240 | | No log | 0.2544 | 172 | 1.5850 | 0.5965 | 1.5850 | 1.2590 | | No log | 0.2574 | 174 | 1.5109 | 0.6303 | 1.5109 | 1.2292 | | No log | 0.2604 | 176 | 1.6340 | 0.5556 | 1.6340 | 1.2783 | | No log | 0.2633 | 178 | 1.8895 | 0.5177 | 1.8895 | 1.3746 | | No log | 0.2663 | 180 | 2.0193 | 0.4799 | 2.0193 | 1.4210 | | No log | 0.2692 | 182 | 1.8525 | 0.5366 | 1.8525 | 1.3611 | | No log | 0.2722 | 184 | 1.4371 | 0.5638 | 1.4371 | 1.1988 | | No log | 0.2751 | 186 | 1.3255 | 0.5638 | 1.3255 | 1.1513 | | No log | 0.2781 | 188 | 1.5223 | 0.5600 | 1.5223 | 1.2338 | | No log | 0.2811 | 190 | 1.8304 | 0.5341 | 1.8304 | 1.3529 | | No log | 0.2840 | 192 | 2.2135 | 0.5158 | 2.2135 | 1.4878 | | No log | 0.2870 | 194 | 2.3865 | 0.4324 | 2.3865 | 1.5448 | | No log | 0.2899 | 196 | 2.0958 | 0.5679 | 2.0958 | 1.4477 | | No log | 0.2929 | 198 | 2.0300 | 0.5679 | 2.0300 | 1.4248 | | No log | 0.2959 | 200 | 2.0011 | 0.5679 | 2.0011 | 1.4146 | | No log | 0.2988 | 202 | 2.3881 | 0.4324 | 2.3881 | 1.5453 | | No log | 0.3018 | 204 | 2.4854 | 0.4324 | 2.4854 | 1.5765 | | No log | 0.3047 | 206 | 1.9500 | 0.6201 | 1.9500 | 1.3964 | | No log | 0.3077 | 208 | 1.1513 | 0.6331 | 1.1513 | 1.0730 | | No log | 0.3107 | 210 | 0.9155 | 0.7631 | 0.9155 | 0.9568 | | No log | 0.3136 | 212 | 1.0646 | 0.6543 | 1.0646 | 1.0318 | | No log | 0.3166 | 214 | 1.4993 | 0.5965 | 1.4993 | 1.2245 | | No log | 0.3195 | 216 | 2.3008 | 0.4324 | 2.3008 | 1.5168 | | No log | 0.3225 | 218 | 2.7024 | 0.4324 | 2.7024 | 1.6439 | | No log | 0.3254 | 220 | 2.4805 | 0.4664 | 2.4805 | 1.5750 | | No log | 0.3284 | 222 | 2.1059 | 0.5470 | 2.1059 | 1.4512 | | No log | 0.3314 | 224 | 1.9829 | 0.5823 | 1.9829 | 1.4082 | | No log | 0.3343 | 226 | 2.0696 | 0.5823 | 2.0696 | 1.4386 | | No log | 0.3373 | 228 | 1.9646 | 0.5510 | 1.9646 | 1.4016 | | No log | 0.3402 | 230 | 1.6215 | 0.6356 | 1.6215 | 1.2734 | | No log | 0.3432 | 232 | 1.6405 | 0.6356 | 1.6405 | 1.2808 | | No log | 0.3462 | 234 | 1.8482 | 0.5510 | 1.8482 | 1.3595 | | No log | 0.3491 | 236 | 1.9110 | 0.4983 | 1.9110 | 1.3824 | | No log | 0.3521 | 238 | 1.6427 | 0.5188 | 1.6427 | 1.2817 | | No log | 0.3550 | 240 | 1.5026 | 0.5638 | 1.5026 | 1.2258 | | No log | 0.3580 | 242 | 1.4896 | 0.5638 | 1.4896 | 1.2205 | | No log | 0.3609 | 244 | 1.7848 | 0.5166 | 1.7848 | 1.3360 | | No log | 0.3639 | 246 | 2.0768 | 0.4983 | 2.0768 | 1.4411 | | No log | 0.3669 | 248 | 2.1792 | 0.4983 | 2.1792 | 1.4762 | | No log | 0.3698 | 250 | 2.1126 | 0.5166 | 2.1126 | 1.4535 | | No log | 0.3728 | 252 | 1.8611 | 0.5889 | 1.8611 | 1.3642 | | No log | 0.3757 | 254 | 1.5523 | 0.6071 | 1.5523 | 1.2459 | | No log | 0.3787 | 256 | 1.6026 | 0.6071 | 1.6026 | 1.2659 | | No log | 0.3817 | 258 | 1.9501 | 0.5882 | 1.9501 | 1.3965 | | No log | 0.3846 | 260 | 2.1583 | 0.5158 | 2.1583 | 1.4691 | | No log | 0.3876 | 262 | 1.9369 | 0.5158 | 1.9369 | 1.3917 | | No log | 0.3905 | 264 | 1.6539 | 0.4583 | 1.6539 | 1.2860 | | No log | 0.3935 | 266 | 1.5456 | 0.4571 | 1.5456 | 1.2432 | | No log | 0.3964 | 268 | 1.7069 | 0.4348 | 1.7069 | 1.3065 | | No log | 0.3994 | 270 | 2.1274 | 0.4431 | 2.1274 | 1.4586 | | No log | 0.4024 | 272 | 2.4146 | 0.4247 | 2.4146 | 1.5539 | | No log | 0.4053 | 274 | 2.4327 | 0.4431 | 2.4327 | 1.5597 | | No log | 0.4083 | 276 | 2.0935 | 0.5158 | 2.0935 | 1.4469 | | No log | 0.4112 | 278 | 1.8193 | 0.5358 | 1.8193 | 1.3488 | | No log | 0.4142 | 280 | 1.6251 | 0.5548 | 1.6251 | 1.2748 | | No log | 0.4172 | 282 | 1.7854 | 0.5548 | 1.7854 | 1.3362 | | No log | 0.4201 | 284 | 2.3406 | 0.5326 | 2.3406 | 1.5299 | | No log | 0.4231 | 286 | 2.8120 | 0.3453 | 2.8120 | 1.6769 | | No log | 0.4260 | 288 | 2.6418 | 0.4469 | 2.6418 | 1.6254 | | No log | 0.4290 | 290 | 2.2237 | 0.4983 | 2.2237 | 1.4912 | | No log | 0.4320 | 292 | 2.1114 | 0.5358 | 2.1114 | 1.4531 | | No log | 0.4349 | 294 | 2.0559 | 0.5376 | 2.0559 | 1.4338 | | No log | 0.4379 | 296 | 1.8434 | 0.6000 | 1.8434 | 1.3577 | | No log | 0.4408 | 298 | 1.7514 | 0.6000 | 1.7514 | 1.3234 | | No log | 0.4438 | 300 | 1.9229 | 0.5573 | 1.9229 | 1.3867 | | No log | 0.4467 | 302 | 1.7978 | 0.5573 | 1.7978 | 1.3408 | | No log | 0.4497 | 304 | 1.5417 | 0.5404 | 1.5417 | 1.2417 | | No log | 0.4527 | 306 | 1.6721 | 0.5573 | 1.6721 | 1.2931 | | No log | 0.4556 | 308 | 1.8258 | 0.4991 | 1.8258 | 1.3512 | | No log | 0.4586 | 310 | 2.1502 | 0.4796 | 2.1502 | 1.4664 | | No log | 0.4615 | 312 | 2.5850 | 0.3921 | 2.5850 | 1.6078 | | No log | 0.4645 | 314 | 2.5081 | 0.3921 | 2.5081 | 1.5837 | | No log | 0.4675 | 316 | 2.0185 | 0.4210 | 2.0185 | 1.4208 | | No log | 0.4704 | 318 | 1.5217 | 0.6157 | 1.5217 | 1.2336 | | No log | 0.4734 | 320 | 1.5176 | 0.6157 | 1.5176 | 1.2319 | | No log | 0.4763 | 322 | 1.8374 | 0.3738 | 1.8374 | 1.3555 | | No log | 0.4793 | 324 | 2.5369 | 0.4247 | 2.5369 | 1.5928 | | No log | 0.4822 | 326 | 3.0166 | 0.4149 | 3.0166 | 1.7368 | | No log | 0.4852 | 328 | 2.9030 | 0.4149 | 2.9030 | 1.7038 | | No log | 0.4882 | 330 | 2.3033 | 0.3812 | 2.3033 | 1.5177 | | No log | 0.4911 | 332 | 1.5720 | 0.5011 | 1.5720 | 1.2538 | | No log | 0.4941 | 334 | 1.2217 | 0.5047 | 1.2217 | 1.1053 | | No log | 0.4970 | 336 | 1.1634 | 0.4806 | 1.1634 | 1.0786 | | No log | 0.5 | 338 | 1.3476 | 0.6839 | 1.3476 | 1.1609 | | No log | 0.5030 | 340 | 1.5272 | 0.5927 | 1.5272 | 1.2358 | | No log | 0.5059 | 342 | 1.8521 | 0.4995 | 1.8521 | 1.3609 | | No log | 0.5089 | 344 | 1.8133 | 0.5177 | 1.8133 | 1.3466 | | No log | 0.5118 | 346 | 1.4281 | 0.6008 | 1.4281 | 1.1950 | | No log | 0.5148 | 348 | 1.1405 | 0.6265 | 1.1405 | 1.0679 | | No log | 0.5178 | 350 | 1.1643 | 0.6356 | 1.1643 | 1.0790 | | No log | 0.5207 | 352 | 1.4326 | 0.6435 | 1.4326 | 1.1969 | | No log | 0.5237 | 354 | 1.8958 | 0.6311 | 1.8958 | 1.3769 | | No log | 0.5266 | 356 | 2.0355 | 0.6375 | 2.0355 | 1.4267 | | No log | 0.5296 | 358 | 1.8620 | 0.5800 | 1.8620 | 1.3645 | | No log | 0.5325 | 360 | 1.7379 | 0.5686 | 1.7379 | 1.3183 | | No log | 0.5355 | 362 | 1.4849 | 0.5744 | 1.4849 | 1.2186 | | No log | 0.5385 | 364 | 1.4820 | 0.5744 | 1.4820 | 1.2174 | | No log | 0.5414 | 366 | 1.7132 | 0.5686 | 1.7132 | 1.3089 | | No log | 0.5444 | 368 | 1.8751 | 0.5164 | 1.8751 | 1.3693 | | No log | 0.5473 | 370 | 1.9427 | 0.5164 | 1.9427 | 1.3938 | | No log | 0.5503 | 372 | 1.9238 | 0.5489 | 1.9238 | 1.3870 | | No log | 0.5533 | 374 | 1.9908 | 0.4823 | 1.9908 | 1.4110 | | No log | 0.5562 | 376 | 1.7852 | 0.6376 | 1.7852 | 1.3361 | | No log | 0.5592 | 378 | 1.4042 | 0.6934 | 1.4042 | 1.1850 | | No log | 0.5621 | 380 | 1.3339 | 0.7014 | 1.3339 | 1.1549 | | No log | 0.5651 | 382 | 1.5014 | 0.6812 | 1.5014 | 1.2253 | | No log | 0.5680 | 384 | 1.7604 | 0.5532 | 1.7604 | 1.3268 | | No log | 0.5710 | 386 | 1.9863 | 0.4988 | 1.9863 | 1.4094 | | No log | 0.5740 | 388 | 2.0023 | 0.4441 | 2.0023 | 1.4150 | | No log | 0.5769 | 390 | 1.8323 | 0.4995 | 1.8323 | 1.3536 | | No log | 0.5799 | 392 | 1.5319 | 0.5927 | 1.5319 | 1.2377 | | No log | 0.5828 | 394 | 1.2857 | 0.5705 | 1.2857 | 1.1339 | | No log | 0.5858 | 396 | 1.2748 | 0.5705 | 1.2748 | 1.1291 | | No log | 0.5888 | 398 | 1.2447 | 0.6323 | 1.2447 | 1.1156 | | No log | 0.5917 | 400 | 1.3810 | 0.6211 | 1.3810 | 1.1752 | | No log | 0.5947 | 402 | 1.7534 | 0.6376 | 1.7534 | 1.3242 | | No log | 0.5976 | 404 | 2.1789 | 0.5291 | 2.1789 | 1.4761 | | No log | 0.6006 | 406 | 2.3193 | 0.4324 | 2.3193 | 1.5229 | | No log | 0.6036 | 408 | 2.0079 | 0.5349 | 2.0079 | 1.4170 | | No log | 0.6065 | 410 | 1.4768 | 0.6339 | 1.4768 | 1.2152 | | No log | 0.6095 | 412 | 0.9421 | 0.8094 | 0.9421 | 0.9706 | | No log | 0.6124 | 414 | 0.7710 | 0.7432 | 0.7710 | 0.8781 | | No log | 0.6154 | 416 | 0.8228 | 0.7906 | 0.8228 | 0.9071 | | No log | 0.6183 | 418 | 1.1203 | 0.7515 | 1.1203 | 1.0584 | | No log | 0.6213 | 420 | 1.7202 | 0.5349 | 1.7202 | 1.3116 | | No log | 0.6243 | 422 | 2.0589 | 0.4625 | 2.0589 | 1.4349 | | No log | 0.6272 | 424 | 1.9750 | 0.4625 | 1.9750 | 1.4053 | | No log | 0.6302 | 426 | 1.7002 | 0.5737 | 1.7002 | 1.3039 | | No log | 0.6331 | 428 | 1.4584 | 0.6053 | 1.4584 | 1.2076 | | No log | 0.6361 | 430 | 1.4064 | 0.6053 | 1.4064 | 1.1859 | | No log | 0.6391 | 432 | 1.5996 | 0.6008 | 1.5996 | 1.2647 | | No log | 0.6420 | 434 | 2.0072 | 0.5341 | 2.0072 | 1.4168 | | No log | 0.6450 | 436 | 2.2470 | 0.4624 | 2.2470 | 1.4990 | | No log | 0.6479 | 438 | 2.1866 | 0.4802 | 2.1866 | 1.4787 | | No log | 0.6509 | 440 | 1.9062 | 0.5158 | 1.9062 | 1.3807 | | No log | 0.6538 | 442 | 1.5809 | 0.5965 | 1.5809 | 1.2573 | | No log | 0.6568 | 444 | 1.5531 | 0.5965 | 1.5531 | 1.2462 | | No log | 0.6598 | 446 | 1.7454 | 0.5366 | 1.7454 | 1.3211 | | No log | 0.6627 | 448 | 1.7889 | 0.4987 | 1.7889 | 1.3375 | | No log | 0.6657 | 450 | 1.8162 | 0.4991 | 1.8161 | 1.3476 | | No log | 0.6686 | 452 | 1.6058 | 0.5200 | 1.6058 | 1.2672 | | No log | 0.6716 | 454 | 1.4890 | 0.6536 | 1.4890 | 1.2203 | | No log | 0.6746 | 456 | 1.3941 | 0.6157 | 1.3941 | 1.1807 | | No log | 0.6775 | 458 | 1.1485 | 0.5935 | 1.1485 | 1.0717 | | No log | 0.6805 | 460 | 1.0123 | 0.6982 | 1.0123 | 1.0061 | | No log | 0.6834 | 462 | 1.0624 | 0.6693 | 1.0624 | 1.0307 | | No log | 0.6864 | 464 | 1.2489 | 0.6678 | 1.2489 | 1.1175 | | No log | 0.6893 | 466 | 1.5782 | 0.6435 | 1.5782 | 1.2563 | | No log | 0.6923 | 468 | 1.5432 | 0.6435 | 1.5432 | 1.2422 | | No log | 0.6953 | 470 | 1.3239 | 0.6732 | 1.3239 | 1.1506 | | No log | 0.6982 | 472 | 1.1434 | 0.6678 | 1.1434 | 1.0693 | | No log | 0.7012 | 474 | 1.0185 | 0.7322 | 1.0185 | 1.0092 | | No log | 0.7041 | 476 | 0.9671 | 0.7898 | 0.9671 | 0.9834 | | No log | 0.7071 | 478 | 1.1360 | 0.6746 | 1.1360 | 1.0658 | | No log | 0.7101 | 480 | 1.4188 | 0.6333 | 1.4188 | 1.1911 | | No log | 0.7130 | 482 | 1.6883 | 0.5489 | 1.6883 | 1.2994 | | No log | 0.7160 | 484 | 1.6613 | 0.5385 | 1.6613 | 1.2889 | | No log | 0.7189 | 486 | 1.3402 | 0.7014 | 1.3402 | 1.1577 | | No log | 0.7219 | 488 | 1.1825 | 0.6897 | 1.1825 | 1.0874 | | No log | 0.7249 | 490 | 1.0660 | 0.6897 | 1.0660 | 1.0325 | | No log | 0.7278 | 492 | 1.1635 | 0.7216 | 1.1635 | 1.0787 | | No log | 0.7308 | 494 | 1.2338 | 0.6414 | 1.2338 | 1.1108 | | No log | 0.7337 | 496 | 1.2031 | 0.6414 | 1.2031 | 1.0969 | | No log | 0.7367 | 498 | 1.1220 | 0.6678 | 1.1220 | 1.0593 | | 0.5124 | 0.7396 | 500 | 1.1471 | 0.6678 | 1.1471 | 1.0710 | | 0.5124 | 0.7426 | 502 | 1.1933 | 0.6678 | 1.1933 | 1.0924 | | 0.5124 | 0.7456 | 504 | 1.3590 | 0.6678 | 1.3590 | 1.1658 | | 0.5124 | 0.7485 | 506 | 1.5283 | 0.6938 | 1.5283 | 1.2363 | | 0.5124 | 0.7515 | 508 | 1.4056 | 0.6938 | 1.4056 | 1.1856 | | 0.5124 | 0.7544 | 510 | 1.2773 | 0.6497 | 1.2773 | 1.1302 | | 0.5124 | 0.7574 | 512 | 1.1138 | 0.6447 | 1.1138 | 1.0554 | | 0.5124 | 0.7604 | 514 | 1.0728 | 0.6497 | 1.0728 | 1.0357 | | 0.5124 | 0.7633 | 516 | 1.2504 | 0.6678 | 1.2504 | 1.1182 | | 0.5124 | 0.7663 | 518 | 1.3152 | 0.6678 | 1.3152 | 1.1468 | | 0.5124 | 0.7692 | 520 | 1.3246 | 0.7129 | 1.3246 | 1.1509 | | 0.5124 | 0.7722 | 522 | 1.2160 | 0.7129 | 1.2160 | 1.1027 | | 0.5124 | 0.7751 | 524 | 1.3105 | 0.7129 | 1.3105 | 1.1448 | | 0.5124 | 0.7781 | 526 | 1.4726 | 0.7129 | 1.4726 | 1.2135 | | 0.5124 | 0.7811 | 528 | 1.3328 | 0.7129 | 1.3328 | 1.1545 | | 0.5124 | 0.7840 | 530 | 1.1356 | 0.6693 | 1.1356 | 1.0656 | | 0.5124 | 0.7870 | 532 | 1.1833 | 0.6678 | 1.1833 | 1.0878 | | 0.5124 | 0.7899 | 534 | 1.3837 | 0.6616 | 1.3837 | 1.1763 | | 0.5124 | 0.7929 | 536 | 1.7438 | 0.6667 | 1.7438 | 1.3205 | | 0.5124 | 0.7959 | 538 | 1.7484 | 0.6508 | 1.7484 | 1.3223 | | 0.5124 | 0.7988 | 540 | 1.4713 | 0.6253 | 1.4713 | 1.2130 | | 0.5124 | 0.8018 | 542 | 1.2209 | 0.6746 | 1.2209 | 1.1049 | | 0.5124 | 0.8047 | 544 | 1.1763 | 0.6746 | 1.1763 | 1.0846 | | 0.5124 | 0.8077 | 546 | 1.1800 | 0.6890 | 1.1800 | 1.0863 | | 0.5124 | 0.8107 | 548 | 1.3306 | 0.6536 | 1.3306 | 1.1535 | | 0.5124 | 0.8136 | 550 | 1.6643 | 0.5200 | 1.6643 | 1.2901 | | 0.5124 | 0.8166 | 552 | 1.8104 | 0.4987 | 1.8104 | 1.3455 | | 0.5124 | 0.8195 | 554 | 1.6862 | 0.5385 | 1.6862 | 1.2985 | | 0.5124 | 0.8225 | 556 | 1.3954 | 0.6053 | 1.3954 | 1.1813 | | 0.5124 | 0.8254 | 558 | 1.1097 | 0.6763 | 1.1097 | 1.0534 | | 0.5124 | 0.8284 | 560 | 1.0157 | 0.6763 | 1.0157 | 1.0078 | | 0.5124 | 0.8314 | 562 | 1.1742 | 0.7305 | 1.1742 | 1.0836 | | 0.5124 | 0.8343 | 564 | 1.5945 | 0.6296 | 1.5945 | 1.2627 | | 0.5124 | 0.8373 | 566 | 1.7991 | 0.5700 | 1.7991 | 1.3413 | | 0.5124 | 0.8402 | 568 | 1.7228 | 0.5600 | 1.7228 | 1.3126 | | 0.5124 | 0.8432 | 570 | 1.7575 | 0.5396 | 1.7575 | 1.3257 | | 0.5124 | 0.8462 | 572 | 1.6217 | 0.5600 | 1.6217 | 1.2734 | | 0.5124 | 0.8491 | 574 | 1.4997 | 0.5927 | 1.4997 | 1.2246 | | 0.5124 | 0.8521 | 576 | 1.6235 | 0.5804 | 1.6235 | 1.2742 | | 0.5124 | 0.8550 | 578 | 1.7046 | 0.5804 | 1.7046 | 1.3056 | | 0.5124 | 0.8580 | 580 | 1.5357 | 0.6253 | 1.5357 | 1.2392 | | 0.5124 | 0.8609 | 582 | 1.2044 | 0.6260 | 1.2044 | 1.0974 | | 0.5124 | 0.8639 | 584 | 1.2175 | 0.6435 | 1.2175 | 1.1034 | | 0.5124 | 0.8669 | 586 | 1.3953 | 0.6836 | 1.3953 | 1.1812 | | 0.5124 | 0.8698 | 588 | 1.6457 | 0.6661 | 1.6457 | 1.2828 | | 0.5124 | 0.8728 | 590 | 1.5577 | 0.5804 | 1.5577 | 1.2481 | | 0.5124 | 0.8757 | 592 | 1.3116 | 0.6157 | 1.3116 | 1.1452 | | 0.5124 | 0.8787 | 594 | 1.0623 | 0.6982 | 1.0623 | 1.0307 | | 0.5124 | 0.8817 | 596 | 1.0351 | 0.6982 | 1.0351 | 1.0174 | | 0.5124 | 0.8846 | 598 | 1.1916 | 0.6476 | 1.1916 | 1.0916 | | 0.5124 | 0.8876 | 600 | 1.4156 | 0.6356 | 1.4156 | 1.1898 | | 0.5124 | 0.8905 | 602 | 1.3344 | 0.6356 | 1.3344 | 1.1552 | | 0.5124 | 0.8935 | 604 | 1.2729 | 0.6356 | 1.2729 | 1.1282 | | 0.5124 | 0.8964 | 606 | 1.3387 | 0.6727 | 1.3387 | 1.1570 | | 0.5124 | 0.8994 | 608 | 1.4370 | 0.6727 | 1.4370 | 1.1987 | | 0.5124 | 0.9024 | 610 | 1.4993 | 0.6673 | 1.4993 | 1.2245 | | 0.5124 | 0.9053 | 612 | 1.5312 | 0.6161 | 1.5312 | 1.2374 | | 0.5124 | 0.9083 | 614 | 1.5403 | 0.5581 | 1.5403 | 1.2411 | | 0.5124 | 0.9112 | 616 | 1.5774 | 0.5581 | 1.5774 | 1.2560 | | 0.5124 | 0.9142 | 618 | 1.4491 | 0.6161 | 1.4491 | 1.2038 | | 0.5124 | 0.9172 | 620 | 1.3558 | 0.6356 | 1.3558 | 1.1644 | | 0.5124 | 0.9201 | 622 | 1.5727 | 0.6933 | 1.5727 | 1.2541 | | 0.5124 | 0.9231 | 624 | 1.9185 | 0.6990 | 1.9185 | 1.3851 | | 0.5124 | 0.9260 | 626 | 1.8117 | 0.7280 | 1.8117 | 1.3460 | | 0.5124 | 0.9290 | 628 | 1.4073 | 0.6356 | 1.4073 | 1.1863 | | 0.5124 | 0.9320 | 630 | 1.1589 | 0.6476 | 1.1589 | 1.0765 | | 0.5124 | 0.9349 | 632 | 1.0186 | 0.6982 | 1.0186 | 1.0093 | | 0.5124 | 0.9379 | 634 | 1.0923 | 0.6982 | 1.0923 | 1.0451 | | 0.5124 | 0.9408 | 636 | 1.3583 | 0.6616 | 1.3583 | 1.1655 | | 0.5124 | 0.9438 | 638 | 1.6299 | 0.6161 | 1.6299 | 1.2767 | | 0.5124 | 0.9467 | 640 | 1.7971 | 0.6376 | 1.7971 | 1.3406 | | 0.5124 | 0.9497 | 642 | 1.7640 | 0.6552 | 1.7640 | 1.3282 | | 0.5124 | 0.9527 | 644 | 1.3877 | 0.6121 | 1.3877 | 1.1780 | | 0.5124 | 0.9556 | 646 | 1.0197 | 0.6684 | 1.0197 | 1.0098 | | 0.5124 | 0.9586 | 648 | 0.8821 | 0.7332 | 0.8821 | 0.9392 | | 0.5124 | 0.9615 | 650 | 0.8877 | 0.7332 | 0.8877 | 0.9422 | | 0.5124 | 0.9645 | 652 | 1.0239 | 0.6982 | 1.0239 | 1.0119 | | 0.5124 | 0.9675 | 654 | 1.3656 | 0.6394 | 1.3656 | 1.1686 | | 0.5124 | 0.9704 | 656 | 1.5068 | 0.6339 | 1.5068 | 1.2275 | | 0.5124 | 0.9734 | 658 | 1.5125 | 0.6382 | 1.5125 | 1.2298 | | 0.5124 | 0.9763 | 660 | 1.3579 | 0.6121 | 1.3579 | 1.1653 | | 0.5124 | 0.9793 | 662 | 1.4305 | 0.5933 | 1.4305 | 1.1960 | | 0.5124 | 0.9822 | 664 | 1.3148 | 0.5973 | 1.3148 | 1.1467 | | 0.5124 | 0.9852 | 666 | 1.1350 | 0.6818 | 1.1350 | 1.0654 | | 0.5124 | 0.9882 | 668 | 1.1327 | 0.6818 | 1.1327 | 1.0643 | | 0.5124 | 0.9911 | 670 | 1.2600 | 0.6934 | 1.2600 | 1.1225 | | 0.5124 | 0.9941 | 672 | 1.2645 | 0.6934 | 1.2645 | 1.1245 | | 0.5124 | 0.9970 | 674 | 1.2645 | 0.6356 | 1.2645 | 1.1245 | | 0.5124 | 1.0 | 676 | 1.3788 | 0.6610 | 1.3788 | 1.1742 | | 0.5124 | 1.0030 | 678 | 1.4922 | 0.6552 | 1.4922 | 1.2216 | | 0.5124 | 1.0059 | 680 | 1.4587 | 0.6071 | 1.4587 | 1.2078 | | 0.5124 | 1.0089 | 682 | 1.3493 | 0.6303 | 1.3493 | 1.1616 | | 0.5124 | 1.0118 | 684 | 1.3922 | 0.5777 | 1.3922 | 1.1799 | | 0.5124 | 1.0148 | 686 | 1.4398 | 0.6161 | 1.4398 | 1.1999 | | 0.5124 | 1.0178 | 688 | 1.2817 | 0.6015 | 1.2817 | 1.1321 | | 0.5124 | 1.0207 | 690 | 1.1599 | 0.6061 | 1.1599 | 1.0770 | | 0.5124 | 1.0237 | 692 | 1.1406 | 0.6543 | 1.1406 | 1.0680 | | 0.5124 | 1.0266 | 694 | 1.1262 | 0.6543 | 1.1262 | 1.0612 | | 0.5124 | 1.0296 | 696 | 1.1291 | 0.6543 | 1.1291 | 1.0626 | | 0.5124 | 1.0325 | 698 | 1.1602 | 0.6543 | 1.1602 | 1.0771 | | 0.5124 | 1.0355 | 700 | 1.1044 | 0.6543 | 1.1044 | 1.0509 | | 0.5124 | 1.0385 | 702 | 1.2154 | 0.6061 | 1.2154 | 1.1025 | | 0.5124 | 1.0414 | 704 | 1.5019 | 0.6207 | 1.5019 | 1.2255 | | 0.5124 | 1.0444 | 706 | 1.6556 | 0.5994 | 1.6556 | 1.2867 | | 0.5124 | 1.0473 | 708 | 1.6783 | 0.5679 | 1.6783 | 1.2955 | | 0.5124 | 1.0503 | 710 | 1.4250 | 0.5366 | 1.4250 | 1.1937 | | 0.5124 | 1.0533 | 712 | 1.1856 | 0.6323 | 1.1856 | 1.0888 | | 0.5124 | 1.0562 | 714 | 1.0413 | 0.6839 | 1.0413 | 1.0205 | | 0.5124 | 1.0592 | 716 | 1.1108 | 0.6839 | 1.1108 | 1.0539 | | 0.5124 | 1.0621 | 718 | 1.3563 | 0.5449 | 1.3563 | 1.1646 | | 0.5124 | 1.0651 | 720 | 1.4666 | 0.5449 | 1.4666 | 1.2110 | | 0.5124 | 1.0680 | 722 | 1.4211 | 0.5449 | 1.4211 | 1.1921 | | 0.5124 | 1.0710 | 724 | 1.2445 | 0.6323 | 1.2445 | 1.1156 | | 0.5124 | 1.0740 | 726 | 1.1570 | 0.6111 | 1.1570 | 1.0757 | | 0.5124 | 1.0769 | 728 | 1.1705 | 0.6476 | 1.1705 | 1.0819 | | 0.5124 | 1.0799 | 730 | 1.3534 | 0.6545 | 1.3534 | 1.1634 | | 0.5124 | 1.0828 | 732 | 1.6599 | 0.6769 | 1.6599 | 1.2884 | | 0.5124 | 1.0858 | 734 | 1.7883 | 0.6552 | 1.7883 | 1.3373 | | 0.5124 | 1.0888 | 736 | 1.5746 | 0.5769 | 1.5746 | 1.2548 | | 0.5124 | 1.0917 | 738 | 1.1983 | 0.6839 | 1.1983 | 1.0947 | | 0.5124 | 1.0947 | 740 | 0.9295 | 0.7316 | 0.9295 | 0.9641 | | 0.5124 | 1.0976 | 742 | 0.8892 | 0.7316 | 0.8892 | 0.9430 | | 0.5124 | 1.1006 | 744 | 0.9784 | 0.6982 | 0.9784 | 0.9891 | | 0.5124 | 1.1036 | 746 | 1.0991 | 0.7310 | 1.0991 | 1.0484 | | 0.5124 | 1.1065 | 748 | 1.1218 | 0.7310 | 1.1218 | 1.0592 | | 0.5124 | 1.1095 | 750 | 0.9994 | 0.6982 | 0.9994 | 0.9997 | | 0.5124 | 1.1124 | 752 | 0.9723 | 0.6982 | 0.9723 | 0.9861 | | 0.5124 | 1.1154 | 754 | 1.0213 | 0.6982 | 1.0213 | 1.0106 | | 0.5124 | 1.1183 | 756 | 1.1969 | 0.6972 | 1.1969 | 1.0940 | | 0.5124 | 1.1213 | 758 | 1.4115 | 0.6860 | 1.4115 | 1.1881 | | 0.5124 | 1.1243 | 760 | 1.4379 | 0.6860 | 1.4379 | 1.1991 | | 0.5124 | 1.1272 | 762 | 1.2880 | 0.6860 | 1.2880 | 1.1349 | | 0.5124 | 1.1302 | 764 | 1.0399 | 0.7322 | 1.0399 | 1.0197 | | 0.5124 | 1.1331 | 766 | 0.9220 | 0.7316 | 0.9220 | 0.9602 | | 0.5124 | 1.1361 | 768 | 0.8880 | 0.7937 | 0.8880 | 0.9423 | | 0.5124 | 1.1391 | 770 | 0.9822 | 0.7322 | 0.9822 | 0.9911 | | 0.5124 | 1.1420 | 772 | 1.2068 | 0.6860 | 1.2068 | 1.0986 | | 0.5124 | 1.1450 | 774 | 1.4540 | 0.6860 | 1.4540 | 1.2058 | | 0.5124 | 1.1479 | 776 | 1.4219 | 0.6545 | 1.4219 | 1.1925 | | 0.5124 | 1.1509 | 778 | 1.1883 | 0.6610 | 1.1883 | 1.0901 | | 0.5124 | 1.1538 | 780 | 1.1370 | 0.6323 | 1.1370 | 1.0663 | | 0.5124 | 1.1568 | 782 | 1.1483 | 0.6323 | 1.1483 | 1.0716 | | 0.5124 | 1.1598 | 784 | 1.0509 | 0.7519 | 1.0509 | 1.0252 | | 0.5124 | 1.1627 | 786 | 0.9433 | 0.7519 | 0.9433 | 0.9712 | | 0.5124 | 1.1657 | 788 | 0.9737 | 0.7519 | 0.9737 | 0.9868 | | 0.5124 | 1.1686 | 790 | 1.1458 | 0.6975 | 1.1458 | 1.0704 | | 0.5124 | 1.1716 | 792 | 1.2280 | 0.7211 | 1.2280 | 1.1082 | | 0.5124 | 1.1746 | 794 | 1.1313 | 0.6975 | 1.1313 | 1.0636 | | 0.5124 | 1.1775 | 796 | 0.9603 | 0.6975 | 0.9603 | 0.9799 | | 0.5124 | 1.1805 | 798 | 0.8791 | 0.6975 | 0.8791 | 0.9376 | | 0.5124 | 1.1834 | 800 | 0.9197 | 0.7614 | 0.9197 | 0.9590 | | 0.5124 | 1.1864 | 802 | 1.0550 | 0.7417 | 1.0550 | 1.0271 | | 0.5124 | 1.1893 | 804 | 1.1647 | 0.7123 | 1.1647 | 1.0792 | | 0.5124 | 1.1923 | 806 | 1.0629 | 0.7688 | 1.0629 | 1.0310 | | 0.5124 | 1.1953 | 808 | 0.9649 | 0.7598 | 0.9649 | 0.9823 | | 0.5124 | 1.1982 | 810 | 0.8524 | 0.7696 | 0.8524 | 0.9232 | | 0.5124 | 1.2012 | 812 | 0.8621 | 0.7793 | 0.8621 | 0.9285 | | 0.5124 | 1.2041 | 814 | 1.0182 | 0.7305 | 1.0182 | 1.0091 | | 0.5124 | 1.2071 | 816 | 1.2705 | 0.7123 | 1.2705 | 1.1272 | | 0.5124 | 1.2101 | 818 | 1.3855 | 0.7123 | 1.3855 | 1.1771 | | 0.5124 | 1.2130 | 820 | 1.3684 | 0.7211 | 1.3684 | 1.1698 | | 0.5124 | 1.2160 | 822 | 1.2992 | 0.7305 | 1.2992 | 1.1398 | | 0.5124 | 1.2189 | 824 | 1.2448 | 0.7305 | 1.2448 | 1.1157 | | 0.5124 | 1.2219 | 826 | 1.2517 | 0.7305 | 1.2517 | 1.1188 | | 0.5124 | 1.2249 | 828 | 1.2850 | 0.7123 | 1.2850 | 1.1336 | | 0.5124 | 1.2278 | 830 | 1.3320 | 0.7123 | 1.3320 | 1.1541 | | 0.5124 | 1.2308 | 832 | 1.5386 | 0.7042 | 1.5386 | 1.2404 | | 0.5124 | 1.2337 | 834 | 1.6960 | 0.6247 | 1.6960 | 1.3023 | | 0.5124 | 1.2367 | 836 | 1.5693 | 0.5926 | 1.5693 | 1.2527 | | 0.5124 | 1.2396 | 838 | 1.2803 | 0.7211 | 1.2803 | 1.1315 | | 0.5124 | 1.2426 | 840 | 1.0329 | 0.7211 | 1.0329 | 1.0163 | | 0.5124 | 1.2456 | 842 | 0.8464 | 0.72 | 0.8464 | 0.9200 | | 0.5124 | 1.2485 | 844 | 0.7644 | 0.72 | 0.7644 | 0.8743 | | 0.5124 | 1.2515 | 846 | 0.8378 | 0.7216 | 0.8378 | 0.9153 | | 0.5124 | 1.2544 | 848 | 1.0355 | 0.7305 | 1.0355 | 1.0176 | | 0.5124 | 1.2574 | 850 | 1.3306 | 0.6485 | 1.3306 | 1.1535 | | 0.5124 | 1.2604 | 852 | 1.5482 | 0.6786 | 1.5482 | 1.2443 | | 0.5124 | 1.2633 | 854 | 1.5133 | 0.6485 | 1.5133 | 1.2302 | | 0.5124 | 1.2663 | 856 | 1.3109 | 0.6545 | 1.3109 | 1.1449 | | 0.5124 | 1.2692 | 858 | 1.0248 | 0.7840 | 1.0248 | 1.0123 | | 0.5124 | 1.2722 | 860 | 0.8619 | 0.7844 | 0.8619 | 0.9284 | | 0.5124 | 1.2751 | 862 | 0.7735 | 0.7844 | 0.7735 | 0.8795 | | 0.5124 | 1.2781 | 864 | 0.7926 | 0.7844 | 0.7926 | 0.8903 | | 0.5124 | 1.2811 | 866 | 0.9015 | 0.7519 | 0.9015 | 0.9495 | | 0.5124 | 1.2840 | 868 | 1.0408 | 0.7840 | 1.0408 | 1.0202 | | 0.5124 | 1.2870 | 870 | 1.2394 | 0.6545 | 1.2394 | 1.1133 | | 0.5124 | 1.2899 | 872 | 1.3263 | 0.6485 | 1.3263 | 1.1517 | | 0.5124 | 1.2929 | 874 | 1.1878 | 0.7720 | 1.1878 | 1.0899 | | 0.5124 | 1.2959 | 876 | 0.9641 | 0.8138 | 0.9641 | 0.9819 | | 0.5124 | 1.2988 | 878 | 0.7789 | 0.8412 | 0.7789 | 0.8826 | | 0.5124 | 1.3018 | 880 | 0.7364 | 0.8283 | 0.7364 | 0.8582 | | 0.5124 | 1.3047 | 882 | 0.8419 | 0.8138 | 0.8419 | 0.9175 | | 0.5124 | 1.3077 | 884 | 1.1174 | 0.8138 | 1.1174 | 1.0571 | | 0.5124 | 1.3107 | 886 | 1.4156 | 0.6257 | 1.4156 | 1.1898 | | 0.5124 | 1.3136 | 888 | 1.4840 | 0.6046 | 1.4840 | 1.2182 | | 0.5124 | 1.3166 | 890 | 1.3482 | 0.6610 | 1.3482 | 1.1611 | | 0.5124 | 1.3195 | 892 | 1.0679 | 0.7515 | 1.0679 | 1.0334 | | 0.5124 | 1.3225 | 894 | 0.8469 | 0.7823 | 0.8469 | 0.9202 | | 0.5124 | 1.3254 | 896 | 0.7514 | 0.7823 | 0.7514 | 0.8668 | | 0.5124 | 1.3284 | 898 | 0.7512 | 0.7823 | 0.7512 | 0.8667 | | 0.5124 | 1.3314 | 900 | 0.9032 | 0.8015 | 0.9032 | 0.9504 | | 0.5124 | 1.3343 | 902 | 1.0844 | 0.7603 | 1.0844 | 1.0413 | | 0.5124 | 1.3373 | 904 | 1.3467 | 0.6071 | 1.3467 | 1.1605 | | 0.5124 | 1.3402 | 906 | 1.4380 | 0.5889 | 1.4380 | 1.1992 | | 0.5124 | 1.3432 | 908 | 1.3756 | 0.5804 | 1.3756 | 1.1728 | | 0.5124 | 1.3462 | 910 | 1.2479 | 0.7188 | 1.2479 | 1.1171 | | 0.5124 | 1.3491 | 912 | 1.2447 | 0.7188 | 1.2447 | 1.1157 | | 0.5124 | 1.3521 | 914 | 1.2992 | 0.6316 | 1.2992 | 1.1398 | | 0.5124 | 1.3550 | 916 | 1.3122 | 0.6536 | 1.3122 | 1.1455 | | 0.5124 | 1.3580 | 918 | 1.2349 | 0.7188 | 1.2349 | 1.1113 | | 0.5124 | 1.3609 | 920 | 1.0607 | 0.7188 | 1.0607 | 1.0299 | | 0.5124 | 1.3639 | 922 | 0.9777 | 0.8138 | 0.9777 | 0.9888 | | 0.5124 | 1.3669 | 924 | 1.0128 | 0.8138 | 1.0128 | 1.0064 | | 0.5124 | 1.3698 | 926 | 1.2155 | 0.7338 | 1.2155 | 1.1025 | | 0.5124 | 1.3728 | 928 | 1.5746 | 0.7115 | 1.5746 | 1.2548 | | 0.5124 | 1.3757 | 930 | 1.7375 | 0.6751 | 1.7375 | 1.3182 | | 0.5124 | 1.3787 | 932 | 1.6504 | 0.6751 | 1.6504 | 1.2847 | | 0.5124 | 1.3817 | 934 | 1.3282 | 0.6552 | 1.3282 | 1.1525 | | 0.5124 | 1.3846 | 936 | 1.1306 | 0.7412 | 1.1306 | 1.0633 | | 0.5124 | 1.3876 | 938 | 0.9086 | 0.8283 | 0.9086 | 0.9532 | | 0.5124 | 1.3905 | 940 | 0.8630 | 0.7823 | 0.8630 | 0.9290 | | 0.5124 | 1.3935 | 942 | 0.9429 | 0.8019 | 0.9429 | 0.9711 | | 0.5124 | 1.3964 | 944 | 1.1244 | 0.6839 | 1.1244 | 1.0604 | | 0.5124 | 1.3994 | 946 | 1.3577 | 0.6969 | 1.3577 | 1.1652 | | 0.5124 | 1.4024 | 948 | 1.6525 | 0.5417 | 1.6525 | 1.2855 | | 0.5124 | 1.4053 | 950 | 1.7282 | 0.4995 | 1.7282 | 1.3146 | | 0.5124 | 1.4083 | 952 | 1.5969 | 0.5661 | 1.5969 | 1.2637 | | 0.5124 | 1.4112 | 954 | 1.4857 | 0.6609 | 1.4857 | 1.2189 | | 0.5124 | 1.4142 | 956 | 1.2584 | 0.6839 | 1.2584 | 1.1218 | | 0.5124 | 1.4172 | 958 | 1.1679 | 0.7519 | 1.1679 | 1.0807 | | 0.5124 | 1.4201 | 960 | 1.1933 | 0.6839 | 1.1933 | 1.0924 | | 0.5124 | 1.4231 | 962 | 1.2640 | 0.6323 | 1.2640 | 1.1243 | | 0.5124 | 1.4260 | 964 | 1.3787 | 0.6680 | 1.3787 | 1.1742 | | 0.5124 | 1.4290 | 966 | 1.6585 | 0.5919 | 1.6585 | 1.2878 | | 0.5124 | 1.4320 | 968 | 1.7420 | 0.5919 | 1.7420 | 1.3198 | | 0.5124 | 1.4349 | 970 | 1.6647 | 0.6108 | 1.6647 | 1.2902 | | 0.5124 | 1.4379 | 972 | 1.6923 | 0.6108 | 1.6923 | 1.3009 | | 0.5124 | 1.4408 | 974 | 1.4807 | 0.6008 | 1.4807 | 1.2168 | | 0.5124 | 1.4438 | 976 | 1.3347 | 0.7188 | 1.3347 | 1.1553 | | 0.5124 | 1.4467 | 978 | 1.3076 | 0.7188 | 1.3076 | 1.1435 | | 0.5124 | 1.4497 | 980 | 1.2835 | 0.7188 | 1.2835 | 1.1329 | | 0.5124 | 1.4527 | 982 | 1.1488 | 0.7840 | 1.1488 | 1.0718 | | 0.5124 | 1.4556 | 984 | 1.1533 | 0.7188 | 1.1533 | 1.0739 | | 0.5124 | 1.4586 | 986 | 1.3488 | 0.6929 | 1.3488 | 1.1614 | | 0.5124 | 1.4615 | 988 | 1.6310 | 0.6538 | 1.6310 | 1.2771 | | 0.5124 | 1.4645 | 990 | 1.6573 | 0.6478 | 1.6573 | 1.2874 | | 0.5124 | 1.4675 | 992 | 1.4338 | 0.6929 | 1.4338 | 1.1974 | | 0.5124 | 1.4704 | 994 | 1.2248 | 0.6551 | 1.2248 | 1.1067 | | 0.5124 | 1.4734 | 996 | 1.1526 | 0.6551 | 1.1526 | 1.0736 | | 0.5124 | 1.4763 | 998 | 1.2478 | 0.6610 | 1.2478 | 1.1170 | | 0.2027 | 1.4793 | 1000 | 1.2770 | 0.6610 | 1.2770 | 1.1300 | | 0.2027 | 1.4822 | 1002 | 1.2318 | 0.6610 | 1.2318 | 1.1099 | | 0.2027 | 1.4852 | 1004 | 1.0398 | 0.7157 | 1.0398 | 1.0197 | | 0.2027 | 1.4882 | 1006 | 0.8259 | 0.8164 | 0.8259 | 0.9088 | | 0.2027 | 1.4911 | 1008 | 0.7525 | 0.8164 | 0.7525 | 0.8675 | | 0.2027 | 1.4941 | 1010 | 0.8021 | 0.7696 | 0.8021 | 0.8956 | | 0.2027 | 1.4970 | 1012 | 0.9255 | 0.7594 | 0.9255 | 0.9620 | | 0.2027 | 1.5 | 1014 | 1.0262 | 0.7507 | 1.0262 | 1.0130 | | 0.2027 | 1.5030 | 1016 | 1.0360 | 0.7507 | 1.0360 | 1.0179 | | 0.2027 | 1.5059 | 1018 | 1.0668 | 0.7434 | 1.0668 | 1.0329 | | 0.2027 | 1.5089 | 1020 | 0.9393 | 0.6952 | 0.9393 | 0.9692 | | 0.2027 | 1.5118 | 1022 | 0.7957 | 0.7983 | 0.7957 | 0.8920 | | 0.2027 | 1.5148 | 1024 | 0.7440 | 0.7427 | 0.7440 | 0.8625 | | 0.2027 | 1.5178 | 1026 | 0.7250 | 0.7427 | 0.7250 | 0.8515 | | 0.2027 | 1.5207 | 1028 | 0.8198 | 0.7427 | 0.8198 | 0.9054 | | 0.2027 | 1.5237 | 1030 | 1.0231 | 0.7216 | 1.0231 | 1.0115 | | 0.2027 | 1.5266 | 1032 | 1.2797 | 0.6610 | 1.2797 | 1.1313 | | 0.2027 | 1.5296 | 1034 | 1.3924 | 0.6610 | 1.3924 | 1.1800 | | 0.2027 | 1.5325 | 1036 | 1.3601 | 0.6610 | 1.3601 | 1.1662 | | 0.2027 | 1.5355 | 1038 | 1.2504 | 0.6680 | 1.2504 | 1.1182 | | 0.2027 | 1.5385 | 1040 | 1.1420 | 0.7519 | 1.1420 | 1.0686 | | 0.2027 | 1.5414 | 1042 | 1.1325 | 0.7519 | 1.1325 | 1.0642 | | 0.2027 | 1.5444 | 1044 | 1.2034 | 0.6839 | 1.2034 | 1.0970 | | 0.2027 | 1.5473 | 1046 | 1.2858 | 0.7188 | 1.2858 | 1.1339 | | 0.2027 | 1.5503 | 1048 | 1.2784 | 0.7188 | 1.2784 | 1.1307 | | 0.2027 | 1.5533 | 1050 | 1.2252 | 0.7188 | 1.2252 | 1.1069 | | 0.2027 | 1.5562 | 1052 | 1.0635 | 0.7840 | 1.0635 | 1.0313 | | 0.2027 | 1.5592 | 1054 | 0.8680 | 0.72 | 0.8680 | 0.9317 | | 0.2027 | 1.5621 | 1056 | 0.7908 | 0.72 | 0.7908 | 0.8893 | | 0.2027 | 1.5651 | 1058 | 0.8120 | 0.7520 | 0.8120 | 0.9011 | | 0.2027 | 1.5680 | 1060 | 0.8354 | 0.7134 | 0.8354 | 0.9140 | | 0.2027 | 1.5710 | 1062 | 0.9254 | 0.7216 | 0.9254 | 0.9620 | | 0.2027 | 1.5740 | 1064 | 1.0193 | 0.7216 | 1.0193 | 1.0096 | | 0.2027 | 1.5769 | 1066 | 1.0557 | 0.6616 | 1.0557 | 1.0274 | | 0.2027 | 1.5799 | 1068 | 0.9851 | 0.7134 | 0.9851 | 0.9925 | | 0.2027 | 1.5828 | 1070 | 0.9778 | 0.6824 | 0.9778 | 0.9888 | | 0.2027 | 1.5858 | 1072 | 1.0820 | 0.6897 | 1.0820 | 1.0402 | | 0.2027 | 1.5888 | 1074 | 1.2212 | 0.6680 | 1.2212 | 1.1051 | | 0.2027 | 1.5917 | 1076 | 1.1884 | 0.6680 | 1.1884 | 1.0901 | | 0.2027 | 1.5947 | 1078 | 1.0435 | 0.7188 | 1.0435 | 1.0215 | | 0.2027 | 1.5976 | 1080 | 1.0055 | 0.6839 | 1.0055 | 1.0027 | | 0.2027 | 1.6006 | 1082 | 1.1141 | 0.6680 | 1.1141 | 1.0555 | | 0.2027 | 1.6036 | 1084 | 1.2438 | 0.6680 | 1.2438 | 1.1152 | | 0.2027 | 1.6065 | 1086 | 1.2313 | 0.6610 | 1.2313 | 1.1096 | | 0.2027 | 1.6095 | 1088 | 1.1180 | 0.6610 | 1.1180 | 1.0573 | | 0.2027 | 1.6124 | 1090 | 1.0677 | 0.7211 | 1.0677 | 1.0333 | | 0.2027 | 1.6154 | 1092 | 1.0659 | 0.7211 | 1.0659 | 1.0324 | | 0.2027 | 1.6183 | 1094 | 1.0080 | 0.7305 | 1.0080 | 1.0040 | | 0.2027 | 1.6213 | 1096 | 1.0406 | 0.7305 | 1.0406 | 1.0201 | | 0.2027 | 1.6243 | 1098 | 1.1992 | 0.7211 | 1.1992 | 1.0951 | | 0.2027 | 1.6272 | 1100 | 1.4250 | 0.6610 | 1.4250 | 1.1937 | | 0.2027 | 1.6302 | 1102 | 1.4228 | 0.6610 | 1.4228 | 1.1928 | | 0.2027 | 1.6331 | 1104 | 1.3426 | 0.7211 | 1.3426 | 1.1587 | | 0.2027 | 1.6361 | 1106 | 1.2521 | 0.7211 | 1.2521 | 1.1190 | | 0.2027 | 1.6391 | 1108 | 1.1786 | 0.7305 | 1.1786 | 1.0856 | | 0.2027 | 1.6420 | 1110 | 1.1456 | 0.7840 | 1.1456 | 1.0703 | | 0.2027 | 1.6450 | 1112 | 1.2251 | 0.7840 | 1.2251 | 1.1068 | | 0.2027 | 1.6479 | 1114 | 1.4526 | 0.6349 | 1.4526 | 1.2052 | | 0.2027 | 1.6509 | 1116 | 1.6923 | 0.5573 | 1.6923 | 1.3009 | | 0.2027 | 1.6538 | 1118 | 1.6747 | 0.5600 | 1.6747 | 1.2941 | | 0.2027 | 1.6568 | 1120 | 1.4693 | 0.6257 | 1.4693 | 1.2122 | | 0.2027 | 1.6598 | 1122 | 1.1755 | 0.7188 | 1.1755 | 1.0842 | | 0.2027 | 1.6627 | 1124 | 0.9689 | 0.72 | 0.9689 | 0.9843 | | 0.2027 | 1.6657 | 1126 | 0.9462 | 0.72 | 0.9462 | 0.9727 | | 0.2027 | 1.6686 | 1128 | 1.0151 | 0.7519 | 1.0151 | 1.0075 | | 0.2027 | 1.6716 | 1130 | 1.0811 | 0.7724 | 1.0811 | 1.0397 | | 0.2027 | 1.6746 | 1132 | 1.0912 | 0.7724 | 1.0912 | 1.0446 | | 0.2027 | 1.6775 | 1134 | 1.0524 | 0.7216 | 1.0524 | 1.0259 | | 0.2027 | 1.6805 | 1136 | 1.0760 | 0.7129 | 1.0760 | 1.0373 | | 0.2027 | 1.6834 | 1138 | 1.2634 | 0.6610 | 1.2634 | 1.1240 | | 0.2027 | 1.6864 | 1140 | 1.4169 | 0.6407 | 1.4169 | 1.1903 | | 0.2027 | 1.6893 | 1142 | 1.3728 | 0.6610 | 1.3728 | 1.1717 | | 0.2027 | 1.6923 | 1144 | 1.2512 | 0.7188 | 1.2512 | 1.1186 | | 0.2027 | 1.6953 | 1146 | 1.1020 | 0.7840 | 1.1020 | 1.0498 | | 0.2027 | 1.6982 | 1148 | 0.9609 | 0.7840 | 0.9609 | 0.9803 | | 0.2027 | 1.7012 | 1150 | 0.8586 | 0.7520 | 0.8586 | 0.9266 | | 0.2027 | 1.7041 | 1152 | 0.8447 | 0.6945 | 0.8447 | 0.9191 | | 0.2027 | 1.7071 | 1154 | 0.9643 | 0.7129 | 0.9643 | 0.9820 | | 0.2027 | 1.7101 | 1156 | 1.1496 | 0.7129 | 1.1496 | 1.0722 | | 0.2027 | 1.7130 | 1158 | 1.1525 | 0.7420 | 1.1525 | 1.0735 | | 0.2027 | 1.7160 | 1160 | 1.0140 | 0.7602 | 1.0140 | 1.0070 | | 0.2027 | 1.7189 | 1162 | 0.8382 | 0.7437 | 0.8382 | 0.9156 | | 0.2027 | 1.7219 | 1164 | 0.7141 | 0.7879 | 0.7141 | 0.8451 | | 0.2027 | 1.7249 | 1166 | 0.7293 | 0.7620 | 0.7293 | 0.8540 | | 0.2027 | 1.7278 | 1168 | 0.8555 | 0.7983 | 0.8555 | 0.9249 | | 0.2027 | 1.7308 | 1170 | 1.0679 | 0.7216 | 1.0679 | 1.0334 | | 0.2027 | 1.7337 | 1172 | 1.2894 | 0.6551 | 1.2894 | 1.1355 | | 0.2027 | 1.7367 | 1174 | 1.3010 | 0.6303 | 1.3010 | 1.1406 | | 0.2027 | 1.7396 | 1176 | 1.1568 | 0.6616 | 1.1568 | 1.0755 | | 0.2027 | 1.7426 | 1178 | 0.9267 | 0.7983 | 0.9267 | 0.9626 | | 0.2027 | 1.7456 | 1180 | 0.7456 | 0.7620 | 0.7456 | 0.8635 | | 0.2027 | 1.7485 | 1182 | 0.7090 | 0.7620 | 0.7090 | 0.8420 | | 0.2027 | 1.7515 | 1184 | 0.7289 | 0.7717 | 0.7289 | 0.8537 | | 0.2027 | 1.7544 | 1186 | 0.8032 | 0.7983 | 0.8032 | 0.8962 | | 0.2027 | 1.7574 | 1188 | 1.0024 | 0.7724 | 1.0024 | 1.0012 | | 0.2027 | 1.7604 | 1190 | 1.1263 | 0.7216 | 1.1263 | 1.0613 | | 0.2027 | 1.7633 | 1192 | 1.1630 | 0.7216 | 1.1630 | 1.0784 | | 0.2027 | 1.7663 | 1194 | 1.1260 | 0.7724 | 1.1260 | 1.0611 | | 0.2027 | 1.7692 | 1196 | 0.9999 | 0.7520 | 0.9999 | 0.9999 | | 0.2027 | 1.7722 | 1198 | 0.8949 | 0.7520 | 0.8949 | 0.9460 | | 0.2027 | 1.7751 | 1200 | 0.8393 | 0.72 | 0.8393 | 0.9161 | | 0.2027 | 1.7781 | 1202 | 0.7766 | 0.7427 | 0.7766 | 0.8813 | | 0.2027 | 1.7811 | 1204 | 0.8097 | 0.7422 | 0.8097 | 0.8998 | | 0.2027 | 1.7840 | 1206 | 0.9306 | 0.7724 | 0.9306 | 0.9647 | | 0.2027 | 1.7870 | 1208 | 1.0580 | 0.7129 | 1.0580 | 1.0286 | | 0.2027 | 1.7899 | 1210 | 1.0144 | 0.7053 | 1.0144 | 1.0072 | | 0.2027 | 1.7929 | 1212 | 1.0080 | 0.7053 | 1.0080 | 1.0040 | | 0.2027 | 1.7959 | 1214 | 0.9547 | 0.7134 | 0.9547 | 0.9771 | | 0.2027 | 1.7988 | 1216 | 0.8649 | 0.7618 | 0.8649 | 0.9300 | | 0.2027 | 1.8018 | 1218 | 0.8642 | 0.7618 | 0.8642 | 0.9296 | | 0.2027 | 1.8047 | 1220 | 0.9496 | 0.7724 | 0.9496 | 0.9745 | | 0.2027 | 1.8077 | 1222 | 1.0637 | 0.7724 | 1.0637 | 1.0314 | | 0.2027 | 1.8107 | 1224 | 1.0438 | 0.7413 | 1.0438 | 1.0217 | | 0.2027 | 1.8136 | 1226 | 1.0562 | 0.7519 | 1.0562 | 1.0277 | | 0.2027 | 1.8166 | 1228 | 1.0480 | 0.7519 | 1.0480 | 1.0237 | | 0.2027 | 1.8195 | 1230 | 1.0601 | 0.7519 | 1.0601 | 1.0296 | | 0.2027 | 1.8225 | 1232 | 0.9394 | 0.7413 | 0.9394 | 0.9692 | | 0.2027 | 1.8254 | 1234 | 0.8760 | 0.7413 | 0.8760 | 0.9360 | | 0.2027 | 1.8284 | 1236 | 0.8723 | 0.7413 | 0.8723 | 0.9340 | | 0.2027 | 1.8314 | 1238 | 0.9070 | 0.7724 | 0.9070 | 0.9524 | | 0.2027 | 1.8343 | 1240 | 0.9346 | 0.7724 | 0.9346 | 0.9667 | | 0.2027 | 1.8373 | 1242 | 0.9246 | 0.7724 | 0.9246 | 0.9616 | | 0.2027 | 1.8402 | 1244 | 1.0184 | 0.7216 | 1.0184 | 1.0092 | | 0.2027 | 1.8432 | 1246 | 1.0145 | 0.7216 | 1.0145 | 1.0072 | | 0.2027 | 1.8462 | 1248 | 0.9746 | 0.7216 | 0.9746 | 0.9872 | | 0.2027 | 1.8491 | 1250 | 1.0259 | 0.7216 | 1.0259 | 1.0129 | | 0.2027 | 1.8521 | 1252 | 1.1335 | 0.7216 | 1.1335 | 1.0646 | | 0.2027 | 1.8550 | 1254 | 1.1524 | 0.7216 | 1.1524 | 1.0735 | | 0.2027 | 1.8580 | 1256 | 1.1742 | 0.7216 | 1.1742 | 1.0836 | | 0.2027 | 1.8609 | 1258 | 1.0785 | 0.7216 | 1.0785 | 1.0385 | | 0.2027 | 1.8639 | 1260 | 0.9516 | 0.7724 | 0.9516 | 0.9755 | | 0.2027 | 1.8669 | 1262 | 0.8954 | 0.7724 | 0.8954 | 0.9462 | | 0.2027 | 1.8698 | 1264 | 0.8350 | 0.7724 | 0.8350 | 0.9138 | | 0.2027 | 1.8728 | 1266 | 0.8384 | 0.7724 | 0.8384 | 0.9157 | | 0.2027 | 1.8757 | 1268 | 0.9365 | 0.7724 | 0.9365 | 0.9677 | | 0.2027 | 1.8787 | 1270 | 1.0440 | 0.7216 | 1.0440 | 1.0218 | | 0.2027 | 1.8817 | 1272 | 1.1089 | 0.7216 | 1.1089 | 1.0531 | | 0.2027 | 1.8846 | 1274 | 1.0823 | 0.7216 | 1.0823 | 1.0403 | | 0.2027 | 1.8876 | 1276 | 1.0238 | 0.7724 | 1.0238 | 1.0118 | | 0.2027 | 1.8905 | 1278 | 0.9472 | 0.7724 | 0.9472 | 0.9733 | | 0.2027 | 1.8935 | 1280 | 0.8534 | 0.7729 | 0.8534 | 0.9238 | | 0.2027 | 1.8964 | 1282 | 0.8127 | 0.7729 | 0.8127 | 0.9015 | | 0.2027 | 1.8994 | 1284 | 0.8792 | 0.7844 | 0.8792 | 0.9377 | | 0.2027 | 1.9024 | 1286 | 0.9333 | 0.7519 | 0.9333 | 0.9661 | | 0.2027 | 1.9053 | 1288 | 1.0706 | 0.7840 | 1.0706 | 1.0347 | | 0.2027 | 1.9083 | 1290 | 1.1552 | 0.7840 | 1.1552 | 1.0748 | | 0.2027 | 1.9112 | 1292 | 1.1153 | 0.7305 | 1.1153 | 1.0561 | | 0.2027 | 1.9142 | 1294 | 0.9649 | 0.7840 | 0.9649 | 0.9823 | | 0.2027 | 1.9172 | 1296 | 0.7974 | 0.7844 | 0.7974 | 0.8930 | | 0.2027 | 1.9201 | 1298 | 0.7322 | 0.8019 | 0.7322 | 0.8557 | | 0.2027 | 1.9231 | 1300 | 0.7319 | 0.8019 | 0.7319 | 0.8555 | | 0.2027 | 1.9260 | 1302 | 0.7839 | 0.7793 | 0.7839 | 0.8854 | | 0.2027 | 1.9290 | 1304 | 0.7875 | 0.8283 | 0.7875 | 0.8874 | | 0.2027 | 1.9320 | 1306 | 0.8742 | 0.7793 | 0.8742 | 0.9350 | | 0.2027 | 1.9349 | 1308 | 1.0828 | 0.7421 | 1.0828 | 1.0406 | | 0.2027 | 1.9379 | 1310 | 1.2391 | 0.6610 | 1.2391 | 1.1132 | | 0.2027 | 1.9408 | 1312 | 1.2314 | 0.6680 | 1.2314 | 1.1097 | | 0.2027 | 1.9438 | 1314 | 1.2424 | 0.6680 | 1.2424 | 1.1146 | | 0.2027 | 1.9467 | 1316 | 1.2892 | 0.7188 | 1.2892 | 1.1354 | | 0.2027 | 1.9497 | 1318 | 1.2065 | 0.7188 | 1.2065 | 1.0984 | | 0.2027 | 1.9527 | 1320 | 1.0568 | 0.7194 | 1.0568 | 1.0280 | | 0.2027 | 1.9556 | 1322 | 0.9709 | 0.7106 | 0.9709 | 0.9853 | | 0.2027 | 1.9586 | 1324 | 0.9310 | 0.7729 | 0.9310 | 0.9649 | | 0.2027 | 1.9615 | 1326 | 0.9968 | 0.7417 | 0.9968 | 0.9984 | | 0.2027 | 1.9645 | 1328 | 1.1721 | 0.6551 | 1.1721 | 1.0827 | | 0.2027 | 1.9675 | 1330 | 1.2560 | 0.6551 | 1.2560 | 1.1207 | | 0.2027 | 1.9704 | 1332 | 1.3105 | 0.6551 | 1.3105 | 1.1448 | | 0.2027 | 1.9734 | 1334 | 1.3047 | 0.6551 | 1.3047 | 1.1422 | | 0.2027 | 1.9763 | 1336 | 1.1227 | 0.6551 | 1.1227 | 1.0596 | | 0.2027 | 1.9793 | 1338 | 0.8940 | 0.7606 | 0.8940 | 0.9455 | | 0.2027 | 1.9822 | 1340 | 0.7891 | 0.8054 | 0.7891 | 0.8883 | | 0.2027 | 1.9852 | 1342 | 0.7859 | 0.8054 | 0.7859 | 0.8865 | | 0.2027 | 1.9882 | 1344 | 0.7966 | 0.8164 | 0.7966 | 0.8926 | | 0.2027 | 1.9911 | 1346 | 0.8101 | 0.8164 | 0.8101 | 0.9001 | | 0.2027 | 1.9941 | 1348 | 0.8712 | 0.8283 | 0.8712 | 0.9334 | | 0.2027 | 1.9970 | 1350 | 0.9879 | 0.8283 | 0.9879 | 0.9939 | | 0.2027 | 2.0 | 1352 | 1.1058 | 0.6934 | 1.1058 | 1.0516 | | 0.2027 | 2.0030 | 1354 | 1.0945 | 0.7511 | 1.0945 | 1.0462 | | 0.2027 | 2.0059 | 1356 | 1.0005 | 0.7898 | 1.0005 | 1.0003 | | 0.2027 | 2.0089 | 1358 | 0.9025 | 0.8283 | 0.9025 | 0.9500 | | 0.2027 | 2.0118 | 1360 | 0.8913 | 0.8283 | 0.8913 | 0.9441 | | 0.2027 | 2.0148 | 1362 | 0.9784 | 0.8138 | 0.9784 | 0.9892 | | 0.2027 | 2.0178 | 1364 | 1.0355 | 0.8138 | 1.0355 | 1.0176 | | 0.2027 | 2.0207 | 1366 | 1.1199 | 0.7515 | 1.1199 | 1.0583 | | 0.2027 | 2.0237 | 1368 | 1.2363 | 0.6680 | 1.2363 | 1.1119 | | 0.2027 | 2.0266 | 1370 | 1.2404 | 0.6680 | 1.2404 | 1.1137 | | 0.2027 | 2.0296 | 1372 | 1.1814 | 0.7188 | 1.1814 | 1.0869 | | 0.2027 | 2.0325 | 1374 | 1.0447 | 0.7840 | 1.0447 | 1.0221 | | 0.2027 | 2.0355 | 1376 | 0.9452 | 0.7844 | 0.9452 | 0.9722 | | 0.2027 | 2.0385 | 1378 | 0.8490 | 0.7729 | 0.8490 | 0.9214 | | 0.2027 | 2.0414 | 1380 | 0.8198 | 0.7729 | 0.8198 | 0.9054 | | 0.2027 | 2.0444 | 1382 | 0.8613 | 0.7729 | 0.8613 | 0.9280 | | 0.2027 | 2.0473 | 1384 | 1.0077 | 0.7305 | 1.0077 | 1.0038 | | 0.2027 | 2.0503 | 1386 | 1.1931 | 0.7095 | 1.1931 | 1.0923 | | 0.2027 | 2.0533 | 1388 | 1.2200 | 0.7095 | 1.2200 | 1.1046 | | 0.2027 | 2.0562 | 1390 | 1.1135 | 0.7216 | 1.1135 | 1.0552 | | 0.2027 | 2.0592 | 1392 | 0.9553 | 0.7724 | 0.9553 | 0.9774 | | 0.2027 | 2.0621 | 1394 | 0.9035 | 0.7724 | 0.9035 | 0.9505 | | 0.2027 | 2.0651 | 1396 | 0.8121 | 0.7112 | 0.8121 | 0.9012 | | 0.2027 | 2.0680 | 1398 | 0.7745 | 0.7427 | 0.7745 | 0.8801 | | 0.2027 | 2.0710 | 1400 | 0.8406 | 0.7618 | 0.8406 | 0.9168 | | 0.2027 | 2.0740 | 1402 | 1.0117 | 0.7216 | 1.0117 | 1.0058 | | 0.2027 | 2.0769 | 1404 | 1.2557 | 0.7129 | 1.2557 | 1.1206 | | 0.2027 | 2.0799 | 1406 | 1.3077 | 0.6934 | 1.3077 | 1.1435 | | 0.2027 | 2.0828 | 1408 | 1.1886 | 0.7129 | 1.1886 | 1.0902 | | 0.2027 | 2.0858 | 1410 | 0.9789 | 0.7134 | 0.9789 | 0.9894 | | 0.2027 | 2.0888 | 1412 | 0.8780 | 0.7618 | 0.8780 | 0.9370 | | 0.2027 | 2.0917 | 1414 | 0.7670 | 0.7717 | 0.7670 | 0.8758 | | 0.2027 | 2.0947 | 1416 | 0.7382 | 0.7717 | 0.7382 | 0.8592 | | 0.2027 | 2.0976 | 1418 | 0.8082 | 0.7713 | 0.8082 | 0.8990 | | 0.2027 | 2.1006 | 1420 | 0.9503 | 0.7618 | 0.9503 | 0.9748 | | 0.2027 | 2.1036 | 1422 | 1.0865 | 0.7216 | 1.0865 | 1.0423 | | 0.2027 | 2.1065 | 1424 | 1.2271 | 0.6680 | 1.2271 | 1.1077 | | 0.2027 | 2.1095 | 1426 | 1.2042 | 0.7101 | 1.2042 | 1.0974 | | 0.2027 | 2.1124 | 1428 | 1.0951 | 0.7724 | 1.0951 | 1.0465 | | 0.2027 | 2.1154 | 1430 | 1.0425 | 0.7724 | 1.0425 | 1.0210 | | 0.2027 | 2.1183 | 1432 | 0.9952 | 0.7618 | 0.9952 | 0.9976 | | 0.2027 | 2.1213 | 1434 | 0.9337 | 0.7618 | 0.9337 | 0.9663 | | 0.2027 | 2.1243 | 1436 | 0.9933 | 0.7618 | 0.9933 | 0.9967 | | 0.2027 | 2.1272 | 1438 | 1.0419 | 0.7618 | 1.0419 | 1.0208 | | 0.2027 | 2.1302 | 1440 | 1.0181 | 0.7618 | 1.0181 | 1.0090 | | 0.2027 | 2.1331 | 1442 | 0.9504 | 0.7618 | 0.9504 | 0.9749 | | 0.2027 | 2.1361 | 1444 | 0.9980 | 0.7724 | 0.9980 | 0.9990 | | 0.2027 | 2.1391 | 1446 | 1.0772 | 0.7724 | 1.0772 | 1.0379 | | 0.2027 | 2.1420 | 1448 | 1.1787 | 0.7724 | 1.1787 | 1.0857 | | 0.2027 | 2.1450 | 1450 | 1.1875 | 0.7840 | 1.1875 | 1.0897 | | 0.2027 | 2.1479 | 1452 | 1.1263 | 0.7519 | 1.1263 | 1.0613 | | 0.2027 | 2.1509 | 1454 | 1.0700 | 0.7519 | 1.0700 | 1.0344 | | 0.2027 | 2.1538 | 1456 | 1.0276 | 0.7413 | 1.0276 | 1.0137 | | 0.2027 | 2.1568 | 1458 | 1.0212 | 0.7724 | 1.0212 | 1.0105 | | 0.2027 | 2.1598 | 1460 | 1.0415 | 0.7724 | 1.0415 | 1.0206 | | 0.2027 | 2.1627 | 1462 | 1.0731 | 0.7216 | 1.0731 | 1.0359 | | 0.2027 | 2.1657 | 1464 | 1.1670 | 0.7216 | 1.1670 | 1.0803 | | 0.2027 | 2.1686 | 1466 | 1.1642 | 0.7216 | 1.1642 | 1.0790 | | 0.2027 | 2.1716 | 1468 | 1.1779 | 0.7216 | 1.1779 | 1.0853 | | 0.2027 | 2.1746 | 1470 | 1.0652 | 0.7216 | 1.0652 | 1.0321 | | 0.2027 | 2.1775 | 1472 | 1.0099 | 0.7724 | 1.0099 | 1.0050 | | 0.2027 | 2.1805 | 1474 | 1.0008 | 0.7724 | 1.0008 | 1.0004 | | 0.2027 | 2.1834 | 1476 | 0.9941 | 0.7724 | 0.9941 | 0.9970 | | 0.2027 | 2.1864 | 1478 | 1.0495 | 0.7840 | 1.0495 | 1.0245 | | 0.2027 | 2.1893 | 1480 | 1.0901 | 0.7840 | 1.0901 | 1.0441 | | 0.2027 | 2.1923 | 1482 | 1.1383 | 0.7840 | 1.1383 | 1.0669 | | 0.2027 | 2.1953 | 1484 | 1.1134 | 0.7626 | 1.1134 | 1.0552 | | 0.2027 | 2.1982 | 1486 | 1.0597 | 0.7520 | 1.0597 | 1.0294 | | 0.2027 | 2.2012 | 1488 | 0.9854 | 0.7520 | 0.9854 | 0.9927 | | 0.2027 | 2.2041 | 1490 | 0.9767 | 0.7520 | 0.9767 | 0.9883 | | 0.2027 | 2.2071 | 1492 | 0.9569 | 0.7020 | 0.9569 | 0.9782 | | 0.2027 | 2.2101 | 1494 | 0.9386 | 0.7020 | 0.9386 | 0.9688 | | 0.2027 | 2.2130 | 1496 | 0.9788 | 0.7520 | 0.9788 | 0.9893 | | 0.2027 | 2.2160 | 1498 | 0.9178 | 0.7520 | 0.9178 | 0.9580 | | 0.1429 | 2.2189 | 1500 | 0.8509 | 0.7422 | 0.8509 | 0.9224 | | 0.1429 | 2.2219 | 1502 | 0.7499 | 0.7422 | 0.7499 | 0.8659 | | 0.1429 | 2.2249 | 1504 | 0.6799 | 0.7879 | 0.6799 | 0.8245 | | 0.1429 | 2.2278 | 1506 | 0.6976 | 0.7879 | 0.6976 | 0.8352 | | 0.1429 | 2.2308 | 1508 | 0.7965 | 0.7983 | 0.7965 | 0.8925 | | 0.1429 | 2.2337 | 1510 | 0.9768 | 0.6945 | 0.9768 | 0.9883 | | 0.1429 | 2.2367 | 1512 | 1.1678 | 0.7129 | 1.1678 | 1.0806 | | 0.1429 | 2.2396 | 1514 | 1.3075 | 0.6407 | 1.3075 | 1.1434 | | 0.1429 | 2.2426 | 1516 | 1.2889 | 0.6884 | 1.2889 | 1.1353 | | 0.1429 | 2.2456 | 1518 | 1.1692 | 0.7840 | 1.1692 | 1.0813 | | 0.1429 | 2.2485 | 1520 | 1.0580 | 0.7519 | 1.0580 | 1.0286 | | 0.1429 | 2.2515 | 1522 | 0.9749 | 0.7296 | 0.9749 | 0.9874 | | 0.1429 | 2.2544 | 1524 | 0.9682 | 0.7626 | 0.9682 | 0.9840 | | 0.1429 | 2.2574 | 1526 | 1.0404 | 0.7626 | 1.0404 | 1.0200 | | 0.1429 | 2.2604 | 1528 | 1.1190 | 0.7101 | 1.1190 | 1.0578 | | 0.1429 | 2.2633 | 1530 | 1.1386 | 0.7101 | 1.1386 | 1.0671 | | 0.1429 | 2.2663 | 1532 | 1.1567 | 0.7101 | 1.1567 | 1.0755 | | 0.1429 | 2.2692 | 1534 | 1.2360 | 0.7211 | 1.2360 | 1.1118 | | 0.1429 | 2.2722 | 1536 | 1.2356 | 0.7305 | 1.2356 | 1.1116 | | 0.1429 | 2.2751 | 1538 | 1.1011 | 0.7101 | 1.1011 | 1.0493 | | 0.1429 | 2.2781 | 1540 | 0.9899 | 0.7520 | 0.9899 | 0.9949 | | 0.1429 | 2.2811 | 1542 | 0.9165 | 0.6403 | 0.9165 | 0.9573 | | 0.1429 | 2.2840 | 1544 | 0.8874 | 0.6403 | 0.8874 | 0.9420 | | 0.1429 | 2.2870 | 1546 | 0.9359 | 0.6403 | 0.9359 | 0.9674 | | 0.1429 | 2.2899 | 1548 | 1.0542 | 0.7296 | 1.0542 | 1.0267 | | 0.1429 | 2.2929 | 1550 | 1.1002 | 0.7626 | 1.1002 | 1.0489 | | 0.1429 | 2.2959 | 1552 | 1.0319 | 0.7296 | 1.0319 | 1.0158 | | 0.1429 | 2.2988 | 1554 | 0.9245 | 0.7296 | 0.9245 | 0.9615 | | 0.1429 | 2.3018 | 1556 | 0.8937 | 0.7296 | 0.8937 | 0.9454 | | 0.1429 | 2.3047 | 1558 | 0.8407 | 0.7296 | 0.8407 | 0.9169 | | 0.1429 | 2.3077 | 1560 | 0.7828 | 0.7733 | 0.7828 | 0.8847 | | 0.1429 | 2.3107 | 1562 | 0.7910 | 0.7525 | 0.7910 | 0.8894 | | 0.1429 | 2.3136 | 1564 | 0.8851 | 0.7296 | 0.8851 | 0.9408 | | 0.1429 | 2.3166 | 1566 | 1.1154 | 0.7305 | 1.1154 | 1.0561 | | 0.1429 | 2.3195 | 1568 | 1.2786 | 0.7230 | 1.2786 | 1.1307 | | 0.1429 | 2.3225 | 1570 | 1.2614 | 0.7305 | 1.2614 | 1.1231 | | 0.1429 | 2.3254 | 1572 | 1.1293 | 0.7840 | 1.1293 | 1.0627 | | 0.1429 | 2.3284 | 1574 | 0.9546 | 0.7413 | 0.9546 | 0.9770 | | 0.1429 | 2.3314 | 1576 | 0.8477 | 0.8019 | 0.8477 | 0.9207 | | 0.1429 | 2.3343 | 1578 | 0.7778 | 0.7823 | 0.7778 | 0.8820 | | 0.1429 | 2.3373 | 1580 | 0.7632 | 0.7717 | 0.7632 | 0.8736 | | 0.1429 | 2.3402 | 1582 | 0.8270 | 0.8019 | 0.8270 | 0.9094 | | 0.1429 | 2.3432 | 1584 | 0.9009 | 0.8283 | 0.9009 | 0.9492 | | 0.1429 | 2.3462 | 1586 | 1.0544 | 0.7216 | 1.0544 | 1.0269 | | 0.1429 | 2.3491 | 1588 | 1.1467 | 0.7216 | 1.1467 | 1.0708 | | 0.1429 | 2.3521 | 1590 | 1.1455 | 0.7216 | 1.1455 | 1.0703 | | 0.1429 | 2.3550 | 1592 | 1.0188 | 0.7724 | 1.0188 | 1.0093 | | 0.1429 | 2.3580 | 1594 | 0.8845 | 0.7413 | 0.8845 | 0.9405 | | 0.1429 | 2.3609 | 1596 | 0.8601 | 0.7729 | 0.8601 | 0.9274 | | 0.1429 | 2.3639 | 1598 | 0.8620 | 0.8015 | 0.8620 | 0.9284 | | 0.1429 | 2.3669 | 1600 | 0.8829 | 0.8015 | 0.8829 | 0.9396 | | 0.1429 | 2.3698 | 1602 | 0.9048 | 0.7724 | 0.9048 | 0.9512 | | 0.1429 | 2.3728 | 1604 | 0.9217 | 0.7216 | 0.9217 | 0.9600 | | 0.1429 | 2.3757 | 1606 | 0.8948 | 0.7216 | 0.8948 | 0.9459 | | 0.1429 | 2.3787 | 1608 | 0.8529 | 0.8015 | 0.8529 | 0.9235 | | 0.1429 | 2.3817 | 1610 | 0.8610 | 0.7724 | 0.8610 | 0.9279 | | 0.1429 | 2.3846 | 1612 | 0.9292 | 0.7724 | 0.9292 | 0.9640 | | 0.1429 | 2.3876 | 1614 | 0.9651 | 0.7840 | 0.9651 | 0.9824 | | 0.1429 | 2.3905 | 1616 | 1.0820 | 0.7840 | 1.0820 | 1.0402 | | 0.1429 | 2.3935 | 1618 | 1.1566 | 0.7840 | 1.1566 | 1.0755 | | 0.1429 | 2.3964 | 1620 | 1.1392 | 0.7840 | 1.1392 | 1.0673 | | 0.1429 | 2.3994 | 1622 | 1.0587 | 0.7840 | 1.0587 | 1.0290 | | 0.1429 | 2.4024 | 1624 | 0.9914 | 0.7840 | 0.9914 | 0.9957 | | 0.1429 | 2.4053 | 1626 | 0.9850 | 0.7840 | 0.9850 | 0.9925 | | 0.1429 | 2.4083 | 1628 | 1.0074 | 0.7840 | 1.0074 | 1.0037 | | 0.1429 | 2.4112 | 1630 | 0.9917 | 0.7840 | 0.9917 | 0.9958 | | 0.1429 | 2.4142 | 1632 | 0.9385 | 0.7840 | 0.9385 | 0.9687 | | 0.1429 | 2.4172 | 1634 | 0.9730 | 0.7840 | 0.9730 | 0.9864 | | 0.1429 | 2.4201 | 1636 | 1.0876 | 0.7840 | 1.0876 | 1.0429 | | 0.1429 | 2.4231 | 1638 | 1.1681 | 0.7188 | 1.1681 | 1.0808 | | 0.1429 | 2.4260 | 1640 | 1.2506 | 0.6680 | 1.2506 | 1.1183 | | 0.1429 | 2.4290 | 1642 | 1.2324 | 0.6680 | 1.2324 | 1.1101 | | 0.1429 | 2.4320 | 1644 | 1.1494 | 0.7305 | 1.1494 | 1.0721 | | 0.1429 | 2.4349 | 1646 | 1.0692 | 0.7305 | 1.0692 | 1.0340 | | 0.1429 | 2.4379 | 1648 | 1.0912 | 0.7211 | 1.0912 | 1.0446 | | 0.1429 | 2.4408 | 1650 | 1.0712 | 0.7305 | 1.0712 | 1.0350 | | 0.1429 | 2.4438 | 1652 | 1.0516 | 0.7216 | 1.0516 | 1.0255 | | 0.1429 | 2.4467 | 1654 | 1.0886 | 0.7840 | 1.0886 | 1.0434 | | 0.1429 | 2.4497 | 1656 | 1.1326 | 0.7840 | 1.1326 | 1.0642 | | 0.1429 | 2.4527 | 1658 | 1.1204 | 0.7840 | 1.1204 | 1.0585 | | 0.1429 | 2.4556 | 1660 | 1.0569 | 0.7840 | 1.0569 | 1.0281 | | 0.1429 | 2.4586 | 1662 | 0.9871 | 0.7724 | 0.9871 | 0.9936 | | 0.1429 | 2.4615 | 1664 | 0.9690 | 0.7724 | 0.9690 | 0.9844 | | 0.1429 | 2.4645 | 1666 | 1.0108 | 0.7216 | 1.0108 | 1.0054 | | 0.1429 | 2.4675 | 1668 | 1.0558 | 0.7216 | 1.0558 | 1.0275 | | 0.1429 | 2.4704 | 1670 | 1.0887 | 0.7516 | 1.0887 | 1.0434 | | 0.1429 | 2.4734 | 1672 | 1.0309 | 0.7516 | 1.0309 | 1.0153 | | 0.1429 | 2.4763 | 1674 | 0.8772 | 0.7516 | 0.8772 | 0.9366 | | 0.1429 | 2.4793 | 1676 | 0.8144 | 0.8164 | 0.8144 | 0.9025 | | 0.1429 | 2.4822 | 1678 | 0.8580 | 0.8015 | 0.8580 | 0.9263 | | 0.1429 | 2.4852 | 1680 | 1.0259 | 0.7216 | 1.0259 | 1.0129 | | 0.1429 | 2.4882 | 1682 | 1.1710 | 0.7129 | 1.1710 | 1.0821 | | 0.1429 | 2.4911 | 1684 | 1.1935 | 0.7129 | 1.1935 | 1.0925 | | 0.1429 | 2.4941 | 1686 | 1.1651 | 0.7216 | 1.1651 | 1.0794 | | 0.1429 | 2.4970 | 1688 | 1.0233 | 0.7516 | 1.0233 | 1.0116 | | 0.1429 | 2.5 | 1690 | 0.8537 | 0.8283 | 0.8537 | 0.9240 | | 0.1429 | 2.5030 | 1692 | 0.8046 | 0.7717 | 0.8046 | 0.8970 | | 0.1429 | 2.5059 | 1694 | 0.8393 | 0.8019 | 0.8393 | 0.9161 | | 0.1429 | 2.5089 | 1696 | 0.9609 | 0.7793 | 0.9609 | 0.9802 | | 0.1429 | 2.5118 | 1698 | 1.0597 | 0.7516 | 1.0597 | 1.0294 | | 0.1429 | 2.5148 | 1700 | 1.1579 | 0.6616 | 1.1579 | 1.0761 | | 0.1429 | 2.5178 | 1702 | 1.1391 | 0.6616 | 1.1391 | 1.0673 | | 0.1429 | 2.5207 | 1704 | 0.9957 | 0.7793 | 0.9957 | 0.9978 | | 0.1429 | 2.5237 | 1706 | 0.8332 | 0.8019 | 0.8332 | 0.9128 | | 0.1429 | 2.5266 | 1708 | 0.7666 | 0.7906 | 0.7666 | 0.8755 | | 0.1429 | 2.5296 | 1710 | 0.7299 | 0.7906 | 0.7299 | 0.8543 | | 0.1429 | 2.5325 | 1712 | 0.7774 | 0.7906 | 0.7774 | 0.8817 | | 0.1429 | 2.5355 | 1714 | 0.8416 | 0.8283 | 0.8416 | 0.9174 | | 0.1429 | 2.5385 | 1716 | 0.9281 | 0.7793 | 0.9281 | 0.9634 | | 0.1429 | 2.5414 | 1718 | 1.0225 | 0.7692 | 1.0225 | 1.0112 | | 0.1429 | 2.5444 | 1720 | 1.1036 | 0.7511 | 1.1036 | 1.0505 | | 0.1429 | 2.5473 | 1722 | 1.1008 | 0.7511 | 1.1008 | 1.0492 | | 0.1429 | 2.5503 | 1724 | 0.9959 | 0.7421 | 0.9959 | 0.9980 | | 0.1429 | 2.5533 | 1726 | 0.9075 | 0.7793 | 0.9075 | 0.9526 | | 0.1429 | 2.5562 | 1728 | 0.8658 | 0.7793 | 0.8658 | 0.9305 | | 0.1429 | 2.5592 | 1730 | 0.8619 | 0.7793 | 0.8619 | 0.9284 | | 0.1429 | 2.5621 | 1732 | 0.8915 | 0.7793 | 0.8915 | 0.9442 | | 0.1429 | 2.5651 | 1734 | 0.9928 | 0.7614 | 0.9928 | 0.9964 | | 0.1429 | 2.5680 | 1736 | 1.0190 | 0.7305 | 1.0190 | 1.0095 | | 0.1429 | 2.5710 | 1738 | 1.0376 | 0.7305 | 1.0376 | 1.0186 | | 0.1429 | 2.5740 | 1740 | 1.0662 | 0.7840 | 1.0662 | 1.0325 | | 0.1429 | 2.5769 | 1742 | 0.9729 | 0.7840 | 0.9729 | 0.9864 | | 0.1429 | 2.5799 | 1744 | 0.8807 | 0.8412 | 0.8807 | 0.9385 | | 0.1429 | 2.5828 | 1746 | 0.8821 | 0.8412 | 0.8821 | 0.9392 | | 0.1429 | 2.5858 | 1748 | 0.8849 | 0.8412 | 0.8849 | 0.9407 | | 0.1429 | 2.5888 | 1750 | 0.7951 | 0.8412 | 0.7951 | 0.8917 | | 0.1429 | 2.5917 | 1752 | 0.7814 | 0.8412 | 0.7814 | 0.8840 | | 0.1429 | 2.5947 | 1754 | 0.8674 | 0.8412 | 0.8674 | 0.9313 | | 0.1429 | 2.5976 | 1756 | 0.9161 | 0.8138 | 0.9161 | 0.9571 | | 0.1429 | 2.6006 | 1758 | 0.9612 | 0.7840 | 0.9612 | 0.9804 | | 0.1429 | 2.6036 | 1760 | 0.9555 | 0.7840 | 0.9555 | 0.9775 | | 0.1429 | 2.6065 | 1762 | 0.9541 | 0.7840 | 0.9541 | 0.9768 | | 0.1429 | 2.6095 | 1764 | 0.9440 | 0.7840 | 0.9440 | 0.9716 | | 0.1429 | 2.6124 | 1766 | 0.8729 | 0.8138 | 0.8729 | 0.9343 | | 0.1429 | 2.6154 | 1768 | 0.8254 | 0.8283 | 0.8254 | 0.9085 | | 0.1429 | 2.6183 | 1770 | 0.8261 | 0.8283 | 0.8261 | 0.9089 | | 0.1429 | 2.6213 | 1772 | 0.8465 | 0.7840 | 0.8465 | 0.9200 | | 0.1429 | 2.6243 | 1774 | 0.8766 | 0.7840 | 0.8766 | 0.9363 | | 0.1429 | 2.6272 | 1776 | 0.9487 | 0.7840 | 0.9487 | 0.9740 | | 0.1429 | 2.6302 | 1778 | 1.0681 | 0.7840 | 1.0681 | 1.0335 | | 0.1429 | 2.6331 | 1780 | 1.1402 | 0.7840 | 1.1402 | 1.0678 | | 0.1429 | 2.6361 | 1782 | 1.1035 | 0.7840 | 1.1035 | 1.0505 | | 0.1429 | 2.6391 | 1784 | 0.9980 | 0.7840 | 0.9980 | 0.9990 | | 0.1429 | 2.6420 | 1786 | 0.9062 | 0.7840 | 0.9062 | 0.9520 | | 0.1429 | 2.6450 | 1788 | 0.8377 | 0.7626 | 0.8377 | 0.9153 | | 0.1429 | 2.6479 | 1790 | 0.8331 | 0.7626 | 0.8331 | 0.9127 | | 0.1429 | 2.6509 | 1792 | 0.8957 | 0.7626 | 0.8957 | 0.9464 | | 0.1429 | 2.6538 | 1794 | 0.9214 | 0.7626 | 0.9214 | 0.9599 | | 0.1429 | 2.6568 | 1796 | 0.8936 | 0.7626 | 0.8936 | 0.9453 | | 0.1429 | 2.6598 | 1798 | 0.9284 | 0.7840 | 0.9284 | 0.9636 | | 0.1429 | 2.6627 | 1800 | 0.8973 | 0.7626 | 0.8973 | 0.9472 | | 0.1429 | 2.6657 | 1802 | 0.8917 | 0.7626 | 0.8917 | 0.9443 | | 0.1429 | 2.6686 | 1804 | 0.9295 | 0.7626 | 0.9295 | 0.9641 | | 0.1429 | 2.6716 | 1806 | 0.9484 | 0.7840 | 0.9484 | 0.9739 | | 0.1429 | 2.6746 | 1808 | 0.8985 | 0.7296 | 0.8985 | 0.9479 | | 0.1429 | 2.6775 | 1810 | 0.9287 | 0.7519 | 0.9287 | 0.9637 | | 0.1429 | 2.6805 | 1812 | 0.9848 | 0.7840 | 0.9848 | 0.9924 | | 0.1429 | 2.6834 | 1814 | 0.9454 | 0.7840 | 0.9454 | 0.9723 | | 0.1429 | 2.6864 | 1816 | 0.8605 | 0.7296 | 0.8605 | 0.9276 | | 0.1429 | 2.6893 | 1818 | 0.8060 | 0.72 | 0.8060 | 0.8978 | | 0.1429 | 2.6923 | 1820 | 0.7593 | 0.7418 | 0.7593 | 0.8714 | | 0.1429 | 2.6953 | 1822 | 0.7535 | 0.7729 | 0.7535 | 0.8680 | | 0.1429 | 2.6982 | 1824 | 0.8054 | 0.7020 | 0.8054 | 0.8975 | | 0.1429 | 2.7012 | 1826 | 0.9041 | 0.7216 | 0.9041 | 0.9508 | | 0.1429 | 2.7041 | 1828 | 1.1066 | 0.7216 | 1.1066 | 1.0519 | | 0.1429 | 2.7071 | 1830 | 1.2403 | 0.7148 | 1.2403 | 1.1137 | | 0.1429 | 2.7101 | 1832 | 1.1971 | 0.7211 | 1.1971 | 1.0941 | | 0.1429 | 2.7130 | 1834 | 1.0272 | 0.7305 | 1.0272 | 1.0135 | | 0.1429 | 2.7160 | 1836 | 0.8910 | 0.7840 | 0.8910 | 0.9439 | | 0.1429 | 2.7189 | 1838 | 0.8488 | 0.7519 | 0.8488 | 0.9213 | | 0.1429 | 2.7219 | 1840 | 0.7875 | 0.7519 | 0.7875 | 0.8874 | | 0.1429 | 2.7249 | 1842 | 0.7793 | 0.7519 | 0.7793 | 0.8828 | | 0.1429 | 2.7278 | 1844 | 0.8416 | 0.7519 | 0.8416 | 0.9174 | | 0.1429 | 2.7308 | 1846 | 0.9337 | 0.7840 | 0.9337 | 0.9663 | | 0.1429 | 2.7337 | 1848 | 1.1363 | 0.7840 | 1.1363 | 1.0660 | | 0.1429 | 2.7367 | 1850 | 1.2267 | 0.6610 | 1.2267 | 1.1075 | | 0.1429 | 2.7396 | 1852 | 1.1463 | 0.7305 | 1.1463 | 1.0706 | | 0.1429 | 2.7426 | 1854 | 0.9590 | 0.7840 | 0.9590 | 0.9793 | | 0.1429 | 2.7456 | 1856 | 0.7667 | 0.7844 | 0.7667 | 0.8756 | | 0.1429 | 2.7485 | 1858 | 0.6720 | 0.8219 | 0.6720 | 0.8198 | | 0.1429 | 2.7515 | 1860 | 0.6396 | 0.8023 | 0.6396 | 0.7998 | | 0.1429 | 2.7544 | 1862 | 0.6449 | 0.8475 | 0.6449 | 0.8030 | | 0.1429 | 2.7574 | 1864 | 0.6921 | 0.8475 | 0.6921 | 0.8319 | | 0.1429 | 2.7604 | 1866 | 0.7797 | 0.7793 | 0.7797 | 0.8830 | | 0.1429 | 2.7633 | 1868 | 0.9503 | 0.7216 | 0.9503 | 0.9748 | | 0.1429 | 2.7663 | 1870 | 1.1123 | 0.7305 | 1.1123 | 1.0547 | | 0.1429 | 2.7692 | 1872 | 1.2489 | 0.7305 | 1.2489 | 1.1176 | | 0.1429 | 2.7722 | 1874 | 1.2702 | 0.7305 | 1.2702 | 1.1270 | | 0.1429 | 2.7751 | 1876 | 1.1560 | 0.7840 | 1.1560 | 1.0752 | | 0.1429 | 2.7781 | 1878 | 1.0071 | 0.7519 | 1.0071 | 1.0035 | | 0.1429 | 2.7811 | 1880 | 0.9395 | 0.7413 | 0.9395 | 0.9693 | | 0.1429 | 2.7840 | 1882 | 0.9518 | 0.7413 | 0.9518 | 0.9756 | | 0.1429 | 2.7870 | 1884 | 0.9892 | 0.7724 | 0.9892 | 0.9946 | | 0.1429 | 2.7899 | 1886 | 1.0118 | 0.7724 | 1.0118 | 1.0059 | | 0.1429 | 2.7929 | 1888 | 0.9239 | 0.8015 | 0.9239 | 0.9612 | | 0.1429 | 2.7959 | 1890 | 0.8930 | 0.8015 | 0.8930 | 0.9450 | | 0.1429 | 2.7988 | 1892 | 0.8113 | 0.8015 | 0.8113 | 0.9007 | | 0.1429 | 2.8018 | 1894 | 0.8179 | 0.7516 | 0.8179 | 0.9044 | | 0.1429 | 2.8047 | 1896 | 0.8370 | 0.7516 | 0.8370 | 0.9149 | | 0.1429 | 2.8077 | 1898 | 0.8714 | 0.7516 | 0.8714 | 0.9335 | | 0.1429 | 2.8107 | 1900 | 0.9191 | 0.7216 | 0.9191 | 0.9587 | | 0.1429 | 2.8136 | 1902 | 0.9380 | 0.7216 | 0.9380 | 0.9685 | | 0.1429 | 2.8166 | 1904 | 0.9808 | 0.7216 | 0.9808 | 0.9903 | | 0.1429 | 2.8195 | 1906 | 1.0158 | 0.7216 | 1.0158 | 1.0079 | | 0.1429 | 2.8225 | 1908 | 1.0067 | 0.7724 | 1.0067 | 1.0034 | | 0.1429 | 2.8254 | 1910 | 0.8932 | 0.7724 | 0.8932 | 0.9451 | | 0.1429 | 2.8284 | 1912 | 0.8099 | 0.7724 | 0.8099 | 0.9000 | | 0.1429 | 2.8314 | 1914 | 0.7824 | 0.7724 | 0.7824 | 0.8845 | | 0.1429 | 2.8343 | 1916 | 0.8441 | 0.7724 | 0.8441 | 0.9188 | | 0.1429 | 2.8373 | 1918 | 0.9730 | 0.7724 | 0.9730 | 0.9864 | | 0.1429 | 2.8402 | 1920 | 1.0125 | 0.7216 | 1.0125 | 1.0062 | | 0.1429 | 2.8432 | 1922 | 0.9742 | 0.7216 | 0.9742 | 0.9870 | | 0.1429 | 2.8462 | 1924 | 0.9109 | 0.7724 | 0.9109 | 0.9544 | | 0.1429 | 2.8491 | 1926 | 0.9532 | 0.7724 | 0.9532 | 0.9763 | | 0.1429 | 2.8521 | 1928 | 1.0112 | 0.7724 | 1.0112 | 1.0056 | | 0.1429 | 2.8550 | 1930 | 1.0149 | 0.7724 | 1.0149 | 1.0074 | | 0.1429 | 2.8580 | 1932 | 0.9550 | 0.7724 | 0.9550 | 0.9773 | | 0.1429 | 2.8609 | 1934 | 0.8822 | 0.7413 | 0.8822 | 0.9392 | | 0.1429 | 2.8639 | 1936 | 0.9046 | 0.7724 | 0.9046 | 0.9511 | | 0.1429 | 2.8669 | 1938 | 1.0342 | 0.7724 | 1.0342 | 1.0169 | | 0.1429 | 2.8698 | 1940 | 1.1980 | 0.7216 | 1.1980 | 1.0945 | | 0.1429 | 2.8728 | 1942 | 1.2083 | 0.7305 | 1.2083 | 1.0992 | | 0.1429 | 2.8757 | 1944 | 1.0976 | 0.7724 | 1.0976 | 1.0477 | | 0.1429 | 2.8787 | 1946 | 0.9175 | 0.7724 | 0.9175 | 0.9578 | | 0.1429 | 2.8817 | 1948 | 0.7737 | 0.72 | 0.7737 | 0.8796 | | 0.1429 | 2.8846 | 1950 | 0.7425 | 0.7112 | 0.7425 | 0.8617 | | 0.1429 | 2.8876 | 1952 | 0.8026 | 0.7724 | 0.8026 | 0.8959 | | 0.1429 | 2.8905 | 1954 | 0.9301 | 0.7724 | 0.9301 | 0.9644 | | 0.1429 | 2.8935 | 1956 | 1.0652 | 0.7724 | 1.0652 | 1.0321 | | 0.1429 | 2.8964 | 1958 | 1.0490 | 0.7724 | 1.0490 | 1.0242 | | 0.1429 | 2.8994 | 1960 | 0.9689 | 0.7724 | 0.9689 | 0.9843 | | 0.1429 | 2.9024 | 1962 | 0.8176 | 0.7413 | 0.8176 | 0.9042 | | 0.1429 | 2.9053 | 1964 | 0.6502 | 0.7717 | 0.6502 | 0.8064 | | 0.1429 | 2.9083 | 1966 | 0.5673 | 0.7620 | 0.5673 | 0.7532 | | 0.1429 | 2.9112 | 1968 | 0.5577 | 0.7620 | 0.5577 | 0.7468 | | 0.1429 | 2.9142 | 1970 | 0.5944 | 0.7717 | 0.5944 | 0.7710 | | 0.1429 | 2.9172 | 1972 | 0.6882 | 0.7717 | 0.6882 | 0.8296 | | 0.1429 | 2.9201 | 1974 | 0.8182 | 0.7724 | 0.8182 | 0.9046 | | 0.1429 | 2.9231 | 1976 | 0.9034 | 0.7724 | 0.9034 | 0.9505 | | 0.1429 | 2.9260 | 1978 | 0.8788 | 0.7724 | 0.8788 | 0.9375 | | 0.1429 | 2.9290 | 1980 | 0.8148 | 0.7724 | 0.8148 | 0.9026 | | 0.1429 | 2.9320 | 1982 | 0.7775 | 0.7724 | 0.7775 | 0.8817 | | 0.1429 | 2.9349 | 1984 | 0.7848 | 0.7724 | 0.7848 | 0.8859 | | 0.1429 | 2.9379 | 1986 | 0.8229 | 0.7724 | 0.8229 | 0.9072 | | 0.1429 | 2.9408 | 1988 | 0.8945 | 0.7724 | 0.8945 | 0.9458 | | 0.1429 | 2.9438 | 1990 | 0.9073 | 0.7724 | 0.9073 | 0.9525 | | 0.1429 | 2.9467 | 1992 | 0.9499 | 0.7724 | 0.9499 | 0.9746 | | 0.1429 | 2.9497 | 1994 | 0.9183 | 0.7724 | 0.9183 | 0.9583 | | 0.1429 | 2.9527 | 1996 | 0.9060 | 0.7724 | 0.9060 | 0.9519 | | 0.1429 | 2.9556 | 1998 | 0.8116 | 0.7724 | 0.8116 | 0.9009 | | 0.1162 | 2.9586 | 2000 | 0.7766 | 0.7724 | 0.7766 | 0.8812 | | 0.1162 | 2.9615 | 2002 | 0.7710 | 0.7724 | 0.7710 | 0.8781 | | 0.1162 | 2.9645 | 2004 | 0.7723 | 0.7724 | 0.7723 | 0.8788 | | 0.1162 | 2.9675 | 2006 | 0.8075 | 0.7724 | 0.8075 | 0.8986 | | 0.1162 | 2.9704 | 2008 | 0.8376 | 0.7724 | 0.8376 | 0.9152 | | 0.1162 | 2.9734 | 2010 | 0.8857 | 0.7724 | 0.8857 | 0.9411 | | 0.1162 | 2.9763 | 2012 | 0.9272 | 0.7724 | 0.9272 | 0.9629 | | 0.1162 | 2.9793 | 2014 | 0.8780 | 0.7724 | 0.8780 | 0.9370 | | 0.1162 | 2.9822 | 2016 | 0.8113 | 0.7724 | 0.8113 | 0.9007 | | 0.1162 | 2.9852 | 2018 | 0.7864 | 0.7724 | 0.7864 | 0.8868 | | 0.1162 | 2.9882 | 2020 | 0.7396 | 0.7413 | 0.7396 | 0.8600 | | 0.1162 | 2.9911 | 2022 | 0.7368 | 0.7413 | 0.7368 | 0.8584 | | 0.1162 | 2.9941 | 2024 | 0.7659 | 0.7724 | 0.7659 | 0.8752 | | 0.1162 | 2.9970 | 2026 | 0.7277 | 0.7724 | 0.7277 | 0.8530 | | 0.1162 | 3.0 | 2028 | 0.7530 | 0.7724 | 0.7530 | 0.8678 | | 0.1162 | 3.0030 | 2030 | 0.7409 | 0.7724 | 0.7409 | 0.8608 | | 0.1162 | 3.0059 | 2032 | 0.7084 | 0.7724 | 0.7084 | 0.8416 | | 0.1162 | 3.0089 | 2034 | 0.6341 | 0.7823 | 0.6341 | 0.7963 | | 0.1162 | 3.0118 | 2036 | 0.6135 | 0.7717 | 0.6135 | 0.7833 | | 0.1162 | 3.0148 | 2038 | 0.6579 | 0.8094 | 0.6579 | 0.8111 | | 0.1162 | 3.0178 | 2040 | 0.7603 | 0.7520 | 0.7603 | 0.8720 | | 0.1162 | 3.0207 | 2042 | 0.8973 | 0.7724 | 0.8973 | 0.9472 | | 0.1162 | 3.0237 | 2044 | 0.9976 | 0.7724 | 0.9976 | 0.9988 | | 0.1162 | 3.0266 | 2046 | 1.0832 | 0.7724 | 1.0832 | 1.0408 | | 0.1162 | 3.0296 | 2048 | 1.1045 | 0.7724 | 1.1045 | 1.0510 | | 0.1162 | 3.0325 | 2050 | 1.0299 | 0.7724 | 1.0299 | 1.0148 | | 0.1162 | 3.0355 | 2052 | 0.9105 | 0.7724 | 0.9105 | 0.9542 | | 0.1162 | 3.0385 | 2054 | 0.7824 | 0.72 | 0.7824 | 0.8845 | | 0.1162 | 3.0414 | 2056 | 0.7448 | 0.72 | 0.7448 | 0.8630 | | 0.1162 | 3.0444 | 2058 | 0.7912 | 0.7724 | 0.7912 | 0.8895 | | 0.1162 | 3.0473 | 2060 | 0.8342 | 0.7724 | 0.8342 | 0.9133 | | 0.1162 | 3.0503 | 2062 | 0.8239 | 0.7724 | 0.8239 | 0.9077 | | 0.1162 | 3.0533 | 2064 | 0.7561 | 0.7618 | 0.7561 | 0.8696 | | 0.1162 | 3.0562 | 2066 | 0.6718 | 0.7983 | 0.6718 | 0.8196 | | 0.1162 | 3.0592 | 2068 | 0.6402 | 0.7879 | 0.6402 | 0.8001 | | 0.1162 | 3.0621 | 2070 | 0.6778 | 0.7983 | 0.6778 | 0.8233 | | 0.1162 | 3.0651 | 2072 | 0.7639 | 0.7724 | 0.7639 | 0.8740 | | 0.1162 | 3.0680 | 2074 | 0.7944 | 0.7724 | 0.7944 | 0.8913 | | 0.1162 | 3.0710 | 2076 | 0.7609 | 0.7724 | 0.7609 | 0.8723 | | 0.1162 | 3.0740 | 2078 | 0.7314 | 0.7520 | 0.7314 | 0.8552 | | 0.1162 | 3.0769 | 2080 | 0.7551 | 0.7520 | 0.7551 | 0.8690 | | 0.1162 | 3.0799 | 2082 | 0.7911 | 0.7724 | 0.7911 | 0.8895 | | 0.1162 | 3.0828 | 2084 | 0.7572 | 0.72 | 0.7572 | 0.8702 | | 0.1162 | 3.0858 | 2086 | 0.7157 | 0.7418 | 0.7157 | 0.8460 | | 0.1162 | 3.0888 | 2088 | 0.7232 | 0.7418 | 0.7232 | 0.8504 | | 0.1162 | 3.0917 | 2090 | 0.7886 | 0.7631 | 0.7886 | 0.8880 | | 0.1162 | 3.0947 | 2092 | 0.8901 | 0.7724 | 0.8901 | 0.9435 | | 0.1162 | 3.0976 | 2094 | 0.9089 | 0.7724 | 0.9089 | 0.9534 | | 0.1162 | 3.1006 | 2096 | 0.9423 | 0.7305 | 0.9423 | 0.9707 | | 0.1162 | 3.1036 | 2098 | 0.9445 | 0.7840 | 0.9445 | 0.9719 | | 0.1162 | 3.1065 | 2100 | 0.9041 | 0.7840 | 0.9041 | 0.9509 | | 0.1162 | 3.1095 | 2102 | 0.8881 | 0.7724 | 0.8881 | 0.9424 | | 0.1162 | 3.1124 | 2104 | 0.8297 | 0.7724 | 0.8297 | 0.9109 | | 0.1162 | 3.1154 | 2106 | 0.8237 | 0.7724 | 0.8237 | 0.9076 | | 0.1162 | 3.1183 | 2108 | 0.8509 | 0.7724 | 0.8509 | 0.9224 | | 0.1162 | 3.1213 | 2110 | 0.9520 | 0.7724 | 0.9520 | 0.9757 | | 0.1162 | 3.1243 | 2112 | 1.0214 | 0.7724 | 1.0214 | 1.0107 | | 0.1162 | 3.1272 | 2114 | 1.0221 | 0.7724 | 1.0221 | 1.0110 | | 0.1162 | 3.1302 | 2116 | 1.0414 | 0.7724 | 1.0414 | 1.0205 | | 0.1162 | 3.1331 | 2118 | 1.0552 | 0.7840 | 1.0552 | 1.0273 | | 0.1162 | 3.1361 | 2120 | 1.1041 | 0.7840 | 1.1041 | 1.0508 | | 0.1162 | 3.1391 | 2122 | 1.1316 | 0.7840 | 1.1316 | 1.0637 | | 0.1162 | 3.1420 | 2124 | 1.1518 | 0.7188 | 1.1518 | 1.0732 | | 0.1162 | 3.1450 | 2126 | 1.0601 | 0.7840 | 1.0601 | 1.0296 | | 0.1162 | 3.1479 | 2128 | 0.9684 | 0.7724 | 0.9684 | 0.9841 | | 0.1162 | 3.1509 | 2130 | 0.8869 | 0.7724 | 0.8869 | 0.9417 | | 0.1162 | 3.1538 | 2132 | 0.8017 | 0.7520 | 0.8017 | 0.8954 | | 0.1162 | 3.1568 | 2134 | 0.7347 | 0.7819 | 0.7347 | 0.8572 | | 0.1162 | 3.1598 | 2136 | 0.7199 | 0.7819 | 0.7199 | 0.8485 | | 0.1162 | 3.1627 | 2138 | 0.7722 | 0.7819 | 0.7722 | 0.8787 | | 0.1162 | 3.1657 | 2140 | 0.9118 | 0.7724 | 0.9118 | 0.9549 | | 0.1162 | 3.1686 | 2142 | 1.0244 | 0.7216 | 1.0244 | 1.0122 | | 0.1162 | 3.1716 | 2144 | 1.0311 | 0.7216 | 1.0311 | 1.0154 | | 0.1162 | 3.1746 | 2146 | 0.9899 | 0.7724 | 0.9899 | 0.9949 | | 0.1162 | 3.1775 | 2148 | 0.9933 | 0.7724 | 0.9933 | 0.9966 | | 0.1162 | 3.1805 | 2150 | 1.0403 | 0.7305 | 1.0403 | 1.0200 | | 0.1162 | 3.1834 | 2152 | 1.1078 | 0.7305 | 1.1078 | 1.0525 | | 0.1162 | 3.1864 | 2154 | 1.1671 | 0.7095 | 1.1671 | 1.0803 | | 0.1162 | 3.1893 | 2156 | 1.2117 | 0.7095 | 1.2117 | 1.1008 | | 0.1162 | 3.1923 | 2158 | 1.1408 | 0.7095 | 1.1408 | 1.0681 | | 0.1162 | 3.1953 | 2160 | 0.9917 | 0.7724 | 0.9917 | 0.9959 | | 0.1162 | 3.1982 | 2162 | 0.9095 | 0.7724 | 0.9095 | 0.9537 | | 0.1162 | 3.2012 | 2164 | 0.9150 | 0.7724 | 0.9150 | 0.9566 | | 0.1162 | 3.2041 | 2166 | 0.9236 | 0.7724 | 0.9236 | 0.9610 | | 0.1162 | 3.2071 | 2168 | 0.9520 | 0.7840 | 0.9520 | 0.9757 | | 0.1162 | 3.2101 | 2170 | 0.9959 | 0.7840 | 0.9959 | 0.9979 | | 0.1162 | 3.2130 | 2172 | 1.0747 | 0.7621 | 1.0747 | 1.0367 | | 0.1162 | 3.2160 | 2174 | 1.0785 | 0.7840 | 1.0785 | 1.0385 | | 0.1162 | 3.2189 | 2176 | 0.9945 | 0.7840 | 0.9945 | 0.9973 | | 0.1162 | 3.2219 | 2178 | 0.8835 | 0.7729 | 0.8835 | 0.9399 | | 0.1162 | 3.2249 | 2180 | 0.8047 | 0.7729 | 0.8047 | 0.8970 | | 0.1162 | 3.2278 | 2182 | 0.7977 | 0.7729 | 0.7977 | 0.8931 | | 0.1162 | 3.2308 | 2184 | 0.8718 | 0.8015 | 0.8718 | 0.9337 | | 0.1162 | 3.2337 | 2186 | 0.9767 | 0.7724 | 0.9767 | 0.9883 | | 0.1162 | 3.2367 | 2188 | 1.1090 | 0.7095 | 1.1090 | 1.0531 | | 0.1162 | 3.2396 | 2190 | 1.1811 | 0.7095 | 1.1811 | 1.0868 | | 0.1162 | 3.2426 | 2192 | 1.2044 | 0.7095 | 1.2044 | 1.0974 | | 0.1162 | 3.2456 | 2194 | 1.2650 | 0.7621 | 1.2650 | 1.1247 | | 0.1162 | 3.2485 | 2196 | 1.2323 | 0.7621 | 1.2323 | 1.1101 | | 0.1162 | 3.2515 | 2198 | 1.0917 | 0.7194 | 1.0917 | 1.0449 | | 0.1162 | 3.2544 | 2200 | 0.9305 | 0.7413 | 0.9305 | 0.9646 | | 0.1162 | 3.2574 | 2202 | 0.8309 | 0.7729 | 0.8309 | 0.9115 | | 0.1162 | 3.2604 | 2204 | 0.8142 | 0.8019 | 0.8142 | 0.9023 | | 0.1162 | 3.2633 | 2206 | 0.8400 | 0.8283 | 0.8400 | 0.9165 | | 0.1162 | 3.2663 | 2208 | 0.8883 | 0.8283 | 0.8883 | 0.9425 | | 0.1162 | 3.2692 | 2210 | 0.9934 | 0.7724 | 0.9934 | 0.9967 | | 0.1162 | 3.2722 | 2212 | 1.0140 | 0.7724 | 1.0140 | 1.0070 | | 0.1162 | 3.2751 | 2214 | 1.0551 | 0.7724 | 1.0551 | 1.0272 | | 0.1162 | 3.2781 | 2216 | 0.9842 | 0.7724 | 0.9842 | 0.9921 | | 0.1162 | 3.2811 | 2218 | 0.8601 | 0.8283 | 0.8601 | 0.9274 | | 0.1162 | 3.2840 | 2220 | 0.8382 | 0.8283 | 0.8382 | 0.9155 | | 0.1162 | 3.2870 | 2222 | 0.8617 | 0.8283 | 0.8617 | 0.9283 | | 0.1162 | 3.2899 | 2224 | 0.8859 | 0.8015 | 0.8859 | 0.9412 | | 0.1162 | 3.2929 | 2226 | 0.9479 | 0.7724 | 0.9479 | 0.9736 | | 0.1162 | 3.2959 | 2228 | 1.0165 | 0.7724 | 1.0165 | 1.0082 | | 0.1162 | 3.2988 | 2230 | 1.0763 | 0.7515 | 1.0763 | 1.0374 | | 0.1162 | 3.3018 | 2232 | 1.1265 | 0.7515 | 1.1265 | 1.0614 | | 0.1162 | 3.3047 | 2234 | 1.1280 | 0.6890 | 1.1280 | 1.0621 | | 0.1162 | 3.3077 | 2236 | 1.0162 | 0.7724 | 1.0162 | 1.0081 | | 0.1162 | 3.3107 | 2238 | 0.8659 | 0.7413 | 0.8659 | 0.9305 | | 0.1162 | 3.3136 | 2240 | 0.7813 | 0.72 | 0.7813 | 0.8839 | | 0.1162 | 3.3166 | 2242 | 0.7626 | 0.7823 | 0.7626 | 0.8733 | | 0.1162 | 3.3195 | 2244 | 0.8184 | 0.7724 | 0.8184 | 0.9047 | | 0.1162 | 3.3225 | 2246 | 0.9424 | 0.7724 | 0.9424 | 0.9708 | | 0.1162 | 3.3254 | 2248 | 1.0574 | 0.7216 | 1.0574 | 1.0283 | | 0.1162 | 3.3284 | 2250 | 1.0485 | 0.7216 | 1.0485 | 1.0240 | | 0.1162 | 3.3314 | 2252 | 0.9405 | 0.7724 | 0.9405 | 0.9698 | | 0.1162 | 3.3343 | 2254 | 0.9177 | 0.7724 | 0.9177 | 0.9579 | | 0.1162 | 3.3373 | 2256 | 0.9978 | 0.7724 | 0.9978 | 0.9989 | | 0.1162 | 3.3402 | 2258 | 1.0211 | 0.7724 | 1.0211 | 1.0105 | | 0.1162 | 3.3432 | 2260 | 1.0102 | 0.7724 | 1.0102 | 1.0051 | | 0.1162 | 3.3462 | 2262 | 0.8901 | 0.7724 | 0.8901 | 0.9434 | | 0.1162 | 3.3491 | 2264 | 0.7977 | 0.7724 | 0.7977 | 0.8931 | | 0.1162 | 3.3521 | 2266 | 0.7835 | 0.7724 | 0.7835 | 0.8852 | | 0.1162 | 3.3550 | 2268 | 0.8506 | 0.7724 | 0.8506 | 0.9223 | | 0.1162 | 3.3580 | 2270 | 0.9911 | 0.7724 | 0.9911 | 0.9955 | | 0.1162 | 3.3609 | 2272 | 1.0787 | 0.7840 | 1.0787 | 1.0386 | | 0.1162 | 3.3639 | 2274 | 1.0661 | 0.7840 | 1.0661 | 1.0325 | | 0.1162 | 3.3669 | 2276 | 1.0247 | 0.7724 | 1.0247 | 1.0123 | | 0.1162 | 3.3698 | 2278 | 0.9857 | 0.7724 | 0.9857 | 0.9928 | | 0.1162 | 3.3728 | 2280 | 0.8962 | 0.7724 | 0.8962 | 0.9467 | | 0.1162 | 3.3757 | 2282 | 0.7898 | 0.7413 | 0.7898 | 0.8887 | | 0.1162 | 3.3787 | 2284 | 0.7466 | 0.7413 | 0.7466 | 0.8641 | | 0.1162 | 3.3817 | 2286 | 0.7975 | 0.7724 | 0.7975 | 0.8930 | | 0.1162 | 3.3846 | 2288 | 0.9353 | 0.7216 | 0.9353 | 0.9671 | | 0.1162 | 3.3876 | 2290 | 1.1304 | 0.7216 | 1.1304 | 1.0632 | | 0.1162 | 3.3905 | 2292 | 1.1809 | 0.7216 | 1.1809 | 1.0867 | | 0.1162 | 3.3935 | 2294 | 1.1682 | 0.7305 | 1.1682 | 1.0809 | | 0.1162 | 3.3964 | 2296 | 1.0968 | 0.7305 | 1.0968 | 1.0473 | | 0.1162 | 3.3994 | 2298 | 0.9606 | 0.7724 | 0.9606 | 0.9801 | | 0.1162 | 3.4024 | 2300 | 0.8493 | 0.7724 | 0.8493 | 0.9216 | | 0.1162 | 3.4053 | 2302 | 0.7809 | 0.7413 | 0.7809 | 0.8837 | | 0.1162 | 3.4083 | 2304 | 0.7608 | 0.7413 | 0.7608 | 0.8723 | | 0.1162 | 3.4112 | 2306 | 0.8043 | 0.7413 | 0.8043 | 0.8968 | | 0.1162 | 3.4142 | 2308 | 0.8935 | 0.7724 | 0.8935 | 0.9452 | | 0.1162 | 3.4172 | 2310 | 1.0699 | 0.7840 | 1.0699 | 1.0343 | | 0.1162 | 3.4201 | 2312 | 1.1944 | 0.7305 | 1.1944 | 1.0929 | | 0.1162 | 3.4231 | 2314 | 1.1693 | 0.7305 | 1.1693 | 1.0814 | | 0.1162 | 3.4260 | 2316 | 1.0631 | 0.7305 | 1.0631 | 1.0311 | | 0.1162 | 3.4290 | 2318 | 0.9026 | 0.7724 | 0.9026 | 0.9501 | | 0.1162 | 3.4320 | 2320 | 0.7982 | 0.7618 | 0.7982 | 0.8934 | | 0.1162 | 3.4349 | 2322 | 0.7758 | 0.7618 | 0.7758 | 0.8808 | | 0.1162 | 3.4379 | 2324 | 0.8081 | 0.7618 | 0.8081 | 0.8989 | | 0.1162 | 3.4408 | 2326 | 0.8982 | 0.7618 | 0.8982 | 0.9478 | | 0.1162 | 3.4438 | 2328 | 0.9911 | 0.7724 | 0.9911 | 0.9955 | | 0.1162 | 3.4467 | 2330 | 0.9801 | 0.7724 | 0.9801 | 0.9900 | | 0.1162 | 3.4497 | 2332 | 0.9000 | 0.7618 | 0.9000 | 0.9487 | | 0.1162 | 3.4527 | 2334 | 0.7839 | 0.7316 | 0.7839 | 0.8854 | | 0.1162 | 3.4556 | 2336 | 0.7508 | 0.7427 | 0.7508 | 0.8665 | | 0.1162 | 3.4586 | 2338 | 0.7855 | 0.7427 | 0.7855 | 0.8863 | | 0.1162 | 3.4615 | 2340 | 0.8798 | 0.7413 | 0.8798 | 0.9380 | | 0.1162 | 3.4645 | 2342 | 0.9026 | 0.7413 | 0.9026 | 0.9500 | | 0.1162 | 3.4675 | 2344 | 0.8616 | 0.72 | 0.8616 | 0.9282 | | 0.1162 | 3.4704 | 2346 | 0.8102 | 0.72 | 0.8102 | 0.9001 | | 0.1162 | 3.4734 | 2348 | 0.8029 | 0.7525 | 0.8029 | 0.8961 | | 0.1162 | 3.4763 | 2350 | 0.7815 | 0.7525 | 0.7815 | 0.8840 | | 0.1162 | 3.4793 | 2352 | 0.7968 | 0.7525 | 0.7968 | 0.8927 | | 0.1162 | 3.4822 | 2354 | 0.8540 | 0.7819 | 0.8540 | 0.9241 | | 0.1162 | 3.4852 | 2356 | 0.9520 | 0.7516 | 0.9520 | 0.9757 | | 0.1162 | 3.4882 | 2358 | 0.9549 | 0.7516 | 0.9549 | 0.9772 | | 0.1162 | 3.4911 | 2360 | 0.9086 | 0.7516 | 0.9086 | 0.9532 | | 0.1162 | 3.4941 | 2362 | 0.8341 | 0.7327 | 0.8341 | 0.9133 | | 0.1162 | 3.4970 | 2364 | 0.7672 | 0.7983 | 0.7672 | 0.8759 | | 0.1162 | 3.5 | 2366 | 0.7149 | 0.8232 | 0.7149 | 0.8455 | | 0.1162 | 3.5030 | 2368 | 0.7119 | 0.7986 | 0.7119 | 0.8437 | | 0.1162 | 3.5059 | 2370 | 0.7458 | 0.7717 | 0.7458 | 0.8636 | | 0.1162 | 3.5089 | 2372 | 0.7282 | 0.7882 | 0.7282 | 0.8534 | | 0.1162 | 3.5118 | 2374 | 0.7380 | 0.7882 | 0.7380 | 0.8590 | | 0.1162 | 3.5148 | 2376 | 0.8078 | 0.7427 | 0.8078 | 0.8988 | | 0.1162 | 3.5178 | 2378 | 0.8473 | 0.7729 | 0.8473 | 0.9205 | | 0.1162 | 3.5207 | 2380 | 0.8808 | 0.7724 | 0.8808 | 0.9385 | | 0.1162 | 3.5237 | 2382 | 0.8302 | 0.7623 | 0.8302 | 0.9111 | | 0.1162 | 3.5266 | 2384 | 0.7617 | 0.7620 | 0.7617 | 0.8727 | | 0.1162 | 3.5296 | 2386 | 0.7344 | 0.7882 | 0.7344 | 0.8570 | | 0.1162 | 3.5325 | 2388 | 0.7336 | 0.7620 | 0.7336 | 0.8565 | | 0.1162 | 3.5355 | 2390 | 0.7938 | 0.7413 | 0.7938 | 0.8910 | | 0.1162 | 3.5385 | 2392 | 0.8755 | 0.7840 | 0.8755 | 0.9357 | | 0.1162 | 3.5414 | 2394 | 0.9611 | 0.7840 | 0.9611 | 0.9803 | | 0.1162 | 3.5444 | 2396 | 0.9955 | 0.7305 | 0.9955 | 0.9977 | | 0.1162 | 3.5473 | 2398 | 0.9407 | 0.7305 | 0.9407 | 0.9699 | | 0.1162 | 3.5503 | 2400 | 0.8313 | 0.7840 | 0.8313 | 0.9118 | | 0.1162 | 3.5533 | 2402 | 0.7876 | 0.8015 | 0.7876 | 0.8875 | | 0.1162 | 3.5562 | 2404 | 0.8044 | 0.8138 | 0.8044 | 0.8969 | | 0.1162 | 3.5592 | 2406 | 0.8877 | 0.7840 | 0.8877 | 0.9422 | | 0.1162 | 3.5621 | 2408 | 0.9590 | 0.7840 | 0.9590 | 0.9793 | | 0.1162 | 3.5651 | 2410 | 1.0271 | 0.7840 | 1.0271 | 1.0135 | | 0.1162 | 3.5680 | 2412 | 1.0207 | 0.7840 | 1.0207 | 1.0103 | | 0.1162 | 3.5710 | 2414 | 0.9388 | 0.7840 | 0.9388 | 0.9689 | | 0.1162 | 3.5740 | 2416 | 0.8605 | 0.7840 | 0.8605 | 0.9276 | | 0.1162 | 3.5769 | 2418 | 0.7621 | 0.8412 | 0.7621 | 0.8730 | | 0.1162 | 3.5799 | 2420 | 0.6715 | 0.8123 | 0.6715 | 0.8194 | | 0.1162 | 3.5828 | 2422 | 0.6706 | 0.8123 | 0.6706 | 0.8189 | | 0.1162 | 3.5858 | 2424 | 0.7484 | 0.8293 | 0.7484 | 0.8651 | | 0.1162 | 3.5888 | 2426 | 0.8841 | 0.7606 | 0.8841 | 0.9403 | | 0.1162 | 3.5917 | 2428 | 1.0645 | 0.7421 | 1.0645 | 1.0318 | | 0.1162 | 3.5947 | 2430 | 1.1243 | 0.6545 | 1.1243 | 1.0603 | | 0.1162 | 3.5976 | 2432 | 1.1148 | 0.6551 | 1.1148 | 1.0559 | | 0.1162 | 3.6006 | 2434 | 1.0023 | 0.7239 | 1.0023 | 1.0011 | | 0.1162 | 3.6036 | 2436 | 0.8786 | 0.8164 | 0.8786 | 0.9373 | | 0.1162 | 3.6065 | 2438 | 0.8233 | 0.8164 | 0.8233 | 0.9073 | | 0.1162 | 3.6095 | 2440 | 0.7917 | 0.8164 | 0.7917 | 0.8898 | | 0.1162 | 3.6124 | 2442 | 0.8218 | 0.8164 | 0.8218 | 0.9065 | | 0.1162 | 3.6154 | 2444 | 0.9057 | 0.8283 | 0.9057 | 0.9517 | | 0.1162 | 3.6183 | 2446 | 0.9523 | 0.7840 | 0.9523 | 0.9759 | | 0.1162 | 3.6213 | 2448 | 0.9302 | 0.7840 | 0.9302 | 0.9645 | | 0.1162 | 3.6243 | 2450 | 0.8485 | 0.7724 | 0.8485 | 0.9211 | | 0.1162 | 3.6272 | 2452 | 0.7351 | 0.8283 | 0.7351 | 0.8574 | | 0.1162 | 3.6302 | 2454 | 0.6785 | 0.8232 | 0.6785 | 0.8237 | | 0.1162 | 3.6331 | 2456 | 0.6948 | 0.8283 | 0.6948 | 0.8336 | | 0.1162 | 3.6361 | 2458 | 0.7485 | 0.8138 | 0.7485 | 0.8652 | | 0.1162 | 3.6391 | 2460 | 0.7742 | 0.7840 | 0.7742 | 0.8799 | | 0.1162 | 3.6420 | 2462 | 0.8258 | 0.7840 | 0.8258 | 0.9087 | | 0.1162 | 3.6450 | 2464 | 0.8494 | 0.7840 | 0.8494 | 0.9216 | | 0.1162 | 3.6479 | 2466 | 0.8609 | 0.7519 | 0.8609 | 0.9278 | | 0.1162 | 3.6509 | 2468 | 0.8377 | 0.7519 | 0.8377 | 0.9153 | | 0.1162 | 3.6538 | 2470 | 0.7964 | 0.7844 | 0.7964 | 0.8924 | | 0.1162 | 3.6568 | 2472 | 0.7550 | 0.8015 | 0.7550 | 0.8689 | | 0.1162 | 3.6598 | 2474 | 0.7731 | 0.8015 | 0.7731 | 0.8792 | | 0.1162 | 3.6627 | 2476 | 0.8439 | 0.7516 | 0.8439 | 0.9186 | | 0.1162 | 3.6657 | 2478 | 0.9745 | 0.7216 | 0.9745 | 0.9872 | | 0.1162 | 3.6686 | 2480 | 1.1186 | 0.6680 | 1.1186 | 1.0576 | | 0.1162 | 3.6716 | 2482 | 1.2594 | 0.6469 | 1.2594 | 1.1222 | | 0.1162 | 3.6746 | 2484 | 1.3117 | 0.6469 | 1.3117 | 1.1453 | | 0.1162 | 3.6775 | 2486 | 1.2602 | 0.6969 | 1.2602 | 1.1226 | | 0.1162 | 3.6805 | 2488 | 1.1442 | 0.7188 | 1.1442 | 1.0697 | | 0.1162 | 3.6834 | 2490 | 1.0308 | 0.6839 | 1.0308 | 1.0153 | | 0.1162 | 3.6864 | 2492 | 0.9664 | 0.6839 | 0.9664 | 0.9831 | | 0.1162 | 3.6893 | 2494 | 0.9203 | 0.6617 | 0.9203 | 0.9593 | | 0.1162 | 3.6923 | 2496 | 0.9104 | 0.6617 | 0.9104 | 0.9542 | | 0.1162 | 3.6953 | 2498 | 0.9569 | 0.7188 | 0.9569 | 0.9782 | | 0.0885 | 3.6982 | 2500 | 1.0330 | 0.7188 | 1.0330 | 1.0164 | | 0.0885 | 3.7012 | 2502 | 1.1183 | 0.6680 | 1.1183 | 1.0575 | | 0.0885 | 3.7041 | 2504 | 1.1880 | 0.6680 | 1.1880 | 1.0899 | | 0.0885 | 3.7071 | 2506 | 1.1605 | 0.6680 | 1.1605 | 1.0773 | | 0.0885 | 3.7101 | 2508 | 1.0374 | 0.6616 | 1.0374 | 1.0185 | | 0.0885 | 3.7130 | 2510 | 0.8719 | 0.8015 | 0.8719 | 0.9338 | | 0.0885 | 3.7160 | 2512 | 0.7399 | 0.7879 | 0.7399 | 0.8602 | | 0.0885 | 3.7189 | 2514 | 0.6869 | 0.8123 | 0.6869 | 0.8288 | | 0.0885 | 3.7219 | 2516 | 0.6908 | 0.8123 | 0.6908 | 0.8311 | | 0.0885 | 3.7249 | 2518 | 0.7420 | 0.8123 | 0.7420 | 0.8614 | | 0.0885 | 3.7278 | 2520 | 0.8474 | 0.8283 | 0.8474 | 0.9206 | | 0.0885 | 3.7308 | 2522 | 0.9556 | 0.7417 | 0.9556 | 0.9776 | | 0.0885 | 3.7337 | 2524 | 1.0459 | 0.7515 | 1.0459 | 1.0227 | | 0.0885 | 3.7367 | 2526 | 1.1312 | 0.6969 | 1.1312 | 1.0636 | | 0.0885 | 3.7396 | 2528 | 1.1158 | 0.6969 | 1.1158 | 1.0563 | | 0.0885 | 3.7426 | 2530 | 1.0474 | 0.7515 | 1.0474 | 1.0234 | | 0.0885 | 3.7456 | 2532 | 0.9789 | 0.7515 | 0.9789 | 0.9894 | | 0.0885 | 3.7485 | 2534 | 0.8634 | 0.7815 | 0.8634 | 0.9292 | | 0.0885 | 3.7515 | 2536 | 0.8160 | 0.8215 | 0.8160 | 0.9033 | | 0.0885 | 3.7544 | 2538 | 0.8419 | 0.7815 | 0.8419 | 0.9176 | | 0.0885 | 3.7574 | 2540 | 0.8892 | 0.7815 | 0.8892 | 0.9430 | | 0.0885 | 3.7604 | 2542 | 0.8525 | 0.7815 | 0.8525 | 0.9233 | | 0.0885 | 3.7633 | 2544 | 0.7840 | 0.8232 | 0.7840 | 0.8855 | | 0.0885 | 3.7663 | 2546 | 0.7160 | 0.8232 | 0.7160 | 0.8462 | | 0.0885 | 3.7692 | 2548 | 0.6766 | 0.8232 | 0.6766 | 0.8226 | | 0.0885 | 3.7722 | 2550 | 0.7092 | 0.8232 | 0.7092 | 0.8421 | | 0.0885 | 3.7751 | 2552 | 0.7994 | 0.8232 | 0.7994 | 0.8941 | | 0.0885 | 3.7781 | 2554 | 0.9509 | 0.8283 | 0.9509 | 0.9752 | | 0.0885 | 3.7811 | 2556 | 1.0959 | 0.6969 | 1.0959 | 1.0468 | | 0.0885 | 3.7840 | 2558 | 1.1449 | 0.6969 | 1.1449 | 1.0700 | | 0.0885 | 3.7870 | 2560 | 1.0848 | 0.6969 | 1.0848 | 1.0415 | | 0.0885 | 3.7899 | 2562 | 0.9958 | 0.7626 | 0.9958 | 0.9979 | | 0.0885 | 3.7929 | 2564 | 0.9331 | 0.8215 | 0.9331 | 0.9660 | | 0.0885 | 3.7959 | 2566 | 0.9308 | 0.8215 | 0.9308 | 0.9648 | | 0.0885 | 3.7988 | 2568 | 0.9152 | 0.8094 | 0.9152 | 0.9567 | | 0.0885 | 3.8018 | 2570 | 0.8826 | 0.8094 | 0.8826 | 0.9395 | | 0.0885 | 3.8047 | 2572 | 0.8244 | 0.7983 | 0.8244 | 0.9080 | | 0.0885 | 3.8077 | 2574 | 0.7960 | 0.7983 | 0.7960 | 0.8922 | | 0.0885 | 3.8107 | 2576 | 0.7696 | 0.7983 | 0.7696 | 0.8773 | | 0.0885 | 3.8136 | 2578 | 0.7704 | 0.7983 | 0.7704 | 0.8777 | | 0.0885 | 3.8166 | 2580 | 0.8027 | 0.8094 | 0.8027 | 0.8960 | | 0.0885 | 3.8195 | 2582 | 0.8710 | 0.8094 | 0.8710 | 0.9333 | | 0.0885 | 3.8225 | 2584 | 1.0065 | 0.8215 | 1.0065 | 1.0032 | | 0.0885 | 3.8254 | 2586 | 1.1050 | 0.7840 | 1.1050 | 1.0512 | | 0.0885 | 3.8284 | 2588 | 1.1096 | 0.7840 | 1.1096 | 1.0534 | | 0.0885 | 3.8314 | 2590 | 1.0408 | 0.7626 | 1.0408 | 1.0202 | | 0.0885 | 3.8343 | 2592 | 1.0348 | 0.7626 | 1.0348 | 1.0173 | | 0.0885 | 3.8373 | 2594 | 0.9969 | 0.7296 | 0.9969 | 0.9984 | | 0.0885 | 3.8402 | 2596 | 0.9563 | 0.7296 | 0.9563 | 0.9779 | | 0.0885 | 3.8432 | 2598 | 0.8955 | 0.7296 | 0.8955 | 0.9463 | | 0.0885 | 3.8462 | 2600 | 0.8564 | 0.7296 | 0.8564 | 0.9254 | | 0.0885 | 3.8491 | 2602 | 0.8386 | 0.7631 | 0.8386 | 0.9157 | | 0.0885 | 3.8521 | 2604 | 0.8248 | 0.7631 | 0.8248 | 0.9082 | | 0.0885 | 3.8550 | 2606 | 0.8204 | 0.8215 | 0.8204 | 0.9058 | | 0.0885 | 3.8580 | 2608 | 0.8078 | 0.8094 | 0.8078 | 0.8988 | | 0.0885 | 3.8609 | 2610 | 0.8545 | 0.8283 | 0.8545 | 0.9244 | | 0.0885 | 3.8639 | 2612 | 0.8514 | 0.8164 | 0.8514 | 0.9227 | | 0.0885 | 3.8669 | 2614 | 0.8611 | 0.8164 | 0.8611 | 0.9280 | | 0.0885 | 3.8698 | 2616 | 0.8278 | 0.8164 | 0.8278 | 0.9098 | | 0.0885 | 3.8728 | 2618 | 0.7677 | 0.8164 | 0.7677 | 0.8762 | | 0.0885 | 3.8757 | 2620 | 0.6816 | 0.7906 | 0.6816 | 0.8256 | | 0.0885 | 3.8787 | 2622 | 0.6433 | 0.7906 | 0.6433 | 0.8021 | | 0.0885 | 3.8817 | 2624 | 0.6416 | 0.7906 | 0.6416 | 0.8010 | | 0.0885 | 3.8846 | 2626 | 0.6635 | 0.8164 | 0.6635 | 0.8146 | | 0.0885 | 3.8876 | 2628 | 0.6655 | 0.8164 | 0.6655 | 0.8158 | | 0.0885 | 3.8905 | 2630 | 0.6876 | 0.8164 | 0.6876 | 0.8292 | | 0.0885 | 3.8935 | 2632 | 0.7056 | 0.8054 | 0.7056 | 0.8400 | | 0.0885 | 3.8964 | 2634 | 0.7274 | 0.8164 | 0.7274 | 0.8529 | | 0.0885 | 3.8994 | 2636 | 0.7644 | 0.8164 | 0.7644 | 0.8743 | | 0.0885 | 3.9024 | 2638 | 0.7564 | 0.8164 | 0.7564 | 0.8697 | | 0.0885 | 3.9053 | 2640 | 0.7532 | 0.8164 | 0.7532 | 0.8679 | | 0.0885 | 3.9083 | 2642 | 0.7830 | 0.7823 | 0.7830 | 0.8849 | | 0.0885 | 3.9112 | 2644 | 0.8489 | 0.7519 | 0.8489 | 0.9214 | | 0.0885 | 3.9142 | 2646 | 0.8663 | 0.7840 | 0.8663 | 0.9307 | | 0.0885 | 3.9172 | 2648 | 0.8485 | 0.7519 | 0.8485 | 0.9212 | | 0.0885 | 3.9201 | 2650 | 0.8368 | 0.7933 | 0.8368 | 0.9148 | | 0.0885 | 3.9231 | 2652 | 0.8475 | 0.8215 | 0.8475 | 0.9206 | | 0.0885 | 3.9260 | 2654 | 0.8376 | 0.7521 | 0.8376 | 0.9152 | | 0.0885 | 3.9290 | 2656 | 0.8794 | 0.7612 | 0.8794 | 0.9378 | | 0.0885 | 3.9320 | 2658 | 0.8860 | 0.7521 | 0.8860 | 0.9413 | | 0.0885 | 3.9349 | 2660 | 0.8856 | 0.7521 | 0.8856 | 0.9410 | | 0.0885 | 3.9379 | 2662 | 0.8285 | 0.7983 | 0.8285 | 0.9102 | | 0.0885 | 3.9408 | 2664 | 0.7708 | 0.7879 | 0.7708 | 0.8780 | | 0.0885 | 3.9438 | 2666 | 0.7453 | 0.8123 | 0.7453 | 0.8633 | | 0.0885 | 3.9467 | 2668 | 0.7620 | 0.7879 | 0.7620 | 0.8729 | | 0.0885 | 3.9497 | 2670 | 0.8345 | 0.8094 | 0.8345 | 0.9135 | | 0.0885 | 3.9527 | 2672 | 0.9145 | 0.8412 | 0.9145 | 0.9563 | | 0.0885 | 3.9556 | 2674 | 0.9212 | 0.7840 | 0.9212 | 0.9598 | | 0.0885 | 3.9586 | 2676 | 0.8813 | 0.8412 | 0.8813 | 0.9388 | | 0.0885 | 3.9615 | 2678 | 0.8558 | 0.8412 | 0.8558 | 0.9251 | | 0.0885 | 3.9645 | 2680 | 0.8419 | 0.8412 | 0.8419 | 0.9175 | | 0.0885 | 3.9675 | 2682 | 0.8122 | 0.8283 | 0.8122 | 0.9012 | | 0.0885 | 3.9704 | 2684 | 0.7451 | 0.8283 | 0.7451 | 0.8632 | | 0.0885 | 3.9734 | 2686 | 0.7148 | 0.8164 | 0.7148 | 0.8455 | | 0.0885 | 3.9763 | 2688 | 0.7121 | 0.8164 | 0.7121 | 0.8438 | | 0.0885 | 3.9793 | 2690 | 0.6679 | 0.7801 | 0.6679 | 0.8173 | | 0.0885 | 3.9822 | 2692 | 0.6607 | 0.7801 | 0.6607 | 0.8128 | | 0.0885 | 3.9852 | 2694 | 0.6788 | 0.8164 | 0.6788 | 0.8239 | | 0.0885 | 3.9882 | 2696 | 0.6549 | 0.8054 | 0.6549 | 0.8093 | | 0.0885 | 3.9911 | 2698 | 0.6452 | 0.7801 | 0.6452 | 0.8032 | | 0.0885 | 3.9941 | 2700 | 0.6499 | 0.8019 | 0.6499 | 0.8062 | | 0.0885 | 3.9970 | 2702 | 0.6863 | 0.8019 | 0.6863 | 0.8284 | | 0.0885 | 4.0 | 2704 | 0.7516 | 0.8019 | 0.7516 | 0.8670 | | 0.0885 | 4.0030 | 2706 | 0.8825 | 0.7898 | 0.8825 | 0.9394 | | 0.0885 | 4.0059 | 2708 | 1.0210 | 0.7305 | 1.0210 | 1.0104 | | 0.0885 | 4.0089 | 2710 | 1.0669 | 0.7211 | 1.0669 | 1.0329 | | 0.0885 | 4.0118 | 2712 | 1.0195 | 0.7305 | 1.0195 | 1.0097 | | 0.0885 | 4.0148 | 2714 | 0.8931 | 0.7614 | 0.8931 | 0.9451 | | 0.0885 | 4.0178 | 2716 | 0.7839 | 0.8019 | 0.7839 | 0.8854 | | 0.0885 | 4.0207 | 2718 | 0.7228 | 0.8019 | 0.7228 | 0.8502 | | 0.0885 | 4.0237 | 2720 | 0.6723 | 0.8019 | 0.6723 | 0.8200 | | 0.0885 | 4.0266 | 2722 | 0.6641 | 0.8019 | 0.6641 | 0.8149 | | 0.0885 | 4.0296 | 2724 | 0.7220 | 0.8019 | 0.7220 | 0.8497 | | 0.0885 | 4.0325 | 2726 | 0.7875 | 0.8283 | 0.7875 | 0.8874 | | 0.0885 | 4.0355 | 2728 | 0.8028 | 0.8283 | 0.8028 | 0.8960 | | 0.0885 | 4.0385 | 2730 | 0.7817 | 0.8019 | 0.7817 | 0.8841 | | 0.0885 | 4.0414 | 2732 | 0.7906 | 0.8142 | 0.7906 | 0.8891 | | 0.0885 | 4.0444 | 2734 | 0.7681 | 0.8142 | 0.7681 | 0.8764 | | 0.0885 | 4.0473 | 2736 | 0.7831 | 0.7844 | 0.7831 | 0.8849 | | 0.0885 | 4.0503 | 2738 | 0.8386 | 0.7844 | 0.8386 | 0.9157 | | 0.0885 | 4.0533 | 2740 | 0.8623 | 0.8138 | 0.8623 | 0.9286 | | 0.0885 | 4.0562 | 2742 | 0.8448 | 0.8138 | 0.8448 | 0.9191 | | 0.0885 | 4.0592 | 2744 | 0.8607 | 0.7614 | 0.8607 | 0.9278 | | 0.0885 | 4.0621 | 2746 | 0.9512 | 0.7614 | 0.9512 | 0.9753 | | 0.0885 | 4.0651 | 2748 | 0.9939 | 0.7305 | 0.9939 | 0.9970 | | 0.0885 | 4.0680 | 2750 | 1.0110 | 0.7305 | 1.0110 | 1.0055 | | 0.0885 | 4.0710 | 2752 | 0.9541 | 0.7614 | 0.9541 | 0.9768 | | 0.0885 | 4.0740 | 2754 | 0.8537 | 0.7614 | 0.8537 | 0.9240 | | 0.0885 | 4.0769 | 2756 | 0.7991 | 0.8015 | 0.7991 | 0.8939 | | 0.0885 | 4.0799 | 2758 | 0.7752 | 0.7902 | 0.7752 | 0.8805 | | 0.0885 | 4.0828 | 2760 | 0.7956 | 0.7902 | 0.7956 | 0.8920 | | 0.0885 | 4.0858 | 2762 | 0.8095 | 0.7902 | 0.8095 | 0.8997 | | 0.0885 | 4.0888 | 2764 | 0.8569 | 0.8015 | 0.8569 | 0.9257 | | 0.0885 | 4.0917 | 2766 | 0.9113 | 0.8015 | 0.9113 | 0.9546 | | 0.0885 | 4.0947 | 2768 | 0.9176 | 0.8015 | 0.9176 | 0.9579 | | 0.0885 | 4.0976 | 2770 | 0.8903 | 0.8015 | 0.8903 | 0.9435 | | 0.0885 | 4.1006 | 2772 | 0.8267 | 0.8015 | 0.8267 | 0.9092 | | 0.0885 | 4.1036 | 2774 | 0.7457 | 0.7906 | 0.7457 | 0.8636 | | 0.0885 | 4.1065 | 2776 | 0.7138 | 0.7906 | 0.7138 | 0.8449 | | 0.0885 | 4.1095 | 2778 | 0.7193 | 0.8164 | 0.7193 | 0.8481 | | 0.0885 | 4.1124 | 2780 | 0.7645 | 0.8164 | 0.7645 | 0.8744 | | 0.0885 | 4.1154 | 2782 | 0.7870 | 0.8283 | 0.7870 | 0.8871 | | 0.0885 | 4.1183 | 2784 | 0.7842 | 0.8283 | 0.7842 | 0.8855 | | 0.0885 | 4.1213 | 2786 | 0.8253 | 0.8412 | 0.8253 | 0.9084 | | 0.0885 | 4.1243 | 2788 | 0.8502 | 0.8412 | 0.8502 | 0.9221 | | 0.0885 | 4.1272 | 2790 | 0.8469 | 0.8412 | 0.8469 | 0.9203 | | 0.0885 | 4.1302 | 2792 | 0.8051 | 0.8412 | 0.8051 | 0.8973 | | 0.0885 | 4.1331 | 2794 | 0.7345 | 0.8283 | 0.7345 | 0.8570 | | 0.0885 | 4.1361 | 2796 | 0.7118 | 0.7879 | 0.7118 | 0.8437 | | 0.0885 | 4.1391 | 2798 | 0.7189 | 0.7879 | 0.7189 | 0.8479 | | 0.0885 | 4.1420 | 2800 | 0.7179 | 0.7879 | 0.7179 | 0.8473 | | 0.0885 | 4.1450 | 2802 | 0.7469 | 0.8054 | 0.7469 | 0.8642 | | 0.0885 | 4.1479 | 2804 | 0.7927 | 0.7696 | 0.7927 | 0.8903 | | 0.0885 | 4.1509 | 2806 | 0.8330 | 0.7696 | 0.8330 | 0.9127 | | 0.0885 | 4.1538 | 2808 | 0.8098 | 0.7696 | 0.8098 | 0.8999 | | 0.0885 | 4.1568 | 2810 | 0.7713 | 0.8164 | 0.7713 | 0.8782 | | 0.0885 | 4.1598 | 2812 | 0.7507 | 0.8164 | 0.7507 | 0.8664 | | 0.0885 | 4.1627 | 2814 | 0.7337 | 0.8164 | 0.7337 | 0.8566 | | 0.0885 | 4.1657 | 2816 | 0.7705 | 0.8164 | 0.7705 | 0.8778 | | 0.0885 | 4.1686 | 2818 | 0.7901 | 0.8283 | 0.7901 | 0.8889 | | 0.0885 | 4.1716 | 2820 | 0.7466 | 0.8019 | 0.7466 | 0.8640 | | 0.0885 | 4.1746 | 2822 | 0.7246 | 0.8019 | 0.7246 | 0.8513 | | 0.0885 | 4.1775 | 2824 | 0.7383 | 0.8019 | 0.7383 | 0.8593 | | 0.0885 | 4.1805 | 2826 | 0.8025 | 0.8019 | 0.8025 | 0.8959 | | 0.0885 | 4.1834 | 2828 | 0.8216 | 0.8283 | 0.8216 | 0.9064 | | 0.0885 | 4.1864 | 2830 | 0.8619 | 0.8283 | 0.8619 | 0.9284 | | 0.0885 | 4.1893 | 2832 | 0.9254 | 0.8015 | 0.9254 | 0.9620 | | 0.0885 | 4.1923 | 2834 | 0.9146 | 0.8015 | 0.9146 | 0.9563 | | 0.0885 | 4.1953 | 2836 | 0.8704 | 0.8283 | 0.8704 | 0.9329 | | 0.0885 | 4.1982 | 2838 | 0.8724 | 0.8283 | 0.8724 | 0.9340 | | 0.0885 | 4.2012 | 2840 | 0.8967 | 0.8015 | 0.8967 | 0.9470 | | 0.0885 | 4.2041 | 2842 | 0.9445 | 0.8015 | 0.9445 | 0.9719 | | 0.0885 | 4.2071 | 2844 | 1.0046 | 0.8015 | 1.0046 | 1.0023 | | 0.0885 | 4.2101 | 2846 | 0.9801 | 0.8015 | 0.9801 | 0.9900 | | 0.0885 | 4.2130 | 2848 | 0.9507 | 0.8015 | 0.9507 | 0.9750 | | 0.0885 | 4.2160 | 2850 | 0.9987 | 0.8015 | 0.9987 | 0.9994 | | 0.0885 | 4.2189 | 2852 | 1.0870 | 0.7216 | 1.0870 | 1.0426 | | 0.0885 | 4.2219 | 2854 | 1.1142 | 0.7014 | 1.1142 | 1.0556 | | 0.0885 | 4.2249 | 2856 | 1.0806 | 0.7014 | 1.0806 | 1.0395 | | 0.0885 | 4.2278 | 2858 | 1.0324 | 0.8015 | 1.0324 | 1.0161 | | 0.0885 | 4.2308 | 2860 | 1.0053 | 0.8015 | 1.0053 | 1.0027 | | 0.0885 | 4.2337 | 2862 | 0.9924 | 0.8015 | 0.9924 | 0.9962 | | 0.0885 | 4.2367 | 2864 | 0.9589 | 0.8015 | 0.9589 | 0.9792 | | 0.0885 | 4.2396 | 2866 | 0.9491 | 0.8015 | 0.9491 | 0.9742 | | 0.0885 | 4.2426 | 2868 | 0.9248 | 0.8015 | 0.9248 | 0.9617 | | 0.0885 | 4.2456 | 2870 | 0.9410 | 0.8015 | 0.9410 | 0.9701 | | 0.0885 | 4.2485 | 2872 | 0.9798 | 0.8015 | 0.9798 | 0.9898 | | 0.0885 | 4.2515 | 2874 | 1.0081 | 0.8015 | 1.0081 | 1.0040 | | 0.0885 | 4.2544 | 2876 | 0.9915 | 0.8015 | 0.9915 | 0.9957 | | 0.0885 | 4.2574 | 2878 | 0.9302 | 0.8283 | 0.9302 | 0.9644 | | 0.0885 | 4.2604 | 2880 | 0.9179 | 0.8283 | 0.9179 | 0.9581 | | 0.0885 | 4.2633 | 2882 | 0.9137 | 0.8283 | 0.9137 | 0.9559 | | 0.0885 | 4.2663 | 2884 | 0.8494 | 0.8094 | 0.8494 | 0.9216 | | 0.0885 | 4.2692 | 2886 | 0.8391 | 0.8283 | 0.8391 | 0.9160 | | 0.0885 | 4.2722 | 2888 | 0.8235 | 0.8283 | 0.8235 | 0.9075 | | 0.0885 | 4.2751 | 2890 | 0.7950 | 0.8283 | 0.7950 | 0.8916 | | 0.0885 | 4.2781 | 2892 | 0.7831 | 0.8283 | 0.7831 | 0.8850 | | 0.0885 | 4.2811 | 2894 | 0.8027 | 0.8283 | 0.8027 | 0.8959 | | 0.0885 | 4.2840 | 2896 | 0.7970 | 0.8164 | 0.7970 | 0.8928 | | 0.0885 | 4.2870 | 2898 | 0.8228 | 0.8283 | 0.8228 | 0.9071 | | 0.0885 | 4.2899 | 2900 | 0.8859 | 0.7793 | 0.8859 | 0.9412 | | 0.0885 | 4.2929 | 2902 | 0.9710 | 0.7793 | 0.9710 | 0.9854 | | 0.0885 | 4.2959 | 2904 | 0.9912 | 0.7793 | 0.9912 | 0.9956 | | 0.0885 | 4.2988 | 2906 | 0.9454 | 0.8283 | 0.9454 | 0.9723 | | 0.0885 | 4.3018 | 2908 | 0.8872 | 0.8283 | 0.8872 | 0.9419 | | 0.0885 | 4.3047 | 2910 | 0.9191 | 0.8283 | 0.9191 | 0.9587 | | 0.0885 | 4.3077 | 2912 | 0.9335 | 0.8015 | 0.9335 | 0.9662 | | 0.0885 | 4.3107 | 2914 | 0.9327 | 0.7729 | 0.9327 | 0.9658 | | 0.0885 | 4.3136 | 2916 | 0.8893 | 0.7729 | 0.8893 | 0.9430 | | 0.0885 | 4.3166 | 2918 | 0.8827 | 0.7729 | 0.8827 | 0.9395 | | 0.0885 | 4.3195 | 2920 | 0.8781 | 0.8015 | 0.8781 | 0.9371 | | 0.0885 | 4.3225 | 2922 | 0.8371 | 0.7729 | 0.8371 | 0.9150 | | 0.0885 | 4.3254 | 2924 | 0.8277 | 0.8019 | 0.8277 | 0.9098 | | 0.0885 | 4.3284 | 2926 | 0.8331 | 0.8283 | 0.8331 | 0.9127 | | 0.0885 | 4.3314 | 2928 | 0.8742 | 0.8015 | 0.8742 | 0.9350 | | 0.0885 | 4.3343 | 2930 | 0.8513 | 0.8283 | 0.8513 | 0.9227 | | 0.0885 | 4.3373 | 2932 | 0.8078 | 0.8019 | 0.8078 | 0.8988 | | 0.0885 | 4.3402 | 2934 | 0.7793 | 0.8019 | 0.7793 | 0.8828 | | 0.0885 | 4.3432 | 2936 | 0.8204 | 0.8019 | 0.8204 | 0.9057 | | 0.0885 | 4.3462 | 2938 | 0.8656 | 0.8019 | 0.8656 | 0.9304 | | 0.0885 | 4.3491 | 2940 | 0.8505 | 0.8019 | 0.8505 | 0.9222 | | 0.0885 | 4.3521 | 2942 | 0.8878 | 0.7793 | 0.8878 | 0.9422 | | 0.0885 | 4.3550 | 2944 | 0.8832 | 0.7793 | 0.8832 | 0.9398 | | 0.0885 | 4.3580 | 2946 | 0.8370 | 0.8283 | 0.8370 | 0.9149 | | 0.0885 | 4.3609 | 2948 | 0.7737 | 0.7823 | 0.7737 | 0.8796 | | 0.0885 | 4.3639 | 2950 | 0.7540 | 0.7823 | 0.7540 | 0.8684 | | 0.0885 | 4.3669 | 2952 | 0.7774 | 0.8019 | 0.7774 | 0.8817 | | 0.0885 | 4.3698 | 2954 | 0.8089 | 0.8283 | 0.8089 | 0.8994 | | 0.0885 | 4.3728 | 2956 | 0.8532 | 0.8283 | 0.8532 | 0.9237 | | 0.0885 | 4.3757 | 2958 | 0.8780 | 0.8283 | 0.8780 | 0.9370 | | 0.0885 | 4.3787 | 2960 | 0.8760 | 0.8283 | 0.8760 | 0.9359 | | 0.0885 | 4.3817 | 2962 | 0.8323 | 0.8019 | 0.8323 | 0.9123 | | 0.0885 | 4.3846 | 2964 | 0.7602 | 0.7823 | 0.7602 | 0.8719 | | 0.0885 | 4.3876 | 2966 | 0.7340 | 0.7823 | 0.7340 | 0.8567 | | 0.0885 | 4.3905 | 2968 | 0.7643 | 0.8019 | 0.7643 | 0.8743 | | 0.0885 | 4.3935 | 2970 | 0.8237 | 0.8283 | 0.8237 | 0.9076 | | 0.0885 | 4.3964 | 2972 | 0.8054 | 0.8283 | 0.8054 | 0.8975 | | 0.0885 | 4.3994 | 2974 | 0.7750 | 0.8283 | 0.7750 | 0.8803 | | 0.0885 | 4.4024 | 2976 | 0.6878 | 0.8232 | 0.6878 | 0.8294 | | 0.0885 | 4.4053 | 2978 | 0.6516 | 0.8232 | 0.6516 | 0.8072 | | 0.0885 | 4.4083 | 2980 | 0.6677 | 0.8232 | 0.6677 | 0.8172 | | 0.0885 | 4.4112 | 2982 | 0.7252 | 0.8283 | 0.7252 | 0.8516 | | 0.0885 | 4.4142 | 2984 | 0.7828 | 0.8283 | 0.7828 | 0.8847 | | 0.0885 | 4.4172 | 2986 | 0.7965 | 0.8283 | 0.7965 | 0.8925 | | 0.0885 | 4.4201 | 2988 | 0.8017 | 0.8283 | 0.8017 | 0.8954 | | 0.0885 | 4.4231 | 2990 | 0.7808 | 0.7823 | 0.7808 | 0.8836 | | 0.0885 | 4.4260 | 2992 | 0.7643 | 0.7823 | 0.7643 | 0.8743 | | 0.0885 | 4.4290 | 2994 | 0.7518 | 0.7823 | 0.7518 | 0.8670 | | 0.0885 | 4.4320 | 2996 | 0.7777 | 0.8283 | 0.7777 | 0.8819 | | 0.0885 | 4.4349 | 2998 | 0.7409 | 0.8094 | 0.7409 | 0.8608 | | 0.0797 | 4.4379 | 3000 | 0.7457 | 0.8094 | 0.7457 | 0.8636 | | 0.0797 | 4.4408 | 3002 | 0.8192 | 0.7793 | 0.8192 | 0.9051 | | 0.0797 | 4.4438 | 3004 | 0.8634 | 0.7793 | 0.8634 | 0.9292 | | 0.0797 | 4.4467 | 3006 | 0.8528 | 0.7793 | 0.8528 | 0.9235 | | 0.0797 | 4.4497 | 3008 | 0.7951 | 0.7612 | 0.7951 | 0.8917 | | 0.0797 | 4.4527 | 3010 | 0.7565 | 0.7612 | 0.7565 | 0.8698 | | 0.0797 | 4.4556 | 3012 | 0.7938 | 0.7612 | 0.7938 | 0.8910 | | 0.0797 | 4.4586 | 3014 | 0.8397 | 0.7612 | 0.8397 | 0.9164 | | 0.0797 | 4.4615 | 3016 | 0.8769 | 0.7612 | 0.8769 | 0.9364 | | 0.0797 | 4.4645 | 3018 | 0.8708 | 0.7612 | 0.8708 | 0.9331 | | 0.0797 | 4.4675 | 3020 | 0.8035 | 0.8094 | 0.8035 | 0.8964 | | 0.0797 | 4.4704 | 3022 | 0.7821 | 0.8094 | 0.7821 | 0.8844 | | 0.0797 | 4.4734 | 3024 | 0.7968 | 0.8094 | 0.7968 | 0.8926 | | 0.0797 | 4.4763 | 3026 | 0.8147 | 0.8094 | 0.8147 | 0.9026 | | 0.0797 | 4.4793 | 3028 | 0.8756 | 0.8283 | 0.8756 | 0.9357 | | 0.0797 | 4.4822 | 3030 | 0.9157 | 0.8412 | 0.9157 | 0.9569 | | 0.0797 | 4.4852 | 3032 | 0.9027 | 0.8412 | 0.9027 | 0.9501 | | 0.0797 | 4.4882 | 3034 | 0.8795 | 0.8412 | 0.8795 | 0.9378 | | 0.0797 | 4.4911 | 3036 | 0.8586 | 0.8412 | 0.8586 | 0.9266 | | 0.0797 | 4.4941 | 3038 | 0.7932 | 0.8215 | 0.7932 | 0.8906 | | 0.0797 | 4.4970 | 3040 | 0.7845 | 0.8215 | 0.7845 | 0.8857 | | 0.0797 | 4.5 | 3042 | 0.8183 | 0.8215 | 0.8183 | 0.9046 | | 0.0797 | 4.5030 | 3044 | 0.8803 | 0.8412 | 0.8803 | 0.9383 | | 0.0797 | 4.5059 | 3046 | 0.9282 | 0.7898 | 0.9282 | 0.9634 | | 0.0797 | 4.5089 | 3048 | 0.9113 | 0.7898 | 0.9113 | 0.9546 | | 0.0797 | 4.5118 | 3050 | 0.8748 | 0.8215 | 0.8748 | 0.9353 | | 0.0797 | 4.5148 | 3052 | 0.7787 | 0.8094 | 0.7787 | 0.8824 | | 0.0797 | 4.5178 | 3054 | 0.7232 | 0.7983 | 0.7232 | 0.8504 | | 0.0797 | 4.5207 | 3056 | 0.7256 | 0.7983 | 0.7256 | 0.8518 | | 0.0797 | 4.5237 | 3058 | 0.7828 | 0.8094 | 0.7828 | 0.8848 | | 0.0797 | 4.5266 | 3060 | 0.8348 | 0.8215 | 0.8348 | 0.9137 | | 0.0797 | 4.5296 | 3062 | 0.9156 | 0.8412 | 0.9156 | 0.9568 | | 0.0797 | 4.5325 | 3064 | 0.9578 | 0.8412 | 0.9578 | 0.9787 | | 0.0797 | 4.5355 | 3066 | 0.9692 | 0.8412 | 0.9692 | 0.9845 | | 0.0797 | 4.5385 | 3068 | 0.8994 | 0.8412 | 0.8994 | 0.9484 | | 0.0797 | 4.5414 | 3070 | 0.8304 | 0.8215 | 0.8304 | 0.9112 | | 0.0797 | 4.5444 | 3072 | 0.8102 | 0.8094 | 0.8102 | 0.9001 | | 0.0797 | 4.5473 | 3074 | 0.8101 | 0.8094 | 0.8101 | 0.9000 | | 0.0797 | 4.5503 | 3076 | 0.8191 | 0.8094 | 0.8191 | 0.9050 | | 0.0797 | 4.5533 | 3078 | 0.8667 | 0.7793 | 0.8667 | 0.9310 | | 0.0797 | 4.5562 | 3080 | 0.9117 | 0.7793 | 0.9117 | 0.9548 | | 0.0797 | 4.5592 | 3082 | 0.9315 | 0.7793 | 0.9315 | 0.9651 | | 0.0797 | 4.5621 | 3084 | 0.9769 | 0.7793 | 0.9769 | 0.9884 | | 0.0797 | 4.5651 | 3086 | 1.0543 | 0.7322 | 1.0543 | 1.0268 | | 0.0797 | 4.5680 | 3088 | 1.0290 | 0.7322 | 1.0290 | 1.0144 | | 0.0797 | 4.5710 | 3090 | 0.9366 | 0.8412 | 0.9366 | 0.9678 | | 0.0797 | 4.5740 | 3092 | 0.8106 | 0.8283 | 0.8106 | 0.9003 | | 0.0797 | 4.5769 | 3094 | 0.7040 | 0.7983 | 0.7040 | 0.8391 | | 0.0797 | 4.5799 | 3096 | 0.6520 | 0.7620 | 0.6520 | 0.8075 | | 0.0797 | 4.5828 | 3098 | 0.6567 | 0.7879 | 0.6567 | 0.8104 | | 0.0797 | 4.5858 | 3100 | 0.7111 | 0.8054 | 0.7111 | 0.8433 | | 0.0797 | 4.5888 | 3102 | 0.8145 | 0.7696 | 0.8145 | 0.9025 | | 0.0797 | 4.5917 | 3104 | 0.9400 | 0.7793 | 0.9400 | 0.9695 | | 0.0797 | 4.5947 | 3106 | 0.9893 | 0.7793 | 0.9893 | 0.9946 | | 0.0797 | 4.5976 | 3108 | 0.9428 | 0.7793 | 0.9428 | 0.9710 | | 0.0797 | 4.6006 | 3110 | 0.8684 | 0.7606 | 0.8684 | 0.9319 | | 0.0797 | 4.6036 | 3112 | 0.7977 | 0.8054 | 0.7977 | 0.8931 | | 0.0797 | 4.6065 | 3114 | 0.7356 | 0.8054 | 0.7356 | 0.8577 | | 0.0797 | 4.6095 | 3116 | 0.6802 | 0.8054 | 0.6802 | 0.8248 | | 0.0797 | 4.6124 | 3118 | 0.6790 | 0.7801 | 0.6790 | 0.8240 | | 0.0797 | 4.6154 | 3120 | 0.7298 | 0.8019 | 0.7298 | 0.8543 | | 0.0797 | 4.6183 | 3122 | 0.8334 | 0.8283 | 0.8334 | 0.9129 | | 0.0797 | 4.6213 | 3124 | 0.9604 | 0.7515 | 0.9604 | 0.9800 | | 0.0797 | 4.6243 | 3126 | 1.0417 | 0.7515 | 1.0417 | 1.0206 | | 0.0797 | 4.6272 | 3128 | 1.0447 | 0.7515 | 1.0447 | 1.0221 | | 0.0797 | 4.6302 | 3130 | 0.9810 | 0.7515 | 0.9810 | 0.9904 | | 0.0797 | 4.6331 | 3132 | 0.8789 | 0.8283 | 0.8789 | 0.9375 | | 0.0797 | 4.6361 | 3134 | 0.8384 | 0.8094 | 0.8384 | 0.9156 | | 0.0797 | 4.6391 | 3136 | 0.8069 | 0.8094 | 0.8069 | 0.8983 | | 0.0797 | 4.6420 | 3138 | 0.8077 | 0.8094 | 0.8077 | 0.8987 | | 0.0797 | 4.6450 | 3140 | 0.8101 | 0.8094 | 0.8101 | 0.9001 | | 0.0797 | 4.6479 | 3142 | 0.8283 | 0.8094 | 0.8283 | 0.9101 | | 0.0797 | 4.6509 | 3144 | 0.8437 | 0.8094 | 0.8437 | 0.9185 | | 0.0797 | 4.6538 | 3146 | 0.8813 | 0.7819 | 0.8813 | 0.9388 | | 0.0797 | 4.6568 | 3148 | 0.9299 | 0.8015 | 0.9299 | 0.9643 | | 0.0797 | 4.6598 | 3150 | 1.0162 | 0.7216 | 1.0162 | 1.0081 | | 0.0797 | 4.6627 | 3152 | 1.0450 | 0.7216 | 1.0450 | 1.0222 | | 0.0797 | 4.6657 | 3154 | 1.0453 | 0.7216 | 1.0453 | 1.0224 | | 0.0797 | 4.6686 | 3156 | 1.0148 | 0.7216 | 1.0148 | 1.0074 | | 0.0797 | 4.6716 | 3158 | 0.9320 | 0.8015 | 0.9320 | 0.9654 | | 0.0797 | 4.6746 | 3160 | 0.9025 | 0.8015 | 0.9025 | 0.9500 | | 0.0797 | 4.6775 | 3162 | 0.9225 | 0.8015 | 0.9225 | 0.9605 | | 0.0797 | 4.6805 | 3164 | 0.9644 | 0.8015 | 0.9644 | 0.9821 | | 0.0797 | 4.6834 | 3166 | 0.9837 | 0.7515 | 0.9837 | 0.9918 | | 0.0797 | 4.6864 | 3168 | 0.9621 | 0.7417 | 0.9621 | 0.9809 | | 0.0797 | 4.6893 | 3170 | 0.9003 | 0.8015 | 0.9003 | 0.9489 | | 0.0797 | 4.6923 | 3172 | 0.8695 | 0.8015 | 0.8695 | 0.9324 | | 0.0797 | 4.6953 | 3174 | 0.8423 | 0.8015 | 0.8423 | 0.9178 | | 0.0797 | 4.6982 | 3176 | 0.7826 | 0.8283 | 0.7826 | 0.8847 | | 0.0797 | 4.7012 | 3178 | 0.7318 | 0.7879 | 0.7318 | 0.8555 | | 0.0797 | 4.7041 | 3180 | 0.7153 | 0.7879 | 0.7153 | 0.8457 | | 0.0797 | 4.7071 | 3182 | 0.7471 | 0.7983 | 0.7471 | 0.8643 | | 0.0797 | 4.7101 | 3184 | 0.7697 | 0.8283 | 0.7697 | 0.8773 | | 0.0797 | 4.7130 | 3186 | 0.8364 | 0.8015 | 0.8364 | 0.9146 | | 0.0797 | 4.7160 | 3188 | 0.8426 | 0.8015 | 0.8426 | 0.9179 | | 0.0797 | 4.7189 | 3190 | 0.7921 | 0.7819 | 0.7921 | 0.8900 | | 0.0797 | 4.7219 | 3192 | 0.7342 | 0.8094 | 0.7342 | 0.8569 | | 0.0797 | 4.7249 | 3194 | 0.6740 | 0.7879 | 0.6740 | 0.8209 | | 0.0797 | 4.7278 | 3196 | 0.6784 | 0.7717 | 0.6784 | 0.8236 | | 0.0797 | 4.7308 | 3198 | 0.7165 | 0.7983 | 0.7165 | 0.8464 | | 0.0797 | 4.7337 | 3200 | 0.7533 | 0.8094 | 0.7533 | 0.8679 | | 0.0797 | 4.7367 | 3202 | 0.8311 | 0.7819 | 0.8311 | 0.9117 | | 0.0797 | 4.7396 | 3204 | 0.9254 | 0.8015 | 0.9254 | 0.9620 | | 0.0797 | 4.7426 | 3206 | 0.9457 | 0.8138 | 0.9457 | 0.9725 | | 0.0797 | 4.7456 | 3208 | 0.9025 | 0.8015 | 0.9025 | 0.9500 | | 0.0797 | 4.7485 | 3210 | 0.8522 | 0.7819 | 0.8522 | 0.9231 | | 0.0797 | 4.7515 | 3212 | 0.7925 | 0.7525 | 0.7925 | 0.8902 | | 0.0797 | 4.7544 | 3214 | 0.7600 | 0.7823 | 0.7600 | 0.8718 | | 0.0797 | 4.7574 | 3216 | 0.7581 | 0.7823 | 0.7581 | 0.8707 | | 0.0797 | 4.7604 | 3218 | 0.7851 | 0.8094 | 0.7851 | 0.8861 | | 0.0797 | 4.7633 | 3220 | 0.8404 | 0.8283 | 0.8404 | 0.9168 | | 0.0797 | 4.7663 | 3222 | 0.9354 | 0.7417 | 0.9354 | 0.9672 | | 0.0797 | 4.7692 | 3224 | 0.9965 | 0.7014 | 0.9965 | 0.9983 | | 0.0797 | 4.7722 | 3226 | 0.9795 | 0.7014 | 0.9795 | 0.9897 | | 0.0797 | 4.7751 | 3228 | 0.9025 | 0.8015 | 0.9025 | 0.9500 | | 0.0797 | 4.7781 | 3230 | 0.8355 | 0.8283 | 0.8355 | 0.9141 | | 0.0797 | 4.7811 | 3232 | 0.7573 | 0.8094 | 0.7573 | 0.8702 | | 0.0797 | 4.7840 | 3234 | 0.7326 | 0.7983 | 0.7326 | 0.8559 | | 0.0797 | 4.7870 | 3236 | 0.7588 | 0.8094 | 0.7588 | 0.8711 | | 0.0797 | 4.7899 | 3238 | 0.8404 | 0.8283 | 0.8404 | 0.9167 | | 0.0797 | 4.7929 | 3240 | 0.9569 | 0.8138 | 0.9569 | 0.9782 | | 0.0797 | 4.7959 | 3242 | 1.0913 | 0.6680 | 1.0913 | 1.0446 | | 0.0797 | 4.7988 | 3244 | 1.1308 | 0.6680 | 1.1308 | 1.0634 | | 0.0797 | 4.8018 | 3246 | 1.0939 | 0.6680 | 1.0939 | 1.0459 | | 0.0797 | 4.8047 | 3248 | 1.0477 | 0.6680 | 1.0477 | 1.0236 | | 0.0797 | 4.8077 | 3250 | 1.0028 | 0.7515 | 1.0028 | 1.0014 | | 0.0797 | 4.8107 | 3252 | 0.9255 | 0.8138 | 0.9255 | 0.9620 | | 0.0797 | 4.8136 | 3254 | 0.8572 | 0.8015 | 0.8572 | 0.9259 | | 0.0797 | 4.8166 | 3256 | 0.7884 | 0.8164 | 0.7884 | 0.8879 | | 0.0797 | 4.8195 | 3258 | 0.7909 | 0.8164 | 0.7909 | 0.8893 | | 0.0797 | 4.8225 | 3260 | 0.8267 | 0.7902 | 0.8267 | 0.9092 | | 0.0797 | 4.8254 | 3262 | 0.8922 | 0.8138 | 0.8922 | 0.9446 | | 0.0797 | 4.8284 | 3264 | 0.9809 | 0.7840 | 0.9809 | 0.9904 | | 0.0797 | 4.8314 | 3266 | 1.0652 | 0.7840 | 1.0652 | 1.0321 | | 0.0797 | 4.8343 | 3268 | 1.1022 | 0.7188 | 1.1022 | 1.0499 | | 0.0797 | 4.8373 | 3270 | 1.0773 | 0.7188 | 1.0773 | 1.0379 | | 0.0797 | 4.8402 | 3272 | 0.9842 | 0.7840 | 0.9842 | 0.9920 | | 0.0797 | 4.8432 | 3274 | 0.8680 | 0.7519 | 0.8680 | 0.9317 | | 0.0797 | 4.8462 | 3276 | 0.7949 | 0.7844 | 0.7949 | 0.8916 | | 0.0797 | 4.8491 | 3278 | 0.7671 | 0.7729 | 0.7671 | 0.8759 | | 0.0797 | 4.8521 | 3280 | 0.7536 | 0.7623 | 0.7536 | 0.8681 | | 0.0797 | 4.8550 | 3282 | 0.7413 | 0.7623 | 0.7413 | 0.8610 | | 0.0797 | 4.8580 | 3284 | 0.7838 | 0.7844 | 0.7838 | 0.8853 | | 0.0797 | 4.8609 | 3286 | 0.8715 | 0.8138 | 0.8715 | 0.9336 | | 0.0797 | 4.8639 | 3288 | 0.9430 | 0.7840 | 0.9430 | 0.9711 | | 0.0797 | 4.8669 | 3290 | 0.9565 | 0.7840 | 0.9565 | 0.9780 | | 0.0797 | 4.8698 | 3292 | 0.9069 | 0.7840 | 0.9069 | 0.9523 | | 0.0797 | 4.8728 | 3294 | 0.8937 | 0.7840 | 0.8937 | 0.9454 | | 0.0797 | 4.8757 | 3296 | 0.8644 | 0.7840 | 0.8644 | 0.9297 | | 0.0797 | 4.8787 | 3298 | 0.7877 | 0.7819 | 0.7877 | 0.8875 | | 0.0797 | 4.8817 | 3300 | 0.7161 | 0.8094 | 0.7161 | 0.8463 | | 0.0797 | 4.8846 | 3302 | 0.6990 | 0.7823 | 0.6990 | 0.8361 | | 0.0797 | 4.8876 | 3304 | 0.7158 | 0.7823 | 0.7158 | 0.8460 | | 0.0797 | 4.8905 | 3306 | 0.7331 | 0.8094 | 0.7331 | 0.8562 | | 0.0797 | 4.8935 | 3308 | 0.7884 | 0.8283 | 0.7884 | 0.8879 | | 0.0797 | 4.8964 | 3310 | 0.8647 | 0.8015 | 0.8647 | 0.9299 | | 0.0797 | 4.8994 | 3312 | 0.8988 | 0.7840 | 0.8988 | 0.9480 | | 0.0797 | 4.9024 | 3314 | 0.8907 | 0.8138 | 0.8907 | 0.9438 | | 0.0797 | 4.9053 | 3316 | 0.8433 | 0.8019 | 0.8433 | 0.9183 | | 0.0797 | 4.9083 | 3318 | 0.7596 | 0.7823 | 0.7596 | 0.8716 | | 0.0797 | 4.9112 | 3320 | 0.7000 | 0.7717 | 0.7000 | 0.8367 | | 0.0797 | 4.9142 | 3322 | 0.6735 | 0.7717 | 0.6735 | 0.8207 | | 0.0797 | 4.9172 | 3324 | 0.6993 | 0.7717 | 0.6993 | 0.8362 | | 0.0797 | 4.9201 | 3326 | 0.7337 | 0.7717 | 0.7337 | 0.8565 | | 0.0797 | 4.9231 | 3328 | 0.7695 | 0.8164 | 0.7695 | 0.8772 | | 0.0797 | 4.9260 | 3330 | 0.7626 | 0.8164 | 0.7626 | 0.8733 | | 0.0797 | 4.9290 | 3332 | 0.7835 | 0.8164 | 0.7835 | 0.8852 | | 0.0797 | 4.9320 | 3334 | 0.8056 | 0.8164 | 0.8056 | 0.8975 | | 0.0797 | 4.9349 | 3336 | 0.7890 | 0.8164 | 0.7890 | 0.8882 | | 0.0797 | 4.9379 | 3338 | 0.7618 | 0.8164 | 0.7618 | 0.8728 | | 0.0797 | 4.9408 | 3340 | 0.7694 | 0.8164 | 0.7694 | 0.8772 | | 0.0797 | 4.9438 | 3342 | 0.8020 | 0.8164 | 0.8020 | 0.8956 | | 0.0797 | 4.9467 | 3344 | 0.8327 | 0.8164 | 0.8327 | 0.9125 | | 0.0797 | 4.9497 | 3346 | 0.8514 | 0.7413 | 0.8514 | 0.9227 | | 0.0797 | 4.9527 | 3348 | 0.8785 | 0.7413 | 0.8785 | 0.9373 | | 0.0797 | 4.9556 | 3350 | 0.9118 | 0.7519 | 0.9118 | 0.9549 | | 0.0797 | 4.9586 | 3352 | 0.8972 | 0.7413 | 0.8972 | 0.9472 | | 0.0797 | 4.9615 | 3354 | 0.8706 | 0.7724 | 0.8706 | 0.9330 | | 0.0797 | 4.9645 | 3356 | 0.8644 | 0.7724 | 0.8644 | 0.9298 | | 0.0797 | 4.9675 | 3358 | 0.8291 | 0.7618 | 0.8291 | 0.9105 | | 0.0797 | 4.9704 | 3360 | 0.7928 | 0.8164 | 0.7928 | 0.8904 | | 0.0797 | 4.9734 | 3362 | 0.7735 | 0.8164 | 0.7735 | 0.8795 | | 0.0797 | 4.9763 | 3364 | 0.7463 | 0.8164 | 0.7463 | 0.8639 | | 0.0797 | 4.9793 | 3366 | 0.7638 | 0.8164 | 0.7638 | 0.8740 | | 0.0797 | 4.9822 | 3368 | 0.8052 | 0.8164 | 0.8052 | 0.8973 | | 0.0797 | 4.9852 | 3370 | 0.8635 | 0.7902 | 0.8635 | 0.9292 | | 0.0797 | 4.9882 | 3372 | 0.9202 | 0.7724 | 0.9202 | 0.9593 | | 0.0797 | 4.9911 | 3374 | 0.9644 | 0.7840 | 0.9644 | 0.9820 | | 0.0797 | 4.9941 | 3376 | 0.9751 | 0.7840 | 0.9751 | 0.9875 | | 0.0797 | 4.9970 | 3378 | 0.9321 | 0.7840 | 0.9321 | 0.9655 | | 0.0797 | 5.0 | 3380 | 0.8508 | 0.7623 | 0.8508 | 0.9224 | | 0.0797 | 5.0030 | 3382 | 0.7783 | 0.7906 | 0.7783 | 0.8822 | | 0.0797 | 5.0059 | 3384 | 0.7016 | 0.7906 | 0.7016 | 0.8376 | | 0.0797 | 5.0089 | 3386 | 0.6338 | 0.7717 | 0.6338 | 0.7961 | | 0.0797 | 5.0118 | 3388 | 0.6175 | 0.7620 | 0.6175 | 0.7858 | | 0.0797 | 5.0148 | 3390 | 0.6416 | 0.7801 | 0.6416 | 0.8010 | | 0.0797 | 5.0178 | 3392 | 0.7056 | 0.8164 | 0.7056 | 0.8400 | | 0.0797 | 5.0207 | 3394 | 0.8227 | 0.8164 | 0.8227 | 0.9070 | | 0.0797 | 5.0237 | 3396 | 0.9171 | 0.7426 | 0.9171 | 0.9577 | | 0.0797 | 5.0266 | 3398 | 0.9738 | 0.6940 | 0.9738 | 0.9868 | | 0.0797 | 5.0296 | 3400 | 0.9680 | 0.7515 | 0.9680 | 0.9839 | | 0.0797 | 5.0325 | 3402 | 0.9004 | 0.7618 | 0.9004 | 0.9489 | | 0.0797 | 5.0355 | 3404 | 0.8291 | 0.7316 | 0.8291 | 0.9106 | | 0.0797 | 5.0385 | 3406 | 0.8184 | 0.7413 | 0.8184 | 0.9046 | | 0.0797 | 5.0414 | 3408 | 0.8140 | 0.7413 | 0.8140 | 0.9022 | | 0.0797 | 5.0444 | 3410 | 0.8128 | 0.7413 | 0.8128 | 0.9015 | | 0.0797 | 5.0473 | 3412 | 0.8421 | 0.7724 | 0.8421 | 0.9177 | | 0.0797 | 5.0503 | 3414 | 0.8818 | 0.7840 | 0.8818 | 0.9390 | | 0.0797 | 5.0533 | 3416 | 0.8949 | 0.7840 | 0.8949 | 0.9460 | | 0.0797 | 5.0562 | 3418 | 0.9503 | 0.7840 | 0.9503 | 0.9748 | | 0.0797 | 5.0592 | 3420 | 0.9647 | 0.7305 | 0.9647 | 0.9822 | | 0.0797 | 5.0621 | 3422 | 0.9247 | 0.7216 | 0.9247 | 0.9616 | | 0.0797 | 5.0651 | 3424 | 0.8563 | 0.7618 | 0.8563 | 0.9254 | | 0.0797 | 5.0680 | 3426 | 0.8365 | 0.7618 | 0.8365 | 0.9146 | | 0.0797 | 5.0710 | 3428 | 0.8017 | 0.7618 | 0.8017 | 0.8954 | | 0.0797 | 5.0740 | 3430 | 0.8188 | 0.7618 | 0.8188 | 0.9049 | | 0.0797 | 5.0769 | 3432 | 0.8456 | 0.7840 | 0.8456 | 0.9195 | | 0.0797 | 5.0799 | 3434 | 0.9030 | 0.7519 | 0.9030 | 0.9503 | | 0.0797 | 5.0828 | 3436 | 0.9507 | 0.7519 | 0.9507 | 0.9750 | | 0.0797 | 5.0858 | 3438 | 0.9936 | 0.7840 | 0.9936 | 0.9968 | | 0.0797 | 5.0888 | 3440 | 0.9979 | 0.7840 | 0.9979 | 0.9990 | | 0.0797 | 5.0917 | 3442 | 0.9735 | 0.7519 | 0.9735 | 0.9867 | | 0.0797 | 5.0947 | 3444 | 0.9881 | 0.7840 | 0.9881 | 0.9940 | | 0.0797 | 5.0976 | 3446 | 1.0506 | 0.7621 | 1.0506 | 1.0250 | | 0.0797 | 5.1006 | 3448 | 1.0685 | 0.7621 | 1.0685 | 1.0337 | | 0.0797 | 5.1036 | 3450 | 1.0462 | 0.7621 | 1.0462 | 1.0229 | | 0.0797 | 5.1065 | 3452 | 1.0392 | 0.7621 | 1.0392 | 1.0194 | | 0.0797 | 5.1095 | 3454 | 0.9713 | 0.7840 | 0.9713 | 0.9855 | | 0.0797 | 5.1124 | 3456 | 0.9199 | 0.7840 | 0.9199 | 0.9591 | | 0.0797 | 5.1154 | 3458 | 0.8870 | 0.7626 | 0.8870 | 0.9418 | | 0.0797 | 5.1183 | 3460 | 0.8634 | 0.7626 | 0.8634 | 0.9292 | | 0.0797 | 5.1213 | 3462 | 0.8286 | 0.7520 | 0.8286 | 0.9103 | | 0.0797 | 5.1243 | 3464 | 0.8677 | 0.7626 | 0.8677 | 0.9315 | | 0.0797 | 5.1272 | 3466 | 0.9625 | 0.7840 | 0.9625 | 0.9811 | | 0.0797 | 5.1302 | 3468 | 1.0001 | 0.7840 | 1.0001 | 1.0001 | | 0.0797 | 5.1331 | 3470 | 0.9647 | 0.7840 | 0.9647 | 0.9822 | | 0.0797 | 5.1361 | 3472 | 0.8770 | 0.7520 | 0.8770 | 0.9365 | | 0.0797 | 5.1391 | 3474 | 0.7879 | 0.7520 | 0.7879 | 0.8876 | | 0.0797 | 5.1420 | 3476 | 0.7401 | 0.7337 | 0.7401 | 0.8603 | | 0.0797 | 5.1450 | 3478 | 0.7503 | 0.7616 | 0.7503 | 0.8662 | | 0.0797 | 5.1479 | 3480 | 0.7811 | 0.7713 | 0.7811 | 0.8838 | | 0.0797 | 5.1509 | 3482 | 0.8408 | 0.7520 | 0.8408 | 0.9169 | | 0.0797 | 5.1538 | 3484 | 0.9380 | 0.7724 | 0.9380 | 0.9685 | | 0.0797 | 5.1568 | 3486 | 0.9983 | 0.7216 | 0.9983 | 0.9992 | | 0.0797 | 5.1598 | 3488 | 0.9857 | 0.7216 | 0.9857 | 0.9928 | | 0.0797 | 5.1627 | 3490 | 0.9139 | 0.8015 | 0.9139 | 0.9560 | | 0.0797 | 5.1657 | 3492 | 0.8396 | 0.7902 | 0.8396 | 0.9163 | | 0.0797 | 5.1686 | 3494 | 0.7844 | 0.7713 | 0.7844 | 0.8856 | | 0.0797 | 5.1716 | 3496 | 0.7673 | 0.7713 | 0.7673 | 0.8760 | | 0.0797 | 5.1746 | 3498 | 0.7993 | 0.7902 | 0.7993 | 0.8940 | | 0.0729 | 5.1775 | 3500 | 0.8580 | 0.7724 | 0.8580 | 0.9263 | | 0.0729 | 5.1805 | 3502 | 0.8718 | 0.7724 | 0.8718 | 0.9337 | | 0.0729 | 5.1834 | 3504 | 0.8409 | 0.7724 | 0.8409 | 0.9170 | | 0.0729 | 5.1864 | 3506 | 0.8036 | 0.7819 | 0.8036 | 0.8964 | | 0.0729 | 5.1893 | 3508 | 0.8000 | 0.7713 | 0.8000 | 0.8944 | | 0.0729 | 5.1923 | 3510 | 0.8335 | 0.7724 | 0.8335 | 0.9130 | | 0.0729 | 5.1953 | 3512 | 0.8857 | 0.7724 | 0.8857 | 0.9411 | | 0.0729 | 5.1982 | 3514 | 0.8863 | 0.7724 | 0.8863 | 0.9414 | | 0.0729 | 5.2012 | 3516 | 0.9182 | 0.7724 | 0.9182 | 0.9582 | | 0.0729 | 5.2041 | 3518 | 0.9120 | 0.7724 | 0.9120 | 0.9550 | | 0.0729 | 5.2071 | 3520 | 0.8537 | 0.7618 | 0.8537 | 0.9240 | | 0.0729 | 5.2101 | 3522 | 0.8442 | 0.7618 | 0.8442 | 0.9188 | | 0.0729 | 5.2130 | 3524 | 0.8557 | 0.7724 | 0.8557 | 0.9250 | | 0.0729 | 5.2160 | 3526 | 0.8916 | 0.7724 | 0.8916 | 0.9442 | | 0.0729 | 5.2189 | 3528 | 0.9189 | 0.7724 | 0.9189 | 0.9586 | | 0.0729 | 5.2219 | 3530 | 0.9315 | 0.7724 | 0.9315 | 0.9651 | | 0.0729 | 5.2249 | 3532 | 0.8827 | 0.7618 | 0.8827 | 0.9395 | | 0.0729 | 5.2278 | 3534 | 0.8433 | 0.7618 | 0.8433 | 0.9183 | | 0.0729 | 5.2308 | 3536 | 0.8072 | 0.7618 | 0.8072 | 0.8984 | | 0.0729 | 5.2337 | 3538 | 0.7755 | 0.7902 | 0.7755 | 0.8806 | | 0.0729 | 5.2367 | 3540 | 0.7664 | 0.7902 | 0.7664 | 0.8754 | | 0.0729 | 5.2396 | 3542 | 0.7976 | 0.7618 | 0.7976 | 0.8931 | | 0.0729 | 5.2426 | 3544 | 0.8564 | 0.7618 | 0.8564 | 0.9254 | | 0.0729 | 5.2456 | 3546 | 0.9030 | 0.7618 | 0.9030 | 0.9502 | | 0.0729 | 5.2485 | 3548 | 0.9374 | 0.7618 | 0.9374 | 0.9682 | | 0.0729 | 5.2515 | 3550 | 0.9030 | 0.7618 | 0.9030 | 0.9503 | | 0.0729 | 5.2544 | 3552 | 0.8456 | 0.7618 | 0.8456 | 0.9196 | | 0.0729 | 5.2574 | 3554 | 0.8013 | 0.7902 | 0.8013 | 0.8952 | | 0.0729 | 5.2604 | 3556 | 0.8156 | 0.7902 | 0.8156 | 0.9031 | | 0.0729 | 5.2633 | 3558 | 0.8410 | 0.7618 | 0.8410 | 0.9171 | | 0.0729 | 5.2663 | 3560 | 0.8319 | 0.7902 | 0.8319 | 0.9121 | | 0.0729 | 5.2692 | 3562 | 0.8376 | 0.7618 | 0.8376 | 0.9152 | | 0.0729 | 5.2722 | 3564 | 0.8401 | 0.7519 | 0.8401 | 0.9166 | | 0.0729 | 5.2751 | 3566 | 0.8302 | 0.7296 | 0.8302 | 0.9112 | | 0.0729 | 5.2781 | 3568 | 0.8641 | 0.7519 | 0.8641 | 0.9295 | | 0.0729 | 5.2811 | 3570 | 0.9360 | 0.7519 | 0.9360 | 0.9675 | | 0.0729 | 5.2840 | 3572 | 1.0311 | 0.7519 | 1.0311 | 1.0154 | | 0.0729 | 5.2870 | 3574 | 1.1162 | 0.7188 | 1.1162 | 1.0565 | | 0.0729 | 5.2899 | 3576 | 1.1290 | 0.7188 | 1.1290 | 1.0625 | | 0.0729 | 5.2929 | 3578 | 1.0821 | 0.7840 | 1.0821 | 1.0402 | | 0.0729 | 5.2959 | 3580 | 1.0303 | 0.7840 | 1.0303 | 1.0150 | | 0.0729 | 5.2988 | 3582 | 0.9852 | 0.7840 | 0.9852 | 0.9926 | | 0.0729 | 5.3018 | 3584 | 0.9101 | 0.8015 | 0.9101 | 0.9540 | | 0.0729 | 5.3047 | 3586 | 0.8275 | 0.7983 | 0.8275 | 0.9097 | | 0.0729 | 5.3077 | 3588 | 0.7903 | 0.7983 | 0.7903 | 0.8890 | | 0.0729 | 5.3107 | 3590 | 0.7626 | 0.7983 | 0.7626 | 0.8733 | | 0.0729 | 5.3136 | 3592 | 0.7643 | 0.7983 | 0.7643 | 0.8742 | | 0.0729 | 5.3166 | 3594 | 0.8050 | 0.8164 | 0.8050 | 0.8972 | | 0.0729 | 5.3195 | 3596 | 0.8844 | 0.7902 | 0.8844 | 0.9404 | | 0.0729 | 5.3225 | 3598 | 0.9726 | 0.7840 | 0.9726 | 0.9862 | | 0.0729 | 5.3254 | 3600 | 1.0236 | 0.7305 | 1.0236 | 1.0117 | | 0.0729 | 5.3284 | 3602 | 1.0094 | 0.7840 | 1.0094 | 1.0047 | | 0.0729 | 5.3314 | 3604 | 0.9310 | 0.7840 | 0.9310 | 0.9649 | | 0.0729 | 5.3343 | 3606 | 0.8657 | 0.7840 | 0.8657 | 0.9304 | | 0.0729 | 5.3373 | 3608 | 0.8299 | 0.7840 | 0.8299 | 0.9110 | | 0.0729 | 5.3402 | 3610 | 0.8399 | 0.7840 | 0.8399 | 0.9165 | | 0.0729 | 5.3432 | 3612 | 0.8454 | 0.7840 | 0.8454 | 0.9195 | | 0.0729 | 5.3462 | 3614 | 0.8262 | 0.7519 | 0.8262 | 0.9089 | | 0.0729 | 5.3491 | 3616 | 0.8348 | 0.7840 | 0.8348 | 0.9137 | | 0.0729 | 5.3521 | 3618 | 0.8672 | 0.7840 | 0.8672 | 0.9312 | | 0.0729 | 5.3550 | 3620 | 0.9134 | 0.7840 | 0.9134 | 0.9557 | | 0.0729 | 5.3580 | 3622 | 0.9925 | 0.7840 | 0.9925 | 0.9963 | | 0.0729 | 5.3609 | 3624 | 1.0549 | 0.7188 | 1.0549 | 1.0271 | | 0.0729 | 5.3639 | 3626 | 1.0776 | 0.7188 | 1.0776 | 1.0381 | | 0.0729 | 5.3669 | 3628 | 1.0745 | 0.7188 | 1.0745 | 1.0366 | | 0.0729 | 5.3698 | 3630 | 1.0338 | 0.7188 | 1.0338 | 1.0167 | | 0.0729 | 5.3728 | 3632 | 0.9743 | 0.7724 | 0.9743 | 0.9871 | | 0.0729 | 5.3757 | 3634 | 0.9124 | 0.7413 | 0.9124 | 0.9552 | | 0.0729 | 5.3787 | 3636 | 0.9034 | 0.7413 | 0.9034 | 0.9505 | | 0.0729 | 5.3817 | 3638 | 0.9447 | 0.7413 | 0.9447 | 0.9719 | | 0.0729 | 5.3846 | 3640 | 0.9774 | 0.7840 | 0.9774 | 0.9887 | | 0.0729 | 5.3876 | 3642 | 1.0130 | 0.7724 | 1.0130 | 1.0065 | | 0.0729 | 5.3905 | 3644 | 1.0409 | 0.7724 | 1.0409 | 1.0203 | | 0.0729 | 5.3935 | 3646 | 1.0244 | 0.7216 | 1.0244 | 1.0121 | | 0.0729 | 5.3964 | 3648 | 0.9700 | 0.7216 | 0.9700 | 0.9849 | | 0.0729 | 5.3994 | 3650 | 0.9572 | 0.7724 | 0.9572 | 0.9783 | | 0.0729 | 5.4024 | 3652 | 0.9200 | 0.7724 | 0.9200 | 0.9591 | | 0.0729 | 5.4053 | 3654 | 0.8985 | 0.7724 | 0.8985 | 0.9479 | | 0.0729 | 5.4083 | 3656 | 0.9255 | 0.7724 | 0.9255 | 0.9620 | | 0.0729 | 5.4112 | 3658 | 0.9594 | 0.7724 | 0.9594 | 0.9795 | | 0.0729 | 5.4142 | 3660 | 1.0013 | 0.7724 | 1.0013 | 1.0006 | | 0.0729 | 5.4172 | 3662 | 1.0250 | 0.7724 | 1.0250 | 1.0124 | | 0.0729 | 5.4201 | 3664 | 0.9970 | 0.7724 | 0.9970 | 0.9985 | | 0.0729 | 5.4231 | 3666 | 0.9493 | 0.7724 | 0.9493 | 0.9743 | | 0.0729 | 5.4260 | 3668 | 0.9316 | 0.7724 | 0.9316 | 0.9652 | | 0.0729 | 5.4290 | 3670 | 0.9488 | 0.7724 | 0.9488 | 0.9740 | | 0.0729 | 5.4320 | 3672 | 0.9211 | 0.7413 | 0.9211 | 0.9597 | | 0.0729 | 5.4349 | 3674 | 0.9128 | 0.7413 | 0.9128 | 0.9554 | | 0.0729 | 5.4379 | 3676 | 0.8855 | 0.7413 | 0.8855 | 0.9410 | | 0.0729 | 5.4408 | 3678 | 0.8444 | 0.7413 | 0.8444 | 0.9189 | | 0.0729 | 5.4438 | 3680 | 0.8298 | 0.7413 | 0.8298 | 0.9110 | | 0.0729 | 5.4467 | 3682 | 0.8495 | 0.7724 | 0.8495 | 0.9217 | | 0.0729 | 5.4497 | 3684 | 0.8905 | 0.7724 | 0.8905 | 0.9436 | | 0.0729 | 5.4527 | 3686 | 0.9412 | 0.7724 | 0.9412 | 0.9702 | | 0.0729 | 5.4556 | 3688 | 0.9933 | 0.7724 | 0.9933 | 0.9967 | | 0.0729 | 5.4586 | 3690 | 1.0112 | 0.7101 | 1.0112 | 1.0056 | | 0.0729 | 5.4615 | 3692 | 0.9683 | 0.7724 | 0.9683 | 0.9840 | | 0.0729 | 5.4645 | 3694 | 0.8887 | 0.7413 | 0.8887 | 0.9427 | | 0.0729 | 5.4675 | 3696 | 0.8022 | 0.8019 | 0.8022 | 0.8956 | | 0.0729 | 5.4704 | 3698 | 0.7701 | 0.7717 | 0.7701 | 0.8776 | | 0.0729 | 5.4734 | 3700 | 0.7594 | 0.7717 | 0.7594 | 0.8714 | | 0.0729 | 5.4763 | 3702 | 0.7811 | 0.7906 | 0.7811 | 0.8838 | | 0.0729 | 5.4793 | 3704 | 0.7993 | 0.7906 | 0.7993 | 0.8941 | | 0.0729 | 5.4822 | 3706 | 0.8533 | 0.8283 | 0.8533 | 0.9237 | | 0.0729 | 5.4852 | 3708 | 0.9178 | 0.8015 | 0.9178 | 0.9580 | | 0.0729 | 5.4882 | 3710 | 0.9853 | 0.7216 | 0.9853 | 0.9926 | | 0.0729 | 5.4911 | 3712 | 1.0007 | 0.7216 | 1.0007 | 1.0004 | | 0.0729 | 5.4941 | 3714 | 0.9916 | 0.7216 | 0.9916 | 0.9958 | | 0.0729 | 5.4970 | 3716 | 0.9264 | 0.7724 | 0.9264 | 0.9625 | | 0.0729 | 5.5 | 3718 | 0.8786 | 0.8015 | 0.8786 | 0.9373 | | 0.0729 | 5.5030 | 3720 | 0.8418 | 0.8019 | 0.8418 | 0.9175 | | 0.0729 | 5.5059 | 3722 | 0.8289 | 0.8019 | 0.8289 | 0.9105 | | 0.0729 | 5.5089 | 3724 | 0.7966 | 0.7823 | 0.7966 | 0.8925 | | 0.0729 | 5.5118 | 3726 | 0.7708 | 0.7823 | 0.7708 | 0.8780 | | 0.0729 | 5.5148 | 3728 | 0.7675 | 0.7823 | 0.7675 | 0.8761 | | 0.0729 | 5.5178 | 3730 | 0.7645 | 0.7823 | 0.7645 | 0.8744 | | 0.0729 | 5.5207 | 3732 | 0.8095 | 0.7525 | 0.8095 | 0.8997 | | 0.0729 | 5.5237 | 3734 | 0.8399 | 0.7413 | 0.8399 | 0.9164 | | 0.0729 | 5.5266 | 3736 | 0.9047 | 0.7724 | 0.9047 | 0.9512 | | 0.0729 | 5.5296 | 3738 | 0.9951 | 0.7840 | 0.9951 | 0.9976 | | 0.0729 | 5.5325 | 3740 | 1.0166 | 0.7840 | 1.0166 | 1.0083 | | 0.0729 | 5.5355 | 3742 | 0.9814 | 0.7840 | 0.9814 | 0.9907 | | 0.0729 | 5.5385 | 3744 | 0.9224 | 0.7840 | 0.9224 | 0.9604 | | 0.0729 | 5.5414 | 3746 | 0.8725 | 0.7413 | 0.8725 | 0.9341 | | 0.0729 | 5.5444 | 3748 | 0.8395 | 0.7413 | 0.8395 | 0.9162 | | 0.0729 | 5.5473 | 3750 | 0.8219 | 0.72 | 0.8219 | 0.9066 | | 0.0729 | 5.5503 | 3752 | 0.8421 | 0.7413 | 0.8421 | 0.9176 | | 0.0729 | 5.5533 | 3754 | 0.8522 | 0.7413 | 0.8522 | 0.9231 | | 0.0729 | 5.5562 | 3756 | 0.8455 | 0.72 | 0.8455 | 0.9195 | | 0.0729 | 5.5592 | 3758 | 0.8631 | 0.7413 | 0.8631 | 0.9290 | | 0.0729 | 5.5621 | 3760 | 0.8563 | 0.7413 | 0.8563 | 0.9254 | | 0.0729 | 5.5651 | 3762 | 0.8206 | 0.72 | 0.8206 | 0.9059 | | 0.0729 | 5.5680 | 3764 | 0.8031 | 0.7823 | 0.8031 | 0.8961 | | 0.0729 | 5.5710 | 3766 | 0.8135 | 0.7525 | 0.8135 | 0.9019 | | 0.0729 | 5.5740 | 3768 | 0.8051 | 0.7525 | 0.8051 | 0.8973 | | 0.0729 | 5.5769 | 3770 | 0.8042 | 0.7525 | 0.8042 | 0.8968 | | 0.0729 | 5.5799 | 3772 | 0.8397 | 0.7413 | 0.8397 | 0.9163 | | 0.0729 | 5.5828 | 3774 | 0.8898 | 0.7413 | 0.8898 | 0.9433 | | 0.0729 | 5.5858 | 3776 | 0.9152 | 0.7413 | 0.9152 | 0.9567 | | 0.0729 | 5.5888 | 3778 | 0.9219 | 0.7519 | 0.9219 | 0.9601 | | 0.0729 | 5.5917 | 3780 | 0.8990 | 0.7413 | 0.8990 | 0.9481 | | 0.0729 | 5.5947 | 3782 | 0.8616 | 0.7413 | 0.8616 | 0.9282 | | 0.0729 | 5.5976 | 3784 | 0.8648 | 0.7413 | 0.8648 | 0.9300 | | 0.0729 | 5.6006 | 3786 | 0.8304 | 0.7413 | 0.8304 | 0.9113 | | 0.0729 | 5.6036 | 3788 | 0.7936 | 0.7413 | 0.7936 | 0.8908 | | 0.0729 | 5.6065 | 3790 | 0.7983 | 0.7413 | 0.7983 | 0.8935 | | 0.0729 | 5.6095 | 3792 | 0.8204 | 0.7519 | 0.8204 | 0.9057 | | 0.0729 | 5.6124 | 3794 | 0.8397 | 0.7519 | 0.8397 | 0.9164 | | 0.0729 | 5.6154 | 3796 | 0.8339 | 0.7413 | 0.8339 | 0.9132 | | 0.0729 | 5.6183 | 3798 | 0.8093 | 0.7729 | 0.8093 | 0.8996 | | 0.0729 | 5.6213 | 3800 | 0.7703 | 0.8019 | 0.7703 | 0.8777 | | 0.0729 | 5.6243 | 3802 | 0.7544 | 0.8283 | 0.7544 | 0.8686 | | 0.0729 | 5.6272 | 3804 | 0.7747 | 0.8283 | 0.7747 | 0.8802 | | 0.0729 | 5.6302 | 3806 | 0.8086 | 0.8283 | 0.8086 | 0.8992 | | 0.0729 | 5.6331 | 3808 | 0.8343 | 0.8283 | 0.8343 | 0.9134 | | 0.0729 | 5.6361 | 3810 | 0.8292 | 0.8283 | 0.8292 | 0.9106 | | 0.0729 | 5.6391 | 3812 | 0.7784 | 0.8283 | 0.7784 | 0.8822 | | 0.0729 | 5.6420 | 3814 | 0.7655 | 0.8019 | 0.7655 | 0.8749 | | 0.0729 | 5.6450 | 3816 | 0.7767 | 0.8019 | 0.7767 | 0.8813 | | 0.0729 | 5.6479 | 3818 | 0.7758 | 0.8019 | 0.7758 | 0.8808 | | 0.0729 | 5.6509 | 3820 | 0.7745 | 0.8019 | 0.7745 | 0.8800 | | 0.0729 | 5.6538 | 3822 | 0.8042 | 0.8019 | 0.8042 | 0.8968 | | 0.0729 | 5.6568 | 3824 | 0.7862 | 0.8019 | 0.7862 | 0.8867 | | 0.0729 | 5.6598 | 3826 | 0.7381 | 0.8019 | 0.7381 | 0.8591 | | 0.0729 | 5.6627 | 3828 | 0.6855 | 0.7717 | 0.6855 | 0.8280 | | 0.0729 | 5.6657 | 3830 | 0.6682 | 0.7717 | 0.6682 | 0.8174 | | 0.0729 | 5.6686 | 3832 | 0.6976 | 0.7906 | 0.6976 | 0.8352 | | 0.0729 | 5.6716 | 3834 | 0.7755 | 0.8283 | 0.7755 | 0.8806 | | 0.0729 | 5.6746 | 3836 | 0.8700 | 0.8283 | 0.8700 | 0.9328 | | 0.0729 | 5.6775 | 3838 | 0.9186 | 0.7724 | 0.9186 | 0.9584 | | 0.0729 | 5.6805 | 3840 | 0.9151 | 0.7724 | 0.9151 | 0.9566 | | 0.0729 | 5.6834 | 3842 | 0.8875 | 0.8015 | 0.8875 | 0.9421 | | 0.0729 | 5.6864 | 3844 | 0.8607 | 0.8283 | 0.8607 | 0.9277 | | 0.0729 | 5.6893 | 3846 | 0.8227 | 0.7906 | 0.8227 | 0.9070 | | 0.0729 | 5.6923 | 3848 | 0.7912 | 0.7906 | 0.7912 | 0.8895 | | 0.0729 | 5.6953 | 3850 | 0.7916 | 0.7906 | 0.7916 | 0.8897 | | 0.0729 | 5.6982 | 3852 | 0.7992 | 0.7906 | 0.7992 | 0.8940 | | 0.0729 | 5.7012 | 3854 | 0.8389 | 0.8164 | 0.8389 | 0.9159 | | 0.0729 | 5.7041 | 3856 | 0.9005 | 0.8283 | 0.9005 | 0.9489 | | 0.0729 | 5.7071 | 3858 | 0.9551 | 0.7724 | 0.9551 | 0.9773 | | 0.0729 | 5.7101 | 3860 | 0.9572 | 0.7724 | 0.9572 | 0.9784 | | 0.0729 | 5.7130 | 3862 | 0.9207 | 0.7724 | 0.9207 | 0.9595 | | 0.0729 | 5.7160 | 3864 | 0.8760 | 0.8019 | 0.8760 | 0.9359 | | 0.0729 | 5.7189 | 3866 | 0.8334 | 0.8019 | 0.8334 | 0.9129 | | 0.0729 | 5.7219 | 3868 | 0.8203 | 0.8019 | 0.8203 | 0.9057 | | 0.0729 | 5.7249 | 3870 | 0.8116 | 0.8019 | 0.8116 | 0.9009 | | 0.0729 | 5.7278 | 3872 | 0.7744 | 0.8019 | 0.7744 | 0.8800 | | 0.0729 | 5.7308 | 3874 | 0.7858 | 0.8019 | 0.7858 | 0.8865 | | 0.0729 | 5.7337 | 3876 | 0.7848 | 0.8019 | 0.7848 | 0.8859 | | 0.0729 | 5.7367 | 3878 | 0.8025 | 0.8019 | 0.8025 | 0.8958 | | 0.0729 | 5.7396 | 3880 | 0.8159 | 0.8019 | 0.8159 | 0.9033 | | 0.0729 | 5.7426 | 3882 | 0.8059 | 0.8019 | 0.8059 | 0.8977 | | 0.0729 | 5.7456 | 3884 | 0.8190 | 0.8019 | 0.8190 | 0.9050 | | 0.0729 | 5.7485 | 3886 | 0.8319 | 0.8283 | 0.8319 | 0.9121 | | 0.0729 | 5.7515 | 3888 | 0.8475 | 0.8015 | 0.8475 | 0.9206 | | 0.0729 | 5.7544 | 3890 | 0.8264 | 0.8015 | 0.8264 | 0.9091 | | 0.0729 | 5.7574 | 3892 | 0.7787 | 0.7729 | 0.7787 | 0.8824 | | 0.0729 | 5.7604 | 3894 | 0.7594 | 0.7729 | 0.7594 | 0.8714 | | 0.0729 | 5.7633 | 3896 | 0.7801 | 0.7729 | 0.7801 | 0.8832 | | 0.0729 | 5.7663 | 3898 | 0.8330 | 0.8015 | 0.8330 | 0.9127 | | 0.0729 | 5.7692 | 3900 | 0.9293 | 0.7724 | 0.9293 | 0.9640 | | 0.0729 | 5.7722 | 3902 | 0.9803 | 0.7724 | 0.9803 | 0.9901 | | 0.0729 | 5.7751 | 3904 | 0.9688 | 0.7724 | 0.9688 | 0.9843 | | 0.0729 | 5.7781 | 3906 | 0.9122 | 0.8015 | 0.9122 | 0.9551 | | 0.0729 | 5.7811 | 3908 | 0.8935 | 0.8015 | 0.8935 | 0.9452 | | 0.0729 | 5.7840 | 3910 | 0.8688 | 0.8015 | 0.8688 | 0.9321 | | 0.0729 | 5.7870 | 3912 | 0.8413 | 0.7729 | 0.8413 | 0.9172 | | 0.0729 | 5.7899 | 3914 | 0.7963 | 0.7623 | 0.7963 | 0.8923 | | 0.0729 | 5.7929 | 3916 | 0.7876 | 0.7623 | 0.7876 | 0.8875 | | 0.0729 | 5.7959 | 3918 | 0.8222 | 0.7623 | 0.8222 | 0.9068 | | 0.0729 | 5.7988 | 3920 | 0.8519 | 0.7316 | 0.8519 | 0.9230 | | 0.0729 | 5.8018 | 3922 | 0.9165 | 0.7724 | 0.9165 | 0.9573 | | 0.0729 | 5.8047 | 3924 | 0.9573 | 0.7724 | 0.9573 | 0.9784 | | 0.0729 | 5.8077 | 3926 | 0.9844 | 0.7724 | 0.9844 | 0.9922 | | 0.0729 | 5.8107 | 3928 | 0.9872 | 0.7840 | 0.9872 | 0.9936 | | 0.0729 | 5.8136 | 3930 | 0.9713 | 0.7840 | 0.9713 | 0.9855 | | 0.0729 | 5.8166 | 3932 | 0.9651 | 0.7519 | 0.9651 | 0.9824 | | 0.0729 | 5.8195 | 3934 | 1.0025 | 0.7840 | 1.0025 | 1.0012 | | 0.0729 | 5.8225 | 3936 | 1.0396 | 0.7840 | 1.0396 | 1.0196 | | 0.0729 | 5.8254 | 3938 | 1.0270 | 0.7840 | 1.0270 | 1.0134 | | 0.0729 | 5.8284 | 3940 | 0.9782 | 0.7840 | 0.9782 | 0.9891 | | 0.0729 | 5.8314 | 3942 | 0.9171 | 0.7519 | 0.9171 | 0.9576 | | 0.0729 | 5.8343 | 3944 | 0.8812 | 0.7519 | 0.8812 | 0.9387 | | 0.0729 | 5.8373 | 3946 | 0.8382 | 0.7316 | 0.8382 | 0.9155 | | 0.0729 | 5.8402 | 3948 | 0.8322 | 0.7316 | 0.8322 | 0.9122 | | 0.0729 | 5.8432 | 3950 | 0.8118 | 0.7316 | 0.8118 | 0.9010 | | 0.0729 | 5.8462 | 3952 | 0.8067 | 0.7316 | 0.8067 | 0.8982 | | 0.0729 | 5.8491 | 3954 | 0.8378 | 0.7316 | 0.8378 | 0.9153 | | 0.0729 | 5.8521 | 3956 | 0.9054 | 0.7724 | 0.9054 | 0.9515 | | 0.0729 | 5.8550 | 3958 | 0.9621 | 0.7840 | 0.9621 | 0.9809 | | 0.0729 | 5.8580 | 3960 | 0.9957 | 0.7840 | 0.9957 | 0.9978 | | 0.0729 | 5.8609 | 3962 | 1.0217 | 0.7840 | 1.0217 | 1.0108 | | 0.0729 | 5.8639 | 3964 | 1.0058 | 0.7840 | 1.0058 | 1.0029 | | 0.0729 | 5.8669 | 3966 | 0.9909 | 0.7519 | 0.9909 | 0.9954 | | 0.0729 | 5.8698 | 3968 | 0.9879 | 0.7519 | 0.9879 | 0.9939 | | 0.0729 | 5.8728 | 3970 | 1.0028 | 0.7519 | 1.0028 | 1.0014 | | 0.0729 | 5.8757 | 3972 | 0.9815 | 0.7519 | 0.9815 | 0.9907 | | 0.0729 | 5.8787 | 3974 | 0.9568 | 0.7519 | 0.9568 | 0.9782 | | 0.0729 | 5.8817 | 3976 | 0.9083 | 0.7519 | 0.9083 | 0.9530 | | 0.0729 | 5.8846 | 3978 | 0.8859 | 0.7413 | 0.8859 | 0.9412 | | 0.0729 | 5.8876 | 3980 | 0.8517 | 0.7413 | 0.8517 | 0.9229 | | 0.0729 | 5.8905 | 3982 | 0.8734 | 0.7413 | 0.8734 | 0.9346 | | 0.0729 | 5.8935 | 3984 | 0.9036 | 0.7840 | 0.9036 | 0.9506 | | 0.0729 | 5.8964 | 3986 | 0.9235 | 0.7840 | 0.9235 | 0.9610 | | 0.0729 | 5.8994 | 3988 | 0.9184 | 0.7840 | 0.9184 | 0.9584 | | 0.0729 | 5.9024 | 3990 | 0.8787 | 0.7724 | 0.8787 | 0.9374 | | 0.0729 | 5.9053 | 3992 | 0.8574 | 0.7724 | 0.8574 | 0.9260 | | 0.0729 | 5.9083 | 3994 | 0.8516 | 0.7724 | 0.8516 | 0.9228 | | 0.0729 | 5.9112 | 3996 | 0.8806 | 0.7724 | 0.8806 | 0.9384 | | 0.0729 | 5.9142 | 3998 | 0.9248 | 0.7840 | 0.9248 | 0.9617 | | 0.0618 | 5.9172 | 4000 | 0.9274 | 0.7840 | 0.9274 | 0.9630 | | 0.0618 | 5.9201 | 4002 | 0.8962 | 0.7840 | 0.8962 | 0.9467 | | 0.0618 | 5.9231 | 4004 | 0.8165 | 0.7729 | 0.8165 | 0.9036 | | 0.0618 | 5.9260 | 4006 | 0.7779 | 0.7729 | 0.7779 | 0.8820 | | 0.0618 | 5.9290 | 4008 | 0.7484 | 0.7823 | 0.7484 | 0.8651 | | 0.0618 | 5.9320 | 4010 | 0.7476 | 0.7823 | 0.7476 | 0.8646 | | 0.0618 | 5.9349 | 4012 | 0.7707 | 0.8019 | 0.7707 | 0.8779 | | 0.0618 | 5.9379 | 4014 | 0.8322 | 0.8015 | 0.8322 | 0.9123 | | 0.0618 | 5.9408 | 4016 | 0.8897 | 0.8138 | 0.8897 | 0.9433 | | 0.0618 | 5.9438 | 4018 | 0.9447 | 0.7840 | 0.9447 | 0.9719 | | 0.0618 | 5.9467 | 4020 | 0.9562 | 0.7840 | 0.9562 | 0.9778 | | 0.0618 | 5.9497 | 4022 | 0.9259 | 0.7840 | 0.9259 | 0.9623 | | 0.0618 | 5.9527 | 4024 | 0.8530 | 0.8015 | 0.8530 | 0.9236 | | 0.0618 | 5.9556 | 4026 | 0.7660 | 0.8019 | 0.7660 | 0.8752 | | 0.0618 | 5.9586 | 4028 | 0.6856 | 0.7717 | 0.6856 | 0.8280 | | 0.0618 | 5.9615 | 4030 | 0.6482 | 0.7717 | 0.6482 | 0.8051 | | 0.0618 | 5.9645 | 4032 | 0.6527 | 0.7717 | 0.6527 | 0.8079 | | 0.0618 | 5.9675 | 4034 | 0.6919 | 0.7717 | 0.6919 | 0.8318 | | 0.0618 | 5.9704 | 4036 | 0.7488 | 0.7823 | 0.7488 | 0.8653 | | 0.0618 | 5.9734 | 4038 | 0.8241 | 0.7844 | 0.8241 | 0.9078 | | 0.0618 | 5.9763 | 4040 | 0.9084 | 0.7840 | 0.9084 | 0.9531 | | 0.0618 | 5.9793 | 4042 | 0.9500 | 0.7840 | 0.9500 | 0.9747 | | 0.0618 | 5.9822 | 4044 | 0.9388 | 0.7840 | 0.9388 | 0.9689 | | 0.0618 | 5.9852 | 4046 | 0.8905 | 0.7840 | 0.8905 | 0.9437 | | 0.0618 | 5.9882 | 4048 | 0.8283 | 0.8283 | 0.8283 | 0.9101 | | 0.0618 | 5.9911 | 4050 | 0.7828 | 0.8164 | 0.7828 | 0.8848 | | 0.0618 | 5.9941 | 4052 | 0.7700 | 0.8164 | 0.7700 | 0.8775 | | 0.0618 | 5.9970 | 4054 | 0.7762 | 0.8164 | 0.7762 | 0.8810 | | 0.0618 | 6.0 | 4056 | 0.7857 | 0.8164 | 0.7857 | 0.8864 | | 0.0618 | 6.0030 | 4058 | 0.7708 | 0.8164 | 0.7708 | 0.8779 | | 0.0618 | 6.0059 | 4060 | 0.7407 | 0.8164 | 0.7407 | 0.8606 | | 0.0618 | 6.0089 | 4062 | 0.7338 | 0.8164 | 0.7338 | 0.8566 | | 0.0618 | 6.0118 | 4064 | 0.7257 | 0.8164 | 0.7257 | 0.8519 | | 0.0618 | 6.0148 | 4066 | 0.6997 | 0.7906 | 0.6997 | 0.8365 | | 0.0618 | 6.0178 | 4068 | 0.6803 | 0.7717 | 0.6803 | 0.8248 | | 0.0618 | 6.0207 | 4070 | 0.6891 | 0.7906 | 0.6891 | 0.8301 | | 0.0618 | 6.0237 | 4072 | 0.7343 | 0.7906 | 0.7343 | 0.8569 | | 0.0618 | 6.0266 | 4074 | 0.7942 | 0.8164 | 0.7942 | 0.8912 | | 0.0618 | 6.0296 | 4076 | 0.8335 | 0.7902 | 0.8335 | 0.9130 | | 0.0618 | 6.0325 | 4078 | 0.8679 | 0.8015 | 0.8679 | 0.9316 | | 0.0618 | 6.0355 | 4080 | 0.8942 | 0.8015 | 0.8942 | 0.9456 | | 0.0618 | 6.0385 | 4082 | 0.8714 | 0.8015 | 0.8714 | 0.9335 | | 0.0618 | 6.0414 | 4084 | 0.8241 | 0.8164 | 0.8241 | 0.9078 | | 0.0618 | 6.0444 | 4086 | 0.8082 | 0.8164 | 0.8082 | 0.8990 | | 0.0618 | 6.0473 | 4088 | 0.7933 | 0.8164 | 0.7933 | 0.8907 | | 0.0618 | 6.0503 | 4090 | 0.8125 | 0.8164 | 0.8125 | 0.9014 | | 0.0618 | 6.0533 | 4092 | 0.8090 | 0.8283 | 0.8090 | 0.8995 | | 0.0618 | 6.0562 | 4094 | 0.8088 | 0.8019 | 0.8088 | 0.8993 | | 0.0618 | 6.0592 | 4096 | 0.8278 | 0.8019 | 0.8278 | 0.9098 | | 0.0618 | 6.0621 | 4098 | 0.8295 | 0.8019 | 0.8295 | 0.9108 | | 0.0618 | 6.0651 | 4100 | 0.8138 | 0.8019 | 0.8138 | 0.9021 | | 0.0618 | 6.0680 | 4102 | 0.7732 | 0.8019 | 0.7732 | 0.8793 | | 0.0618 | 6.0710 | 4104 | 0.7564 | 0.7906 | 0.7564 | 0.8697 | | 0.0618 | 6.0740 | 4106 | 0.7512 | 0.7717 | 0.7512 | 0.8667 | | 0.0618 | 6.0769 | 4108 | 0.7832 | 0.8019 | 0.7832 | 0.8850 | | 0.0618 | 6.0799 | 4110 | 0.8424 | 0.7844 | 0.8424 | 0.9178 | | 0.0618 | 6.0828 | 4112 | 0.9049 | 0.7519 | 0.9049 | 0.9513 | | 0.0618 | 6.0858 | 4114 | 0.9690 | 0.7840 | 0.9690 | 0.9844 | | 0.0618 | 6.0888 | 4116 | 0.9749 | 0.7840 | 0.9749 | 0.9874 | | 0.0618 | 6.0917 | 4118 | 0.9375 | 0.7840 | 0.9375 | 0.9683 | | 0.0618 | 6.0947 | 4120 | 0.8876 | 0.7844 | 0.8876 | 0.9421 | | 0.0618 | 6.0976 | 4122 | 0.8800 | 0.7844 | 0.8800 | 0.9381 | | 0.0618 | 6.1006 | 4124 | 0.8834 | 0.7844 | 0.8834 | 0.9399 | | 0.0618 | 6.1036 | 4126 | 0.8651 | 0.8142 | 0.8651 | 0.9301 | | 0.0618 | 6.1065 | 4128 | 0.8500 | 0.8142 | 0.8500 | 0.9220 | | 0.0618 | 6.1095 | 4130 | 0.8587 | 0.8142 | 0.8587 | 0.9267 | | 0.0618 | 6.1124 | 4132 | 0.8524 | 0.8142 | 0.8523 | 0.9232 | | 0.0618 | 6.1154 | 4134 | 0.8220 | 0.7937 | 0.8220 | 0.9066 | | 0.0618 | 6.1183 | 4136 | 0.8070 | 0.7937 | 0.8070 | 0.8984 | | 0.0618 | 6.1213 | 4138 | 0.8368 | 0.8215 | 0.8368 | 0.9148 | | 0.0618 | 6.1243 | 4140 | 0.9113 | 0.8138 | 0.9113 | 0.9546 | | 0.0618 | 6.1272 | 4142 | 0.9792 | 0.7614 | 0.9792 | 0.9896 | | 0.0618 | 6.1302 | 4144 | 1.0125 | 0.7305 | 1.0125 | 1.0062 | | 0.0618 | 6.1331 | 4146 | 0.9993 | 0.7840 | 0.9993 | 0.9997 | | 0.0618 | 6.1361 | 4148 | 0.9779 | 0.7840 | 0.9779 | 0.9889 | | 0.0618 | 6.1391 | 4150 | 0.9603 | 0.7840 | 0.9603 | 0.9799 | | 0.0618 | 6.1420 | 4152 | 0.9874 | 0.7840 | 0.9874 | 0.9937 | | 0.0618 | 6.1450 | 4154 | 1.0432 | 0.7840 | 1.0432 | 1.0214 | | 0.0618 | 6.1479 | 4156 | 1.1069 | 0.7840 | 1.1069 | 1.0521 | | 0.0618 | 6.1509 | 4158 | 1.1020 | 0.7840 | 1.1020 | 1.0498 | | 0.0618 | 6.1538 | 4160 | 1.0420 | 0.7840 | 1.0420 | 1.0208 | | 0.0618 | 6.1568 | 4162 | 0.9577 | 0.7519 | 0.9577 | 0.9786 | | 0.0618 | 6.1598 | 4164 | 0.8798 | 0.7844 | 0.8798 | 0.9380 | | 0.0618 | 6.1627 | 4166 | 0.8421 | 0.7844 | 0.8421 | 0.9176 | | 0.0618 | 6.1657 | 4168 | 0.8155 | 0.7631 | 0.8155 | 0.9031 | | 0.0618 | 6.1686 | 4170 | 0.8277 | 0.7844 | 0.8277 | 0.9098 | | 0.0618 | 6.1716 | 4172 | 0.8593 | 0.7844 | 0.8593 | 0.9270 | | 0.0618 | 6.1746 | 4174 | 0.9136 | 0.7844 | 0.9136 | 0.9558 | | 0.0618 | 6.1775 | 4176 | 0.9556 | 0.7519 | 0.9556 | 0.9776 | | 0.0618 | 6.1805 | 4178 | 0.9804 | 0.7840 | 0.9804 | 0.9902 | | 0.0618 | 6.1834 | 4180 | 1.0194 | 0.7840 | 1.0194 | 1.0096 | | 0.0618 | 6.1864 | 4182 | 1.0158 | 0.8138 | 1.0158 | 1.0079 | | 0.0618 | 6.1893 | 4184 | 0.9620 | 0.8138 | 0.9620 | 0.9808 | | 0.0618 | 6.1923 | 4186 | 0.8777 | 0.8283 | 0.8777 | 0.9368 | | 0.0618 | 6.1953 | 4188 | 0.8295 | 0.8283 | 0.8295 | 0.9108 | | 0.0618 | 6.1982 | 4190 | 0.7949 | 0.8164 | 0.7949 | 0.8916 | | 0.0618 | 6.2012 | 4192 | 0.7444 | 0.7717 | 0.7444 | 0.8628 | | 0.0618 | 6.2041 | 4194 | 0.7334 | 0.7717 | 0.7334 | 0.8564 | | 0.0618 | 6.2071 | 4196 | 0.7522 | 0.7717 | 0.7522 | 0.8673 | | 0.0618 | 6.2101 | 4198 | 0.7948 | 0.7906 | 0.7948 | 0.8915 | | 0.0618 | 6.2130 | 4200 | 0.8747 | 0.8283 | 0.8747 | 0.9353 | | 0.0618 | 6.2160 | 4202 | 0.9584 | 0.8138 | 0.9584 | 0.9790 | | 0.0618 | 6.2189 | 4204 | 0.9870 | 0.7614 | 0.9870 | 0.9935 | | 0.0618 | 6.2219 | 4206 | 0.9665 | 0.8138 | 0.9665 | 0.9831 | | 0.0618 | 6.2249 | 4208 | 0.9049 | 0.8412 | 0.9049 | 0.9512 | | 0.0618 | 6.2278 | 4210 | 0.8257 | 0.8164 | 0.8257 | 0.9087 | | 0.0618 | 6.2308 | 4212 | 0.7600 | 0.7906 | 0.7600 | 0.8718 | | 0.0618 | 6.2337 | 4214 | 0.7078 | 0.7717 | 0.7078 | 0.8413 | | 0.0618 | 6.2367 | 4216 | 0.6927 | 0.7717 | 0.6927 | 0.8323 | | 0.0618 | 6.2396 | 4218 | 0.7173 | 0.7717 | 0.7173 | 0.8469 | | 0.0618 | 6.2426 | 4220 | 0.7786 | 0.7717 | 0.7786 | 0.8824 | | 0.0618 | 6.2456 | 4222 | 0.8638 | 0.8412 | 0.8638 | 0.9294 | | 0.0618 | 6.2485 | 4224 | 0.9439 | 0.8138 | 0.9439 | 0.9716 | | 0.0618 | 6.2515 | 4226 | 0.9660 | 0.8138 | 0.9660 | 0.9829 | | 0.0618 | 6.2544 | 4228 | 0.9475 | 0.8138 | 0.9475 | 0.9734 | | 0.0618 | 6.2574 | 4230 | 0.9202 | 0.8138 | 0.9202 | 0.9593 | | 0.0618 | 6.2604 | 4232 | 0.8638 | 0.8015 | 0.8638 | 0.9294 | | 0.0618 | 6.2633 | 4234 | 0.8018 | 0.7906 | 0.8018 | 0.8955 | | 0.0618 | 6.2663 | 4236 | 0.7775 | 0.7906 | 0.7775 | 0.8818 | | 0.0618 | 6.2692 | 4238 | 0.7787 | 0.8164 | 0.7787 | 0.8824 | | 0.0618 | 6.2722 | 4240 | 0.8015 | 0.8164 | 0.8015 | 0.8953 | | 0.0618 | 6.2751 | 4242 | 0.8417 | 0.8164 | 0.8417 | 0.9174 | | 0.0618 | 6.2781 | 4244 | 0.9045 | 0.7902 | 0.9045 | 0.9510 | | 0.0618 | 6.2811 | 4246 | 0.9271 | 0.7902 | 0.9271 | 0.9628 | | 0.0618 | 6.2840 | 4248 | 0.8977 | 0.7902 | 0.8977 | 0.9475 | | 0.0618 | 6.2870 | 4250 | 0.8399 | 0.7906 | 0.8399 | 0.9164 | | 0.0618 | 6.2899 | 4252 | 0.7644 | 0.7717 | 0.7644 | 0.8743 | | 0.0618 | 6.2929 | 4254 | 0.7287 | 0.7717 | 0.7287 | 0.8537 | | 0.0618 | 6.2959 | 4256 | 0.7334 | 0.7717 | 0.7334 | 0.8564 | | 0.0618 | 6.2988 | 4258 | 0.7651 | 0.7717 | 0.7651 | 0.8747 | | 0.0618 | 6.3018 | 4260 | 0.8232 | 0.7525 | 0.8232 | 0.9073 | | 0.0618 | 6.3047 | 4262 | 0.8860 | 0.7844 | 0.8860 | 0.9413 | | 0.0618 | 6.3077 | 4264 | 0.9122 | 0.7844 | 0.9122 | 0.9551 | | 0.0618 | 6.3107 | 4266 | 0.9031 | 0.7844 | 0.9031 | 0.9503 | | 0.0618 | 6.3136 | 4268 | 0.8583 | 0.7631 | 0.8583 | 0.9265 | | 0.0618 | 6.3166 | 4270 | 0.8002 | 0.7717 | 0.8002 | 0.8946 | | 0.0618 | 6.3195 | 4272 | 0.7614 | 0.7717 | 0.7614 | 0.8726 | | 0.0618 | 6.3225 | 4274 | 0.7499 | 0.7717 | 0.7499 | 0.8659 | | 0.0618 | 6.3254 | 4276 | 0.7669 | 0.7717 | 0.7669 | 0.8757 | | 0.0618 | 6.3284 | 4278 | 0.8117 | 0.7717 | 0.8117 | 0.9009 | | 0.0618 | 6.3314 | 4280 | 0.8436 | 0.7717 | 0.8436 | 0.9185 | | 0.0618 | 6.3343 | 4282 | 0.8931 | 0.8283 | 0.8931 | 0.9451 | | 0.0618 | 6.3373 | 4284 | 0.9088 | 0.8283 | 0.9088 | 0.9533 | | 0.0618 | 6.3402 | 4286 | 0.8810 | 0.8164 | 0.8810 | 0.9386 | | 0.0618 | 6.3432 | 4288 | 0.8622 | 0.7983 | 0.8622 | 0.9285 | | 0.0618 | 6.3462 | 4290 | 0.8516 | 0.7983 | 0.8516 | 0.9228 | | 0.0618 | 6.3491 | 4292 | 0.8306 | 0.7983 | 0.8306 | 0.9114 | | 0.0618 | 6.3521 | 4294 | 0.7967 | 0.7717 | 0.7967 | 0.8926 | | 0.0618 | 6.3550 | 4296 | 0.7927 | 0.7717 | 0.7927 | 0.8903 | | 0.0618 | 6.3580 | 4298 | 0.8169 | 0.7717 | 0.8169 | 0.9038 | | 0.0618 | 6.3609 | 4300 | 0.8513 | 0.7525 | 0.8513 | 0.9227 | | 0.0618 | 6.3639 | 4302 | 0.8584 | 0.7525 | 0.8584 | 0.9265 | | 0.0618 | 6.3669 | 4304 | 0.8419 | 0.7525 | 0.8419 | 0.9176 | | 0.0618 | 6.3698 | 4306 | 0.8314 | 0.7427 | 0.8314 | 0.9118 | | 0.0618 | 6.3728 | 4308 | 0.8132 | 0.7427 | 0.8132 | 0.9018 | | 0.0618 | 6.3757 | 4310 | 0.8137 | 0.7427 | 0.8137 | 0.9021 | | 0.0618 | 6.3787 | 4312 | 0.8264 | 0.7427 | 0.8264 | 0.9091 | | 0.0618 | 6.3817 | 4314 | 0.8466 | 0.7525 | 0.8466 | 0.9201 | | 0.0618 | 6.3846 | 4316 | 0.8922 | 0.7525 | 0.8922 | 0.9446 | | 0.0618 | 6.3876 | 4318 | 0.9381 | 0.7729 | 0.9381 | 0.9685 | | 0.0618 | 6.3905 | 4320 | 0.9569 | 0.7724 | 0.9569 | 0.9782 | | 0.0618 | 6.3935 | 4322 | 0.9639 | 0.7724 | 0.9639 | 0.9818 | | 0.0618 | 6.3964 | 4324 | 0.9795 | 0.7724 | 0.9795 | 0.9897 | | 0.0618 | 6.3994 | 4326 | 0.9675 | 0.7724 | 0.9675 | 0.9836 | | 0.0618 | 6.4024 | 4328 | 0.9493 | 0.8015 | 0.9493 | 0.9743 | | 0.0618 | 6.4053 | 4330 | 0.9175 | 0.7729 | 0.9175 | 0.9579 | | 0.0618 | 6.4083 | 4332 | 0.8663 | 0.7525 | 0.8663 | 0.9308 | | 0.0618 | 6.4112 | 4334 | 0.8321 | 0.7525 | 0.8321 | 0.9122 | | 0.0618 | 6.4142 | 4336 | 0.8146 | 0.7525 | 0.8146 | 0.9025 | | 0.0618 | 6.4172 | 4338 | 0.8186 | 0.7525 | 0.8186 | 0.9048 | | 0.0618 | 6.4201 | 4340 | 0.8411 | 0.7525 | 0.8411 | 0.9171 | | 0.0618 | 6.4231 | 4342 | 0.8795 | 0.72 | 0.8795 | 0.9378 | | 0.0618 | 6.4260 | 4344 | 0.9002 | 0.72 | 0.9002 | 0.9488 | | 0.0618 | 6.4290 | 4346 | 0.9446 | 0.7724 | 0.9446 | 0.9719 | | 0.0618 | 6.4320 | 4348 | 0.9799 | 0.7840 | 0.9799 | 0.9899 | | 0.0618 | 6.4349 | 4350 | 0.9811 | 0.7724 | 0.9811 | 0.9905 | | 0.0618 | 6.4379 | 4352 | 0.9651 | 0.7724 | 0.9651 | 0.9824 | | 0.0618 | 6.4408 | 4354 | 0.9166 | 0.7724 | 0.9166 | 0.9574 | | 0.0618 | 6.4438 | 4356 | 0.8878 | 0.7520 | 0.8878 | 0.9422 | | 0.0618 | 6.4467 | 4358 | 0.8855 | 0.7520 | 0.8855 | 0.9410 | | 0.0618 | 6.4497 | 4360 | 0.9110 | 0.7520 | 0.9110 | 0.9544 | | 0.0618 | 6.4527 | 4362 | 0.9617 | 0.7724 | 0.9617 | 0.9807 | | 0.0618 | 6.4556 | 4364 | 0.9895 | 0.7724 | 0.9895 | 0.9947 | | 0.0618 | 6.4586 | 4366 | 0.9956 | 0.7724 | 0.9956 | 0.9978 | | 0.0618 | 6.4615 | 4368 | 0.9714 | 0.7724 | 0.9714 | 0.9856 | | 0.0618 | 6.4645 | 4370 | 0.9126 | 0.7520 | 0.9126 | 0.9553 | | 0.0618 | 6.4675 | 4372 | 0.8689 | 0.7819 | 0.8689 | 0.9321 | | 0.0618 | 6.4704 | 4374 | 0.8606 | 0.7525 | 0.8606 | 0.9277 | | 0.0618 | 6.4734 | 4376 | 0.8802 | 0.7525 | 0.8802 | 0.9382 | | 0.0618 | 6.4763 | 4378 | 0.9276 | 0.7626 | 0.9276 | 0.9631 | | 0.0618 | 6.4793 | 4380 | 0.9632 | 0.7626 | 0.9632 | 0.9814 | | 0.0618 | 6.4822 | 4382 | 0.9936 | 0.7840 | 0.9936 | 0.9968 | | 0.0618 | 6.4852 | 4384 | 0.9775 | 0.7840 | 0.9775 | 0.9887 | | 0.0618 | 6.4882 | 4386 | 0.9372 | 0.8138 | 0.9372 | 0.9681 | | 0.0618 | 6.4911 | 4388 | 0.8688 | 0.7819 | 0.8688 | 0.9321 | | 0.0618 | 6.4941 | 4390 | 0.8390 | 0.7819 | 0.8390 | 0.9160 | | 0.0618 | 6.4970 | 4392 | 0.8079 | 0.7819 | 0.8079 | 0.8989 | | 0.0618 | 6.5 | 4394 | 0.7641 | 0.7713 | 0.7641 | 0.8741 | | 0.0618 | 6.5030 | 4396 | 0.7490 | 0.7713 | 0.7490 | 0.8655 | | 0.0618 | 6.5059 | 4398 | 0.7567 | 0.7713 | 0.7567 | 0.8699 | | 0.0618 | 6.5089 | 4400 | 0.7497 | 0.7713 | 0.7497 | 0.8659 | | 0.0618 | 6.5118 | 4402 | 0.7497 | 0.7713 | 0.7497 | 0.8659 | | 0.0618 | 6.5148 | 4404 | 0.7730 | 0.7713 | 0.7730 | 0.8792 | | 0.0618 | 6.5178 | 4406 | 0.8254 | 0.7902 | 0.8254 | 0.9085 | | 0.0618 | 6.5207 | 4408 | 0.8634 | 0.8015 | 0.8634 | 0.9292 | | 0.0618 | 6.5237 | 4410 | 0.8683 | 0.8015 | 0.8683 | 0.9318 | | 0.0618 | 6.5266 | 4412 | 0.8526 | 0.8015 | 0.8526 | 0.9234 | | 0.0618 | 6.5296 | 4414 | 0.8343 | 0.8015 | 0.8343 | 0.9134 | | 0.0618 | 6.5325 | 4416 | 0.8119 | 0.8015 | 0.8119 | 0.9011 | | 0.0618 | 6.5355 | 4418 | 0.7804 | 0.7525 | 0.7804 | 0.8834 | | 0.0618 | 6.5385 | 4420 | 0.7326 | 0.7427 | 0.7326 | 0.8559 | | 0.0618 | 6.5414 | 4422 | 0.7086 | 0.7427 | 0.7086 | 0.8418 | | 0.0618 | 6.5444 | 4424 | 0.7011 | 0.7427 | 0.7011 | 0.8373 | | 0.0618 | 6.5473 | 4426 | 0.7097 | 0.7427 | 0.7097 | 0.8424 | | 0.0618 | 6.5503 | 4428 | 0.7339 | 0.7427 | 0.7339 | 0.8567 | | 0.0618 | 6.5533 | 4430 | 0.7779 | 0.7427 | 0.7779 | 0.8820 | | 0.0618 | 6.5562 | 4432 | 0.8245 | 0.8015 | 0.8245 | 0.9080 | | 0.0618 | 6.5592 | 4434 | 0.8713 | 0.8138 | 0.8713 | 0.9334 | | 0.0618 | 6.5621 | 4436 | 0.8752 | 0.8015 | 0.8752 | 0.9355 | | 0.0618 | 6.5651 | 4438 | 0.8417 | 0.7902 | 0.8417 | 0.9174 | | 0.0618 | 6.5680 | 4440 | 0.8117 | 0.7902 | 0.8117 | 0.9010 | | 0.0618 | 6.5710 | 4442 | 0.7711 | 0.7713 | 0.7711 | 0.8781 | | 0.0618 | 6.5740 | 4444 | 0.7298 | 0.7427 | 0.7298 | 0.8543 | | 0.0618 | 6.5769 | 4446 | 0.7291 | 0.7427 | 0.7291 | 0.8539 | | 0.0618 | 6.5799 | 4448 | 0.7399 | 0.7427 | 0.7399 | 0.8602 | | 0.0618 | 6.5828 | 4450 | 0.7796 | 0.7713 | 0.7796 | 0.8829 | | 0.0618 | 6.5858 | 4452 | 0.8176 | 0.7902 | 0.8176 | 0.9042 | | 0.0618 | 6.5888 | 4454 | 0.8640 | 0.7902 | 0.8640 | 0.9295 | | 0.0618 | 6.5917 | 4456 | 0.8799 | 0.7902 | 0.8799 | 0.9380 | | 0.0618 | 6.5947 | 4458 | 0.8964 | 0.7902 | 0.8964 | 0.9468 | | 0.0618 | 6.5976 | 4460 | 0.9016 | 0.7902 | 0.9016 | 0.9495 | | 0.0618 | 6.6006 | 4462 | 0.8860 | 0.7902 | 0.8860 | 0.9413 | | 0.0618 | 6.6036 | 4464 | 0.8399 | 0.7713 | 0.8399 | 0.9164 | | 0.0618 | 6.6065 | 4466 | 0.7918 | 0.7713 | 0.7918 | 0.8898 | | 0.0618 | 6.6095 | 4468 | 0.7926 | 0.7713 | 0.7926 | 0.8903 | | 0.0618 | 6.6124 | 4470 | 0.8235 | 0.7713 | 0.8235 | 0.9075 | | 0.0618 | 6.6154 | 4472 | 0.8503 | 0.7713 | 0.8503 | 0.9221 | | 0.0618 | 6.6183 | 4474 | 0.8654 | 0.7713 | 0.8654 | 0.9303 | | 0.0618 | 6.6213 | 4476 | 0.8665 | 0.7626 | 0.8665 | 0.9308 | | 0.0618 | 6.6243 | 4478 | 0.8559 | 0.7626 | 0.8559 | 0.9252 | | 0.0618 | 6.6272 | 4480 | 0.8714 | 0.7626 | 0.8714 | 0.9335 | | 0.0618 | 6.6302 | 4482 | 0.8815 | 0.7626 | 0.8815 | 0.9389 | | 0.0618 | 6.6331 | 4484 | 0.9243 | 0.7840 | 0.9243 | 0.9614 | | 0.0618 | 6.6361 | 4486 | 0.9661 | 0.7840 | 0.9661 | 0.9829 | | 0.0618 | 6.6391 | 4488 | 0.9981 | 0.7840 | 0.9981 | 0.9990 | | 0.0618 | 6.6420 | 4490 | 1.0217 | 0.7840 | 1.0217 | 1.0108 | | 0.0618 | 6.6450 | 4492 | 1.0027 | 0.7305 | 1.0027 | 1.0014 | | 0.0618 | 6.6479 | 4494 | 0.9528 | 0.7840 | 0.9528 | 0.9761 | | 0.0618 | 6.6509 | 4496 | 0.8722 | 0.7520 | 0.8722 | 0.9339 | | 0.0618 | 6.6538 | 4498 | 0.7963 | 0.7819 | 0.7963 | 0.8923 | | 0.0549 | 6.6568 | 4500 | 0.7404 | 0.7983 | 0.7404 | 0.8604 | | 0.0549 | 6.6598 | 4502 | 0.7243 | 0.7983 | 0.7243 | 0.8510 | | 0.0549 | 6.6627 | 4504 | 0.7438 | 0.7525 | 0.7438 | 0.8624 | | 0.0549 | 6.6657 | 4506 | 0.7860 | 0.7525 | 0.7860 | 0.8866 | | 0.0549 | 6.6686 | 4508 | 0.8387 | 0.7626 | 0.8387 | 0.9158 | | 0.0549 | 6.6716 | 4510 | 0.8908 | 0.7840 | 0.8908 | 0.9438 | | 0.0549 | 6.6746 | 4512 | 0.9419 | 0.7840 | 0.9419 | 0.9705 | | 0.0549 | 6.6775 | 4514 | 0.9409 | 0.7840 | 0.9409 | 0.9700 | | 0.0549 | 6.6805 | 4516 | 0.9023 | 0.7840 | 0.9023 | 0.9499 | | 0.0549 | 6.6834 | 4518 | 0.8555 | 0.7840 | 0.8555 | 0.9250 | | 0.0549 | 6.6864 | 4520 | 0.8063 | 0.7626 | 0.8063 | 0.8979 | | 0.0549 | 6.6893 | 4522 | 0.7897 | 0.7296 | 0.7897 | 0.8887 | | 0.0549 | 6.6923 | 4524 | 0.7775 | 0.7296 | 0.7775 | 0.8818 | | 0.0549 | 6.6953 | 4526 | 0.7620 | 0.7631 | 0.7620 | 0.8730 | | 0.0549 | 6.6982 | 4528 | 0.7485 | 0.7631 | 0.7485 | 0.8651 | | 0.0549 | 6.7012 | 4530 | 0.7307 | 0.7631 | 0.7307 | 0.8548 | | 0.0549 | 6.7041 | 4532 | 0.7056 | 0.7823 | 0.7056 | 0.8400 | | 0.0549 | 6.7071 | 4534 | 0.6903 | 0.7823 | 0.6903 | 0.8308 | | 0.0549 | 6.7101 | 4536 | 0.7052 | 0.8094 | 0.7052 | 0.8397 | | 0.0549 | 6.7130 | 4538 | 0.7270 | 0.8283 | 0.7270 | 0.8527 | | 0.0549 | 6.7160 | 4540 | 0.7480 | 0.8412 | 0.7480 | 0.8649 | | 0.0549 | 6.7189 | 4542 | 0.7647 | 0.8412 | 0.7647 | 0.8745 | | 0.0549 | 6.7219 | 4544 | 0.7613 | 0.8412 | 0.7613 | 0.8725 | | 0.0549 | 6.7249 | 4546 | 0.7743 | 0.8412 | 0.7743 | 0.8800 | | 0.0549 | 6.7278 | 4548 | 0.7732 | 0.8412 | 0.7732 | 0.8793 | | 0.0549 | 6.7308 | 4550 | 0.8008 | 0.8412 | 0.8008 | 0.8949 | | 0.0549 | 6.7337 | 4552 | 0.8441 | 0.8138 | 0.8441 | 0.9188 | | 0.0549 | 6.7367 | 4554 | 0.8645 | 0.8138 | 0.8645 | 0.9298 | | 0.0549 | 6.7396 | 4556 | 0.8600 | 0.8138 | 0.8600 | 0.9274 | | 0.0549 | 6.7426 | 4558 | 0.8315 | 0.8138 | 0.8315 | 0.9119 | | 0.0549 | 6.7456 | 4560 | 0.8161 | 0.8412 | 0.8161 | 0.9034 | | 0.0549 | 6.7485 | 4562 | 0.8246 | 0.8138 | 0.8246 | 0.9081 | | 0.0549 | 6.7515 | 4564 | 0.8455 | 0.7840 | 0.8455 | 0.9195 | | 0.0549 | 6.7544 | 4566 | 0.8351 | 0.8138 | 0.8351 | 0.9139 | | 0.0549 | 6.7574 | 4568 | 0.8266 | 0.8412 | 0.8266 | 0.9092 | | 0.0549 | 6.7604 | 4570 | 0.8379 | 0.8412 | 0.8379 | 0.9154 | | 0.0549 | 6.7633 | 4572 | 0.8597 | 0.8412 | 0.8597 | 0.9272 | | 0.0549 | 6.7663 | 4574 | 0.8581 | 0.8412 | 0.8581 | 0.9263 | | 0.0549 | 6.7692 | 4576 | 0.8819 | 0.8138 | 0.8819 | 0.9391 | | 0.0549 | 6.7722 | 4578 | 0.9142 | 0.7840 | 0.9142 | 0.9561 | | 0.0549 | 6.7751 | 4580 | 0.9281 | 0.7840 | 0.9281 | 0.9634 | | 0.0549 | 6.7781 | 4582 | 0.9276 | 0.7840 | 0.9276 | 0.9631 | | 0.0549 | 6.7811 | 4584 | 0.9251 | 0.7840 | 0.9251 | 0.9618 | | 0.0549 | 6.7840 | 4586 | 0.9118 | 0.7840 | 0.9118 | 0.9549 | | 0.0549 | 6.7870 | 4588 | 0.8836 | 0.7840 | 0.8836 | 0.9400 | | 0.0549 | 6.7899 | 4590 | 0.8736 | 0.7840 | 0.8736 | 0.9346 | | 0.0549 | 6.7929 | 4592 | 0.8614 | 0.7840 | 0.8614 | 0.9281 | | 0.0549 | 6.7959 | 4594 | 0.8519 | 0.7840 | 0.8519 | 0.9230 | | 0.0549 | 6.7988 | 4596 | 0.8339 | 0.7840 | 0.8339 | 0.9132 | | 0.0549 | 6.8018 | 4598 | 0.8378 | 0.7840 | 0.8378 | 0.9153 | | 0.0549 | 6.8047 | 4600 | 0.8696 | 0.7840 | 0.8696 | 0.9325 | | 0.0549 | 6.8077 | 4602 | 0.9205 | 0.7840 | 0.9205 | 0.9594 | | 0.0549 | 6.8107 | 4604 | 0.9711 | 0.7840 | 0.9711 | 0.9855 | | 0.0549 | 6.8136 | 4606 | 1.0190 | 0.7621 | 1.0190 | 1.0095 | | 0.0549 | 6.8166 | 4608 | 1.0164 | 0.7621 | 1.0164 | 1.0082 | | 0.0549 | 6.8195 | 4610 | 1.0042 | 0.7621 | 1.0042 | 1.0021 | | 0.0549 | 6.8225 | 4612 | 0.9927 | 0.7840 | 0.9927 | 0.9964 | | 0.0549 | 6.8254 | 4614 | 0.9525 | 0.7840 | 0.9525 | 0.9759 | | 0.0549 | 6.8284 | 4616 | 0.9276 | 0.7840 | 0.9276 | 0.9631 | | 0.0549 | 6.8314 | 4618 | 0.8828 | 0.7840 | 0.8828 | 0.9396 | | 0.0549 | 6.8343 | 4620 | 0.8656 | 0.7840 | 0.8656 | 0.9304 | | 0.0549 | 6.8373 | 4622 | 0.8662 | 0.7840 | 0.8662 | 0.9307 | | 0.0549 | 6.8402 | 4624 | 0.8916 | 0.7840 | 0.8916 | 0.9442 | | 0.0549 | 6.8432 | 4626 | 0.9022 | 0.7840 | 0.9022 | 0.9498 | | 0.0549 | 6.8462 | 4628 | 0.8867 | 0.7840 | 0.8867 | 0.9416 | | 0.0549 | 6.8491 | 4630 | 0.8416 | 0.7724 | 0.8416 | 0.9174 | | 0.0549 | 6.8521 | 4632 | 0.7946 | 0.8283 | 0.7946 | 0.8914 | | 0.0549 | 6.8550 | 4634 | 0.7730 | 0.8094 | 0.7730 | 0.8792 | | 0.0549 | 6.8580 | 4636 | 0.7702 | 0.8094 | 0.7702 | 0.8776 | | 0.0549 | 6.8609 | 4638 | 0.7928 | 0.7724 | 0.7928 | 0.8904 | | 0.0549 | 6.8639 | 4640 | 0.8415 | 0.7724 | 0.8415 | 0.9173 | | 0.0549 | 6.8669 | 4642 | 0.9122 | 0.7840 | 0.9122 | 0.9551 | | 0.0549 | 6.8698 | 4644 | 0.9748 | 0.7840 | 0.9748 | 0.9873 | | 0.0549 | 6.8728 | 4646 | 1.0289 | 0.7621 | 1.0289 | 1.0144 | | 0.0549 | 6.8757 | 4648 | 1.0459 | 0.7621 | 1.0459 | 1.0227 | | 0.0549 | 6.8787 | 4650 | 1.0220 | 0.7621 | 1.0220 | 1.0109 | | 0.0549 | 6.8817 | 4652 | 0.9670 | 0.7840 | 0.9670 | 0.9834 | | 0.0549 | 6.8846 | 4654 | 0.9203 | 0.7840 | 0.9203 | 0.9593 | | 0.0549 | 6.8876 | 4656 | 0.8649 | 0.7724 | 0.8649 | 0.9300 | | 0.0549 | 6.8905 | 4658 | 0.7921 | 0.7724 | 0.7921 | 0.8900 | | 0.0549 | 6.8935 | 4660 | 0.7594 | 0.7525 | 0.7594 | 0.8714 | | 0.0549 | 6.8964 | 4662 | 0.7520 | 0.7525 | 0.7520 | 0.8672 | | 0.0549 | 6.8994 | 4664 | 0.7381 | 0.7525 | 0.7381 | 0.8592 | | 0.0549 | 6.9024 | 4666 | 0.7423 | 0.7525 | 0.7423 | 0.8615 | | 0.0549 | 6.9053 | 4668 | 0.7716 | 0.72 | 0.7716 | 0.8784 | | 0.0549 | 6.9083 | 4670 | 0.8169 | 0.7296 | 0.8169 | 0.9038 | | 0.0549 | 6.9112 | 4672 | 0.8611 | 0.7296 | 0.8611 | 0.9280 | | 0.0549 | 6.9142 | 4674 | 0.9014 | 0.7840 | 0.9014 | 0.9494 | | 0.0549 | 6.9172 | 4676 | 0.9400 | 0.7840 | 0.9400 | 0.9695 | | 0.0549 | 6.9201 | 4678 | 0.9517 | 0.7840 | 0.9517 | 0.9755 | | 0.0549 | 6.9231 | 4680 | 0.9722 | 0.7840 | 0.9722 | 0.9860 | | 0.0549 | 6.9260 | 4682 | 0.9580 | 0.7840 | 0.9580 | 0.9788 | | 0.0549 | 6.9290 | 4684 | 0.9146 | 0.7840 | 0.9146 | 0.9563 | | 0.0549 | 6.9320 | 4686 | 0.8571 | 0.7626 | 0.8571 | 0.9258 | | 0.0549 | 6.9349 | 4688 | 0.8050 | 0.7819 | 0.8050 | 0.8972 | | 0.0549 | 6.9379 | 4690 | 0.7811 | 0.7819 | 0.7811 | 0.8838 | | 0.0549 | 6.9408 | 4692 | 0.7714 | 0.7819 | 0.7714 | 0.8783 | | 0.0549 | 6.9438 | 4694 | 0.7736 | 0.7819 | 0.7736 | 0.8795 | | 0.0549 | 6.9467 | 4696 | 0.7760 | 0.7819 | 0.7760 | 0.8809 | | 0.0549 | 6.9497 | 4698 | 0.7668 | 0.7819 | 0.7668 | 0.8757 | | 0.0549 | 6.9527 | 4700 | 0.7751 | 0.8015 | 0.7751 | 0.8804 | | 0.0549 | 6.9556 | 4702 | 0.7947 | 0.8015 | 0.7947 | 0.8915 | | 0.0549 | 6.9586 | 4704 | 0.8089 | 0.8015 | 0.8089 | 0.8994 | | 0.0549 | 6.9615 | 4706 | 0.8234 | 0.8015 | 0.8234 | 0.9074 | | 0.0549 | 6.9645 | 4708 | 0.8125 | 0.8015 | 0.8125 | 0.9014 | | 0.0549 | 6.9675 | 4710 | 0.8033 | 0.8015 | 0.8033 | 0.8963 | | 0.0549 | 6.9704 | 4712 | 0.8225 | 0.8138 | 0.8225 | 0.9069 | | 0.0549 | 6.9734 | 4714 | 0.8542 | 0.8138 | 0.8542 | 0.9242 | | 0.0549 | 6.9763 | 4716 | 0.8661 | 0.7840 | 0.8661 | 0.9306 | | 0.0549 | 6.9793 | 4718 | 0.8504 | 0.7840 | 0.8504 | 0.9222 | | 0.0549 | 6.9822 | 4720 | 0.8383 | 0.8138 | 0.8383 | 0.9156 | | 0.0549 | 6.9852 | 4722 | 0.8020 | 0.7729 | 0.8020 | 0.8955 | | 0.0549 | 6.9882 | 4724 | 0.7727 | 0.7525 | 0.7727 | 0.8790 | | 0.0549 | 6.9911 | 4726 | 0.7585 | 0.7525 | 0.7585 | 0.8709 | | 0.0549 | 6.9941 | 4728 | 0.7473 | 0.7819 | 0.7473 | 0.8645 | | 0.0549 | 6.9970 | 4730 | 0.7486 | 0.7713 | 0.7486 | 0.8652 | | 0.0549 | 7.0 | 4732 | 0.7637 | 0.7819 | 0.7637 | 0.8739 | | 0.0549 | 7.0030 | 4734 | 0.7936 | 0.8015 | 0.7936 | 0.8909 | | 0.0549 | 7.0059 | 4736 | 0.8245 | 0.8015 | 0.8245 | 0.9080 | | 0.0549 | 7.0089 | 4738 | 0.8737 | 0.7724 | 0.8737 | 0.9347 | | 0.0549 | 7.0118 | 4740 | 0.8947 | 0.7724 | 0.8947 | 0.9459 | | 0.0549 | 7.0148 | 4742 | 0.8758 | 0.7724 | 0.8758 | 0.9358 | | 0.0549 | 7.0178 | 4744 | 0.8438 | 0.7724 | 0.8438 | 0.9186 | | 0.0549 | 7.0207 | 4746 | 0.8171 | 0.8015 | 0.8171 | 0.9040 | | 0.0549 | 7.0237 | 4748 | 0.8074 | 0.8015 | 0.8074 | 0.8985 | | 0.0549 | 7.0266 | 4750 | 0.8108 | 0.7520 | 0.8108 | 0.9004 | | 0.0549 | 7.0296 | 4752 | 0.7981 | 0.7819 | 0.7981 | 0.8934 | | 0.0549 | 7.0325 | 4754 | 0.8114 | 0.7724 | 0.8114 | 0.9008 | | 0.0549 | 7.0355 | 4756 | 0.8371 | 0.7724 | 0.8371 | 0.9149 | | 0.0549 | 7.0385 | 4758 | 0.8665 | 0.7724 | 0.8665 | 0.9309 | | 0.0549 | 7.0414 | 4760 | 0.8820 | 0.7724 | 0.8820 | 0.9392 | | 0.0549 | 7.0444 | 4762 | 0.8763 | 0.7724 | 0.8763 | 0.9361 | | 0.0549 | 7.0473 | 4764 | 0.8711 | 0.7724 | 0.8711 | 0.9333 | | 0.0549 | 7.0503 | 4766 | 0.8389 | 0.8015 | 0.8389 | 0.9159 | | 0.0549 | 7.0533 | 4768 | 0.8040 | 0.8283 | 0.8040 | 0.8967 | | 0.0549 | 7.0562 | 4770 | 0.7820 | 0.8283 | 0.7820 | 0.8843 | | 0.0549 | 7.0592 | 4772 | 0.7627 | 0.8283 | 0.7627 | 0.8733 | | 0.0549 | 7.0621 | 4774 | 0.7294 | 0.7823 | 0.7294 | 0.8540 | | 0.0549 | 7.0651 | 4776 | 0.7017 | 0.7823 | 0.7017 | 0.8377 | | 0.0549 | 7.0680 | 4778 | 0.6896 | 0.7823 | 0.6896 | 0.8304 | | 0.0549 | 7.0710 | 4780 | 0.6974 | 0.7823 | 0.6974 | 0.8351 | | 0.0549 | 7.0740 | 4782 | 0.7210 | 0.7823 | 0.7210 | 0.8491 | | 0.0549 | 7.0769 | 4784 | 0.7562 | 0.8094 | 0.7562 | 0.8696 | | 0.0549 | 7.0799 | 4786 | 0.8090 | 0.8094 | 0.8090 | 0.8994 | | 0.0549 | 7.0828 | 4788 | 0.8711 | 0.7724 | 0.8711 | 0.9333 | | 0.0549 | 7.0858 | 4790 | 0.8960 | 0.7840 | 0.8960 | 0.9466 | | 0.0549 | 7.0888 | 4792 | 0.8849 | 0.7840 | 0.8849 | 0.9407 | | 0.0549 | 7.0917 | 4794 | 0.8823 | 0.7840 | 0.8823 | 0.9393 | | 0.0549 | 7.0947 | 4796 | 0.8602 | 0.7626 | 0.8602 | 0.9275 | | 0.0549 | 7.0976 | 4798 | 0.8329 | 0.7296 | 0.8329 | 0.9126 | | 0.0549 | 7.1006 | 4800 | 0.8058 | 0.7296 | 0.8058 | 0.8977 | | 0.0549 | 7.1036 | 4802 | 0.7875 | 0.72 | 0.7875 | 0.8874 | | 0.0549 | 7.1065 | 4804 | 0.7718 | 0.72 | 0.7718 | 0.8785 | | 0.0549 | 7.1095 | 4806 | 0.7489 | 0.72 | 0.7489 | 0.8654 | | 0.0549 | 7.1124 | 4808 | 0.7513 | 0.7112 | 0.7513 | 0.8668 | | 0.0549 | 7.1154 | 4810 | 0.7549 | 0.7112 | 0.7549 | 0.8688 | | 0.0549 | 7.1183 | 4812 | 0.7747 | 0.7422 | 0.7747 | 0.8801 | | 0.0549 | 7.1213 | 4814 | 0.8141 | 0.7520 | 0.8141 | 0.9023 | | 0.0549 | 7.1243 | 4816 | 0.8626 | 0.7724 | 0.8626 | 0.9288 | | 0.0549 | 7.1272 | 4818 | 0.8817 | 0.7724 | 0.8817 | 0.9390 | | 0.0549 | 7.1302 | 4820 | 0.8743 | 0.7626 | 0.8743 | 0.9350 | | 0.0549 | 7.1331 | 4822 | 0.8820 | 0.7626 | 0.8820 | 0.9391 | | 0.0549 | 7.1361 | 4824 | 0.8843 | 0.7626 | 0.8843 | 0.9404 | | 0.0549 | 7.1391 | 4826 | 0.8642 | 0.7626 | 0.8642 | 0.9296 | | 0.0549 | 7.1420 | 4828 | 0.8493 | 0.7296 | 0.8493 | 0.9216 | | 0.0549 | 7.1450 | 4830 | 0.8178 | 0.7296 | 0.8178 | 0.9043 | | 0.0549 | 7.1479 | 4832 | 0.7860 | 0.7112 | 0.7860 | 0.8866 | | 0.0549 | 7.1509 | 4834 | 0.7820 | 0.7112 | 0.7820 | 0.8843 | | 0.0549 | 7.1538 | 4836 | 0.7627 | 0.7112 | 0.7627 | 0.8733 | | 0.0549 | 7.1568 | 4838 | 0.7328 | 0.7427 | 0.7328 | 0.8561 | | 0.0549 | 7.1598 | 4840 | 0.7226 | 0.7427 | 0.7226 | 0.8501 | | 0.0549 | 7.1627 | 4842 | 0.7319 | 0.7427 | 0.7319 | 0.8555 | | 0.0549 | 7.1657 | 4844 | 0.7584 | 0.7112 | 0.7584 | 0.8708 | | 0.0549 | 7.1686 | 4846 | 0.8041 | 0.7112 | 0.8041 | 0.8967 | | 0.0549 | 7.1716 | 4848 | 0.8570 | 0.7618 | 0.8570 | 0.9258 | | 0.0549 | 7.1746 | 4850 | 0.8788 | 0.7618 | 0.8788 | 0.9374 | | 0.0549 | 7.1775 | 4852 | 0.8736 | 0.7618 | 0.8736 | 0.9346 | | 0.0549 | 7.1805 | 4854 | 0.8402 | 0.7422 | 0.8402 | 0.9166 | | 0.0549 | 7.1834 | 4856 | 0.7920 | 0.7112 | 0.7920 | 0.8899 | | 0.0549 | 7.1864 | 4858 | 0.7654 | 0.7112 | 0.7654 | 0.8749 | | 0.0549 | 7.1893 | 4860 | 0.7687 | 0.7112 | 0.7687 | 0.8767 | | 0.0549 | 7.1923 | 4862 | 0.7966 | 0.7422 | 0.7966 | 0.8925 | | 0.0549 | 7.1953 | 4864 | 0.8376 | 0.7422 | 0.8376 | 0.9152 | | 0.0549 | 7.1982 | 4866 | 0.8865 | 0.7618 | 0.8865 | 0.9415 | | 0.0549 | 7.2012 | 4868 | 0.9240 | 0.7724 | 0.9240 | 0.9612 | | 0.0549 | 7.2041 | 4870 | 0.9404 | 0.7724 | 0.9404 | 0.9697 | | 0.0549 | 7.2071 | 4872 | 0.9358 | 0.7840 | 0.9358 | 0.9673 | | 0.0549 | 7.2101 | 4874 | 0.9416 | 0.7840 | 0.9416 | 0.9704 | | 0.0549 | 7.2130 | 4876 | 0.9330 | 0.7840 | 0.9330 | 0.9659 | | 0.0549 | 7.2160 | 4878 | 0.9116 | 0.7840 | 0.9116 | 0.9548 | | 0.0549 | 7.2189 | 4880 | 0.8997 | 0.7840 | 0.8997 | 0.9485 | | 0.0549 | 7.2219 | 4882 | 0.8914 | 0.7626 | 0.8914 | 0.9442 | | 0.0549 | 7.2249 | 4884 | 0.8706 | 0.7296 | 0.8706 | 0.9331 | | 0.0549 | 7.2278 | 4886 | 0.8296 | 0.7296 | 0.8296 | 0.9108 | | 0.0549 | 7.2308 | 4888 | 0.8102 | 0.7296 | 0.8102 | 0.9001 | | 0.0549 | 7.2337 | 4890 | 0.7892 | 0.7112 | 0.7892 | 0.8884 | | 0.0549 | 7.2367 | 4892 | 0.7582 | 0.7112 | 0.7582 | 0.8707 | | 0.0549 | 7.2396 | 4894 | 0.7320 | 0.7112 | 0.7320 | 0.8556 | | 0.0549 | 7.2426 | 4896 | 0.7105 | 0.7112 | 0.7105 | 0.8429 | | 0.0549 | 7.2456 | 4898 | 0.6965 | 0.7112 | 0.6965 | 0.8345 | | 0.0549 | 7.2485 | 4900 | 0.6960 | 0.7112 | 0.6960 | 0.8343 | | 0.0549 | 7.2515 | 4902 | 0.7096 | 0.7112 | 0.7096 | 0.8424 | | 0.0549 | 7.2544 | 4904 | 0.7071 | 0.7112 | 0.7071 | 0.8409 | | 0.0549 | 7.2574 | 4906 | 0.7183 | 0.7112 | 0.7183 | 0.8475 | | 0.0549 | 7.2604 | 4908 | 0.7499 | 0.7422 | 0.7499 | 0.8660 | | 0.0549 | 7.2633 | 4910 | 0.7955 | 0.7422 | 0.7955 | 0.8919 | | 0.0549 | 7.2663 | 4912 | 0.8364 | 0.7840 | 0.8364 | 0.9146 | | 0.0549 | 7.2692 | 4914 | 0.8812 | 0.7840 | 0.8812 | 0.9387 | | 0.0549 | 7.2722 | 4916 | 0.9046 | 0.7840 | 0.9046 | 0.9511 | | 0.0549 | 7.2751 | 4918 | 0.9104 | 0.7840 | 0.9104 | 0.9541 | | 0.0549 | 7.2781 | 4920 | 0.8984 | 0.7840 | 0.8984 | 0.9478 | | 0.0549 | 7.2811 | 4922 | 0.8633 | 0.7840 | 0.8633 | 0.9292 | | 0.0549 | 7.2840 | 4924 | 0.8091 | 0.7422 | 0.8091 | 0.8995 | | 0.0549 | 7.2870 | 4926 | 0.7670 | 0.7112 | 0.7670 | 0.8758 | | 0.0549 | 7.2899 | 4928 | 0.7482 | 0.7112 | 0.7482 | 0.8650 | | 0.0549 | 7.2929 | 4930 | 0.7479 | 0.7112 | 0.7479 | 0.8648 | | 0.0549 | 7.2959 | 4932 | 0.7450 | 0.7112 | 0.7450 | 0.8631 | | 0.0549 | 7.2988 | 4934 | 0.7623 | 0.7112 | 0.7623 | 0.8731 | | 0.0549 | 7.3018 | 4936 | 0.7740 | 0.7112 | 0.7740 | 0.8798 | | 0.0549 | 7.3047 | 4938 | 0.7942 | 0.7520 | 0.7942 | 0.8912 | | 0.0549 | 7.3077 | 4940 | 0.8235 | 0.7626 | 0.8235 | 0.9075 | | 0.0549 | 7.3107 | 4942 | 0.8474 | 0.7626 | 0.8474 | 0.9205 | | 0.0549 | 7.3136 | 4944 | 0.8528 | 0.7626 | 0.8528 | 0.9235 | | 0.0549 | 7.3166 | 4946 | 0.8657 | 0.7840 | 0.8657 | 0.9305 | | 0.0549 | 7.3195 | 4948 | 0.8648 | 0.7840 | 0.8648 | 0.9299 | | 0.0549 | 7.3225 | 4950 | 0.8492 | 0.7626 | 0.8492 | 0.9215 | | 0.0549 | 7.3254 | 4952 | 0.8213 | 0.7520 | 0.8213 | 0.9063 | | 0.0549 | 7.3284 | 4954 | 0.7908 | 0.7520 | 0.7908 | 0.8892 | | 0.0549 | 7.3314 | 4956 | 0.7701 | 0.7112 | 0.7701 | 0.8776 | | 0.0549 | 7.3343 | 4958 | 0.7712 | 0.7112 | 0.7712 | 0.8782 | | 0.0549 | 7.3373 | 4960 | 0.7653 | 0.7112 | 0.7653 | 0.8748 | | 0.0549 | 7.3402 | 4962 | 0.7608 | 0.7112 | 0.7608 | 0.8722 | | 0.0549 | 7.3432 | 4964 | 0.7671 | 0.7112 | 0.7671 | 0.8758 | | 0.0549 | 7.3462 | 4966 | 0.7633 | 0.7112 | 0.7633 | 0.8736 | | 0.0549 | 7.3491 | 4968 | 0.7743 | 0.7422 | 0.7743 | 0.8799 | | 0.0549 | 7.3521 | 4970 | 0.7858 | 0.7422 | 0.7858 | 0.8864 | | 0.0549 | 7.3550 | 4972 | 0.7987 | 0.7422 | 0.7987 | 0.8937 | | 0.0549 | 7.3580 | 4974 | 0.7969 | 0.7422 | 0.7969 | 0.8927 | | 0.0549 | 7.3609 | 4976 | 0.7931 | 0.7618 | 0.7931 | 0.8906 | | 0.0549 | 7.3639 | 4978 | 0.8001 | 0.7724 | 0.8001 | 0.8945 | | 0.0549 | 7.3669 | 4980 | 0.7922 | 0.7520 | 0.7922 | 0.8901 | | 0.0549 | 7.3698 | 4982 | 0.7730 | 0.72 | 0.7730 | 0.8792 | | 0.0549 | 7.3728 | 4984 | 0.7525 | 0.7427 | 0.7525 | 0.8675 | | 0.0549 | 7.3757 | 4986 | 0.7481 | 0.7427 | 0.7481 | 0.8649 | | 0.0549 | 7.3787 | 4988 | 0.7471 | 0.7427 | 0.7471 | 0.8643 | | 0.0549 | 7.3817 | 4990 | 0.7342 | 0.7427 | 0.7342 | 0.8568 | | 0.0549 | 7.3846 | 4992 | 0.7144 | 0.7717 | 0.7144 | 0.8452 | | 0.0549 | 7.3876 | 4994 | 0.7037 | 0.7717 | 0.7037 | 0.8388 | | 0.0549 | 7.3905 | 4996 | 0.7152 | 0.7717 | 0.7152 | 0.8457 | | 0.0549 | 7.3935 | 4998 | 0.7492 | 0.7427 | 0.7492 | 0.8656 | | 0.0513 | 7.3964 | 5000 | 0.7790 | 0.7623 | 0.7790 | 0.8826 | | 0.0513 | 7.3994 | 5002 | 0.8123 | 0.7902 | 0.8123 | 0.9013 | | 0.0513 | 7.4024 | 5004 | 0.8130 | 0.7623 | 0.8130 | 0.9017 | | 0.0513 | 7.4053 | 5006 | 0.8059 | 0.7623 | 0.8059 | 0.8977 | | 0.0513 | 7.4083 | 5008 | 0.7905 | 0.7427 | 0.7905 | 0.8891 | | 0.0513 | 7.4112 | 5010 | 0.7988 | 0.7631 | 0.7988 | 0.8938 | | 0.0513 | 7.4142 | 5012 | 0.7921 | 0.7631 | 0.7921 | 0.8900 | | 0.0513 | 7.4172 | 5014 | 0.7952 | 0.7631 | 0.7952 | 0.8917 | | 0.0513 | 7.4201 | 5016 | 0.7906 | 0.7631 | 0.7906 | 0.8892 | | 0.0513 | 7.4231 | 5018 | 0.7928 | 0.7296 | 0.7928 | 0.8904 | | 0.0513 | 7.4260 | 5020 | 0.7997 | 0.7296 | 0.7997 | 0.8943 | | 0.0513 | 7.4290 | 5022 | 0.8024 | 0.72 | 0.8024 | 0.8958 | | 0.0513 | 7.4320 | 5024 | 0.8094 | 0.72 | 0.8094 | 0.8997 | | 0.0513 | 7.4349 | 5026 | 0.8300 | 0.7296 | 0.8300 | 0.9110 | | 0.0513 | 7.4379 | 5028 | 0.8530 | 0.7296 | 0.8530 | 0.9236 | | 0.0513 | 7.4408 | 5030 | 0.8584 | 0.7296 | 0.8584 | 0.9265 | | 0.0513 | 7.4438 | 5032 | 0.8528 | 0.7296 | 0.8528 | 0.9235 | | 0.0513 | 7.4467 | 5034 | 0.8670 | 0.7296 | 0.8670 | 0.9311 | | 0.0513 | 7.4497 | 5036 | 0.8847 | 0.7519 | 0.8847 | 0.9406 | | 0.0513 | 7.4527 | 5038 | 0.8920 | 0.7840 | 0.8920 | 0.9445 | | 0.0513 | 7.4556 | 5040 | 0.8879 | 0.7840 | 0.8879 | 0.9423 | | 0.0513 | 7.4586 | 5042 | 0.8744 | 0.7519 | 0.8744 | 0.9351 | | 0.0513 | 7.4615 | 5044 | 0.8643 | 0.7296 | 0.8643 | 0.9297 | | 0.0513 | 7.4645 | 5046 | 0.8629 | 0.7296 | 0.8629 | 0.9289 | | 0.0513 | 7.4675 | 5048 | 0.8636 | 0.7296 | 0.8636 | 0.9293 | | 0.0513 | 7.4704 | 5050 | 0.8558 | 0.7296 | 0.8558 | 0.9251 | | 0.0513 | 7.4734 | 5052 | 0.8681 | 0.7296 | 0.8681 | 0.9317 | | 0.0513 | 7.4763 | 5054 | 0.8731 | 0.7296 | 0.8731 | 0.9344 | | 0.0513 | 7.4793 | 5056 | 0.8683 | 0.7296 | 0.8683 | 0.9318 | | 0.0513 | 7.4822 | 5058 | 0.8784 | 0.7296 | 0.8784 | 0.9372 | | 0.0513 | 7.4852 | 5060 | 0.9076 | 0.7519 | 0.9076 | 0.9527 | | 0.0513 | 7.4882 | 5062 | 0.9444 | 0.7840 | 0.9444 | 0.9718 | | 0.0513 | 7.4911 | 5064 | 0.9580 | 0.7840 | 0.9580 | 0.9788 | | 0.0513 | 7.4941 | 5066 | 0.9438 | 0.7840 | 0.9438 | 0.9715 | | 0.0513 | 7.4970 | 5068 | 0.9137 | 0.7840 | 0.9137 | 0.9559 | | 0.0513 | 7.5 | 5070 | 0.8980 | 0.7840 | 0.8980 | 0.9476 | | 0.0513 | 7.5030 | 5072 | 0.8968 | 0.7840 | 0.8968 | 0.9470 | | 0.0513 | 7.5059 | 5074 | 0.8976 | 0.7840 | 0.8976 | 0.9474 | | 0.0513 | 7.5089 | 5076 | 0.8870 | 0.7519 | 0.8870 | 0.9418 | | 0.0513 | 7.5118 | 5078 | 0.8808 | 0.7519 | 0.8808 | 0.9385 | | 0.0513 | 7.5148 | 5080 | 0.8896 | 0.7519 | 0.8896 | 0.9432 | | 0.0513 | 7.5178 | 5082 | 0.8833 | 0.7519 | 0.8833 | 0.9398 | | 0.0513 | 7.5207 | 5084 | 0.8941 | 0.7519 | 0.8941 | 0.9456 | | 0.0513 | 7.5237 | 5086 | 0.9216 | 0.7519 | 0.9216 | 0.9600 | | 0.0513 | 7.5266 | 5088 | 0.9488 | 0.7519 | 0.9488 | 0.9741 | | 0.0513 | 7.5296 | 5090 | 0.9757 | 0.7519 | 0.9757 | 0.9878 | | 0.0513 | 7.5325 | 5092 | 0.9863 | 0.7519 | 0.9863 | 0.9931 | | 0.0513 | 7.5355 | 5094 | 0.9746 | 0.7519 | 0.9746 | 0.9872 | | 0.0513 | 7.5385 | 5096 | 0.9432 | 0.7519 | 0.9432 | 0.9712 | | 0.0513 | 7.5414 | 5098 | 0.9148 | 0.7519 | 0.9148 | 0.9565 | | 0.0513 | 7.5444 | 5100 | 0.9067 | 0.7519 | 0.9067 | 0.9522 | | 0.0513 | 7.5473 | 5102 | 0.8874 | 0.7519 | 0.8874 | 0.9420 | | 0.0513 | 7.5503 | 5104 | 0.8609 | 0.7296 | 0.8609 | 0.9278 | | 0.0513 | 7.5533 | 5106 | 0.8382 | 0.7296 | 0.8382 | 0.9155 | | 0.0513 | 7.5562 | 5108 | 0.8344 | 0.7296 | 0.8344 | 0.9135 | | 0.0513 | 7.5592 | 5110 | 0.8570 | 0.7296 | 0.8570 | 0.9257 | | 0.0513 | 7.5621 | 5112 | 0.8925 | 0.7296 | 0.8925 | 0.9447 | | 0.0513 | 7.5651 | 5114 | 0.9403 | 0.7519 | 0.9403 | 0.9697 | | 0.0513 | 7.5680 | 5116 | 0.9732 | 0.7519 | 0.9732 | 0.9865 | | 0.0513 | 7.5710 | 5118 | 0.9905 | 0.7519 | 0.9905 | 0.9952 | | 0.0513 | 7.5740 | 5120 | 1.0105 | 0.7840 | 1.0105 | 1.0052 | | 0.0513 | 7.5769 | 5122 | 1.0106 | 0.7840 | 1.0106 | 1.0053 | | 0.0513 | 7.5799 | 5124 | 0.9867 | 0.7840 | 0.9867 | 0.9933 | | 0.0513 | 7.5828 | 5126 | 0.9611 | 0.7519 | 0.9611 | 0.9804 | | 0.0513 | 7.5858 | 5128 | 0.9355 | 0.7519 | 0.9355 | 0.9672 | | 0.0513 | 7.5888 | 5130 | 0.9121 | 0.7519 | 0.9121 | 0.9550 | | 0.0513 | 7.5917 | 5132 | 0.9045 | 0.7519 | 0.9045 | 0.9511 | | 0.0513 | 7.5947 | 5134 | 0.9002 | 0.7519 | 0.9002 | 0.9488 | | 0.0513 | 7.5976 | 5136 | 0.8894 | 0.7519 | 0.8894 | 0.9431 | | 0.0513 | 7.6006 | 5138 | 0.8953 | 0.7519 | 0.8953 | 0.9462 | | 0.0513 | 7.6036 | 5140 | 0.9232 | 0.7840 | 0.9232 | 0.9608 | | 0.0513 | 7.6065 | 5142 | 0.9312 | 0.7840 | 0.9312 | 0.9650 | | 0.0513 | 7.6095 | 5144 | 0.9380 | 0.7519 | 0.9380 | 0.9685 | | 0.0513 | 7.6124 | 5146 | 0.9408 | 0.7519 | 0.9408 | 0.9699 | | 0.0513 | 7.6154 | 5148 | 0.9631 | 0.7519 | 0.9631 | 0.9814 | | 0.0513 | 7.6183 | 5150 | 0.9985 | 0.7840 | 0.9985 | 0.9992 | | 0.0513 | 7.6213 | 5152 | 1.0390 | 0.7840 | 1.0390 | 1.0193 | | 0.0513 | 7.6243 | 5154 | 1.0423 | 0.7840 | 1.0423 | 1.0209 | | 0.0513 | 7.6272 | 5156 | 1.0147 | 0.7519 | 1.0147 | 1.0073 | | 0.0513 | 7.6302 | 5158 | 0.9678 | 0.7519 | 0.9678 | 0.9838 | | 0.0513 | 7.6331 | 5160 | 0.9242 | 0.7519 | 0.9242 | 0.9614 | | 0.0513 | 7.6361 | 5162 | 0.8840 | 0.7296 | 0.8840 | 0.9402 | | 0.0513 | 7.6391 | 5164 | 0.8636 | 0.7296 | 0.8636 | 0.9293 | | 0.0513 | 7.6420 | 5166 | 0.8501 | 0.7296 | 0.8501 | 0.9220 | | 0.0513 | 7.6450 | 5168 | 0.8404 | 0.7296 | 0.8404 | 0.9167 | | 0.0513 | 7.6479 | 5170 | 0.8469 | 0.7296 | 0.8469 | 0.9203 | | 0.0513 | 7.6509 | 5172 | 0.8509 | 0.7296 | 0.8509 | 0.9224 | | 0.0513 | 7.6538 | 5174 | 0.8532 | 0.7296 | 0.8532 | 0.9237 | | 0.0513 | 7.6568 | 5176 | 0.8734 | 0.7296 | 0.8734 | 0.9346 | | 0.0513 | 7.6598 | 5178 | 0.9032 | 0.7840 | 0.9032 | 0.9504 | | 0.0513 | 7.6627 | 5180 | 0.9208 | 0.7840 | 0.9208 | 0.9596 | | 0.0513 | 7.6657 | 5182 | 0.9271 | 0.7840 | 0.9271 | 0.9629 | | 0.0513 | 7.6686 | 5184 | 0.9043 | 0.7840 | 0.9043 | 0.9509 | | 0.0513 | 7.6716 | 5186 | 0.8711 | 0.7840 | 0.8711 | 0.9333 | | 0.0513 | 7.6746 | 5188 | 0.8547 | 0.7626 | 0.8547 | 0.9245 | | 0.0513 | 7.6775 | 5190 | 0.8409 | 0.7296 | 0.8409 | 0.9170 | | 0.0513 | 7.6805 | 5192 | 0.8427 | 0.7296 | 0.8427 | 0.9180 | | 0.0513 | 7.6834 | 5194 | 0.8690 | 0.7296 | 0.8690 | 0.9322 | | 0.0513 | 7.6864 | 5196 | 0.8874 | 0.7626 | 0.8874 | 0.9420 | | 0.0513 | 7.6893 | 5198 | 0.9085 | 0.7840 | 0.9085 | 0.9531 | | 0.0513 | 7.6923 | 5200 | 0.9412 | 0.7840 | 0.9412 | 0.9701 | | 0.0513 | 7.6953 | 5202 | 0.9631 | 0.7840 | 0.9631 | 0.9814 | | 0.0513 | 7.6982 | 5204 | 0.9516 | 0.7840 | 0.9516 | 0.9755 | | 0.0513 | 7.7012 | 5206 | 0.9202 | 0.7840 | 0.9202 | 0.9593 | | 0.0513 | 7.7041 | 5208 | 0.8798 | 0.7626 | 0.8798 | 0.9380 | | 0.0513 | 7.7071 | 5210 | 0.8325 | 0.7296 | 0.8325 | 0.9124 | | 0.0513 | 7.7101 | 5212 | 0.7859 | 0.7296 | 0.7859 | 0.8865 | | 0.0513 | 7.7130 | 5214 | 0.7703 | 0.72 | 0.7703 | 0.8777 | | 0.0513 | 7.7160 | 5216 | 0.7783 | 0.72 | 0.7783 | 0.8822 | | 0.0513 | 7.7189 | 5218 | 0.7952 | 0.7296 | 0.7952 | 0.8918 | | 0.0513 | 7.7219 | 5220 | 0.8079 | 0.7296 | 0.8079 | 0.8988 | | 0.0513 | 7.7249 | 5222 | 0.8141 | 0.7296 | 0.8141 | 0.9023 | | 0.0513 | 7.7278 | 5224 | 0.8363 | 0.7626 | 0.8363 | 0.9145 | | 0.0513 | 7.7308 | 5226 | 0.8579 | 0.7626 | 0.8579 | 0.9262 | | 0.0513 | 7.7337 | 5228 | 0.8630 | 0.7626 | 0.8630 | 0.9290 | | 0.0513 | 7.7367 | 5230 | 0.8502 | 0.7626 | 0.8502 | 0.9221 | | 0.0513 | 7.7396 | 5232 | 0.8546 | 0.7626 | 0.8546 | 0.9244 | | 0.0513 | 7.7426 | 5234 | 0.8756 | 0.7840 | 0.8756 | 0.9357 | | 0.0513 | 7.7456 | 5236 | 0.9083 | 0.7840 | 0.9083 | 0.9530 | | 0.0513 | 7.7485 | 5238 | 0.9352 | 0.7840 | 0.9352 | 0.9670 | | 0.0513 | 7.7515 | 5240 | 0.9624 | 0.7840 | 0.9624 | 0.9810 | | 0.0513 | 7.7544 | 5242 | 0.9656 | 0.7840 | 0.9656 | 0.9826 | | 0.0513 | 7.7574 | 5244 | 0.9433 | 0.7840 | 0.9433 | 0.9712 | | 0.0513 | 7.7604 | 5246 | 0.9051 | 0.7840 | 0.9051 | 0.9514 | | 0.0513 | 7.7633 | 5248 | 0.8595 | 0.7413 | 0.8595 | 0.9271 | | 0.0513 | 7.7663 | 5250 | 0.8234 | 0.72 | 0.8234 | 0.9074 | | 0.0513 | 7.7692 | 5252 | 0.8027 | 0.72 | 0.8027 | 0.8960 | | 0.0513 | 7.7722 | 5254 | 0.8021 | 0.72 | 0.8021 | 0.8956 | | 0.0513 | 7.7751 | 5256 | 0.8193 | 0.72 | 0.8193 | 0.9052 | | 0.0513 | 7.7781 | 5258 | 0.8567 | 0.72 | 0.8567 | 0.9256 | | 0.0513 | 7.7811 | 5260 | 0.8943 | 0.7413 | 0.8943 | 0.9457 | | 0.0513 | 7.7840 | 5262 | 0.9261 | 0.7840 | 0.9261 | 0.9623 | | 0.0513 | 7.7870 | 5264 | 0.9689 | 0.7840 | 0.9689 | 0.9843 | | 0.0513 | 7.7899 | 5266 | 0.9977 | 0.7840 | 0.9977 | 0.9989 | | 0.0513 | 7.7929 | 5268 | 1.0082 | 0.7840 | 1.0082 | 1.0041 | | 0.0513 | 7.7959 | 5270 | 1.0252 | 0.7840 | 1.0252 | 1.0125 | | 0.0513 | 7.7988 | 5272 | 1.0323 | 0.7840 | 1.0323 | 1.0160 | | 0.0513 | 7.8018 | 5274 | 1.0288 | 0.7840 | 1.0288 | 1.0143 | | 0.0513 | 7.8047 | 5276 | 0.9974 | 0.7840 | 0.9974 | 0.9987 | | 0.0513 | 7.8077 | 5278 | 0.9670 | 0.7840 | 0.9670 | 0.9833 | | 0.0513 | 7.8107 | 5280 | 0.9379 | 0.7840 | 0.9379 | 0.9685 | | 0.0513 | 7.8136 | 5282 | 0.8991 | 0.7626 | 0.8991 | 0.9482 | | 0.0513 | 7.8166 | 5284 | 0.8749 | 0.7626 | 0.8749 | 0.9354 | | 0.0513 | 7.8195 | 5286 | 0.8548 | 0.7520 | 0.8548 | 0.9246 | | 0.0513 | 7.8225 | 5288 | 0.8513 | 0.7520 | 0.8513 | 0.9227 | | 0.0513 | 7.8254 | 5290 | 0.8672 | 0.7520 | 0.8672 | 0.9312 | | 0.0513 | 7.8284 | 5292 | 0.8998 | 0.7626 | 0.8998 | 0.9486 | | 0.0513 | 7.8314 | 5294 | 0.9235 | 0.7840 | 0.9235 | 0.9610 | | 0.0513 | 7.8343 | 5296 | 0.9389 | 0.7840 | 0.9389 | 0.9690 | | 0.0513 | 7.8373 | 5298 | 0.9523 | 0.7840 | 0.9523 | 0.9759 | | 0.0513 | 7.8402 | 5300 | 0.9738 | 0.7840 | 0.9738 | 0.9868 | | 0.0513 | 7.8432 | 5302 | 0.9840 | 0.7840 | 0.9840 | 0.9920 | | 0.0513 | 7.8462 | 5304 | 0.9792 | 0.7840 | 0.9792 | 0.9895 | | 0.0513 | 7.8491 | 5306 | 0.9627 | 0.7840 | 0.9627 | 0.9812 | | 0.0513 | 7.8521 | 5308 | 0.9539 | 0.7840 | 0.9539 | 0.9767 | | 0.0513 | 7.8550 | 5310 | 0.9467 | 0.7840 | 0.9467 | 0.9730 | | 0.0513 | 7.8580 | 5312 | 0.9264 | 0.7626 | 0.9264 | 0.9625 | | 0.0513 | 7.8609 | 5314 | 0.9058 | 0.7626 | 0.9058 | 0.9518 | | 0.0513 | 7.8639 | 5316 | 0.8926 | 0.7626 | 0.8926 | 0.9448 | | 0.0513 | 7.8669 | 5318 | 0.8816 | 0.7296 | 0.8816 | 0.9389 | | 0.0513 | 7.8698 | 5320 | 0.8710 | 0.7626 | 0.8710 | 0.9333 | | 0.0513 | 7.8728 | 5322 | 0.8604 | 0.7520 | 0.8604 | 0.9276 | | 0.0513 | 7.8757 | 5324 | 0.8542 | 0.7520 | 0.8542 | 0.9242 | | 0.0513 | 7.8787 | 5326 | 0.8520 | 0.7520 | 0.8520 | 0.9231 | | 0.0513 | 7.8817 | 5328 | 0.8507 | 0.7520 | 0.8507 | 0.9224 | | 0.0513 | 7.8846 | 5330 | 0.8527 | 0.7520 | 0.8527 | 0.9234 | | 0.0513 | 7.8876 | 5332 | 0.8406 | 0.7520 | 0.8406 | 0.9168 | | 0.0513 | 7.8905 | 5334 | 0.8342 | 0.7520 | 0.8342 | 0.9133 | | 0.0513 | 7.8935 | 5336 | 0.8408 | 0.7520 | 0.8408 | 0.9169 | | 0.0513 | 7.8964 | 5338 | 0.8421 | 0.7626 | 0.8421 | 0.9177 | | 0.0513 | 7.8994 | 5340 | 0.8418 | 0.7626 | 0.8418 | 0.9175 | | 0.0513 | 7.9024 | 5342 | 0.8484 | 0.7626 | 0.8484 | 0.9211 | | 0.0513 | 7.9053 | 5344 | 0.8473 | 0.7626 | 0.8473 | 0.9205 | | 0.0513 | 7.9083 | 5346 | 0.8396 | 0.7626 | 0.8396 | 0.9163 | | 0.0513 | 7.9112 | 5348 | 0.8417 | 0.7840 | 0.8417 | 0.9174 | | 0.0513 | 7.9142 | 5350 | 0.8596 | 0.7840 | 0.8596 | 0.9271 | | 0.0513 | 7.9172 | 5352 | 0.8838 | 0.7840 | 0.8838 | 0.9401 | | 0.0513 | 7.9201 | 5354 | 0.8852 | 0.7840 | 0.8852 | 0.9409 | | 0.0513 | 7.9231 | 5356 | 0.8937 | 0.7840 | 0.8937 | 0.9454 | | 0.0513 | 7.9260 | 5358 | 0.9098 | 0.7840 | 0.9098 | 0.9539 | | 0.0513 | 7.9290 | 5360 | 0.9152 | 0.7840 | 0.9152 | 0.9567 | | 0.0513 | 7.9320 | 5362 | 0.9063 | 0.7840 | 0.9063 | 0.9520 | | 0.0513 | 7.9349 | 5364 | 0.8998 | 0.7840 | 0.8998 | 0.9486 | | 0.0513 | 7.9379 | 5366 | 0.8868 | 0.7840 | 0.8868 | 0.9417 | | 0.0513 | 7.9408 | 5368 | 0.8697 | 0.7840 | 0.8697 | 0.9326 | | 0.0513 | 7.9438 | 5370 | 0.8681 | 0.7840 | 0.8681 | 0.9317 | | 0.0513 | 7.9467 | 5372 | 0.8701 | 0.7840 | 0.8701 | 0.9328 | | 0.0513 | 7.9497 | 5374 | 0.8574 | 0.7840 | 0.8574 | 0.9260 | | 0.0513 | 7.9527 | 5376 | 0.8564 | 0.7840 | 0.8564 | 0.9254 | | 0.0513 | 7.9556 | 5378 | 0.8474 | 0.7840 | 0.8474 | 0.9205 | | 0.0513 | 7.9586 | 5380 | 0.8497 | 0.7626 | 0.8497 | 0.9218 | | 0.0513 | 7.9615 | 5382 | 0.8664 | 0.7840 | 0.8664 | 0.9308 | | 0.0513 | 7.9645 | 5384 | 0.8858 | 0.7840 | 0.8858 | 0.9412 | | 0.0513 | 7.9675 | 5386 | 0.8961 | 0.7840 | 0.8961 | 0.9466 | | 0.0513 | 7.9704 | 5388 | 0.8907 | 0.7840 | 0.8907 | 0.9437 | | 0.0513 | 7.9734 | 5390 | 0.8936 | 0.7840 | 0.8936 | 0.9453 | | 0.0513 | 7.9763 | 5392 | 0.9150 | 0.7840 | 0.9150 | 0.9565 | | 0.0513 | 7.9793 | 5394 | 0.9201 | 0.7840 | 0.9201 | 0.9592 | | 0.0513 | 7.9822 | 5396 | 0.9395 | 0.7840 | 0.9395 | 0.9693 | | 0.0513 | 7.9852 | 5398 | 0.9737 | 0.7840 | 0.9737 | 0.9868 | | 0.0513 | 7.9882 | 5400 | 0.9916 | 0.7840 | 0.9916 | 0.9958 | | 0.0513 | 7.9911 | 5402 | 0.9916 | 0.7840 | 0.9916 | 0.9958 | | 0.0513 | 7.9941 | 5404 | 0.9681 | 0.7840 | 0.9681 | 0.9839 | | 0.0513 | 7.9970 | 5406 | 0.9462 | 0.7840 | 0.9462 | 0.9727 | | 0.0513 | 8.0 | 5408 | 0.9057 | 0.7840 | 0.9057 | 0.9517 | | 0.0513 | 8.0030 | 5410 | 0.8827 | 0.7840 | 0.8827 | 0.9395 | | 0.0513 | 8.0059 | 5412 | 0.8804 | 0.7840 | 0.8804 | 0.9383 | | 0.0513 | 8.0089 | 5414 | 0.8891 | 0.7840 | 0.8891 | 0.9429 | | 0.0513 | 8.0118 | 5416 | 0.9093 | 0.7840 | 0.9093 | 0.9536 | | 0.0513 | 8.0148 | 5418 | 0.9292 | 0.7840 | 0.9292 | 0.9640 | | 0.0513 | 8.0178 | 5420 | 0.9463 | 0.7840 | 0.9463 | 0.9728 | | 0.0513 | 8.0207 | 5422 | 0.9432 | 0.7840 | 0.9432 | 0.9712 | | 0.0513 | 8.0237 | 5424 | 0.9443 | 0.7840 | 0.9443 | 0.9718 | | 0.0513 | 8.0266 | 5426 | 0.9662 | 0.7840 | 0.9662 | 0.9830 | | 0.0513 | 8.0296 | 5428 | 0.9697 | 0.7840 | 0.9697 | 0.9847 | | 0.0513 | 8.0325 | 5430 | 0.9530 | 0.7840 | 0.9530 | 0.9762 | | 0.0513 | 8.0355 | 5432 | 0.9319 | 0.7840 | 0.9319 | 0.9654 | | 0.0513 | 8.0385 | 5434 | 0.9145 | 0.7840 | 0.9145 | 0.9563 | | 0.0513 | 8.0414 | 5436 | 0.8882 | 0.7840 | 0.8882 | 0.9424 | | 0.0513 | 8.0444 | 5438 | 0.8745 | 0.7840 | 0.8745 | 0.9351 | | 0.0513 | 8.0473 | 5440 | 0.8735 | 0.7840 | 0.8735 | 0.9346 | | 0.0513 | 8.0503 | 5442 | 0.8885 | 0.7840 | 0.8885 | 0.9426 | | 0.0513 | 8.0533 | 5444 | 0.8960 | 0.7840 | 0.8960 | 0.9466 | | 0.0513 | 8.0562 | 5446 | 0.9071 | 0.7840 | 0.9071 | 0.9524 | | 0.0513 | 8.0592 | 5448 | 0.9082 | 0.7840 | 0.9082 | 0.9530 | | 0.0513 | 8.0621 | 5450 | 0.8888 | 0.7840 | 0.8888 | 0.9427 | | 0.0513 | 8.0651 | 5452 | 0.8756 | 0.7840 | 0.8756 | 0.9357 | | 0.0513 | 8.0680 | 5454 | 0.8716 | 0.7840 | 0.8716 | 0.9336 | | 0.0513 | 8.0710 | 5456 | 0.8763 | 0.7840 | 0.8763 | 0.9361 | | 0.0513 | 8.0740 | 5458 | 0.8698 | 0.7840 | 0.8698 | 0.9326 | | 0.0513 | 8.0769 | 5460 | 0.8683 | 0.7840 | 0.8683 | 0.9318 | | 0.0513 | 8.0799 | 5462 | 0.8590 | 0.7724 | 0.8590 | 0.9268 | | 0.0513 | 8.0828 | 5464 | 0.8545 | 0.7724 | 0.8545 | 0.9244 | | 0.0513 | 8.0858 | 5466 | 0.8546 | 0.7724 | 0.8546 | 0.9244 | | 0.0513 | 8.0888 | 5468 | 0.8727 | 0.7724 | 0.8727 | 0.9342 | | 0.0513 | 8.0917 | 5470 | 0.8868 | 0.7724 | 0.8868 | 0.9417 | | 0.0513 | 8.0947 | 5472 | 0.9095 | 0.7840 | 0.9095 | 0.9537 | | 0.0513 | 8.0976 | 5474 | 0.9463 | 0.7840 | 0.9463 | 0.9728 | | 0.0513 | 8.1006 | 5476 | 0.9815 | 0.7840 | 0.9815 | 0.9907 | | 0.0513 | 8.1036 | 5478 | 1.0161 | 0.7840 | 1.0161 | 1.0080 | | 0.0513 | 8.1065 | 5480 | 1.0219 | 0.7840 | 1.0219 | 1.0109 | | 0.0513 | 8.1095 | 5482 | 1.0064 | 0.7840 | 1.0064 | 1.0032 | | 0.0513 | 8.1124 | 5484 | 0.9786 | 0.7840 | 0.9786 | 0.9893 | | 0.0513 | 8.1154 | 5486 | 0.9712 | 0.7840 | 0.9712 | 0.9855 | | 0.0513 | 8.1183 | 5488 | 0.9631 | 0.7840 | 0.9631 | 0.9814 | | 0.0513 | 8.1213 | 5490 | 0.9548 | 0.7840 | 0.9548 | 0.9772 | | 0.0513 | 8.1243 | 5492 | 0.9550 | 0.7840 | 0.9550 | 0.9772 | | 0.0513 | 8.1272 | 5494 | 0.9506 | 0.7840 | 0.9506 | 0.9750 | | 0.0513 | 8.1302 | 5496 | 0.9431 | 0.7840 | 0.9431 | 0.9711 | | 0.0513 | 8.1331 | 5498 | 0.9371 | 0.7840 | 0.9371 | 0.9680 | | 0.0463 | 8.1361 | 5500 | 0.9407 | 0.7840 | 0.9407 | 0.9699 | | 0.0463 | 8.1391 | 5502 | 0.9409 | 0.7840 | 0.9409 | 0.9700 | | 0.0463 | 8.1420 | 5504 | 0.9537 | 0.7840 | 0.9537 | 0.9766 | | 0.0463 | 8.1450 | 5506 | 0.9643 | 0.7840 | 0.9643 | 0.9820 | | 0.0463 | 8.1479 | 5508 | 0.9587 | 0.7840 | 0.9587 | 0.9791 | | 0.0463 | 8.1509 | 5510 | 0.9394 | 0.7840 | 0.9394 | 0.9692 | | 0.0463 | 8.1538 | 5512 | 0.9359 | 0.7840 | 0.9359 | 0.9674 | | 0.0463 | 8.1568 | 5514 | 0.9347 | 0.7840 | 0.9347 | 0.9668 | | 0.0463 | 8.1598 | 5516 | 0.9311 | 0.7840 | 0.9311 | 0.9650 | | 0.0463 | 8.1627 | 5518 | 0.9267 | 0.7840 | 0.9267 | 0.9626 | | 0.0463 | 8.1657 | 5520 | 0.9123 | 0.7626 | 0.9123 | 0.9551 | | 0.0463 | 8.1686 | 5522 | 0.8869 | 0.7296 | 0.8869 | 0.9417 | | 0.0463 | 8.1716 | 5524 | 0.8553 | 0.7296 | 0.8553 | 0.9248 | | 0.0463 | 8.1746 | 5526 | 0.8484 | 0.7296 | 0.8484 | 0.9211 | | 0.0463 | 8.1775 | 5528 | 0.8518 | 0.7296 | 0.8518 | 0.9229 | | 0.0463 | 8.1805 | 5530 | 0.8724 | 0.7296 | 0.8724 | 0.9340 | | 0.0463 | 8.1834 | 5532 | 0.9039 | 0.7840 | 0.9039 | 0.9507 | | 0.0463 | 8.1864 | 5534 | 0.9287 | 0.7840 | 0.9287 | 0.9637 | | 0.0463 | 8.1893 | 5536 | 0.9542 | 0.7840 | 0.9542 | 0.9769 | | 0.0463 | 8.1923 | 5538 | 0.9696 | 0.7840 | 0.9696 | 0.9847 | | 0.0463 | 8.1953 | 5540 | 0.9936 | 0.7840 | 0.9936 | 0.9968 | | 0.0463 | 8.1982 | 5542 | 1.0123 | 0.7840 | 1.0123 | 1.0061 | | 0.0463 | 8.2012 | 5544 | 1.0111 | 0.7840 | 1.0111 | 1.0055 | | 0.0463 | 8.2041 | 5546 | 0.9937 | 0.7840 | 0.9937 | 0.9968 | | 0.0463 | 8.2071 | 5548 | 0.9620 | 0.7840 | 0.9620 | 0.9808 | | 0.0463 | 8.2101 | 5550 | 0.9328 | 0.7840 | 0.9328 | 0.9658 | | 0.0463 | 8.2130 | 5552 | 0.9176 | 0.7840 | 0.9176 | 0.9579 | | 0.0463 | 8.2160 | 5554 | 0.8961 | 0.7840 | 0.8961 | 0.9466 | | 0.0463 | 8.2189 | 5556 | 0.8694 | 0.7840 | 0.8694 | 0.9324 | | 0.0463 | 8.2219 | 5558 | 0.8567 | 0.7840 | 0.8567 | 0.9256 | | 0.0463 | 8.2249 | 5560 | 0.8487 | 0.7840 | 0.8487 | 0.9213 | | 0.0463 | 8.2278 | 5562 | 0.8487 | 0.7840 | 0.8487 | 0.9212 | | 0.0463 | 8.2308 | 5564 | 0.8558 | 0.7840 | 0.8558 | 0.9251 | | 0.0463 | 8.2337 | 5566 | 0.8666 | 0.7840 | 0.8666 | 0.9309 | | 0.0463 | 8.2367 | 5568 | 0.8800 | 0.7840 | 0.8800 | 0.9381 | | 0.0463 | 8.2396 | 5570 | 0.8852 | 0.7840 | 0.8852 | 0.9409 | | 0.0463 | 8.2426 | 5572 | 0.9007 | 0.7840 | 0.9007 | 0.9490 | | 0.0463 | 8.2456 | 5574 | 0.9000 | 0.7840 | 0.9000 | 0.9487 | | 0.0463 | 8.2485 | 5576 | 0.9012 | 0.7840 | 0.9012 | 0.9493 | | 0.0463 | 8.2515 | 5578 | 0.8997 | 0.7840 | 0.8997 | 0.9485 | | 0.0463 | 8.2544 | 5580 | 0.8967 | 0.7626 | 0.8967 | 0.9469 | | 0.0463 | 8.2574 | 5582 | 0.8941 | 0.7626 | 0.8941 | 0.9456 | | 0.0463 | 8.2604 | 5584 | 0.9061 | 0.7626 | 0.9061 | 0.9519 | | 0.0463 | 8.2633 | 5586 | 0.9288 | 0.7840 | 0.9288 | 0.9638 | | 0.0463 | 8.2663 | 5588 | 0.9536 | 0.7840 | 0.9536 | 0.9765 | | 0.0463 | 8.2692 | 5590 | 0.9589 | 0.7840 | 0.9589 | 0.9792 | | 0.0463 | 8.2722 | 5592 | 0.9439 | 0.7840 | 0.9439 | 0.9716 | | 0.0463 | 8.2751 | 5594 | 0.9212 | 0.7626 | 0.9212 | 0.9598 | | 0.0463 | 8.2781 | 5596 | 0.8968 | 0.7626 | 0.8968 | 0.9470 | | 0.0463 | 8.2811 | 5598 | 0.8869 | 0.7626 | 0.8869 | 0.9417 | | 0.0463 | 8.2840 | 5600 | 0.8885 | 0.7626 | 0.8885 | 0.9426 | | 0.0463 | 8.2870 | 5602 | 0.8791 | 0.7626 | 0.8791 | 0.9376 | | 0.0463 | 8.2899 | 5604 | 0.8684 | 0.7296 | 0.8684 | 0.9319 | | 0.0463 | 8.2929 | 5606 | 0.8603 | 0.7296 | 0.8603 | 0.9275 | | 0.0463 | 8.2959 | 5608 | 0.8622 | 0.7296 | 0.8622 | 0.9285 | | 0.0463 | 8.2988 | 5610 | 0.8822 | 0.7296 | 0.8822 | 0.9392 | | 0.0463 | 8.3018 | 5612 | 0.9049 | 0.7296 | 0.9049 | 0.9513 | | 0.0463 | 8.3047 | 5614 | 0.9291 | 0.7626 | 0.9291 | 0.9639 | | 0.0463 | 8.3077 | 5616 | 0.9610 | 0.7626 | 0.9610 | 0.9803 | | 0.0463 | 8.3107 | 5618 | 0.9874 | 0.7840 | 0.9874 | 0.9937 | | 0.0463 | 8.3136 | 5620 | 1.0146 | 0.7840 | 1.0146 | 1.0073 | | 0.0463 | 8.3166 | 5622 | 1.0132 | 0.7840 | 1.0132 | 1.0066 | | 0.0463 | 8.3195 | 5624 | 0.9899 | 0.7840 | 0.9899 | 0.9949 | | 0.0463 | 8.3225 | 5626 | 0.9501 | 0.7626 | 0.9501 | 0.9747 | | 0.0463 | 8.3254 | 5628 | 0.9011 | 0.7296 | 0.9011 | 0.9492 | | 0.0463 | 8.3284 | 5630 | 0.8584 | 0.7296 | 0.8584 | 0.9265 | | 0.0463 | 8.3314 | 5632 | 0.8282 | 0.7296 | 0.8282 | 0.9101 | | 0.0463 | 8.3343 | 5634 | 0.8208 | 0.7296 | 0.8208 | 0.9060 | | 0.0463 | 8.3373 | 5636 | 0.8301 | 0.7296 | 0.8301 | 0.9111 | | 0.0463 | 8.3402 | 5638 | 0.8509 | 0.7296 | 0.8509 | 0.9224 | | 0.0463 | 8.3432 | 5640 | 0.8636 | 0.7626 | 0.8636 | 0.9293 | | 0.0463 | 8.3462 | 5642 | 0.8824 | 0.7626 | 0.8824 | 0.9394 | | 0.0463 | 8.3491 | 5644 | 0.8956 | 0.7626 | 0.8956 | 0.9464 | | 0.0463 | 8.3521 | 5646 | 0.9121 | 0.7626 | 0.9121 | 0.9550 | | 0.0463 | 8.3550 | 5648 | 0.9320 | 0.7840 | 0.9320 | 0.9654 | | 0.0463 | 8.3580 | 5650 | 0.9368 | 0.7840 | 0.9368 | 0.9679 | | 0.0463 | 8.3609 | 5652 | 0.9307 | 0.7626 | 0.9307 | 0.9647 | | 0.0463 | 8.3639 | 5654 | 0.9334 | 0.7626 | 0.9334 | 0.9661 | | 0.0463 | 8.3669 | 5656 | 0.9290 | 0.7626 | 0.9290 | 0.9639 | | 0.0463 | 8.3698 | 5658 | 0.9245 | 0.7626 | 0.9245 | 0.9615 | | 0.0463 | 8.3728 | 5660 | 0.9264 | 0.7626 | 0.9264 | 0.9625 | | 0.0463 | 8.3757 | 5662 | 0.9144 | 0.7626 | 0.9144 | 0.9563 | | 0.0463 | 8.3787 | 5664 | 0.9077 | 0.7626 | 0.9077 | 0.9528 | | 0.0463 | 8.3817 | 5666 | 0.9198 | 0.7626 | 0.9198 | 0.9591 | | 0.0463 | 8.3846 | 5668 | 0.9402 | 0.7626 | 0.9402 | 0.9696 | | 0.0463 | 8.3876 | 5670 | 0.9529 | 0.7626 | 0.9529 | 0.9762 | | 0.0463 | 8.3905 | 5672 | 0.9801 | 0.7840 | 0.9801 | 0.9900 | | 0.0463 | 8.3935 | 5674 | 0.9953 | 0.7840 | 0.9953 | 0.9976 | | 0.0463 | 8.3964 | 5676 | 0.9977 | 0.7840 | 0.9977 | 0.9988 | | 0.0463 | 8.3994 | 5678 | 0.9879 | 0.7840 | 0.9879 | 0.9939 | | 0.0463 | 8.4024 | 5680 | 0.9640 | 0.7840 | 0.9640 | 0.9819 | | 0.0463 | 8.4053 | 5682 | 0.9444 | 0.7840 | 0.9444 | 0.9718 | | 0.0463 | 8.4083 | 5684 | 0.9303 | 0.7626 | 0.9303 | 0.9645 | | 0.0463 | 8.4112 | 5686 | 0.9309 | 0.7626 | 0.9309 | 0.9648 | | 0.0463 | 8.4142 | 5688 | 0.9304 | 0.7626 | 0.9304 | 0.9646 | | 0.0463 | 8.4172 | 5690 | 0.9131 | 0.7626 | 0.9131 | 0.9555 | | 0.0463 | 8.4201 | 5692 | 0.8937 | 0.7626 | 0.8937 | 0.9454 | | 0.0463 | 8.4231 | 5694 | 0.8819 | 0.7626 | 0.8819 | 0.9391 | | 0.0463 | 8.4260 | 5696 | 0.8729 | 0.7626 | 0.8729 | 0.9343 | | 0.0463 | 8.4290 | 5698 | 0.8708 | 0.7626 | 0.8708 | 0.9332 | | 0.0463 | 8.4320 | 5700 | 0.8611 | 0.7626 | 0.8611 | 0.9279 | | 0.0463 | 8.4349 | 5702 | 0.8579 | 0.7626 | 0.8579 | 0.9262 | | 0.0463 | 8.4379 | 5704 | 0.8553 | 0.7626 | 0.8553 | 0.9248 | | 0.0463 | 8.4408 | 5706 | 0.8701 | 0.7626 | 0.8701 | 0.9328 | | 0.0463 | 8.4438 | 5708 | 0.8849 | 0.7626 | 0.8849 | 0.9407 | | 0.0463 | 8.4467 | 5710 | 0.8903 | 0.7626 | 0.8903 | 0.9435 | | 0.0463 | 8.4497 | 5712 | 0.8841 | 0.7626 | 0.8841 | 0.9403 | | 0.0463 | 8.4527 | 5714 | 0.8816 | 0.7626 | 0.8816 | 0.9389 | | 0.0463 | 8.4556 | 5716 | 0.8788 | 0.7626 | 0.8788 | 0.9375 | | 0.0463 | 8.4586 | 5718 | 0.8876 | 0.7626 | 0.8876 | 0.9421 | | 0.0463 | 8.4615 | 5720 | 0.9134 | 0.7840 | 0.9134 | 0.9557 | | 0.0463 | 8.4645 | 5722 | 0.9324 | 0.7840 | 0.9324 | 0.9656 | | 0.0463 | 8.4675 | 5724 | 0.9467 | 0.7840 | 0.9467 | 0.9730 | | 0.0463 | 8.4704 | 5726 | 0.9481 | 0.7840 | 0.9481 | 0.9737 | | 0.0463 | 8.4734 | 5728 | 0.9346 | 0.7840 | 0.9346 | 0.9668 | | 0.0463 | 8.4763 | 5730 | 0.9082 | 0.7840 | 0.9082 | 0.9530 | | 0.0463 | 8.4793 | 5732 | 0.8784 | 0.7626 | 0.8784 | 0.9372 | | 0.0463 | 8.4822 | 5734 | 0.8630 | 0.7626 | 0.8630 | 0.9290 | | 0.0463 | 8.4852 | 5736 | 0.8621 | 0.7626 | 0.8621 | 0.9285 | | 0.0463 | 8.4882 | 5738 | 0.8539 | 0.7626 | 0.8539 | 0.9241 | | 0.0463 | 8.4911 | 5740 | 0.8459 | 0.7626 | 0.8459 | 0.9197 | | 0.0463 | 8.4941 | 5742 | 0.8374 | 0.7626 | 0.8374 | 0.9151 | | 0.0463 | 8.4970 | 5744 | 0.8458 | 0.7626 | 0.8458 | 0.9197 | | 0.0463 | 8.5 | 5746 | 0.8549 | 0.7626 | 0.8549 | 0.9246 | | 0.0463 | 8.5030 | 5748 | 0.8767 | 0.7626 | 0.8767 | 0.9363 | | 0.0463 | 8.5059 | 5750 | 0.9061 | 0.7840 | 0.9061 | 0.9519 | | 0.0463 | 8.5089 | 5752 | 0.9308 | 0.7840 | 0.9308 | 0.9648 | | 0.0463 | 8.5118 | 5754 | 0.9431 | 0.7840 | 0.9431 | 0.9711 | | 0.0463 | 8.5148 | 5756 | 0.9389 | 0.7840 | 0.9389 | 0.9690 | | 0.0463 | 8.5178 | 5758 | 0.9194 | 0.7840 | 0.9194 | 0.9588 | | 0.0463 | 8.5207 | 5760 | 0.9052 | 0.7840 | 0.9052 | 0.9514 | | 0.0463 | 8.5237 | 5762 | 0.8764 | 0.7626 | 0.8764 | 0.9362 | | 0.0463 | 8.5266 | 5764 | 0.8455 | 0.7296 | 0.8455 | 0.9195 | | 0.0463 | 8.5296 | 5766 | 0.8264 | 0.72 | 0.8264 | 0.9091 | | 0.0463 | 8.5325 | 5768 | 0.8238 | 0.7520 | 0.8238 | 0.9076 | | 0.0463 | 8.5355 | 5770 | 0.8186 | 0.7520 | 0.8186 | 0.9048 | | 0.0463 | 8.5385 | 5772 | 0.8282 | 0.7520 | 0.8282 | 0.9100 | | 0.0463 | 8.5414 | 5774 | 0.8416 | 0.7520 | 0.8416 | 0.9174 | | 0.0463 | 8.5444 | 5776 | 0.8482 | 0.7520 | 0.8482 | 0.9210 | | 0.0463 | 8.5473 | 5778 | 0.8452 | 0.7520 | 0.8452 | 0.9193 | | 0.0463 | 8.5503 | 5780 | 0.8579 | 0.7520 | 0.8579 | 0.9262 | | 0.0463 | 8.5533 | 5782 | 0.8748 | 0.7520 | 0.8748 | 0.9353 | | 0.0463 | 8.5562 | 5784 | 0.8843 | 0.7520 | 0.8843 | 0.9404 | | 0.0463 | 8.5592 | 5786 | 0.8920 | 0.7520 | 0.8920 | 0.9445 | | 0.0463 | 8.5621 | 5788 | 0.8910 | 0.7520 | 0.8910 | 0.9440 | | 0.0463 | 8.5651 | 5790 | 0.8814 | 0.7520 | 0.8814 | 0.9388 | | 0.0463 | 8.5680 | 5792 | 0.8682 | 0.7520 | 0.8682 | 0.9318 | | 0.0463 | 8.5710 | 5794 | 0.8503 | 0.7520 | 0.8503 | 0.9221 | | 0.0463 | 8.5740 | 5796 | 0.8368 | 0.7520 | 0.8368 | 0.9148 | | 0.0463 | 8.5769 | 5798 | 0.8235 | 0.72 | 0.8235 | 0.9075 | | 0.0463 | 8.5799 | 5800 | 0.8247 | 0.72 | 0.8247 | 0.9082 | | 0.0463 | 8.5828 | 5802 | 0.8334 | 0.72 | 0.8334 | 0.9129 | | 0.0463 | 8.5858 | 5804 | 0.8445 | 0.7520 | 0.8445 | 0.9190 | | 0.0463 | 8.5888 | 5806 | 0.8531 | 0.7520 | 0.8531 | 0.9236 | | 0.0463 | 8.5917 | 5808 | 0.8747 | 0.7626 | 0.8747 | 0.9353 | | 0.0463 | 8.5947 | 5810 | 0.8873 | 0.7626 | 0.8873 | 0.9420 | | 0.0463 | 8.5976 | 5812 | 0.8831 | 0.7626 | 0.8831 | 0.9397 | | 0.0463 | 8.6006 | 5814 | 0.8656 | 0.7520 | 0.8656 | 0.9304 | | 0.0463 | 8.6036 | 5816 | 0.8524 | 0.7520 | 0.8524 | 0.9233 | | 0.0463 | 8.6065 | 5818 | 0.8481 | 0.7520 | 0.8481 | 0.9209 | | 0.0463 | 8.6095 | 5820 | 0.8429 | 0.7520 | 0.8429 | 0.9181 | | 0.0463 | 8.6124 | 5822 | 0.8521 | 0.7520 | 0.8521 | 0.9231 | | 0.0463 | 8.6154 | 5824 | 0.8622 | 0.7520 | 0.8622 | 0.9286 | | 0.0463 | 8.6183 | 5826 | 0.8603 | 0.7520 | 0.8603 | 0.9275 | | 0.0463 | 8.6213 | 5828 | 0.8570 | 0.7520 | 0.8570 | 0.9257 | | 0.0463 | 8.6243 | 5830 | 0.8542 | 0.7520 | 0.8542 | 0.9242 | | 0.0463 | 8.6272 | 5832 | 0.8388 | 0.7520 | 0.8388 | 0.9159 | | 0.0463 | 8.6302 | 5834 | 0.8214 | 0.7819 | 0.8214 | 0.9063 | | 0.0463 | 8.6331 | 5836 | 0.8000 | 0.7819 | 0.8000 | 0.8944 | | 0.0463 | 8.6361 | 5838 | 0.7796 | 0.7819 | 0.7796 | 0.8829 | | 0.0463 | 8.6391 | 5840 | 0.7777 | 0.7819 | 0.7777 | 0.8819 | | 0.0463 | 8.6420 | 5842 | 0.7902 | 0.7819 | 0.7902 | 0.8889 | | 0.0463 | 8.6450 | 5844 | 0.8049 | 0.7819 | 0.8049 | 0.8972 | | 0.0463 | 8.6479 | 5846 | 0.8296 | 0.7819 | 0.8296 | 0.9108 | | 0.0463 | 8.6509 | 5848 | 0.8475 | 0.7520 | 0.8475 | 0.9206 | | 0.0463 | 8.6538 | 5850 | 0.8557 | 0.7520 | 0.8557 | 0.9250 | | 0.0463 | 8.6568 | 5852 | 0.8557 | 0.7520 | 0.8557 | 0.9251 | | 0.0463 | 8.6598 | 5854 | 0.8453 | 0.7520 | 0.8453 | 0.9194 | | 0.0463 | 8.6627 | 5856 | 0.8282 | 0.7520 | 0.8282 | 0.9101 | | 0.0463 | 8.6657 | 5858 | 0.8167 | 0.7520 | 0.8167 | 0.9037 | | 0.0463 | 8.6686 | 5860 | 0.8203 | 0.7520 | 0.8203 | 0.9057 | | 0.0463 | 8.6716 | 5862 | 0.8242 | 0.7520 | 0.8242 | 0.9078 | | 0.0463 | 8.6746 | 5864 | 0.8308 | 0.7520 | 0.8308 | 0.9115 | | 0.0463 | 8.6775 | 5866 | 0.8331 | 0.7520 | 0.8331 | 0.9128 | | 0.0463 | 8.6805 | 5868 | 0.8446 | 0.7520 | 0.8446 | 0.9190 | | 0.0463 | 8.6834 | 5870 | 0.8627 | 0.7520 | 0.8627 | 0.9288 | | 0.0463 | 8.6864 | 5872 | 0.8687 | 0.7626 | 0.8687 | 0.9320 | | 0.0463 | 8.6893 | 5874 | 0.8753 | 0.7626 | 0.8753 | 0.9356 | | 0.0463 | 8.6923 | 5876 | 0.8743 | 0.7520 | 0.8743 | 0.9351 | | 0.0463 | 8.6953 | 5878 | 0.8626 | 0.7520 | 0.8626 | 0.9287 | | 0.0463 | 8.6982 | 5880 | 0.8465 | 0.7520 | 0.8465 | 0.9201 | | 0.0463 | 8.7012 | 5882 | 0.8342 | 0.7520 | 0.8342 | 0.9134 | | 0.0463 | 8.7041 | 5884 | 0.8168 | 0.7520 | 0.8168 | 0.9037 | | 0.0463 | 8.7071 | 5886 | 0.8007 | 0.7819 | 0.8007 | 0.8948 | | 0.0463 | 8.7101 | 5888 | 0.7954 | 0.7819 | 0.7954 | 0.8918 | | 0.0463 | 8.7130 | 5890 | 0.8003 | 0.7819 | 0.8003 | 0.8946 | | 0.0463 | 8.7160 | 5892 | 0.8047 | 0.7819 | 0.8047 | 0.8970 | | 0.0463 | 8.7189 | 5894 | 0.8084 | 0.7819 | 0.8084 | 0.8991 | | 0.0463 | 8.7219 | 5896 | 0.8066 | 0.7819 | 0.8066 | 0.8981 | | 0.0463 | 8.7249 | 5898 | 0.7996 | 0.7819 | 0.7996 | 0.8942 | | 0.0463 | 8.7278 | 5900 | 0.8050 | 0.7819 | 0.8050 | 0.8972 | | 0.0463 | 8.7308 | 5902 | 0.8125 | 0.7819 | 0.8125 | 0.9014 | | 0.0463 | 8.7337 | 5904 | 0.8273 | 0.7520 | 0.8273 | 0.9096 | | 0.0463 | 8.7367 | 5906 | 0.8470 | 0.7520 | 0.8470 | 0.9203 | | 0.0463 | 8.7396 | 5908 | 0.8528 | 0.7520 | 0.8528 | 0.9235 | | 0.0463 | 8.7426 | 5910 | 0.8634 | 0.7520 | 0.8634 | 0.9292 | | 0.0463 | 8.7456 | 5912 | 0.8790 | 0.7626 | 0.8790 | 0.9375 | | 0.0463 | 8.7485 | 5914 | 0.9019 | 0.7840 | 0.9019 | 0.9497 | | 0.0463 | 8.7515 | 5916 | 0.9146 | 0.7840 | 0.9146 | 0.9563 | | 0.0463 | 8.7544 | 5918 | 0.9316 | 0.7840 | 0.9316 | 0.9652 | | 0.0463 | 8.7574 | 5920 | 0.9316 | 0.7840 | 0.9316 | 0.9652 | | 0.0463 | 8.7604 | 5922 | 0.9340 | 0.7840 | 0.9340 | 0.9664 | | 0.0463 | 8.7633 | 5924 | 0.9231 | 0.7840 | 0.9231 | 0.9608 | | 0.0463 | 8.7663 | 5926 | 0.9025 | 0.7840 | 0.9025 | 0.9500 | | 0.0463 | 8.7692 | 5928 | 0.8827 | 0.7840 | 0.8827 | 0.9395 | | 0.0463 | 8.7722 | 5930 | 0.8687 | 0.7626 | 0.8687 | 0.9321 | | 0.0463 | 8.7751 | 5932 | 0.8642 | 0.7626 | 0.8642 | 0.9296 | | 0.0463 | 8.7781 | 5934 | 0.8637 | 0.7520 | 0.8637 | 0.9294 | | 0.0463 | 8.7811 | 5936 | 0.8498 | 0.7520 | 0.8498 | 0.9218 | | 0.0463 | 8.7840 | 5938 | 0.8338 | 0.7520 | 0.8338 | 0.9132 | | 0.0463 | 8.7870 | 5940 | 0.8244 | 0.7520 | 0.8244 | 0.9079 | | 0.0463 | 8.7899 | 5942 | 0.8247 | 0.7520 | 0.8247 | 0.9081 | | 0.0463 | 8.7929 | 5944 | 0.8263 | 0.7520 | 0.8263 | 0.9090 | | 0.0463 | 8.7959 | 5946 | 0.8399 | 0.7626 | 0.8399 | 0.9165 | | 0.0463 | 8.7988 | 5948 | 0.8472 | 0.7626 | 0.8472 | 0.9205 | | 0.0463 | 8.8018 | 5950 | 0.8593 | 0.7626 | 0.8593 | 0.9270 | | 0.0463 | 8.8047 | 5952 | 0.8712 | 0.7626 | 0.8712 | 0.9334 | | 0.0463 | 8.8077 | 5954 | 0.8797 | 0.7626 | 0.8797 | 0.9379 | | 0.0463 | 8.8107 | 5956 | 0.8915 | 0.7626 | 0.8915 | 0.9442 | | 0.0463 | 8.8136 | 5958 | 0.9070 | 0.7840 | 0.9070 | 0.9524 | | 0.0463 | 8.8166 | 5960 | 0.9154 | 0.7840 | 0.9154 | 0.9568 | | 0.0463 | 8.8195 | 5962 | 0.9148 | 0.7840 | 0.9148 | 0.9564 | | 0.0463 | 8.8225 | 5964 | 0.9147 | 0.7626 | 0.9147 | 0.9564 | | 0.0463 | 8.8254 | 5966 | 0.9126 | 0.7626 | 0.9126 | 0.9553 | | 0.0463 | 8.8284 | 5968 | 0.9120 | 0.7626 | 0.9120 | 0.9550 | | 0.0463 | 8.8314 | 5970 | 0.9206 | 0.7840 | 0.9206 | 0.9595 | | 0.0463 | 8.8343 | 5972 | 0.9328 | 0.7840 | 0.9328 | 0.9658 | | 0.0463 | 8.8373 | 5974 | 0.9332 | 0.7840 | 0.9332 | 0.9660 | | 0.0463 | 8.8402 | 5976 | 0.9239 | 0.7626 | 0.9239 | 0.9612 | | 0.0463 | 8.8432 | 5978 | 0.9104 | 0.7626 | 0.9104 | 0.9541 | | 0.0463 | 8.8462 | 5980 | 0.8925 | 0.7626 | 0.8925 | 0.9447 | | 0.0463 | 8.8491 | 5982 | 0.8730 | 0.7626 | 0.8730 | 0.9343 | | 0.0463 | 8.8521 | 5984 | 0.8612 | 0.7296 | 0.8612 | 0.9280 | | 0.0463 | 8.8550 | 5986 | 0.8530 | 0.7296 | 0.8530 | 0.9236 | | 0.0463 | 8.8580 | 5988 | 0.8348 | 0.7296 | 0.8348 | 0.9137 | | 0.0463 | 8.8609 | 5990 | 0.8198 | 0.7296 | 0.8198 | 0.9054 | | 0.0463 | 8.8639 | 5992 | 0.8157 | 0.7296 | 0.8157 | 0.9031 | | 0.0463 | 8.8669 | 5994 | 0.8157 | 0.7296 | 0.8157 | 0.9032 | | 0.0463 | 8.8698 | 5996 | 0.8250 | 0.7296 | 0.8250 | 0.9083 | | 0.0463 | 8.8728 | 5998 | 0.8391 | 0.7626 | 0.8391 | 0.9160 | | 0.0421 | 8.8757 | 6000 | 0.8496 | 0.7626 | 0.8496 | 0.9217 | | 0.0421 | 8.8787 | 6002 | 0.8590 | 0.7626 | 0.8590 | 0.9268 | | 0.0421 | 8.8817 | 6004 | 0.8609 | 0.7626 | 0.8609 | 0.9278 | | 0.0421 | 8.8846 | 6006 | 0.8554 | 0.7626 | 0.8554 | 0.9249 | | 0.0421 | 8.8876 | 6008 | 0.8473 | 0.7626 | 0.8473 | 0.9205 | | 0.0421 | 8.8905 | 6010 | 0.8385 | 0.7626 | 0.8385 | 0.9157 | | 0.0421 | 8.8935 | 6012 | 0.8417 | 0.7626 | 0.8417 | 0.9174 | | 0.0421 | 8.8964 | 6014 | 0.8538 | 0.7626 | 0.8538 | 0.9240 | | 0.0421 | 8.8994 | 6016 | 0.8662 | 0.7626 | 0.8662 | 0.9307 | | 0.0421 | 8.9024 | 6018 | 0.8769 | 0.7626 | 0.8769 | 0.9364 | | 0.0421 | 8.9053 | 6020 | 0.8864 | 0.7626 | 0.8864 | 0.9415 | | 0.0421 | 8.9083 | 6022 | 0.8885 | 0.7626 | 0.8885 | 0.9426 | | 0.0421 | 8.9112 | 6024 | 0.8953 | 0.7626 | 0.8953 | 0.9462 | | 0.0421 | 8.9142 | 6026 | 0.8963 | 0.7626 | 0.8963 | 0.9467 | | 0.0421 | 8.9172 | 6028 | 0.9000 | 0.7626 | 0.9000 | 0.9487 | | 0.0421 | 8.9201 | 6030 | 0.8921 | 0.7626 | 0.8921 | 0.9445 | | 0.0421 | 8.9231 | 6032 | 0.8793 | 0.7626 | 0.8793 | 0.9377 | | 0.0421 | 8.9260 | 6034 | 0.8792 | 0.7626 | 0.8792 | 0.9377 | | 0.0421 | 8.9290 | 6036 | 0.8913 | 0.7626 | 0.8913 | 0.9441 | | 0.0421 | 8.9320 | 6038 | 0.9032 | 0.7626 | 0.9032 | 0.9504 | | 0.0421 | 8.9349 | 6040 | 0.9093 | 0.7626 | 0.9093 | 0.9536 | | 0.0421 | 8.9379 | 6042 | 0.9079 | 0.7626 | 0.9079 | 0.9528 | | 0.0421 | 8.9408 | 6044 | 0.9172 | 0.7626 | 0.9172 | 0.9577 | | 0.0421 | 8.9438 | 6046 | 0.9220 | 0.7626 | 0.9220 | 0.9602 | | 0.0421 | 8.9467 | 6048 | 0.9295 | 0.7626 | 0.9295 | 0.9641 | | 0.0421 | 8.9497 | 6050 | 0.9242 | 0.7626 | 0.9242 | 0.9614 | | 0.0421 | 8.9527 | 6052 | 0.9120 | 0.7626 | 0.9120 | 0.9550 | | 0.0421 | 8.9556 | 6054 | 0.8966 | 0.7626 | 0.8966 | 0.9469 | | 0.0421 | 8.9586 | 6056 | 0.8926 | 0.7626 | 0.8926 | 0.9448 | | 0.0421 | 8.9615 | 6058 | 0.8962 | 0.7626 | 0.8962 | 0.9467 | | 0.0421 | 8.9645 | 6060 | 0.9033 | 0.7626 | 0.9033 | 0.9504 | | 0.0421 | 8.9675 | 6062 | 0.9096 | 0.7626 | 0.9096 | 0.9537 | | 0.0421 | 8.9704 | 6064 | 0.9105 | 0.7626 | 0.9105 | 0.9542 | | 0.0421 | 8.9734 | 6066 | 0.9014 | 0.7626 | 0.9014 | 0.9494 | | 0.0421 | 8.9763 | 6068 | 0.8945 | 0.7626 | 0.8945 | 0.9458 | | 0.0421 | 8.9793 | 6070 | 0.8958 | 0.7626 | 0.8958 | 0.9465 | | 0.0421 | 8.9822 | 6072 | 0.9075 | 0.7626 | 0.9075 | 0.9526 | | 0.0421 | 8.9852 | 6074 | 0.9147 | 0.7626 | 0.9147 | 0.9564 | | 0.0421 | 8.9882 | 6076 | 0.9261 | 0.7626 | 0.9261 | 0.9623 | | 0.0421 | 8.9911 | 6078 | 0.9296 | 0.7626 | 0.9296 | 0.9642 | | 0.0421 | 8.9941 | 6080 | 0.9341 | 0.7626 | 0.9341 | 0.9665 | | 0.0421 | 8.9970 | 6082 | 0.9404 | 0.7626 | 0.9404 | 0.9697 | | 0.0421 | 9.0 | 6084 | 0.9375 | 0.7626 | 0.9375 | 0.9683 | | 0.0421 | 9.0030 | 6086 | 0.9309 | 0.7626 | 0.9309 | 0.9648 | | 0.0421 | 9.0059 | 6088 | 0.9219 | 0.7626 | 0.9219 | 0.9602 | | 0.0421 | 9.0089 | 6090 | 0.9142 | 0.7626 | 0.9142 | 0.9561 | | 0.0421 | 9.0118 | 6092 | 0.8992 | 0.7626 | 0.8992 | 0.9483 | | 0.0421 | 9.0148 | 6094 | 0.8778 | 0.7626 | 0.8778 | 0.9369 | | 0.0421 | 9.0178 | 6096 | 0.8539 | 0.7626 | 0.8539 | 0.9241 | | 0.0421 | 9.0207 | 6098 | 0.8423 | 0.7626 | 0.8423 | 0.9178 | | 0.0421 | 9.0237 | 6100 | 0.8269 | 0.7626 | 0.8269 | 0.9093 | | 0.0421 | 9.0266 | 6102 | 0.8146 | 0.7520 | 0.8146 | 0.9026 | | 0.0421 | 9.0296 | 6104 | 0.8150 | 0.7520 | 0.8150 | 0.9028 | | 0.0421 | 9.0325 | 6106 | 0.8133 | 0.7520 | 0.8133 | 0.9018 | | 0.0421 | 9.0355 | 6108 | 0.8077 | 0.72 | 0.8077 | 0.8987 | | 0.0421 | 9.0385 | 6110 | 0.8039 | 0.72 | 0.8039 | 0.8966 | | 0.0421 | 9.0414 | 6112 | 0.8042 | 0.72 | 0.8042 | 0.8968 | | 0.0421 | 9.0444 | 6114 | 0.8121 | 0.7520 | 0.8121 | 0.9012 | | 0.0421 | 9.0473 | 6116 | 0.8242 | 0.7626 | 0.8242 | 0.9079 | | 0.0421 | 9.0503 | 6118 | 0.8419 | 0.7626 | 0.8419 | 0.9176 | | 0.0421 | 9.0533 | 6120 | 0.8646 | 0.7626 | 0.8646 | 0.9298 | | 0.0421 | 9.0562 | 6122 | 0.8855 | 0.7626 | 0.8855 | 0.9410 | | 0.0421 | 9.0592 | 6124 | 0.9077 | 0.7626 | 0.9077 | 0.9527 | | 0.0421 | 9.0621 | 6126 | 0.9237 | 0.7626 | 0.9237 | 0.9611 | | 0.0421 | 9.0651 | 6128 | 0.9284 | 0.7626 | 0.9284 | 0.9635 | | 0.0421 | 9.0680 | 6130 | 0.9244 | 0.7626 | 0.9244 | 0.9615 | | 0.0421 | 9.0710 | 6132 | 0.9109 | 0.7626 | 0.9109 | 0.9544 | | 0.0421 | 9.0740 | 6134 | 0.8944 | 0.7626 | 0.8944 | 0.9457 | | 0.0421 | 9.0769 | 6136 | 0.8830 | 0.7626 | 0.8830 | 0.9397 | | 0.0421 | 9.0799 | 6138 | 0.8721 | 0.7626 | 0.8721 | 0.9339 | | 0.0421 | 9.0828 | 6140 | 0.8659 | 0.7626 | 0.8659 | 0.9305 | | 0.0421 | 9.0858 | 6142 | 0.8511 | 0.7626 | 0.8511 | 0.9226 | | 0.0421 | 9.0888 | 6144 | 0.8312 | 0.7520 | 0.8312 | 0.9117 | | 0.0421 | 9.0917 | 6146 | 0.8112 | 0.7520 | 0.8112 | 0.9007 | | 0.0421 | 9.0947 | 6148 | 0.7991 | 0.72 | 0.7991 | 0.8939 | | 0.0421 | 9.0976 | 6150 | 0.7964 | 0.72 | 0.7964 | 0.8924 | | 0.0421 | 9.1006 | 6152 | 0.7948 | 0.72 | 0.7948 | 0.8915 | | 0.0421 | 9.1036 | 6154 | 0.7956 | 0.7520 | 0.7956 | 0.8920 | | 0.0421 | 9.1065 | 6156 | 0.7965 | 0.7520 | 0.7965 | 0.8925 | | 0.0421 | 9.1095 | 6158 | 0.8020 | 0.7520 | 0.8020 | 0.8956 | | 0.0421 | 9.1124 | 6160 | 0.8113 | 0.7520 | 0.8113 | 0.9007 | | 0.0421 | 9.1154 | 6162 | 0.8208 | 0.7520 | 0.8208 | 0.9060 | | 0.0421 | 9.1183 | 6164 | 0.8356 | 0.7520 | 0.8356 | 0.9141 | | 0.0421 | 9.1213 | 6166 | 0.8456 | 0.7520 | 0.8456 | 0.9195 | | 0.0421 | 9.1243 | 6168 | 0.8530 | 0.7520 | 0.8530 | 0.9236 | | 0.0421 | 9.1272 | 6170 | 0.8558 | 0.7626 | 0.8558 | 0.9251 | | 0.0421 | 9.1302 | 6172 | 0.8603 | 0.7626 | 0.8603 | 0.9275 | | 0.0421 | 9.1331 | 6174 | 0.8668 | 0.7626 | 0.8668 | 0.9310 | | 0.0421 | 9.1361 | 6176 | 0.8687 | 0.7626 | 0.8687 | 0.9321 | | 0.0421 | 9.1391 | 6178 | 0.8668 | 0.7626 | 0.8668 | 0.9310 | | 0.0421 | 9.1420 | 6180 | 0.8694 | 0.7626 | 0.8694 | 0.9324 | | 0.0421 | 9.1450 | 6182 | 0.8757 | 0.7626 | 0.8757 | 0.9358 | | 0.0421 | 9.1479 | 6184 | 0.8818 | 0.7626 | 0.8818 | 0.9390 | | 0.0421 | 9.1509 | 6186 | 0.8879 | 0.7626 | 0.8879 | 0.9423 | | 0.0421 | 9.1538 | 6188 | 0.8917 | 0.7626 | 0.8917 | 0.9443 | | 0.0421 | 9.1568 | 6190 | 0.8925 | 0.7626 | 0.8925 | 0.9447 | | 0.0421 | 9.1598 | 6192 | 0.8850 | 0.7626 | 0.8850 | 0.9407 | | 0.0421 | 9.1627 | 6194 | 0.8738 | 0.7626 | 0.8738 | 0.9347 | | 0.0421 | 9.1657 | 6196 | 0.8639 | 0.7520 | 0.8639 | 0.9295 | | 0.0421 | 9.1686 | 6198 | 0.8589 | 0.7520 | 0.8589 | 0.9268 | | 0.0421 | 9.1716 | 6200 | 0.8600 | 0.7520 | 0.8600 | 0.9274 | | 0.0421 | 9.1746 | 6202 | 0.8600 | 0.7520 | 0.8600 | 0.9273 | | 0.0421 | 9.1775 | 6204 | 0.8632 | 0.7520 | 0.8632 | 0.9291 | | 0.0421 | 9.1805 | 6206 | 0.8703 | 0.7626 | 0.8703 | 0.9329 | | 0.0421 | 9.1834 | 6208 | 0.8744 | 0.7626 | 0.8744 | 0.9351 | | 0.0421 | 9.1864 | 6210 | 0.8815 | 0.7626 | 0.8815 | 0.9389 | | 0.0421 | 9.1893 | 6212 | 0.8789 | 0.7626 | 0.8789 | 0.9375 | | 0.0421 | 9.1923 | 6214 | 0.8772 | 0.7626 | 0.8772 | 0.9366 | | 0.0421 | 9.1953 | 6216 | 0.8751 | 0.7626 | 0.8751 | 0.9355 | | 0.0421 | 9.1982 | 6218 | 0.8819 | 0.7626 | 0.8819 | 0.9391 | | 0.0421 | 9.2012 | 6220 | 0.8874 | 0.7626 | 0.8874 | 0.9420 | | 0.0421 | 9.2041 | 6222 | 0.8908 | 0.7626 | 0.8908 | 0.9438 | | 0.0421 | 9.2071 | 6224 | 0.8948 | 0.7626 | 0.8948 | 0.9459 | | 0.0421 | 9.2101 | 6226 | 0.8974 | 0.7626 | 0.8974 | 0.9473 | | 0.0421 | 9.2130 | 6228 | 0.9037 | 0.7626 | 0.9037 | 0.9506 | | 0.0421 | 9.2160 | 6230 | 0.9045 | 0.7626 | 0.9045 | 0.9511 | | 0.0421 | 9.2189 | 6232 | 0.9017 | 0.7626 | 0.9017 | 0.9496 | | 0.0421 | 9.2219 | 6234 | 0.9046 | 0.7626 | 0.9046 | 0.9511 | | 0.0421 | 9.2249 | 6236 | 0.9052 | 0.7626 | 0.9052 | 0.9514 | | 0.0421 | 9.2278 | 6238 | 0.9068 | 0.7626 | 0.9068 | 0.9522 | | 0.0421 | 9.2308 | 6240 | 0.9016 | 0.7626 | 0.9016 | 0.9495 | | 0.0421 | 9.2337 | 6242 | 0.9034 | 0.7626 | 0.9034 | 0.9504 | | 0.0421 | 9.2367 | 6244 | 0.9084 | 0.7626 | 0.9084 | 0.9531 | | 0.0421 | 9.2396 | 6246 | 0.9057 | 0.7626 | 0.9057 | 0.9517 | | 0.0421 | 9.2426 | 6248 | 0.9017 | 0.7626 | 0.9017 | 0.9496 | | 0.0421 | 9.2456 | 6250 | 0.8932 | 0.7626 | 0.8932 | 0.9451 | | 0.0421 | 9.2485 | 6252 | 0.8886 | 0.7626 | 0.8886 | 0.9427 | | 0.0421 | 9.2515 | 6254 | 0.8818 | 0.7626 | 0.8818 | 0.9390 | | 0.0421 | 9.2544 | 6256 | 0.8753 | 0.7626 | 0.8753 | 0.9356 | | 0.0421 | 9.2574 | 6258 | 0.8680 | 0.7626 | 0.8680 | 0.9317 | | 0.0421 | 9.2604 | 6260 | 0.8568 | 0.7520 | 0.8568 | 0.9256 | | 0.0421 | 9.2633 | 6262 | 0.8431 | 0.7520 | 0.8431 | 0.9182 | | 0.0421 | 9.2663 | 6264 | 0.8376 | 0.7520 | 0.8376 | 0.9152 | | 0.0421 | 9.2692 | 6266 | 0.8333 | 0.7520 | 0.8333 | 0.9129 | | 0.0421 | 9.2722 | 6268 | 0.8260 | 0.7520 | 0.8260 | 0.9088 | | 0.0421 | 9.2751 | 6270 | 0.8147 | 0.7520 | 0.8147 | 0.9026 | | 0.0421 | 9.2781 | 6272 | 0.8048 | 0.7520 | 0.8048 | 0.8971 | | 0.0421 | 9.2811 | 6274 | 0.7998 | 0.7520 | 0.7998 | 0.8943 | | 0.0421 | 9.2840 | 6276 | 0.7937 | 0.7520 | 0.7937 | 0.8909 | | 0.0421 | 9.2870 | 6278 | 0.7861 | 0.7520 | 0.7861 | 0.8866 | | 0.0421 | 9.2899 | 6280 | 0.7817 | 0.7520 | 0.7817 | 0.8841 | | 0.0421 | 9.2929 | 6282 | 0.7823 | 0.7520 | 0.7823 | 0.8845 | | 0.0421 | 9.2959 | 6284 | 0.7869 | 0.7520 | 0.7869 | 0.8871 | | 0.0421 | 9.2988 | 6286 | 0.7958 | 0.7520 | 0.7958 | 0.8921 | | 0.0421 | 9.3018 | 6288 | 0.8121 | 0.7520 | 0.8121 | 0.9012 | | 0.0421 | 9.3047 | 6290 | 0.8248 | 0.7520 | 0.8248 | 0.9082 | | 0.0421 | 9.3077 | 6292 | 0.8408 | 0.7520 | 0.8408 | 0.9170 | | 0.0421 | 9.3107 | 6294 | 0.8551 | 0.7520 | 0.8551 | 0.9247 | | 0.0421 | 9.3136 | 6296 | 0.8731 | 0.7626 | 0.8731 | 0.9344 | | 0.0421 | 9.3166 | 6298 | 0.8849 | 0.7626 | 0.8849 | 0.9407 | | 0.0421 | 9.3195 | 6300 | 0.8898 | 0.7626 | 0.8898 | 0.9433 | | 0.0421 | 9.3225 | 6302 | 0.8866 | 0.7626 | 0.8866 | 0.9416 | | 0.0421 | 9.3254 | 6304 | 0.8769 | 0.7626 | 0.8769 | 0.9364 | | 0.0421 | 9.3284 | 6306 | 0.8707 | 0.7626 | 0.8707 | 0.9331 | | 0.0421 | 9.3314 | 6308 | 0.8713 | 0.7626 | 0.8713 | 0.9335 | | 0.0421 | 9.3343 | 6310 | 0.8714 | 0.7626 | 0.8714 | 0.9335 | | 0.0421 | 9.3373 | 6312 | 0.8686 | 0.7626 | 0.8686 | 0.9320 | | 0.0421 | 9.3402 | 6314 | 0.8631 | 0.7626 | 0.8631 | 0.9290 | | 0.0421 | 9.3432 | 6316 | 0.8658 | 0.7626 | 0.8658 | 0.9305 | | 0.0421 | 9.3462 | 6318 | 0.8680 | 0.7626 | 0.8680 | 0.9317 | | 0.0421 | 9.3491 | 6320 | 0.8677 | 0.7626 | 0.8677 | 0.9315 | | 0.0421 | 9.3521 | 6322 | 0.8719 | 0.7626 | 0.8719 | 0.9338 | | 0.0421 | 9.3550 | 6324 | 0.8810 | 0.7626 | 0.8810 | 0.9386 | | 0.0421 | 9.3580 | 6326 | 0.8879 | 0.7626 | 0.8879 | 0.9423 | | 0.0421 | 9.3609 | 6328 | 0.8902 | 0.7626 | 0.8902 | 0.9435 | | 0.0421 | 9.3639 | 6330 | 0.8946 | 0.7626 | 0.8946 | 0.9458 | | 0.0421 | 9.3669 | 6332 | 0.9060 | 0.7626 | 0.9060 | 0.9518 | | 0.0421 | 9.3698 | 6334 | 0.9116 | 0.7626 | 0.9116 | 0.9548 | | 0.0421 | 9.3728 | 6336 | 0.9144 | 0.7626 | 0.9144 | 0.9563 | | 0.0421 | 9.3757 | 6338 | 0.9215 | 0.7626 | 0.9215 | 0.9599 | | 0.0421 | 9.3787 | 6340 | 0.9235 | 0.7626 | 0.9235 | 0.9610 | | 0.0421 | 9.3817 | 6342 | 0.9187 | 0.7626 | 0.9187 | 0.9585 | | 0.0421 | 9.3846 | 6344 | 0.9182 | 0.7626 | 0.9182 | 0.9582 | | 0.0421 | 9.3876 | 6346 | 0.9131 | 0.7626 | 0.9131 | 0.9556 | | 0.0421 | 9.3905 | 6348 | 0.9036 | 0.7626 | 0.9036 | 0.9506 | | 0.0421 | 9.3935 | 6350 | 0.8933 | 0.7626 | 0.8933 | 0.9451 | | 0.0421 | 9.3964 | 6352 | 0.8838 | 0.7626 | 0.8838 | 0.9401 | | 0.0421 | 9.3994 | 6354 | 0.8819 | 0.7296 | 0.8819 | 0.9391 | | 0.0421 | 9.4024 | 6356 | 0.8750 | 0.7296 | 0.8750 | 0.9354 | | 0.0421 | 9.4053 | 6358 | 0.8630 | 0.7296 | 0.8630 | 0.9290 | | 0.0421 | 9.4083 | 6360 | 0.8540 | 0.7296 | 0.8540 | 0.9241 | | 0.0421 | 9.4112 | 6362 | 0.8482 | 0.7296 | 0.8482 | 0.9210 | | 0.0421 | 9.4142 | 6364 | 0.8467 | 0.7296 | 0.8467 | 0.9202 | | 0.0421 | 9.4172 | 6366 | 0.8479 | 0.7296 | 0.8479 | 0.9208 | | 0.0421 | 9.4201 | 6368 | 0.8523 | 0.7626 | 0.8523 | 0.9232 | | 0.0421 | 9.4231 | 6370 | 0.8550 | 0.7626 | 0.8550 | 0.9247 | | 0.0421 | 9.4260 | 6372 | 0.8599 | 0.7626 | 0.8599 | 0.9273 | | 0.0421 | 9.4290 | 6374 | 0.8673 | 0.7626 | 0.8673 | 0.9313 | | 0.0421 | 9.4320 | 6376 | 0.8761 | 0.7626 | 0.8761 | 0.9360 | | 0.0421 | 9.4349 | 6378 | 0.8820 | 0.7626 | 0.8820 | 0.9391 | | 0.0421 | 9.4379 | 6380 | 0.8844 | 0.7626 | 0.8844 | 0.9405 | | 0.0421 | 9.4408 | 6382 | 0.8914 | 0.7626 | 0.8914 | 0.9441 | | 0.0421 | 9.4438 | 6384 | 0.8981 | 0.7626 | 0.8981 | 0.9477 | | 0.0421 | 9.4467 | 6386 | 0.9086 | 0.7626 | 0.9086 | 0.9532 | | 0.0421 | 9.4497 | 6388 | 0.9179 | 0.7840 | 0.9179 | 0.9581 | | 0.0421 | 9.4527 | 6390 | 0.9252 | 0.7840 | 0.9252 | 0.9619 | | 0.0421 | 9.4556 | 6392 | 0.9306 | 0.7840 | 0.9306 | 0.9647 | | 0.0421 | 9.4586 | 6394 | 0.9316 | 0.7840 | 0.9316 | 0.9652 | | 0.0421 | 9.4615 | 6396 | 0.9314 | 0.7840 | 0.9314 | 0.9651 | | 0.0421 | 9.4645 | 6398 | 0.9303 | 0.7840 | 0.9303 | 0.9645 | | 0.0421 | 9.4675 | 6400 | 0.9237 | 0.7840 | 0.9237 | 0.9611 | | 0.0421 | 9.4704 | 6402 | 0.9148 | 0.7840 | 0.9148 | 0.9564 | | 0.0421 | 9.4734 | 6404 | 0.9101 | 0.7840 | 0.9101 | 0.9540 | | 0.0421 | 9.4763 | 6406 | 0.9045 | 0.7840 | 0.9045 | 0.9511 | | 0.0421 | 9.4793 | 6408 | 0.8941 | 0.7626 | 0.8941 | 0.9456 | | 0.0421 | 9.4822 | 6410 | 0.8870 | 0.7626 | 0.8870 | 0.9418 | | 0.0421 | 9.4852 | 6412 | 0.8806 | 0.7626 | 0.8806 | 0.9384 | | 0.0421 | 9.4882 | 6414 | 0.8741 | 0.7626 | 0.8741 | 0.9350 | | 0.0421 | 9.4911 | 6416 | 0.8703 | 0.7626 | 0.8703 | 0.9329 | | 0.0421 | 9.4941 | 6418 | 0.8661 | 0.7520 | 0.8661 | 0.9307 | | 0.0421 | 9.4970 | 6420 | 0.8612 | 0.7520 | 0.8612 | 0.9280 | | 0.0421 | 9.5 | 6422 | 0.8592 | 0.7520 | 0.8592 | 0.9270 | | 0.0421 | 9.5030 | 6424 | 0.8608 | 0.7520 | 0.8608 | 0.9278 | | 0.0421 | 9.5059 | 6426 | 0.8628 | 0.7520 | 0.8628 | 0.9288 | | 0.0421 | 9.5089 | 6428 | 0.8607 | 0.7520 | 0.8607 | 0.9277 | | 0.0421 | 9.5118 | 6430 | 0.8591 | 0.7520 | 0.8591 | 0.9269 | | 0.0421 | 9.5148 | 6432 | 0.8587 | 0.7520 | 0.8587 | 0.9266 | | 0.0421 | 9.5178 | 6434 | 0.8603 | 0.7520 | 0.8603 | 0.9275 | | 0.0421 | 9.5207 | 6436 | 0.8595 | 0.7520 | 0.8595 | 0.9271 | | 0.0421 | 9.5237 | 6438 | 0.8604 | 0.7520 | 0.8604 | 0.9276 | | 0.0421 | 9.5266 | 6440 | 0.8625 | 0.7520 | 0.8625 | 0.9287 | | 0.0421 | 9.5296 | 6442 | 0.8643 | 0.7520 | 0.8643 | 0.9297 | | 0.0421 | 9.5325 | 6444 | 0.8617 | 0.7520 | 0.8617 | 0.9283 | | 0.0421 | 9.5355 | 6446 | 0.8560 | 0.7520 | 0.8560 | 0.9252 | | 0.0421 | 9.5385 | 6448 | 0.8500 | 0.7520 | 0.8500 | 0.9219 | | 0.0421 | 9.5414 | 6450 | 0.8438 | 0.7520 | 0.8438 | 0.9186 | | 0.0421 | 9.5444 | 6452 | 0.8352 | 0.7520 | 0.8352 | 0.9139 | | 0.0421 | 9.5473 | 6454 | 0.8298 | 0.7520 | 0.8298 | 0.9109 | | 0.0421 | 9.5503 | 6456 | 0.8256 | 0.7520 | 0.8256 | 0.9086 | | 0.0421 | 9.5533 | 6458 | 0.8210 | 0.7520 | 0.8210 | 0.9061 | | 0.0421 | 9.5562 | 6460 | 0.8174 | 0.7520 | 0.8174 | 0.9041 | | 0.0421 | 9.5592 | 6462 | 0.8148 | 0.7520 | 0.8148 | 0.9026 | | 0.0421 | 9.5621 | 6464 | 0.8131 | 0.7520 | 0.8131 | 0.9017 | | 0.0421 | 9.5651 | 6466 | 0.8089 | 0.7520 | 0.8089 | 0.8994 | | 0.0421 | 9.5680 | 6468 | 0.8031 | 0.7520 | 0.8031 | 0.8961 | | 0.0421 | 9.5710 | 6470 | 0.8008 | 0.7520 | 0.8008 | 0.8949 | | 0.0421 | 9.5740 | 6472 | 0.8021 | 0.7520 | 0.8021 | 0.8956 | | 0.0421 | 9.5769 | 6474 | 0.8041 | 0.7520 | 0.8041 | 0.8967 | | 0.0421 | 9.5799 | 6476 | 0.8094 | 0.7520 | 0.8094 | 0.8997 | | 0.0421 | 9.5828 | 6478 | 0.8120 | 0.7520 | 0.8120 | 0.9011 | | 0.0421 | 9.5858 | 6480 | 0.8132 | 0.7520 | 0.8132 | 0.9018 | | 0.0421 | 9.5888 | 6482 | 0.8156 | 0.7520 | 0.8156 | 0.9031 | | 0.0421 | 9.5917 | 6484 | 0.8188 | 0.7520 | 0.8188 | 0.9049 | | 0.0421 | 9.5947 | 6486 | 0.8234 | 0.7520 | 0.8234 | 0.9074 | | 0.0421 | 9.5976 | 6488 | 0.8251 | 0.7520 | 0.8251 | 0.9084 | | 0.0421 | 9.6006 | 6490 | 0.8278 | 0.7520 | 0.8278 | 0.9098 | | 0.0421 | 9.6036 | 6492 | 0.8346 | 0.7520 | 0.8346 | 0.9135 | | 0.0421 | 9.6065 | 6494 | 0.8428 | 0.7520 | 0.8428 | 0.9180 | | 0.0421 | 9.6095 | 6496 | 0.8481 | 0.7520 | 0.8481 | 0.9209 | | 0.0421 | 9.6124 | 6498 | 0.8537 | 0.7520 | 0.8537 | 0.9240 | | 0.0386 | 9.6154 | 6500 | 0.8588 | 0.7520 | 0.8588 | 0.9267 | | 0.0386 | 9.6183 | 6502 | 0.8598 | 0.7520 | 0.8598 | 0.9272 | | 0.0386 | 9.6213 | 6504 | 0.8580 | 0.7520 | 0.8580 | 0.9263 | | 0.0386 | 9.6243 | 6506 | 0.8534 | 0.7520 | 0.8534 | 0.9238 | | 0.0386 | 9.6272 | 6508 | 0.8458 | 0.7520 | 0.8458 | 0.9197 | | 0.0386 | 9.6302 | 6510 | 0.8402 | 0.7520 | 0.8402 | 0.9166 | | 0.0386 | 9.6331 | 6512 | 0.8318 | 0.7520 | 0.8318 | 0.9120 | | 0.0386 | 9.6361 | 6514 | 0.8246 | 0.7520 | 0.8246 | 0.9081 | | 0.0386 | 9.6391 | 6516 | 0.8170 | 0.7520 | 0.8170 | 0.9039 | | 0.0386 | 9.6420 | 6518 | 0.8094 | 0.7520 | 0.8094 | 0.8997 | | 0.0386 | 9.6450 | 6520 | 0.8030 | 0.7520 | 0.8030 | 0.8961 | | 0.0386 | 9.6479 | 6522 | 0.7985 | 0.7520 | 0.7985 | 0.8936 | | 0.0386 | 9.6509 | 6524 | 0.7963 | 0.7520 | 0.7963 | 0.8924 | | 0.0386 | 9.6538 | 6526 | 0.7980 | 0.7520 | 0.7980 | 0.8933 | | 0.0386 | 9.6568 | 6528 | 0.7996 | 0.7520 | 0.7996 | 0.8942 | | 0.0386 | 9.6598 | 6530 | 0.8030 | 0.7520 | 0.8030 | 0.8961 | | 0.0386 | 9.6627 | 6532 | 0.8060 | 0.7520 | 0.8060 | 0.8978 | | 0.0386 | 9.6657 | 6534 | 0.8121 | 0.7520 | 0.8121 | 0.9012 | | 0.0386 | 9.6686 | 6536 | 0.8193 | 0.7520 | 0.8193 | 0.9052 | | 0.0386 | 9.6716 | 6538 | 0.8252 | 0.7520 | 0.8252 | 0.9084 | | 0.0386 | 9.6746 | 6540 | 0.8306 | 0.7520 | 0.8306 | 0.9114 | | 0.0386 | 9.6775 | 6542 | 0.8352 | 0.7520 | 0.8352 | 0.9139 | | 0.0386 | 9.6805 | 6544 | 0.8366 | 0.7520 | 0.8366 | 0.9147 | | 0.0386 | 9.6834 | 6546 | 0.8378 | 0.7520 | 0.8378 | 0.9153 | | 0.0386 | 9.6864 | 6548 | 0.8390 | 0.7520 | 0.8390 | 0.9160 | | 0.0386 | 9.6893 | 6550 | 0.8383 | 0.7520 | 0.8383 | 0.9156 | | 0.0386 | 9.6923 | 6552 | 0.8369 | 0.7520 | 0.8369 | 0.9148 | | 0.0386 | 9.6953 | 6554 | 0.8380 | 0.7520 | 0.8380 | 0.9154 | | 0.0386 | 9.6982 | 6556 | 0.8397 | 0.7520 | 0.8397 | 0.9163 | | 0.0386 | 9.7012 | 6558 | 0.8398 | 0.7520 | 0.8398 | 0.9164 | | 0.0386 | 9.7041 | 6560 | 0.8398 | 0.7520 | 0.8398 | 0.9164 | | 0.0386 | 9.7071 | 6562 | 0.8428 | 0.7520 | 0.8428 | 0.9180 | | 0.0386 | 9.7101 | 6564 | 0.8465 | 0.7520 | 0.8465 | 0.9201 | | 0.0386 | 9.7130 | 6566 | 0.8485 | 0.7520 | 0.8485 | 0.9212 | | 0.0386 | 9.7160 | 6568 | 0.8489 | 0.7520 | 0.8489 | 0.9214 | | 0.0386 | 9.7189 | 6570 | 0.8491 | 0.7520 | 0.8491 | 0.9215 | | 0.0386 | 9.7219 | 6572 | 0.8471 | 0.7520 | 0.8471 | 0.9204 | | 0.0386 | 9.7249 | 6574 | 0.8435 | 0.7520 | 0.8435 | 0.9184 | | 0.0386 | 9.7278 | 6576 | 0.8421 | 0.7520 | 0.8421 | 0.9176 | | 0.0386 | 9.7308 | 6578 | 0.8428 | 0.7520 | 0.8428 | 0.9181 | | 0.0386 | 9.7337 | 6580 | 0.8436 | 0.7520 | 0.8436 | 0.9185 | | 0.0386 | 9.7367 | 6582 | 0.8471 | 0.7520 | 0.8471 | 0.9204 | | 0.0386 | 9.7396 | 6584 | 0.8525 | 0.7520 | 0.8525 | 0.9233 | | 0.0386 | 9.7426 | 6586 | 0.8558 | 0.7626 | 0.8558 | 0.9251 | | 0.0386 | 9.7456 | 6588 | 0.8591 | 0.7626 | 0.8591 | 0.9269 | | 0.0386 | 9.7485 | 6590 | 0.8602 | 0.7626 | 0.8602 | 0.9275 | | 0.0386 | 9.7515 | 6592 | 0.8599 | 0.7626 | 0.8599 | 0.9273 | | 0.0386 | 9.7544 | 6594 | 0.8598 | 0.7626 | 0.8598 | 0.9273 | | 0.0386 | 9.7574 | 6596 | 0.8598 | 0.7626 | 0.8598 | 0.9273 | | 0.0386 | 9.7604 | 6598 | 0.8610 | 0.7626 | 0.8610 | 0.9279 | | 0.0386 | 9.7633 | 6600 | 0.8608 | 0.7626 | 0.8608 | 0.9278 | | 0.0386 | 9.7663 | 6602 | 0.8604 | 0.7626 | 0.8604 | 0.9276 | | 0.0386 | 9.7692 | 6604 | 0.8598 | 0.7626 | 0.8598 | 0.9273 | | 0.0386 | 9.7722 | 6606 | 0.8584 | 0.7626 | 0.8584 | 0.9265 | | 0.0386 | 9.7751 | 6608 | 0.8568 | 0.7626 | 0.8568 | 0.9256 | | 0.0386 | 9.7781 | 6610 | 0.8557 | 0.7626 | 0.8557 | 0.9251 | | 0.0386 | 9.7811 | 6612 | 0.8564 | 0.7626 | 0.8564 | 0.9254 | | 0.0386 | 9.7840 | 6614 | 0.8564 | 0.7626 | 0.8564 | 0.9254 | | 0.0386 | 9.7870 | 6616 | 0.8573 | 0.7626 | 0.8573 | 0.9259 | | 0.0386 | 9.7899 | 6618 | 0.8578 | 0.7626 | 0.8578 | 0.9262 | | 0.0386 | 9.7929 | 6620 | 0.8593 | 0.7626 | 0.8593 | 0.9270 | | 0.0386 | 9.7959 | 6622 | 0.8602 | 0.7626 | 0.8602 | 0.9275 | | 0.0386 | 9.7988 | 6624 | 0.8619 | 0.7626 | 0.8619 | 0.9284 | | 0.0386 | 9.8018 | 6626 | 0.8640 | 0.7626 | 0.8640 | 0.9295 | | 0.0386 | 9.8047 | 6628 | 0.8656 | 0.7626 | 0.8656 | 0.9304 | | 0.0386 | 9.8077 | 6630 | 0.8680 | 0.7626 | 0.8680 | 0.9317 | | 0.0386 | 9.8107 | 6632 | 0.8710 | 0.7626 | 0.8710 | 0.9333 | | 0.0386 | 9.8136 | 6634 | 0.8731 | 0.7626 | 0.8731 | 0.9344 | | 0.0386 | 9.8166 | 6636 | 0.8732 | 0.7626 | 0.8732 | 0.9344 | | 0.0386 | 9.8195 | 6638 | 0.8716 | 0.7626 | 0.8716 | 0.9336 | | 0.0386 | 9.8225 | 6640 | 0.8690 | 0.7626 | 0.8690 | 0.9322 | | 0.0386 | 9.8254 | 6642 | 0.8662 | 0.7626 | 0.8662 | 0.9307 | | 0.0386 | 9.8284 | 6644 | 0.8645 | 0.7626 | 0.8645 | 0.9298 | | 0.0386 | 9.8314 | 6646 | 0.8640 | 0.7626 | 0.8640 | 0.9295 | | 0.0386 | 9.8343 | 6648 | 0.8637 | 0.7626 | 0.8637 | 0.9293 | | 0.0386 | 9.8373 | 6650 | 0.8633 | 0.7626 | 0.8633 | 0.9291 | | 0.0386 | 9.8402 | 6652 | 0.8624 | 0.7626 | 0.8624 | 0.9287 | | 0.0386 | 9.8432 | 6654 | 0.8605 | 0.7626 | 0.8605 | 0.9276 | | 0.0386 | 9.8462 | 6656 | 0.8576 | 0.7626 | 0.8576 | 0.9261 | | 0.0386 | 9.8491 | 6658 | 0.8559 | 0.7626 | 0.8559 | 0.9252 | | 0.0386 | 9.8521 | 6660 | 0.8548 | 0.7626 | 0.8548 | 0.9245 | | 0.0386 | 9.8550 | 6662 | 0.8536 | 0.7626 | 0.8536 | 0.9239 | | 0.0386 | 9.8580 | 6664 | 0.8524 | 0.7626 | 0.8524 | 0.9232 | | 0.0386 | 9.8609 | 6666 | 0.8519 | 0.7626 | 0.8519 | 0.9230 | | 0.0386 | 9.8639 | 6668 | 0.8514 | 0.7626 | 0.8514 | 0.9227 | | 0.0386 | 9.8669 | 6670 | 0.8519 | 0.7626 | 0.8519 | 0.9230 | | 0.0386 | 9.8698 | 6672 | 0.8522 | 0.7626 | 0.8522 | 0.9232 | | 0.0386 | 9.8728 | 6674 | 0.8527 | 0.7626 | 0.8527 | 0.9234 | | 0.0386 | 9.8757 | 6676 | 0.8535 | 0.7626 | 0.8535 | 0.9238 | | 0.0386 | 9.8787 | 6678 | 0.8537 | 0.7626 | 0.8537 | 0.9240 | | 0.0386 | 9.8817 | 6680 | 0.8550 | 0.7626 | 0.8550 | 0.9246 | | 0.0386 | 9.8846 | 6682 | 0.8564 | 0.7626 | 0.8564 | 0.9254 | | 0.0386 | 9.8876 | 6684 | 0.8584 | 0.7626 | 0.8584 | 0.9265 | | 0.0386 | 9.8905 | 6686 | 0.8605 | 0.7626 | 0.8605 | 0.9276 | | 0.0386 | 9.8935 | 6688 | 0.8613 | 0.7626 | 0.8613 | 0.9281 | | 0.0386 | 9.8964 | 6690 | 0.8611 | 0.7626 | 0.8611 | 0.9280 | | 0.0386 | 9.8994 | 6692 | 0.8607 | 0.7626 | 0.8607 | 0.9277 | | 0.0386 | 9.9024 | 6694 | 0.8609 | 0.7626 | 0.8609 | 0.9278 | | 0.0386 | 9.9053 | 6696 | 0.8602 | 0.7626 | 0.8602 | 0.9275 | | 0.0386 | 9.9083 | 6698 | 0.8588 | 0.7626 | 0.8588 | 0.9267 | | 0.0386 | 9.9112 | 6700 | 0.8575 | 0.7626 | 0.8575 | 0.9260 | | 0.0386 | 9.9142 | 6702 | 0.8566 | 0.7626 | 0.8566 | 0.9255 | | 0.0386 | 9.9172 | 6704 | 0.8555 | 0.7626 | 0.8555 | 0.9249 | | 0.0386 | 9.9201 | 6706 | 0.8540 | 0.7626 | 0.8540 | 0.9241 | | 0.0386 | 9.9231 | 6708 | 0.8533 | 0.7626 | 0.8533 | 0.9237 | | 0.0386 | 9.9260 | 6710 | 0.8524 | 0.7626 | 0.8524 | 0.9232 | | 0.0386 | 9.9290 | 6712 | 0.8510 | 0.7626 | 0.8510 | 0.9225 | | 0.0386 | 9.9320 | 6714 | 0.8504 | 0.7626 | 0.8504 | 0.9222 | | 0.0386 | 9.9349 | 6716 | 0.8495 | 0.7626 | 0.8495 | 0.9217 | | 0.0386 | 9.9379 | 6718 | 0.8485 | 0.7626 | 0.8485 | 0.9212 | | 0.0386 | 9.9408 | 6720 | 0.8479 | 0.7626 | 0.8479 | 0.9208 | | 0.0386 | 9.9438 | 6722 | 0.8477 | 0.7626 | 0.8477 | 0.9207 | | 0.0386 | 9.9467 | 6724 | 0.8477 | 0.7626 | 0.8477 | 0.9207 | | 0.0386 | 9.9497 | 6726 | 0.8481 | 0.7626 | 0.8481 | 0.9209 | | 0.0386 | 9.9527 | 6728 | 0.8487 | 0.7626 | 0.8487 | 0.9212 | | 0.0386 | 9.9556 | 6730 | 0.8490 | 0.7626 | 0.8490 | 0.9214 | | 0.0386 | 9.9586 | 6732 | 0.8495 | 0.7626 | 0.8495 | 0.9217 | | 0.0386 | 9.9615 | 6734 | 0.8501 | 0.7626 | 0.8501 | 0.9220 | | 0.0386 | 9.9645 | 6736 | 0.8509 | 0.7626 | 0.8509 | 0.9224 | | 0.0386 | 9.9675 | 6738 | 0.8514 | 0.7626 | 0.8514 | 0.9227 | | 0.0386 | 9.9704 | 6740 | 0.8518 | 0.7626 | 0.8518 | 0.9229 | | 0.0386 | 9.9734 | 6742 | 0.8520 | 0.7626 | 0.8520 | 0.9230 | | 0.0386 | 9.9763 | 6744 | 0.8523 | 0.7626 | 0.8523 | 0.9232 | | 0.0386 | 9.9793 | 6746 | 0.8524 | 0.7626 | 0.8524 | 0.9232 | | 0.0386 | 9.9822 | 6748 | 0.8523 | 0.7626 | 0.8523 | 0.9232 | | 0.0386 | 9.9852 | 6750 | 0.8523 | 0.7626 | 0.8523 | 0.9232 | | 0.0386 | 9.9882 | 6752 | 0.8525 | 0.7626 | 0.8525 | 0.9233 | | 0.0386 | 9.9911 | 6754 | 0.8525 | 0.7626 | 0.8525 | 0.9233 | | 0.0386 | 9.9941 | 6756 | 0.8526 | 0.7626 | 0.8526 | 0.9234 | | 0.0386 | 9.9970 | 6758 | 0.8527 | 0.7626 | 0.8527 | 0.9234 | | 0.0386 | 10.0 | 6760 | 0.8527 | 0.7626 | 0.8527 | 0.9234 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu118 - Datasets 2.21.0 - Tokenizers 0.19.1
RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf
RichardErkhov
2024-11-14T07:14:31Z
9
0
null
[ "gguf", "arxiv:2309.09530", "arxiv:2406.14491", "endpoints_compatible", "region:us" ]
null
2024-11-13T23:03:42Z
Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) law-LLM-13B - GGUF - Model creator: https://huggingface.co/AdaptLLM/ - Original model: https://huggingface.co/AdaptLLM/law-LLM-13B/ | Name | Quant method | Size | | ---- | ---- | ---- | | [law-LLM-13B.Q2_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q2_K.gguf) | Q2_K | 4.52GB | | [law-LLM-13B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K_S.gguf) | Q3_K_S | 5.27GB | | [law-LLM-13B.Q3_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K.gguf) | Q3_K | 5.9GB | | [law-LLM-13B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K_M.gguf) | Q3_K_M | 5.9GB | | [law-LLM-13B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K_L.gguf) | Q3_K_L | 6.45GB | | [law-LLM-13B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.IQ4_XS.gguf) | IQ4_XS | 6.54GB | | [law-LLM-13B.Q4_0.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_0.gguf) | Q4_0 | 6.86GB | | [law-LLM-13B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.IQ4_NL.gguf) | IQ4_NL | 6.9GB | | [law-LLM-13B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_K_S.gguf) | Q4_K_S | 6.91GB | | [law-LLM-13B.Q4_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_K.gguf) | Q4_K | 7.33GB | | [law-LLM-13B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_K_M.gguf) | Q4_K_M | 7.33GB | | [law-LLM-13B.Q4_1.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_1.gguf) | Q4_1 | 7.61GB | | [law-LLM-13B.Q5_0.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_0.gguf) | Q5_0 | 8.36GB | | [law-LLM-13B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_K_S.gguf) | Q5_K_S | 8.36GB | | [law-LLM-13B.Q5_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_K.gguf) | Q5_K | 8.6GB | | [law-LLM-13B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_K_M.gguf) | Q5_K_M | 8.6GB | | [law-LLM-13B.Q5_1.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_1.gguf) | Q5_1 | 9.1GB | | [law-LLM-13B.Q6_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q6_K.gguf) | Q6_K | 9.95GB | | [law-LLM-13B.Q8_0.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q8_0.gguf) | Q8_0 | 12.88GB | Original model description: --- language: - en datasets: - Open-Orca/OpenOrca - GAIR/lima - WizardLM/WizardLM_evol_instruct_V2_196k - EleutherAI/pile metrics: - accuracy pipeline_tag: text-generation tags: - legal --- # Adapting LLMs to Domains via Continual Pre-Training (ICLR 2024) This repo contains the domain-specific base model developed from **LLaMA-1-13B**, using the method in our paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530). We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**. ### [2024/6/21] 🤗 We release the 2nd version of AdaptLLM at [Instruction-Pretrain](https://huggingface.co/instruction-pretrain), effective for both pre-training from scratch and continual pre-training 🤗 **************************** **Updates** **************************** * 2024/8/29: Updated [guidelines](https://huggingface.co/datasets/AdaptLLM/finance-tasks) on evaluating any 🤗Huggingface models on the domain-specific tasks * 2024/6/22: Released the [benchmarking code](https://github.com/microsoft/LMOps/tree/main/adaptllm) * 2024/6/21: Released the 2nd version of AdaptLLM at [Instruction-Pretrain](https://huggingface.co/instruction-pretrain) * 2024/4/2: Released the [raw data splits (train and test)](https://huggingface.co/datasets/AdaptLLM/ConvFinQA) of all the evaluation datasets * 2024/1/16: Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024 * 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B * 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B * 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B ## 1. Domain-Specific Models ### LLaMA-1-7B In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are: <p align='center'> <img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700"> </p> ### LLaMA-1-13B Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B). ### LLaMA-2-Chat Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat) For example, to chat with the law model: ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("AdaptLLM/law-LLM-13B") tokenizer = AutoTokenizer.from_pretrained("AdaptLLM/law-LLM-13B", use_fast=False) # Put your input here: user_input = '''Question: Which of the following is false about ex post facto laws? Options: - They make criminal an act that was innocent when committed. - They prescribe greater punishment for an act than was prescribed when it was done. - They increase the evidence required to convict a person than when the act was done. - They alter criminal offenses or punishment in a substantially prejudicial manner for the purpose of punishing a person for some past activity. Please provide your choice first and then provide explanations if possible.''' # Simply use your input as the prompt for base models prompt = user_input inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False).input_ids.to(model.device) outputs = model.generate(input_ids=inputs, max_length=2048)[0] answer_start = int(inputs.shape[-1]) pred = tokenizer.decode(outputs[answer_start:], skip_special_tokens=True) print(pred) ``` ### LLaMA-3-8B (💡New!) In our recent research on [Instruction-Pretrain](https://huggingface.co/papers/2406.14491), we developed a context-based instruction synthesizer to augment the raw corpora with instruction-response pairs, **enabling Llama3-8B to be comparable to or even outperform Llama3-70B**: [Finance-Llama3-8B](https://huggingface.co/instruction-pretrain/finance-Llama3-8B), [Biomedicine-Llama3-8B](https://huggingface.co/instruction-pretrain/medicine-Llama3-8B). ## 2. Domain-Specific Tasks ### Pre-templatized Testing Splits To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks). Note: those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models. ### Evaluating Any Huggingface LMs on Domain-Specific Tasks (💡New!) You can use the following script to reproduce our results and evaluate any other Huggingface models on domain-specific tasks. Note that the script is NOT applicable to models that require specific prompt templates (e.g., Llama2-chat, Llama3-Instruct). 1). **Set Up Dependencies** ```bash git clone https://github.com/microsoft/LMOps cd LMOps/adaptllm pip install -r requirements.txt ``` 2). **Evaluate the Model** ```bash # Select the domain from ['biomedicine', 'finance', 'law'] DOMAIN='law' # Specify any Huggingface model name (Not applicable to chat models) MODEL='AdaptLLM/law-LLM-13B' # Model parallelization: # - Set MODEL_PARALLEL=False if the model fits on a single GPU. # We observe that LMs smaller than 10B always meet this requirement. # - Set MODEL_PARALLEL=True if the model is too large and encounters OOM on a single GPU. MODEL_PARALLEL=True # Choose the number of GPUs from [1, 2, 4, 8] N_GPU=2 # Whether to add a BOS token at the beginning of the prompt input: # - Set to False for AdaptLLM. # - Set to True for instruction-pretrain models. # If unsure, we recommend setting it to False, as this is suitable for most LMs. add_bos_token=False # Run the evaluation script bash scripts/inference.sh ${DOMAIN} ${MODEL} ${add_bos_token} ${MODEL_PARALLEL} ${N_GPU} ``` ### Raw Datasets We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages: [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt), [RCT](https://huggingface.co/datasets/AdaptLLM/RCT), [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA), [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA), [Headline](https://huggingface.co/datasets/AdaptLLM/Headline), [NER](https://huggingface.co/datasets/AdaptLLM/NER), [FPB](https://huggingface.co/datasets/AdaptLLM/FPB) ### Domain Knowledge Probing Our pre-processed knowledge probing datasets are available at: [med_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/med_knowledge_prob) and [law_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/law_knowledge_prob) ## Citation If you find our work helpful, please cite us: ```bibtex @inproceedings{ cheng2024adapting, title={Adapting Large Language Models via Reading Comprehension}, author={Daixuan Cheng and Shaohan Huang and Furu Wei}, booktitle={The Twelfth International Conference on Learning Representations}, year={2024}, url={https://openreview.net/forum?id=y886UXPEZ0} } ```
omrudra998/fifth
omrudra998
2024-11-14T07:12:47Z
91
0
transformers
[ "transformers", "tensorboard", "safetensors", "llama", "text-generation", "generated_from_trainer", "conversational", "base_model:meta-llama/Llama-3.2-1B-Instruct", "base_model:finetune:meta-llama/Llama-3.2-1B-Instruct", "license:llama3.2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T07:09:58Z
--- library_name: transformers license: llama3.2 base_model: meta-llama/Llama-3.2-1B-Instruct tags: - generated_from_trainer model-index: - name: fifth results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # fifth This model is a fine-tuned version of [meta-llama/Llama-3.2-1B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 60 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.46.2 - Pytorch 2.5.0+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3
welivein/dummy-model
welivein
2024-11-14T07:11:32Z
177
0
transformers
[ "transformers", "safetensors", "camembert", "fill-mask", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
2024-11-14T07:11:07Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
seongil-dn/e5-base-miracl-250k-answerable-bs128
seongil-dn
2024-11-14T07:10:24Z
6
0
sentence-transformers
[ "sentence-transformers", "safetensors", "xlm-roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:204133", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:intfloat/multilingual-e5-base", "base_model:finetune:intfloat/multilingual-e5-base", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
sentence-similarity
2024-11-14T07:09:31Z
--- tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:204133 - loss:MultipleNegativesRankingLoss base_model: intfloat/multilingual-e5-base widget: - source_sentence: 'query: 2018 글로벌인텔리전스서밋은 언제 어디서 열렸나요?' sentences: - 'passage: 서 원장은 이날 서울 웨스틴조선호텔에서 열린 국가안보전략연구원 주최 `2018 글로벌인텔리전스서밋`의 축하 영상메시지를 통해 "두 번의 남북정상회담에 이어 지난주에 미북 정상이 한반도의 완전한 비핵화와 평화정책에 합의했다. 정보가 왜 안보의 버팀목이자 평화의 길잡이인지 정보가 어떻게 세계 평화에 기여하는지를 제대로 보여줬다고 생각한다”고 말했다.' - 'passage: 1999년 벨라루스의 야당 인사들은 이 단체를 콤소몰에 빗대어 "루카몰"()이라고 부르기도 하였다. 이는 루카셴코와 콤소몰의 합성어로 루카셴코 대통령의 개인 숭배를 정치적 기반으로 이용하는 청소년 단체를 뜻한다.' - 'passage: 멕시코의 학교 교육에서는 스페인어와 영어가 필수이고 외국어로는 독일어, 프랑스어 등이 있다. 이 나라의 UNAM이라는 대학은 등록금이 원화로 단돈 500원에 이를 정도로 싸다.' - source_sentence: 'query: 도부 철도 오고세 선의 종점은 어디인가요?' sentences: - 'passage: 《30 ROCK》은 미국 NBC TV의 코미디 시리즈이다. NBC방송사를 통해 방송되는 코미디쇼 《TGS》의 제작진과 그외 방송국 사람들의 이야기를 다룬다. 티나 페이, 트레이시 모건, 제인 크라코스키, 잭 맥브레이어, 스콧 애짓, 앨릭 볼드윈 등 13명의 고정 출연진 외에 매회 뜻밖의 명사들이 출연한다.' - 'passage: 1952년 1월 18일에 대한민국 정부는 ‘인접 해양 주권에 관한 대통령 선언(평화선 선언)’을 발표하면서 이 섬을 평화선 안에 포함시켜 보호하도록 했다. 일본 측은 이에 항의하며 대한민국 측에 독도에 대한 한국 영유권을 부정하는 외교 문서를 보냈다. 이후부터 이 섬은 국제 사회에서 분쟁 지역으로 보이기 시작하였다.' - 'passage: 도부 철도 오고세 선()은 일본의 철도 노선 하나이다. 도부 철도가 운영한다. 일본 사이타마현 사카도 시에 있는 사카도 역과 군마 현 이루마 군 오고세 정에 있는 오고세 역을 잇는다.' - source_sentence: 'query: 묵주기도를 1단만 바치자는 평의회가 있었나요?' sentences: - 'passage: 이 공항은 남부 베트남의 호찌민 시(넓게 보면 동남보()를 거점으로 한다. 호찌민 시 중심부(1군)에서 7km 떨어진 떤빈 군에 위치하고 있으며, 공항에서 시내까지는 택시와 버스(152번)로 오고 갈 수 있다.' - 'passage: 어떤 평의회에서는 심지어 묵주기도를 5단이 아닌 1단만 바치자고 했던 경우가 있는가 하면 또한 교본에 조직, 운영에 따른 규칙이 자세히 나열되고 있지만 번역이 어렵게 되어 이해하는 데 곤란했던 일 등 여러 가지 시행착오가 있었으니, 어려운 문제가 생길 때마다 기도하며 연구해서 원만히 운영함으로써 오늘날과 같은 대군단의 조직을 갖추게 된 것이다.' - 'passage: 마르더의 차대는 체코제의 38(t) 전차 차대를 사용하여 기계적 신뢰성이 매우 높았으며 화력은 장거리에서 대부분의 경전차와 중전차를 격파하기에 충분한 것이었지만, 마르더 1과 2에서 지적된 수비의 약점들은 거의 그대로 계승되었다.' - source_sentence: 'query: 아불 하이르는 언제 화레즘을 공격했나요?' sentences: - 'passage: 1616년(광해군 8년) 음력 4월 1일 형 흥안군과 함께 사간원에서 탄핵을 받았는데, 당시 사간원에서는 흥안군과 경평군이 새로운 궁궐을 짓기 위해 마련된 빈 부지에 일부 사대부들이 집을 지었다는 이유로 소란을 피우며, 당시 부지 내에 있던 사대부들의 집을 무단으로 철거하거나, 종들을 풀어 집들을 뺏었다고 보고하였다. 또 두 형제가 늙은 부녀자를 구타하고, 과부들을 능욕하는 등의 행위도 저질렀다고도 하였다. 또 1618년(광해군 10년)에는 대궐 안의 하인을 무단으로 데려가 폭행하고 감금하는 바람에, 광해군이 직접 추고하라는 전교를 내렸다. 당시 대신들은 여러 차례에 걸쳐 경평군의 삭탈 관직을 청하였으나, 광해군은 이에 대해서는 윤허하지 않았다. 그러나 경평군은 3년 뒤인 1621년(광해군 13년)에 활인서의 별제로 있던 이제인을 무자비하게 구타하는 바람에, 추고를 하라는 전교가 또 내려졌다. 또 이듬해인 1622년(광해군 14년)에는 한 금리와 그 처를 잡아다가 죽기 직전까지 구타를 하여 또 탄핵을 받았다.' - 'passage: 위도 상으로는 태양의 복사열을 잘 받는 위치에 있지만, 고지대에 있기 때문에 기온은 연중 온화하고, 관광에는 적합한 기후를 가진다. 호수의 연평균 수온도 16.9℃로 안정되어 있다. 풍화설월의 명소인 따리 4경 중의 하나인 ‘얼하이의 달’로 호수면에 떠오르는 달의 풍치가 뛰어나다고 알려져 있다. 남안의 하관진과 북서쪽에 위치하는 호접천을 묶는 관광선이 운행되고 있다. 선내에서는 부이족의 삼도차를 제공하거나 부이족의 춤을 선보이는 것도 있다. 옛 명칭에는 엽유호, 곤미천, 시얼허(西洱河), 시얼허(西二河) 등으로 나온다. 차마고도의 출발지로 8세기에 남조국은 농업용수나 식용수 확보를 위해 이곳에 수도를 두었다고 추측된다. 또한 주위의 숲은 사슴 사냥의 장소로서 이용되고 있었다고 한다. 그 다음 937년에 개국한 대리국도 얼하이를 생활의 터전으로 이용했다.' - 'passage: 1430 - 31년, 아불 하이르는 바카스와 함께 화레즘을 공격해 우르간즈를 점령했다. 그러나 1431년, 우즈베크인들은 도시를 포기하고 후퇴했다. 아불 하이르의 군대는 아스트라한 부근에서 자신에게 대항하는 두명의 칸에게 패배를 안겨줬다. 1435 - 36년 우즈베크 군대는 화레즘을 다시금 공격하고 몇 년 뒤에는 아스트라한을 공격했다. 1446년 초반에 아불 하이르와 그의 군대는 시르다리야 강 유역을 공격해 결국 티무르 왕조의 땅 일부를 빼앗았다. 시그나크는 아불 하이르의 새 수도가 되었고, 트란스옥시아나에 대한 그의 공격은 그 곳을 중심으로 이뤄졌다.' - source_sentence: 'query: 주계열성의 중심핵 온도는 얼마나 될까요?' sentences: - 'passage: 수소 핵융합 과정은 온도에 민감하기 때문에 중심핵의 온도가 조금만 올라가도 핵융합의 강도는 막대하게 증가한다. 주계열성의 중심핵 온도는 가장 뜨거운 별과 가장 차가운 별을 비교하면 10배 정도밖에 차이가 나지 않는다. 예를 들면 O형 항성의 경우 중심 온도는 4천만 켈빈이며 어둡고 희미한 적색 왜성의 중심부 온도는 4백만 켈빈 정도에 이른다.' - 'passage: 처음 묘소는 충청남도 서산군 보현동(普賢洞) 인좌에 안장되었다가, 뒤에 두번째 부인 순천박씨가 매장된 경기도 양주군 와부면 도곡리 산 97-1 안골(현, 남양주시 와부읍 도곡1리 안골) 금대산(金臺山)의 갓무봉 언덕으로 이장, 합장되었다. 동시에 충청북도 충주에 안장되었던 본처 대구서씨의 묘소도 이장하여 그의 묘소에 합장하였다. 중종 때의 반정공신 박원종의 묘소가 근처에 있다. 완순군 이재완 역시 그의 묘소 근처에 안장되었다. 또한 완순군의 묘 아래에는 그의 양자 이재완의 아들 이달용의 묘가 있다.' - 'passage: 독일 형법(Strafgesetzbuch, StGB)은 모든 류의 범죄에 적용 가능한 총칙(Allgemeiner Teil)과 각각의 범죄를 서술해놓은 각칙(Besonderer Teil)으로 구성되어있다. 다른 법률에서도 다른 종류의 범죄들을 서술하고 있기는 하지만, 총칙에서 서술된 원칙은 그런 법률들에도 적용이 가능하다. 대륙법체계를 따르기 때문에 성문화 되어있고, 판사에 의해 이해되고 적용된다. 그렇기 때문에 사안의 사실관계확정에 초점을 두며 무죄추정의 원칙이 적용된다. 특히, 독일은 헌법 103조 2항에서 죄형법정주의를 채택하고 있고, 이를 형법총칙 제1조에서 다시 한번 언급하고 있다. 즉, 형법상의 책임은 의회가 제정한 법률에 근거해야 한다고 밝히고 있다. 의회에서 제정한 법률이 어떤 행위를 범죄화하기 위한 목적으로 인용하지 않는 한, 정부의 시행령이나 규칙으로는 범죄에 대한 책임을 묻기에 충분하지 않다. 형법총칙 제11조 5항에 보면 위법한 행위란 형법상의 구성요건을 실행하는 행위라고 한다. 즉, 범죄가 성립되려면 [범죄의 성립요건]을 만족해야 하며 이에는 구성요건 해당성, 위법성, 유책성(책임)이 있다. 특히 형법총칙 제16조에 ''구성요건적 착오''를 둠으로써 범죄성립 요건을 만족하였더라도 구성요건을 착오한 자는 죄를 면제해주거나 경감해줄 수 있도록 조치하였다. 형벌의 종류에는 자유형, 벌금형, 재산형, 부가형의 4가지가 존재한다. 제2차 세계대전 이후 형법을 부분개정하는 과정에서, 1949년 사형 제도가 폐지되었고, 1951년 내란죄, 간첩죄 등 국가적 법익을 침해하는 범죄에 대한 개정이 이루어졌으며, 1953년 보호관찰부 집행유예 제도와 가석방 제도가 도입되는 등 형법체제가 정비되었다. 1990년 동서독 통일 이후 독일 형법은 몇몇 분야에서 동독 형법과 서독 형법으로 나눠진 이원적 법적용 상황이 형성되었고 이를 제거하기 위해 1994년 5월 동성애 규정이 폐지되었고, 제182조의 죄명을 개정하는 등, 구동독 형법의 구성요건을 수용하는 노력을 보였다.' pipeline_tag: sentence-similarity library_name: sentence-transformers --- # SentenceTransformer based on intfloat/multilingual-e5-base This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) on the miracl-corpus dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) <!-- at revision d13f1b27baf31030b7fd040960d60d909913633f --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - miracl-corpus <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("seongil-dn/e5-base-miracl-250k-answerable-bs128") # Run inference sentences = [ 'query: 주계열성의 중심핵 온도는 얼마나 될까요?', 'passage: 수소 핵융합 과정은 온도에 민감하기 때문에 중심핵의 온도가 조금만 올라가도 핵융합의 강도는 막대하게 증가한다. 주계열성의 중심핵 온도는 가장 뜨거운 별과 가장 차가운 별을 비교하면 10배 정도밖에 차이가 나지 않는다. 예를 들면 O형 항성의 경우 중심 온도는 4천만 켈빈이며 어둡고 희미한 적색 왜성의 중심부 온도는 4백만 켈빈 정도에 이른다.', 'passage: 처음 묘소는 충청남도 서산군 보현동(普賢洞) 인좌에 안장되었다가, 뒤에 두번째 부인 순천박씨가 매장된 경기도 양주군 와부면 도곡리 산 97-1 안골(현, 남양주시 와부읍 도곡1리 안골) 금대산(金臺山)의 갓무봉 언덕으로 이장, 합장되었다. 동시에 충청북도 충주에 안장되었던 본처 대구서씨의 묘소도 이장하여 그의 묘소에 합장하였다. 중종 때의 반정공신 박원종의 묘소가 근처에 있다. 완순군 이재완 역시 그의 묘소 근처에 안장되었다. 또한 완순군의 묘 아래에는 그의 양자 이재완의 아들 이달용의 묘가 있다.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### miracl-corpus * Dataset: miracl-corpus * Size: 204,133 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 11 tokens</li><li>mean: 20.61 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 47 tokens</li><li>mean: 141.05 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | anchor | positive | |:------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>query: 별의 나이는 어떻게 측정하는가?</code> | <code>passage: 별의 나이는 토륨과 다른 성분들에 의해 만들어진 스펙트럼선들의 상대적인 힘을 측정하기 위해 초거대망원경의 자외선 분광기를 사용하여 추측한다. 선의 힘은 여러 가지 다양한 동위원소를 만들어내는데, 그것들로부터 핵우주 연대학을 사용하여 별의 나이를 짐작하는 것이다.</code> | | <code>query: 헬크릭층에서 발견되는 드로마에오사우루스과 공룡은 무엇인가요?</code> | <code>passage: 헬크릭층에서 발견되는 수각류로는 티라노사우루스과, 오르니토미무스과, 트로오돈과,조류, 카에나그나티드과 그리고 드로마에오사우루스과의 여러 속들이 있다. 아케로랍토르가 헬크릭층에서 발견되는 유일한 드로마에오사우루스과 공룡인 것으로 보인다. 흔히 발견되는 이빨은 드로마에오사우루스나 사우로르니톨레스테스의 것으로 간주되었으나 지금은 아케로랍토르의 것으로 생각하고 있다. 티라노사우루스과의 공룡으로는 나노티라누스와 티라노사우루스가 있는데, 나노티라누스는 티라노사우루스의 어린 개체일 가능성이 있다. 오르니토미무스과로는 스트루티오미무스와 오르니토미무스가 있으며, 정식으로 보고되지 않은 공룡인 오르코미무스도 헬크릭층에서 발견된 것일 수 있다. 트로오돈과의 공룡으로는 트로오돈과 파로니코돈이 헬크릭층에서 발견된다. 조류로는 아비사우루스, 그리고 브로다비스의 여러 종, 명명되지 않은 세 종류의 헤스페르오로르니스류가 있다. 오비랍토로사우리아는 헬크릭층에서 안주와 렙토린코스 두 종류가 있다.</code> | | <code>query: 남이 장군은 언제 죽었는가?</code> | <code>passage: 1468년 10월 24일 예종이 즉위한 지 한 달 보름만에 유자광이 남이(南怡)가 역모를 꾀하고 있다는 고변으로 시작되어 이시애의 난을 진압하는 데 공을 세웠던 남이(南怡)와 강순을 비롯해 민서, 문효량, 변영수, 변자의, 오치권 등이 역모죄로 죽음을 당한 사건이다.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 96 - `adam_epsilon`: 1e-07 - `warmup_ratio`: 0.1 - `fp16`: True - `dataloader_drop_last`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 96 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-07 - `max_grad_norm`: 1.0 - `num_train_epochs`: 3 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: True - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | |:------:|:----:|:-------------:| | 0.0005 | 1 | 2.1885 | | 0.0009 | 2 | 2.2525 | | 0.0014 | 3 | 2.2806 | | 0.0019 | 4 | 2.2304 | | 0.0024 | 5 | 2.3556 | | 0.0028 | 6 | 2.2416 | | 0.0033 | 7 | 2.158 | | 0.0038 | 8 | 2.2726 | | 0.0042 | 9 | 2.1155 | | 0.0047 | 10 | 2.2428 | | 0.0052 | 11 | 2.2109 | | 0.0056 | 12 | 2.403 | | 0.0061 | 13 | 2.2068 | | 0.0066 | 14 | 2.1787 | | 0.0071 | 15 | 2.1531 | | 0.0075 | 16 | 2.2818 | | 0.0080 | 17 | 2.1265 | | 0.0085 | 18 | 2.1879 | | 0.0089 | 19 | 2.1179 | | 0.0094 | 20 | 2.1004 | | 0.0099 | 21 | 2.1954 | | 0.0103 | 22 | 1.9755 | | 0.0108 | 23 | 2.0296 | | 0.0113 | 24 | 2.0232 | | 0.0118 | 25 | 1.9643 | | 0.0122 | 26 | 1.7821 | | 0.0127 | 27 | 1.9111 | | 0.0132 | 28 | 1.8421 | | 0.0136 | 29 | 1.806 | | 0.0141 | 30 | 1.7923 | | 0.0146 | 31 | 1.7095 | | 0.0151 | 32 | 1.5854 | | 0.0155 | 33 | 1.6535 | | 0.0160 | 34 | 1.5348 | | 0.0165 | 35 | 1.5379 | | 0.0169 | 36 | 1.5649 | | 0.0174 | 37 | 1.2713 | | 0.0179 | 38 | 1.344 | | 0.0183 | 39 | 1.1934 | | 0.0188 | 40 | 1.2732 | | 0.0193 | 41 | 1.1936 | | 0.0198 | 42 | 1.2777 | | 0.0202 | 43 | 1.1918 | | 0.0207 | 44 | 1.1907 | | 0.0212 | 45 | 1.0029 | | 0.0216 | 46 | 1.085 | | 0.0221 | 47 | 0.9491 | | 0.0226 | 48 | 0.96 | | 0.0230 | 49 | 0.8735 | | 0.0235 | 50 | 0.8606 | | 0.0240 | 51 | 0.7412 | | 0.0245 | 52 | 0.7611 | | 0.0249 | 53 | 0.6617 | | 0.0254 | 54 | 0.6171 | | 0.0259 | 55 | 0.6364 | | 0.0263 | 56 | 0.64 | | 0.0268 | 57 | 0.556 | | 0.0273 | 58 | 0.5417 | | 0.0278 | 59 | 0.4746 | | 0.0282 | 60 | 0.4184 | | 0.0287 | 61 | 0.543 | | 0.0292 | 62 | 0.3564 | | 0.0296 | 63 | 0.3815 | | 0.0301 | 64 | 0.397 | | 0.0306 | 65 | 0.3263 | | 0.0310 | 66 | 0.3552 | | 0.0315 | 67 | 0.3837 | | 0.0320 | 68 | 0.2362 | | 0.0325 | 69 | 0.1956 | | 0.0329 | 70 | 0.2436 | | 0.0334 | 71 | 0.2661 | | 0.0339 | 72 | 0.2649 | | 0.0343 | 73 | 0.2263 | | 0.0348 | 74 | 0.1218 | | 0.0353 | 75 | 0.2553 | | 0.0357 | 76 | 0.2317 | | 0.0362 | 77 | 0.2347 | | 0.0367 | 78 | 0.1252 | | 0.0372 | 79 | 0.1548 | | 0.0376 | 80 | 0.0933 | | 0.0381 | 81 | 0.1597 | | 0.0386 | 82 | 0.1682 | | 0.0390 | 83 | 0.1506 | | 0.0395 | 84 | 0.1385 | | 0.0400 | 85 | 0.0837 | | 0.0405 | 86 | 0.0685 | | 0.0409 | 87 | 0.1103 | | 0.0414 | 88 | 0.0973 | | 0.0419 | 89 | 0.0964 | | 0.0423 | 90 | 0.0561 | | 0.0428 | 91 | 0.0799 | | 0.0433 | 92 | 0.0693 | | 0.0437 | 93 | 0.1259 | | 0.0442 | 94 | 0.0776 | | 0.0447 | 95 | 0.099 | | 0.0452 | 96 | 0.1126 | | 0.0456 | 97 | 0.1296 | | 0.0461 | 98 | 0.1076 | | 0.0466 | 99 | 0.0879 | | 0.0470 | 100 | 0.0706 | | 0.0475 | 101 | 0.0689 | | 0.0480 | 102 | 0.0389 | | 0.0484 | 103 | 0.0627 | | 0.0489 | 104 | 0.0168 | | 0.0494 | 105 | 0.0499 | | 0.0499 | 106 | 0.0511 | | 0.0503 | 107 | 0.0241 | | 0.0508 | 108 | 0.0613 | | 0.0513 | 109 | 0.049 | | 0.0517 | 110 | 0.0348 | | 0.0522 | 111 | 0.041 | | 0.0527 | 112 | 0.0554 | | 0.0532 | 113 | 0.0815 | | 0.0536 | 114 | 0.0465 | | 0.0541 | 115 | 0.0309 | | 0.0546 | 116 | 0.0505 | | 0.0550 | 117 | 0.0246 | | 0.0555 | 118 | 0.0641 | | 0.0560 | 119 | 0.0321 | | 0.0564 | 120 | 0.0828 | | 0.0569 | 121 | 0.0674 | | 0.0574 | 122 | 0.0577 | | 0.0579 | 123 | 0.0957 | | 0.0583 | 124 | 0.0436 | | 0.0588 | 125 | 0.0427 | | 0.0593 | 126 | 0.0325 | | 0.0597 | 127 | 0.0376 | | 0.0602 | 128 | 0.0455 | | 0.0607 | 129 | 0.083 | | 0.0611 | 130 | 0.0669 | | 0.0616 | 131 | 0.0458 | | 0.0621 | 132 | 0.0265 | | 0.0626 | 133 | 0.0562 | | 0.0630 | 134 | 0.0357 | | 0.0635 | 135 | 0.0342 | | 0.0640 | 136 | 0.0262 | | 0.0644 | 137 | 0.0265 | | 0.0649 | 138 | 0.0923 | | 0.0654 | 139 | 0.0182 | | 0.0659 | 140 | 0.0374 | | 0.0663 | 141 | 0.016 | | 0.0668 | 142 | 0.059 | | 0.0673 | 143 | 0.0544 | | 0.0677 | 144 | 0.0466 | | 0.0682 | 145 | 0.0614 | | 0.0687 | 146 | 0.0776 | | 0.0691 | 147 | 0.0509 | | 0.0696 | 148 | 0.0259 | | 0.0701 | 149 | 0.0571 | | 0.0706 | 150 | 0.0423 | | 0.0710 | 151 | 0.0368 | | 0.0715 | 152 | 0.0369 | | 0.0720 | 153 | 0.0293 | | 0.0724 | 154 | 0.0391 | | 0.0729 | 155 | 0.0261 | | 0.0734 | 156 | 0.0333 | | 0.0738 | 157 | 0.0364 | | 0.0743 | 158 | 0.0239 | | 0.0748 | 159 | 0.048 | | 0.0753 | 160 | 0.0169 | | 0.0757 | 161 | 0.0525 | | 0.0762 | 162 | 0.0523 | | 0.0767 | 163 | 0.0602 | | 0.0771 | 164 | 0.0844 | | 0.0776 | 165 | 0.0292 | | 0.0781 | 166 | 0.048 | | 0.0786 | 167 | 0.0272 | | 0.0790 | 168 | 0.0435 | | 0.0795 | 169 | 0.0443 | | 0.0800 | 170 | 0.0256 | | 0.0804 | 171 | 0.0653 | | 0.0809 | 172 | 0.0557 | | 0.0814 | 173 | 0.0564 | | 0.0818 | 174 | 0.0239 | | 0.0823 | 175 | 0.0436 | | 0.0828 | 176 | 0.085 | | 0.0833 | 177 | 0.0463 | | 0.0837 | 178 | 0.0179 | | 0.0842 | 179 | 0.034 | | 0.0847 | 180 | 0.0477 | | 0.0851 | 181 | 0.0949 | | 0.0856 | 182 | 0.0351 | | 0.0861 | 183 | 0.0746 | | 0.0865 | 184 | 0.058 | | 0.0870 | 185 | 0.0155 | | 0.0875 | 186 | 0.0104 | | 0.0880 | 187 | 0.0222 | | 0.0884 | 188 | 0.0256 | | 0.0889 | 189 | 0.0235 | | 0.0894 | 190 | 0.0543 | | 0.0898 | 191 | 0.0269 | | 0.0903 | 192 | 0.045 | | 0.0908 | 193 | 0.1137 | | 0.0913 | 194 | 0.0514 | | 0.0917 | 195 | 0.027 | | 0.0922 | 196 | 0.0725 | | 0.0927 | 197 | 0.0223 | | 0.0931 | 198 | 0.0209 | | 0.0936 | 199 | 0.0411 | | 0.0941 | 200 | 0.0418 | | 0.0945 | 201 | 0.0517 | | 0.0950 | 202 | 0.0722 | | 0.0955 | 203 | 0.0687 | | 0.0960 | 204 | 0.0389 | | 0.0964 | 205 | 0.0429 | | 0.0969 | 206 | 0.0643 | | 0.0974 | 207 | 0.0201 | | 0.0978 | 208 | 0.0213 | | 0.0983 | 209 | 0.0505 | | 0.0988 | 210 | 0.0531 | | 0.0992 | 211 | 0.0398 | | 0.0997 | 212 | 0.0373 | | 0.1002 | 213 | 0.0139 | | 0.1007 | 214 | 0.0439 | | 0.1011 | 215 | 0.0114 | | 0.1016 | 216 | 0.0368 | | 0.1021 | 217 | 0.0664 | | 0.1025 | 218 | 0.013 | | 0.1030 | 219 | 0.0684 | | 0.1035 | 220 | 0.0425 | | 0.1040 | 221 | 0.038 | | 0.1044 | 222 | 0.034 | | 0.1049 | 223 | 0.1103 | | 0.1054 | 224 | 0.0309 | | 0.1058 | 225 | 0.064 | | 0.1063 | 226 | 0.0278 | | 0.1068 | 227 | 0.053 | | 0.1072 | 228 | 0.0224 | | 0.1077 | 229 | 0.0428 | | 0.1082 | 230 | 0.0674 | | 0.1087 | 231 | 0.0477 | | 0.1091 | 232 | 0.0577 | | 0.1096 | 233 | 0.0368 | | 0.1101 | 234 | 0.0384 | | 0.1105 | 235 | 0.0519 | | 0.1110 | 236 | 0.0467 | | 0.1115 | 237 | 0.0619 | | 0.1119 | 238 | 0.0245 | | 0.1124 | 239 | 0.0493 | | 0.1129 | 240 | 0.0163 | | 0.1134 | 241 | 0.0759 | | 0.1138 | 242 | 0.0253 | | 0.1143 | 243 | 0.033 | | 0.1148 | 244 | 0.0192 | | 0.1152 | 245 | 0.0226 | | 0.1157 | 246 | 0.0127 | | 0.1162 | 247 | 0.012 | | 0.1167 | 248 | 0.0501 | | 0.1171 | 249 | 0.0587 | | 0.1176 | 250 | 0.0546 | | 0.1181 | 251 | 0.0437 | | 0.1185 | 252 | 0.0295 | | 0.1190 | 253 | 0.0308 | | 0.1195 | 254 | 0.108 | | 0.1199 | 255 | 0.0213 | | 0.1204 | 256 | 0.0161 | | 0.1209 | 257 | 0.023 | | 0.1214 | 258 | 0.0641 | | 0.1218 | 259 | 0.0634 | | 0.1223 | 260 | 0.0285 | | 0.1228 | 261 | 0.0251 | | 0.1232 | 262 | 0.0577 | | 0.1237 | 263 | 0.0283 | | 0.1242 | 264 | 0.0173 | | 0.1246 | 265 | 0.0268 | | 0.1251 | 266 | 0.0491 | | 0.1256 | 267 | 0.0354 | | 0.1261 | 268 | 0.0477 | | 0.1265 | 269 | 0.0357 | | 0.1270 | 270 | 0.0367 | | 0.1275 | 271 | 0.0209 | | 0.1279 | 272 | 0.0687 | | 0.1284 | 273 | 0.0149 | | 0.1289 | 274 | 0.0527 | | 0.1294 | 275 | 0.0374 | | 0.1298 | 276 | 0.0176 | | 0.1303 | 277 | 0.0315 | | 0.1308 | 278 | 0.0743 | | 0.1312 | 279 | 0.0539 | | 0.1317 | 280 | 0.0484 | | 0.1322 | 281 | 0.0194 | | 0.1326 | 282 | 0.0476 | | 0.1331 | 283 | 0.0197 | | 0.1336 | 284 | 0.0265 | | 0.1341 | 285 | 0.035 | | 0.1345 | 286 | 0.0183 | | 0.1350 | 287 | 0.0303 | | 0.1355 | 288 | 0.0335 | | 0.1359 | 289 | 0.0372 | | 0.1364 | 290 | 0.0135 | | 0.1369 | 291 | 0.0261 | | 0.1373 | 292 | 0.0217 | | 0.1378 | 293 | 0.0412 | | 0.1383 | 294 | 0.0534 | | 0.1388 | 295 | 0.0248 | | 0.1392 | 296 | 0.0902 | | 0.1397 | 297 | 0.0387 | | 0.1402 | 298 | 0.1036 | | 0.1406 | 299 | 0.0731 | | 0.1411 | 300 | 0.0475 | | 0.1416 | 301 | 0.0162 | | 0.1421 | 302 | 0.0307 | | 0.1425 | 303 | 0.0341 | | 0.1430 | 304 | 0.0509 | | 0.1435 | 305 | 0.0246 | | 0.1439 | 306 | 0.0559 | | 0.1444 | 307 | 0.0571 | | 0.1449 | 308 | 0.0167 | | 0.1453 | 309 | 0.0433 | | 0.1458 | 310 | 0.1018 | | 0.1463 | 311 | 0.0261 | | 0.1468 | 312 | 0.1039 | | 0.1472 | 313 | 0.0399 | | 0.1477 | 314 | 0.0976 | | 0.1482 | 315 | 0.0581 | | 0.1486 | 316 | 0.0456 | | 0.1491 | 317 | 0.037 | | 0.1496 | 318 | 0.0325 | | 0.1500 | 319 | 0.0203 | | 0.1505 | 320 | 0.0497 | | 0.1510 | 321 | 0.0279 | | 0.1515 | 322 | 0.0318 | | 0.1519 | 323 | 0.029 | | 0.1524 | 324 | 0.0239 | | 0.1529 | 325 | 0.0395 | | 0.1533 | 326 | 0.0402 | | 0.1538 | 327 | 0.0491 | | 0.1543 | 328 | 0.0154 | | 0.1548 | 329 | 0.0089 | | 0.1552 | 330 | 0.0833 | | 0.1557 | 331 | 0.059 | | 0.1562 | 332 | 0.0251 | | 0.1566 | 333 | 0.0522 | | 0.1571 | 334 | 0.026 | | 0.1576 | 335 | 0.0602 | | 0.1580 | 336 | 0.0408 | | 0.1585 | 337 | 0.0358 | | 0.1590 | 338 | 0.0137 | | 0.1595 | 339 | 0.0167 | | 0.1599 | 340 | 0.0401 | | 0.1604 | 341 | 0.0533 | | 0.1609 | 342 | 0.0543 | | 0.1613 | 343 | 0.013 | | 0.1618 | 344 | 0.0235 | | 0.1623 | 345 | 0.0682 | | 0.1627 | 346 | 0.0579 | | 0.1632 | 347 | 0.017 | | 0.1637 | 348 | 0.01 | | 0.1642 | 349 | 0.0096 | | 0.1646 | 350 | 0.0146 | | 0.1651 | 351 | 0.0727 | | 0.1656 | 352 | 0.0697 | | 0.1660 | 353 | 0.0184 | | 0.1665 | 354 | 0.0349 | | 0.1670 | 355 | 0.0489 | | 0.1675 | 356 | 0.0483 | | 0.1679 | 357 | 0.0587 | | 0.1684 | 358 | 0.0263 | | 0.1689 | 359 | 0.0395 | | 0.1693 | 360 | 0.049 | | 0.1698 | 361 | 0.0494 | | 0.1703 | 362 | 0.0531 | | 0.1707 | 363 | 0.1087 | | 0.1712 | 364 | 0.0186 | | 0.1717 | 365 | 0.0303 | | 0.1722 | 366 | 0.0465 | | 0.1726 | 367 | 0.0362 | | 0.1731 | 368 | 0.0334 | | 0.1736 | 369 | 0.0443 | | 0.1740 | 370 | 0.017 | | 0.1745 | 371 | 0.0263 | | 0.1750 | 372 | 0.0101 | | 0.1754 | 373 | 0.02 | | 0.1759 | 374 | 0.0336 | | 0.1764 | 375 | 0.0663 | | 0.1769 | 376 | 0.0148 | | 0.1773 | 377 | 0.0435 | | 0.1778 | 378 | 0.0479 | | 0.1783 | 379 | 0.0214 | | 0.1787 | 380 | 0.0325 | | 0.1792 | 381 | 0.0125 | | 0.1797 | 382 | 0.0426 | | 0.1802 | 383 | 0.0364 | | 0.1806 | 384 | 0.0329 | | 0.1811 | 385 | 0.0155 | | 0.1816 | 386 | 0.0654 | | 0.1820 | 387 | 0.0652 | | 0.1825 | 388 | 0.031 | | 0.1830 | 389 | 0.034 | | 0.1834 | 390 | 0.0273 | | 0.1839 | 391 | 0.0461 | | 0.1844 | 392 | 0.04 | | 0.1849 | 393 | 0.0705 | | 0.1853 | 394 | 0.0725 | | 0.1858 | 395 | 0.0117 | | 0.1863 | 396 | 0.0228 | | 0.1867 | 397 | 0.0537 | | 0.1872 | 398 | 0.0197 | | 0.1877 | 399 | 0.0491 | | 0.1881 | 400 | 0.0626 | | 0.1886 | 401 | 0.0229 | | 0.1891 | 402 | 0.0778 | | 0.1896 | 403 | 0.0327 | | 0.1900 | 404 | 0.0251 | | 0.1905 | 405 | 0.0312 | | 0.1910 | 406 | 0.0547 | | 0.1914 | 407 | 0.0309 | | 0.1919 | 408 | 0.0475 | | 0.1924 | 409 | 0.0486 | | 0.1929 | 410 | 0.0246 | | 0.1933 | 411 | 0.0686 | | 0.1938 | 412 | 0.0397 | | 0.1943 | 413 | 0.0707 | | 0.1947 | 414 | 0.0667 | | 0.1952 | 415 | 0.0327 | | 0.1957 | 416 | 0.0362 | | 0.1961 | 417 | 0.0222 | | 0.1966 | 418 | 0.0481 | | 0.1971 | 419 | 0.0848 | | 0.1976 | 420 | 0.0169 | | 0.1980 | 421 | 0.0542 | | 0.1985 | 422 | 0.0587 | | 0.1990 | 423 | 0.0249 | | 0.1994 | 424 | 0.0432 | | 0.1999 | 425 | 0.0566 | | 0.2004 | 426 | 0.029 | | 0.2008 | 427 | 0.0445 | | 0.2013 | 428 | 0.059 | | 0.2018 | 429 | 0.0163 | | 0.2023 | 430 | 0.0131 | | 0.2027 | 431 | 0.0431 | | 0.2032 | 432 | 0.0643 | | 0.2037 | 433 | 0.0717 | | 0.2041 | 434 | 0.0469 | | 0.2046 | 435 | 0.0518 | | 0.2051 | 436 | 0.0359 | | 0.2056 | 437 | 0.0455 | | 0.2060 | 438 | 0.0414 | | 0.2065 | 439 | 0.0169 | | 0.2070 | 440 | 0.0259 | | 0.2074 | 441 | 0.0607 | | 0.2079 | 442 | 0.0175 | | 0.2084 | 443 | 0.0279 | | 0.2088 | 444 | 0.0319 | | 0.2093 | 445 | 0.0245 | | 0.2098 | 446 | 0.0412 | | 0.2103 | 447 | 0.057 | | 0.2107 | 448 | 0.0209 | | 0.2112 | 449 | 0.0265 | | 0.2117 | 450 | 0.043 | | 0.2121 | 451 | 0.0323 | | 0.2126 | 452 | 0.0401 | | 0.2131 | 453 | 0.0198 | | 0.2135 | 454 | 0.0269 | | 0.2140 | 455 | 0.0428 | | 0.2145 | 456 | 0.0239 | | 0.2150 | 457 | 0.0209 | | 0.2154 | 458 | 0.0469 | | 0.2159 | 459 | 0.0382 | | 0.2164 | 460 | 0.0341 | | 0.2168 | 461 | 0.0297 | | 0.2173 | 462 | 0.0172 | | 0.2178 | 463 | 0.0291 | | 0.2183 | 464 | 0.0494 | | 0.2187 | 465 | 0.0206 | | 0.2192 | 466 | 0.0271 | | 0.2197 | 467 | 0.0446 | | 0.2201 | 468 | 0.0284 | | 0.2206 | 469 | 0.0524 | | 0.2211 | 470 | 0.0373 | | 0.2215 | 471 | 0.0433 | | 0.2220 | 472 | 0.0243 | | 0.2225 | 473 | 0.0516 | | 0.2230 | 474 | 0.0248 | | 0.2234 | 475 | 0.0375 | | 0.2239 | 476 | 0.0165 | | 0.2244 | 477 | 0.0324 | | 0.2248 | 478 | 0.0255 | | 0.2253 | 479 | 0.0594 | | 0.2258 | 480 | 0.0639 | | 0.2262 | 481 | 0.0231 | | 0.2267 | 482 | 0.02 | | 0.2272 | 483 | 0.0724 | | 0.2277 | 484 | 0.0222 | | 0.2281 | 485 | 0.0206 | | 0.2286 | 486 | 0.0629 | | 0.2291 | 487 | 0.0875 | | 0.2295 | 488 | 0.0214 | | 0.2300 | 489 | 0.018 | | 0.2305 | 490 | 0.015 | | 0.2310 | 491 | 0.0297 | | 0.2314 | 492 | 0.0231 | | 0.2319 | 493 | 0.0281 | | 0.2324 | 494 | 0.0217 | | 0.2328 | 495 | 0.0333 | | 0.2333 | 496 | 0.0435 | | 0.2338 | 497 | 0.0202 | | 0.2342 | 498 | 0.0327 | | 0.2347 | 499 | 0.0728 | | 0.2352 | 500 | 0.0512 | | 0.2357 | 501 | 0.0155 | | 0.2361 | 502 | 0.047 | | 0.2366 | 503 | 0.0562 | | 0.2371 | 504 | 0.0403 | | 0.2375 | 505 | 0.0344 | | 0.2380 | 506 | 0.0642 | | 0.2385 | 507 | 0.0398 | | 0.2389 | 508 | 0.0512 | | 0.2394 | 509 | 0.0164 | | 0.2399 | 510 | 0.0228 | | 0.2404 | 511 | 0.0134 | | 0.2408 | 512 | 0.0112 | | 0.2413 | 513 | 0.0345 | | 0.2418 | 514 | 0.0444 | | 0.2422 | 515 | 0.0257 | | 0.2427 | 516 | 0.0251 | | 0.2432 | 517 | 0.0292 | | 0.2437 | 518 | 0.0272 | | 0.2441 | 519 | 0.0471 | | 0.2446 | 520 | 0.0516 | | 0.2451 | 521 | 0.0193 | | 0.2455 | 522 | 0.0267 | | 0.2460 | 523 | 0.0572 | | 0.2465 | 524 | 0.0183 | | 0.2469 | 525 | 0.0361 | | 0.2474 | 526 | 0.0272 | | 0.2479 | 527 | 0.0516 | | 0.2484 | 528 | 0.017 | | 0.2488 | 529 | 0.0456 | | 0.2493 | 530 | 0.0278 | | 0.2498 | 531 | 0.062 | | 0.2502 | 532 | 0.0418 | | 0.2507 | 533 | 0.0365 | | 0.2512 | 534 | 0.0097 | | 0.2516 | 535 | 0.0386 | | 0.2521 | 536 | 0.0306 | | 0.2526 | 537 | 0.0797 | | 0.2531 | 538 | 0.0147 | | 0.2535 | 539 | 0.0179 | | 0.2540 | 540 | 0.0329 | | 0.2545 | 541 | 0.0709 | | 0.2549 | 542 | 0.0389 | | 0.2554 | 543 | 0.0145 | | 0.2559 | 544 | 0.0832 | | 0.2563 | 545 | 0.0313 | | 0.2568 | 546 | 0.0282 | | 0.2573 | 547 | 0.031 | | 0.2578 | 548 | 0.0692 | | 0.2582 | 549 | 0.0335 | | 0.2587 | 550 | 0.0306 | | 0.2592 | 551 | 0.0366 | | 0.2596 | 552 | 0.0425 | | 0.2601 | 553 | 0.0656 | | 0.2606 | 554 | 0.1067 | | 0.2611 | 555 | 0.0346 | | 0.2615 | 556 | 0.0469 | | 0.2620 | 557 | 0.0223 | | 0.2625 | 558 | 0.0498 | | 0.2629 | 559 | 0.0712 | | 0.2634 | 560 | 0.0377 | | 0.2639 | 561 | 0.1023 | | 0.2643 | 562 | 0.0853 | | 0.2648 | 563 | 0.0356 | | 0.2653 | 564 | 0.076 | | 0.2658 | 565 | 0.0282 | | 0.2662 | 566 | 0.0469 | | 0.2667 | 567 | 0.0694 | | 0.2672 | 568 | 0.0285 | | 0.2676 | 569 | 0.0278 | | 0.2681 | 570 | 0.0633 | | 0.2686 | 571 | 0.0125 | | 0.2690 | 572 | 0.1003 | | 0.2695 | 573 | 0.0249 | | 0.2700 | 574 | 0.0357 | | 0.2705 | 575 | 0.0227 | | 0.2709 | 576 | 0.0643 | | 0.2714 | 577 | 0.0511 | | 0.2719 | 578 | 0.0246 | | 0.2723 | 579 | 0.04 | | 0.2728 | 580 | 0.0309 | | 0.2733 | 581 | 0.0238 | | 0.2738 | 582 | 0.0375 | | 0.2742 | 583 | 0.0262 | | 0.2747 | 584 | 0.0399 | | 0.2752 | 585 | 0.0185 | | 0.2756 | 586 | 0.026 | | 0.2761 | 587 | 0.03 | | 0.2766 | 588 | 0.0598 | | 0.2770 | 589 | 0.0227 | | 0.2775 | 590 | 0.019 | | 0.2780 | 591 | 0.0415 | | 0.2785 | 592 | 0.0525 | | 0.2789 | 593 | 0.0187 | | 0.2794 | 594 | 0.0165 | | 0.2799 | 595 | 0.0418 | | 0.2803 | 596 | 0.0295 | | 0.2808 | 597 | 0.0416 | | 0.2813 | 598 | 0.0158 | | 0.2817 | 599 | 0.0422 | | 0.2822 | 600 | 0.0589 | | 0.2827 | 601 | 0.1205 | | 0.2832 | 602 | 0.0294 | | 0.2836 | 603 | 0.0191 | | 0.2841 | 604 | 0.0429 | | 0.2846 | 605 | 0.0261 | | 0.2850 | 606 | 0.0211 | | 0.2855 | 607 | 0.0349 | | 0.2860 | 608 | 0.0634 | | 0.2865 | 609 | 0.0179 | | 0.2869 | 610 | 0.0416 | | 0.2874 | 611 | 0.018 | | 0.2879 | 612 | 0.037 | | 0.2883 | 613 | 0.12 | | 0.2888 | 614 | 0.0198 | | 0.2893 | 615 | 0.04 | | 0.2897 | 616 | 0.0382 | | 0.2902 | 617 | 0.0432 | | 0.2907 | 618 | 0.0711 | | 0.2912 | 619 | 0.0337 | | 0.2916 | 620 | 0.0421 | | 0.2921 | 621 | 0.0226 | | 0.2926 | 622 | 0.0236 | | 0.2930 | 623 | 0.0179 | | 0.2935 | 624 | 0.0433 | | 0.2940 | 625 | 0.0399 | | 0.2944 | 626 | 0.0179 | | 0.2949 | 627 | 0.0562 | | 0.2954 | 628 | 0.104 | | 0.2959 | 629 | 0.0593 | | 0.2963 | 630 | 0.0446 | | 0.2968 | 631 | 0.0191 | | 0.2973 | 632 | 0.0658 | | 0.2977 | 633 | 0.0269 | | 0.2982 | 634 | 0.0356 | | 0.2987 | 635 | 0.0244 | | 0.2992 | 636 | 0.0198 | | 0.2996 | 637 | 0.0294 | | 0.3001 | 638 | 0.0182 | | 0.3006 | 639 | 0.0487 | | 0.3010 | 640 | 0.0459 | | 0.3015 | 641 | 0.0327 | | 0.3020 | 642 | 0.0799 | | 0.3024 | 643 | 0.0699 | | 0.3029 | 644 | 0.0269 | | 0.3034 | 645 | 0.0397 | | 0.3039 | 646 | 0.0663 | | 0.3043 | 647 | 0.075 | | 0.3048 | 648 | 0.0175 | | 0.3053 | 649 | 0.0316 | | 0.3057 | 650 | 0.0258 | | 0.3062 | 651 | 0.0343 | | 0.3067 | 652 | 0.0234 | | 0.3071 | 653 | 0.085 | | 0.3076 | 654 | 0.022 | | 0.3081 | 655 | 0.1063 | | 0.3086 | 656 | 0.0398 | | 0.3090 | 657 | 0.0174 | | 0.3095 | 658 | 0.0319 | | 0.3100 | 659 | 0.02 | | 0.3104 | 660 | 0.0234 | | 0.3109 | 661 | 0.0663 | | 0.3114 | 662 | 0.0511 | | 0.3119 | 663 | 0.0366 | | 0.3123 | 664 | 0.0231 | | 0.3128 | 665 | 0.0389 | | 0.3133 | 666 | 0.0476 | | 0.3137 | 667 | 0.1014 | | 0.3142 | 668 | 0.0186 | | 0.3147 | 669 | 0.013 | | 0.3151 | 670 | 0.0297 | | 0.3156 | 671 | 0.0147 | | 0.3161 | 672 | 0.0412 | | 0.3166 | 673 | 0.0379 | | 0.3170 | 674 | 0.0189 | | 0.3175 | 675 | 0.0874 | | 0.3180 | 676 | 0.0166 | | 0.3184 | 677 | 0.0872 | | 0.3189 | 678 | 0.0367 | | 0.3194 | 679 | 0.0243 | | 0.3198 | 680 | 0.0124 | | 0.3203 | 681 | 0.0276 | | 0.3208 | 682 | 0.0373 | | 0.3213 | 683 | 0.0477 | | 0.3217 | 684 | 0.0169 | | 0.3222 | 685 | 0.0575 | | 0.3227 | 686 | 0.0413 | | 0.3231 | 687 | 0.0248 | | 0.3236 | 688 | 0.0178 | | 0.3241 | 689 | 0.0351 | | 0.3246 | 690 | 0.032 | | 0.3250 | 691 | 0.0252 | | 0.3255 | 692 | 0.0654 | | 0.3260 | 693 | 0.0091 | | 0.3264 | 694 | 0.0415 | | 0.3269 | 695 | 0.0341 | | 0.3274 | 696 | 0.0674 | | 0.3278 | 697 | 0.0411 | | 0.3283 | 698 | 0.0519 | | 0.3288 | 699 | 0.0163 | | 0.3293 | 700 | 0.0837 | | 0.3297 | 701 | 0.0352 | | 0.3302 | 702 | 0.0292 | | 0.3307 | 703 | 0.0207 | | 0.3311 | 704 | 0.0438 | | 0.3316 | 705 | 0.0175 | | 0.3321 | 706 | 0.0731 | | 0.3325 | 707 | 0.0852 | | 0.3330 | 708 | 0.0419 | | 0.3335 | 709 | 0.0402 | | 0.3340 | 710 | 0.0545 | | 0.3344 | 711 | 0.0291 | | 0.3349 | 712 | 0.0443 | | 0.3354 | 713 | 0.01 | | 0.3358 | 714 | 0.0478 | | 0.3363 | 715 | 0.0488 | | 0.3368 | 716 | 0.0384 | | 0.3373 | 717 | 0.0505 | | 0.3377 | 718 | 0.0402 | | 0.3382 | 719 | 0.0693 | | 0.3387 | 720 | 0.0433 | | 0.3391 | 721 | 0.0632 | | 0.3396 | 722 | 0.0285 | | 0.3401 | 723 | 0.0394 | | 0.3405 | 724 | 0.0701 | | 0.3410 | 725 | 0.069 | | 0.3415 | 726 | 0.0439 | | 0.3420 | 727 | 0.0171 | | 0.3424 | 728 | 0.0235 | | 0.3429 | 729 | 0.17 | | 0.3434 | 730 | 0.0684 | | 0.3438 | 731 | 0.0208 | | 0.3443 | 732 | 0.0691 | | 0.3448 | 733 | 0.0595 | | 0.3452 | 734 | 0.043 | | 0.3457 | 735 | 0.0287 | | 0.3462 | 736 | 0.0336 | | 0.3467 | 737 | 0.0505 | | 0.3471 | 738 | 0.0093 | | 0.3476 | 739 | 0.0369 | | 0.3481 | 740 | 0.0351 | | 0.3485 | 741 | 0.0284 | | 0.3490 | 742 | 0.0727 | | 0.3495 | 743 | 0.0344 | | 0.3500 | 744 | 0.03 | | 0.3504 | 745 | 0.0629 | | 0.3509 | 746 | 0.0241 | | 0.3514 | 747 | 0.0335 | | 0.3518 | 748 | 0.0342 | | 0.3523 | 749 | 0.0475 | | 0.3528 | 750 | 0.0285 | | 0.3532 | 751 | 0.0333 | | 0.3537 | 752 | 0.0283 | | 0.3542 | 753 | 0.0299 | | 0.3547 | 754 | 0.0321 | | 0.3551 | 755 | 0.006 | | 0.3556 | 756 | 0.0518 | | 0.3561 | 757 | 0.0271 | | 0.3565 | 758 | 0.0326 | | 0.3570 | 759 | 0.0173 | | 0.3575 | 760 | 0.0205 | | 0.3579 | 761 | 0.0127 | | 0.3584 | 762 | 0.0328 | | 0.3589 | 763 | 0.0367 | | 0.3594 | 764 | 0.035 | | 0.3598 | 765 | 0.0437 | | 0.3603 | 766 | 0.0229 | | 0.3608 | 767 | 0.0333 | | 0.3612 | 768 | 0.0291 | | 0.3617 | 769 | 0.1005 | | 0.3622 | 770 | 0.0395 | | 0.3627 | 771 | 0.0358 | | 0.3631 | 772 | 0.0325 | | 0.3636 | 773 | 0.0517 | | 0.3641 | 774 | 0.0719 | | 0.3645 | 775 | 0.0141 | | 0.3650 | 776 | 0.0399 | | 0.3655 | 777 | 0.0276 | | 0.3659 | 778 | 0.0244 | | 0.3664 | 779 | 0.0509 | | 0.3669 | 780 | 0.0416 | | 0.3674 | 781 | 0.0129 | | 0.3678 | 782 | 0.0413 | | 0.3683 | 783 | 0.0211 | | 0.3688 | 784 | 0.0485 | | 0.3692 | 785 | 0.0132 | | 0.3697 | 786 | 0.0506 | | 0.3702 | 787 | 0.0712 | | 0.3706 | 788 | 0.0087 | | 0.3711 | 789 | 0.0283 | | 0.3716 | 790 | 0.0321 | | 0.3721 | 791 | 0.0293 | | 0.3725 | 792 | 0.027 | | 0.3730 | 793 | 0.0251 | | 0.3735 | 794 | 0.0536 | | 0.3739 | 795 | 0.0577 | | 0.3744 | 796 | 0.0472 | | 0.3749 | 797 | 0.0439 | | 0.3754 | 798 | 0.0237 | | 0.3758 | 799 | 0.0392 | | 0.3763 | 800 | 0.0457 | | 0.3768 | 801 | 0.03 | | 0.3772 | 802 | 0.0301 | | 0.3777 | 803 | 0.0363 | | 0.3782 | 804 | 0.0091 | | 0.3786 | 805 | 0.0847 | | 0.3791 | 806 | 0.0673 | | 0.3796 | 807 | 0.0216 | | 0.3801 | 808 | 0.0684 | | 0.3805 | 809 | 0.0435 | | 0.3810 | 810 | 0.0468 | | 0.3815 | 811 | 0.0324 | | 0.3819 | 812 | 0.0263 | | 0.3824 | 813 | 0.0347 | | 0.3829 | 814 | 0.0115 | | 0.3833 | 815 | 0.0275 | | 0.3838 | 816 | 0.0727 | | 0.3843 | 817 | 0.0313 | | 0.3848 | 818 | 0.0367 | | 0.3852 | 819 | 0.0579 | | 0.3857 | 820 | 0.0236 | | 0.3862 | 821 | 0.0179 | | 0.3866 | 822 | 0.0388 | | 0.3871 | 823 | 0.0421 | | 0.3876 | 824 | 0.0297 | | 0.3881 | 825 | 0.0327 | | 0.3885 | 826 | 0.0415 | | 0.3890 | 827 | 0.0135 | | 0.3895 | 828 | 0.0137 | | 0.3899 | 829 | 0.0309 | | 0.3904 | 830 | 0.0248 | | 0.3909 | 831 | 0.0283 | | 0.3913 | 832 | 0.0526 | | 0.3918 | 833 | 0.0321 | | 0.3923 | 834 | 0.0725 | | 0.3928 | 835 | 0.0253 | | 0.3932 | 836 | 0.029 | | 0.3937 | 837 | 0.0231 | | 0.3942 | 838 | 0.0178 | | 0.3946 | 839 | 0.0361 | | 0.3951 | 840 | 0.0397 | | 0.3956 | 841 | 0.0219 | | 0.3960 | 842 | 0.066 | | 0.3965 | 843 | 0.0516 | | 0.3970 | 844 | 0.0235 | | 0.3975 | 845 | 0.0421 | | 0.3979 | 846 | 0.0371 | | 0.3984 | 847 | 0.06 | | 0.3989 | 848 | 0.0405 | | 0.3993 | 849 | 0.0499 | | 0.3998 | 850 | 0.0199 | | 0.4003 | 851 | 0.0114 | | 0.4008 | 852 | 0.0477 | | 0.4012 | 853 | 0.0449 | | 0.4017 | 854 | 0.0415 | | 0.4022 | 855 | 0.0222 | | 0.4026 | 856 | 0.0584 | | 0.4031 | 857 | 0.0086 | | 0.4036 | 858 | 0.0475 | | 0.4040 | 859 | 0.0442 | | 0.4045 | 860 | 0.0318 | | 0.4050 | 861 | 0.0211 | | 0.4055 | 862 | 0.0234 | | 0.4059 | 863 | 0.072 | | 0.4064 | 864 | 0.0275 | | 0.4069 | 865 | 0.0213 | | 0.4073 | 866 | 0.0443 | | 0.4078 | 867 | 0.0176 | | 0.4083 | 868 | 0.0313 | | 0.4087 | 869 | 0.025 | | 0.4092 | 870 | 0.0247 | | 0.4097 | 871 | 0.0169 | | 0.4102 | 872 | 0.0946 | | 0.4106 | 873 | 0.0689 | | 0.4111 | 874 | 0.0179 | | 0.4116 | 875 | 0.026 | | 0.4120 | 876 | 0.0226 | | 0.4125 | 877 | 0.0955 | | 0.4130 | 878 | 0.0177 | | 0.4135 | 879 | 0.1038 | | 0.4139 | 880 | 0.0401 | | 0.4144 | 881 | 0.0329 | | 0.4149 | 882 | 0.0373 | | 0.4153 | 883 | 0.0205 | | 0.4158 | 884 | 0.0158 | | 0.4163 | 885 | 0.0292 | | 0.4167 | 886 | 0.0247 | | 0.4172 | 887 | 0.0067 | | 0.4177 | 888 | 0.0463 | | 0.4182 | 889 | 0.0921 | | 0.4186 | 890 | 0.0857 | | 0.4191 | 891 | 0.0628 | | 0.4196 | 892 | 0.0799 | | 0.4200 | 893 | 0.0086 | | 0.4205 | 894 | 0.0395 | | 0.4210 | 895 | 0.0324 | | 0.4214 | 896 | 0.0742 | | 0.4219 | 897 | 0.0321 | | 0.4224 | 898 | 0.0133 | | 0.4229 | 899 | 0.0108 | | 0.4233 | 900 | 0.0285 | | 0.4238 | 901 | 0.0882 | | 0.4243 | 902 | 0.0344 | | 0.4247 | 903 | 0.0434 | | 0.4252 | 904 | 0.015 | | 0.4257 | 905 | 0.0215 | | 0.4262 | 906 | 0.0272 | | 0.4266 | 907 | 0.0582 | | 0.4271 | 908 | 0.0309 | | 0.4276 | 909 | 0.0181 | | 0.4280 | 910 | 0.038 | | 0.4285 | 911 | 0.0294 | | 0.4290 | 912 | 0.0341 | | 0.4294 | 913 | 0.0581 | | 0.4299 | 914 | 0.0165 | | 0.4304 | 915 | 0.0649 | | 0.4309 | 916 | 0.0404 | | 0.4313 | 917 | 0.0666 | | 0.4318 | 918 | 0.0112 | | 0.4323 | 919 | 0.022 | | 0.4327 | 920 | 0.0212 | | 0.4332 | 921 | 0.035 | | 0.4337 | 922 | 0.0286 | | 0.4341 | 923 | 0.0504 | | 0.4346 | 924 | 0.0776 | | 0.4351 | 925 | 0.0258 | | 0.4356 | 926 | 0.0531 | | 0.4360 | 927 | 0.0211 | | 0.4365 | 928 | 0.0175 | | 0.4370 | 929 | 0.0376 | | 0.4374 | 930 | 0.039 | | 0.4379 | 931 | 0.0178 | | 0.4384 | 932 | 0.0148 | | 0.4389 | 933 | 0.0625 | | 0.4393 | 934 | 0.1002 | | 0.4398 | 935 | 0.021 | | 0.4403 | 936 | 0.012 | | 0.4407 | 937 | 0.0236 | | 0.4412 | 938 | 0.0412 | | 0.4417 | 939 | 0.022 | | 0.4421 | 940 | 0.0408 | | 0.4426 | 941 | 0.0336 | | 0.4431 | 942 | 0.0573 | | 0.4436 | 943 | 0.0767 | | 0.4440 | 944 | 0.0193 | | 0.4445 | 945 | 0.0318 | | 0.4450 | 946 | 0.0118 | | 0.4454 | 947 | 0.0312 | | 0.4459 | 948 | 0.0575 | | 0.4464 | 949 | 0.0216 | | 0.4468 | 950 | 0.0412 | | 0.4473 | 951 | 0.0401 | | 0.4478 | 952 | 0.0494 | | 0.4483 | 953 | 0.0435 | | 0.4487 | 954 | 0.0622 | | 0.4492 | 955 | 0.0257 | | 0.4497 | 956 | 0.0494 | | 0.4501 | 957 | 0.0624 | | 0.4506 | 958 | 0.023 | | 0.4511 | 959 | 0.0473 | | 0.4516 | 960 | 0.0121 | | 0.4520 | 961 | 0.0217 | | 0.4525 | 962 | 0.0236 | | 0.4530 | 963 | 0.0525 | | 0.4534 | 964 | 0.0239 | | 0.4539 | 965 | 0.0103 | | 0.4544 | 966 | 0.0322 | | 0.4548 | 967 | 0.0505 | | 0.4553 | 968 | 0.0194 | | 0.4558 | 969 | 0.0287 | | 0.4563 | 970 | 0.0728 | | 0.4567 | 971 | 0.0617 | | 0.4572 | 972 | 0.023 | | 0.4577 | 973 | 0.0343 | | 0.4581 | 974 | 0.0313 | | 0.4586 | 975 | 0.0526 | | 0.4591 | 976 | 0.0419 | | 0.4595 | 977 | 0.0411 | | 0.4600 | 978 | 0.0343 | | 0.4605 | 979 | 0.0269 | | 0.4610 | 980 | 0.0482 | | 0.4614 | 981 | 0.0414 | | 0.4619 | 982 | 0.0254 | | 0.4624 | 983 | 0.0344 | | 0.4628 | 984 | 0.0217 | | 0.4633 | 985 | 0.014 | | 0.4638 | 986 | 0.0261 | | 0.4643 | 987 | 0.0188 | | 0.4647 | 988 | 0.0374 | | 0.4652 | 989 | 0.027 | | 0.4657 | 990 | 0.0118 | | 0.4661 | 991 | 0.007 | | 0.4666 | 992 | 0.0481 | | 0.4671 | 993 | 0.0344 | | 0.4675 | 994 | 0.043 | | 0.4680 | 995 | 0.0181 | | 0.4685 | 996 | 0.025 | | 0.4690 | 997 | 0.0282 | | 0.4694 | 998 | 0.0282 | | 0.4699 | 999 | 0.0548 | | 0.4704 | 1000 | 0.0188 | | 0.4708 | 1001 | 0.0551 | | 0.4713 | 1002 | 0.0623 | | 0.4718 | 1003 | 0.0406 | | 0.4722 | 1004 | 0.0782 | | 0.4727 | 1005 | 0.0196 | | 0.4732 | 1006 | 0.0322 | | 0.4737 | 1007 | 0.025 | | 0.4741 | 1008 | 0.0205 | | 0.4746 | 1009 | 0.0624 | | 0.4751 | 1010 | 0.1019 | | 0.4755 | 1011 | 0.0881 | | 0.4760 | 1012 | 0.0227 | | 0.4765 | 1013 | 0.0093 | | 0.4770 | 1014 | 0.0128 | | 0.4774 | 1015 | 0.0208 | | 0.4779 | 1016 | 0.0278 | | 0.4784 | 1017 | 0.0217 | | 0.4788 | 1018 | 0.0225 | | 0.4793 | 1019 | 0.0532 | | 0.4798 | 1020 | 0.0161 | | 0.4802 | 1021 | 0.0319 | | 0.4807 | 1022 | 0.0332 | | 0.4812 | 1023 | 0.0693 | | 0.4817 | 1024 | 0.0509 | | 0.4821 | 1025 | 0.0265 | | 0.4826 | 1026 | 0.0242 | | 0.4831 | 1027 | 0.0761 | | 0.4835 | 1028 | 0.0357 | | 0.4840 | 1029 | 0.0376 | | 0.4845 | 1030 | 0.0404 | | 0.4849 | 1031 | 0.0181 | | 0.4854 | 1032 | 0.0143 | | 0.4859 | 1033 | 0.0912 | | 0.4864 | 1034 | 0.0313 | | 0.4868 | 1035 | 0.0269 | | 0.4873 | 1036 | 0.0191 | | 0.4878 | 1037 | 0.0649 | | 0.4882 | 1038 | 0.071 | | 0.4887 | 1039 | 0.0545 | | 0.4892 | 1040 | 0.0148 | | 0.4897 | 1041 | 0.0319 | | 0.4901 | 1042 | 0.0232 | | 0.4906 | 1043 | 0.037 | | 0.4911 | 1044 | 0.0323 | | 0.4915 | 1045 | 0.0416 | | 0.4920 | 1046 | 0.0468 | | 0.4925 | 1047 | 0.0119 | | 0.4929 | 1048 | 0.0419 | | 0.4934 | 1049 | 0.0275 | | 0.4939 | 1050 | 0.0191 | | 0.4944 | 1051 | 0.043 | | 0.4948 | 1052 | 0.0171 | | 0.4953 | 1053 | 0.0194 | | 0.4958 | 1054 | 0.0425 | | 0.4962 | 1055 | 0.0499 | | 0.4967 | 1056 | 0.0711 | | 0.4972 | 1057 | 0.0452 | | 0.4976 | 1058 | 0.012 | | 0.4981 | 1059 | 0.0385 | | 0.4986 | 1060 | 0.0345 | | 0.4991 | 1061 | 0.0311 | | 0.4995 | 1062 | 0.0828 | | 0.5 | 1063 | 0.026 | | 0.5005 | 1064 | 0.0387 | | 0.5009 | 1065 | 0.0511 | | 0.5014 | 1066 | 0.017 | | 0.5019 | 1067 | 0.0116 | | 0.5024 | 1068 | 0.0408 | | 0.5028 | 1069 | 0.0248 | | 0.5033 | 1070 | 0.0155 | | 0.5038 | 1071 | 0.0298 | | 0.5042 | 1072 | 0.0596 | | 0.5047 | 1073 | 0.0399 | | 0.5052 | 1074 | 0.0438 | | 0.5056 | 1075 | 0.08 | | 0.5061 | 1076 | 0.0123 | | 0.5066 | 1077 | 0.027 | | 0.5071 | 1078 | 0.0321 | | 0.5075 | 1079 | 0.0199 | | 0.5080 | 1080 | 0.0373 | | 0.5085 | 1081 | 0.0274 | | 0.5089 | 1082 | 0.0262 | | 0.5094 | 1083 | 0.0307 | | 0.5099 | 1084 | 0.0098 | | 0.5103 | 1085 | 0.0863 | | 0.5108 | 1086 | 0.014 | | 0.5113 | 1087 | 0.0337 | | 0.5118 | 1088 | 0.0786 | | 0.5122 | 1089 | 0.0348 | | 0.5127 | 1090 | 0.029 | | 0.5132 | 1091 | 0.0331 | | 0.5136 | 1092 | 0.0355 | | 0.5141 | 1093 | 0.0297 | | 0.5146 | 1094 | 0.1157 | | 0.5151 | 1095 | 0.0274 | | 0.5155 | 1096 | 0.0272 | | 0.5160 | 1097 | 0.018 | | 0.5165 | 1098 | 0.0847 | | 0.5169 | 1099 | 0.0459 | | 0.5174 | 1100 | 0.0203 | | 0.5179 | 1101 | 0.0362 | | 0.5183 | 1102 | 0.0263 | | 0.5188 | 1103 | 0.0197 | | 0.5193 | 1104 | 0.0323 | | 0.5198 | 1105 | 0.0257 | | 0.5202 | 1106 | 0.0137 | | 0.5207 | 1107 | 0.0339 | | 0.5212 | 1108 | 0.0198 | | 0.5216 | 1109 | 0.0165 | | 0.5221 | 1110 | 0.037 | | 0.5226 | 1111 | 0.0411 | | 0.5230 | 1112 | 0.0553 | | 0.5235 | 1113 | 0.0163 | | 0.5240 | 1114 | 0.0145 | | 0.5245 | 1115 | 0.0398 | | 0.5249 | 1116 | 0.0316 | | 0.5254 | 1117 | 0.0342 | | 0.5259 | 1118 | 0.0333 | | 0.5263 | 1119 | 0.0694 | | 0.5268 | 1120 | 0.0366 | | 0.5273 | 1121 | 0.0495 | | 0.5278 | 1122 | 0.0633 | | 0.5282 | 1123 | 0.0316 | | 0.5287 | 1124 | 0.0373 | | 0.5292 | 1125 | 0.0785 | | 0.5296 | 1126 | 0.0425 | | 0.5301 | 1127 | 0.0175 | | 0.5306 | 1128 | 0.026 | | 0.5310 | 1129 | 0.0223 | | 0.5315 | 1130 | 0.0199 | | 0.5320 | 1131 | 0.0685 | | 0.5325 | 1132 | 0.0317 | | 0.5329 | 1133 | 0.0278 | | 0.5334 | 1134 | 0.0384 | | 0.5339 | 1135 | 0.0307 | | 0.5343 | 1136 | 0.0339 | | 0.5348 | 1137 | 0.0229 | | 0.5353 | 1138 | 0.0378 | | 0.5357 | 1139 | 0.0192 | | 0.5362 | 1140 | 0.0386 | | 0.5367 | 1141 | 0.0264 | | 0.5372 | 1142 | 0.029 | | 0.5376 | 1143 | 0.056 | | 0.5381 | 1144 | 0.0193 | | 0.5386 | 1145 | 0.0894 | | 0.5390 | 1146 | 0.0279 | | 0.5395 | 1147 | 0.0185 | | 0.5400 | 1148 | 0.0344 | | 0.5405 | 1149 | 0.0375 | | 0.5409 | 1150 | 0.0339 | | 0.5414 | 1151 | 0.0653 | | 0.5419 | 1152 | 0.0298 | | 0.5423 | 1153 | 0.0425 | | 0.5428 | 1154 | 0.051 | | 0.5433 | 1155 | 0.0198 | | 0.5437 | 1156 | 0.0196 | | 0.5442 | 1157 | 0.0475 | | 0.5447 | 1158 | 0.0159 | | 0.5452 | 1159 | 0.0439 | | 0.5456 | 1160 | 0.1319 | | 0.5461 | 1161 | 0.0425 | | 0.5466 | 1162 | 0.0313 | | 0.5470 | 1163 | 0.0144 | | 0.5475 | 1164 | 0.0446 | | 0.5480 | 1165 | 0.0162 | | 0.5484 | 1166 | 0.019 | | 0.5489 | 1167 | 0.0464 | | 0.5494 | 1168 | 0.028 | | 0.5499 | 1169 | 0.0252 | | 0.5503 | 1170 | 0.0723 | | 0.5508 | 1171 | 0.0159 | | 0.5513 | 1172 | 0.0463 | | 0.5517 | 1173 | 0.0193 | | 0.5522 | 1174 | 0.0303 | | 0.5527 | 1175 | 0.0218 | | 0.5532 | 1176 | 0.0482 | | 0.5536 | 1177 | 0.02 | | 0.5541 | 1178 | 0.0429 | | 0.5546 | 1179 | 0.0135 | | 0.5550 | 1180 | 0.0409 | | 0.5555 | 1181 | 0.0123 | | 0.5560 | 1182 | 0.1156 | | 0.5564 | 1183 | 0.0772 | | 0.5569 | 1184 | 0.0649 | | 0.5574 | 1185 | 0.0315 | | 0.5579 | 1186 | 0.0239 | | 0.5583 | 1187 | 0.056 | | 0.5588 | 1188 | 0.026 | | 0.5593 | 1189 | 0.0482 | | 0.5597 | 1190 | 0.0466 | | 0.5602 | 1191 | 0.0108 | | 0.5607 | 1192 | 0.0159 | | 0.5611 | 1193 | 0.0375 | | 0.5616 | 1194 | 0.0289 | | 0.5621 | 1195 | 0.0194 | | 0.5626 | 1196 | 0.0509 | | 0.5630 | 1197 | 0.0254 | | 0.5635 | 1198 | 0.0527 | | 0.5640 | 1199 | 0.0357 | | 0.5644 | 1200 | 0.0173 | | 0.5649 | 1201 | 0.0412 | | 0.5654 | 1202 | 0.0238 | | 0.5659 | 1203 | 0.0484 | | 0.5663 | 1204 | 0.0358 | | 0.5668 | 1205 | 0.0225 | | 0.5673 | 1206 | 0.0376 | | 0.5677 | 1207 | 0.0534 | | 0.5682 | 1208 | 0.0403 | | 0.5687 | 1209 | 0.0276 | | 0.5691 | 1210 | 0.0283 | | 0.5696 | 1211 | 0.0507 | | 0.5701 | 1212 | 0.0275 | | 0.5706 | 1213 | 0.0526 | | 0.5710 | 1214 | 0.0347 | | 0.5715 | 1215 | 0.0416 | | 0.5720 | 1216 | 0.0428 | | 0.5724 | 1217 | 0.0427 | | 0.5729 | 1218 | 0.041 | | 0.5734 | 1219 | 0.0245 | | 0.5738 | 1220 | 0.0268 | | 0.5743 | 1221 | 0.0187 | | 0.5748 | 1222 | 0.015 | | 0.5753 | 1223 | 0.0392 | | 0.5757 | 1224 | 0.0144 | | 0.5762 | 1225 | 0.0321 | | 0.5767 | 1226 | 0.0503 | | 0.5771 | 1227 | 0.0339 | | 0.5776 | 1228 | 0.0418 | | 0.5781 | 1229 | 0.033 | | 0.5786 | 1230 | 0.0318 | | 0.5790 | 1231 | 0.041 | | 0.5795 | 1232 | 0.0363 | | 0.5800 | 1233 | 0.0267 | | 0.5804 | 1234 | 0.0365 | | 0.5809 | 1235 | 0.0249 | | 0.5814 | 1236 | 0.0194 | | 0.5818 | 1237 | 0.0366 | | 0.5823 | 1238 | 0.0474 | | 0.5828 | 1239 | 0.0349 | | 0.5833 | 1240 | 0.0144 | | 0.5837 | 1241 | 0.027 | | 0.5842 | 1242 | 0.0241 | | 0.5847 | 1243 | 0.0249 | | 0.5851 | 1244 | 0.012 | | 0.5856 | 1245 | 0.0304 | | 0.5861 | 1246 | 0.011 | | 0.5865 | 1247 | 0.0765 | | 0.5870 | 1248 | 0.0354 | | 0.5875 | 1249 | 0.0108 | | 0.5880 | 1250 | 0.0239 | | 0.5884 | 1251 | 0.0306 | | 0.5889 | 1252 | 0.0125 | | 0.5894 | 1253 | 0.0507 | | 0.5898 | 1254 | 0.025 | | 0.5903 | 1255 | 0.0323 | | 0.5908 | 1256 | 0.0369 | | 0.5913 | 1257 | 0.0261 | | 0.5917 | 1258 | 0.0648 | | 0.5922 | 1259 | 0.018 | | 0.5927 | 1260 | 0.0223 | | 0.5931 | 1261 | 0.0223 | | 0.5936 | 1262 | 0.0403 | | 0.5941 | 1263 | 0.0103 | | 0.5945 | 1264 | 0.0234 | | 0.5950 | 1265 | 0.0236 | | 0.5955 | 1266 | 0.0166 | | 0.5960 | 1267 | 0.0085 | | 0.5964 | 1268 | 0.0298 | | 0.5969 | 1269 | 0.0317 | | 0.5974 | 1270 | 0.072 | | 0.5978 | 1271 | 0.0558 | | 0.5983 | 1272 | 0.0391 | | 0.5988 | 1273 | 0.0379 | | 0.5992 | 1274 | 0.0913 | | 0.5997 | 1275 | 0.0224 | | 0.6002 | 1276 | 0.014 | | 0.6007 | 1277 | 0.027 | | 0.6011 | 1278 | 0.0593 | | 0.6016 | 1279 | 0.0214 | | 0.6021 | 1280 | 0.0191 | | 0.6025 | 1281 | 0.047 | | 0.6030 | 1282 | 0.0157 | | 0.6035 | 1283 | 0.0468 | | 0.6040 | 1284 | 0.0166 | | 0.6044 | 1285 | 0.0189 | | 0.6049 | 1286 | 0.0462 | | 0.6054 | 1287 | 0.017 | | 0.6058 | 1288 | 0.03 | | 0.6063 | 1289 | 0.054 | | 0.6068 | 1290 | 0.0276 | | 0.6072 | 1291 | 0.0139 | | 0.6077 | 1292 | 0.0226 | | 0.6082 | 1293 | 0.0391 | | 0.6087 | 1294 | 0.0717 | | 0.6091 | 1295 | 0.0472 | | 0.6096 | 1296 | 0.0697 | | 0.6101 | 1297 | 0.0271 | | 0.6105 | 1298 | 0.057 | | 0.6110 | 1299 | 0.0263 | | 0.6115 | 1300 | 0.0842 | | 0.6119 | 1301 | 0.0174 | | 0.6124 | 1302 | 0.049 | | 0.6129 | 1303 | 0.036 | | 0.6134 | 1304 | 0.0444 | | 0.6138 | 1305 | 0.0255 | | 0.6143 | 1306 | 0.0619 | | 0.6148 | 1307 | 0.0275 | | 0.6152 | 1308 | 0.0144 | | 0.6157 | 1309 | 0.0356 | | 0.6162 | 1310 | 0.018 | | 0.6167 | 1311 | 0.0494 | | 0.6171 | 1312 | 0.0611 | | 0.6176 | 1313 | 0.0312 | | 0.6181 | 1314 | 0.0212 | | 0.6185 | 1315 | 0.0217 | | 0.6190 | 1316 | 0.042 | | 0.6195 | 1317 | 0.0328 | | 0.6199 | 1318 | 0.0538 | | 0.6204 | 1319 | 0.065 | | 0.6209 | 1320 | 0.0173 | | 0.6214 | 1321 | 0.0166 | | 0.6218 | 1322 | 0.0235 | | 0.6223 | 1323 | 0.0165 | | 0.6228 | 1324 | 0.0389 | | 0.6232 | 1325 | 0.0311 | | 0.6237 | 1326 | 0.0616 | | 0.6242 | 1327 | 0.0463 | | 0.6246 | 1328 | 0.014 | | 0.6251 | 1329 | 0.0339 | | 0.6256 | 1330 | 0.0086 | | 0.6261 | 1331 | 0.013 | | 0.6265 | 1332 | 0.0379 | | 0.6270 | 1333 | 0.0461 | | 0.6275 | 1334 | 0.0193 | | 0.6279 | 1335 | 0.0236 | | 0.6284 | 1336 | 0.0181 | | 0.6289 | 1337 | 0.0903 | | 0.6294 | 1338 | 0.0178 | | 0.6298 | 1339 | 0.0384 | | 0.6303 | 1340 | 0.0255 | | 0.6308 | 1341 | 0.05 | | 0.6312 | 1342 | 0.0132 | | 0.6317 | 1343 | 0.0188 | | 0.6322 | 1344 | 0.0974 | | 0.6326 | 1345 | 0.0304 | | 0.6331 | 1346 | 0.0716 | | 0.6336 | 1347 | 0.0147 | | 0.6341 | 1348 | 0.0549 | | 0.6345 | 1349 | 0.0454 | | 0.6350 | 1350 | 0.0282 | | 0.6355 | 1351 | 0.0185 | | 0.6359 | 1352 | 0.038 | | 0.6364 | 1353 | 0.0197 | | 0.6369 | 1354 | 0.0481 | | 0.6373 | 1355 | 0.0117 | | 0.6378 | 1356 | 0.0193 | | 0.6383 | 1357 | 0.0072 | | 0.6388 | 1358 | 0.035 | | 0.6392 | 1359 | 0.0446 | | 0.6397 | 1360 | 0.0321 | | 0.6402 | 1361 | 0.0622 | | 0.6406 | 1362 | 0.0391 | | 0.6411 | 1363 | 0.011 | | 0.6416 | 1364 | 0.0204 | | 0.6421 | 1365 | 0.0228 | | 0.6425 | 1366 | 0.038 | | 0.6430 | 1367 | 0.0167 | | 0.6435 | 1368 | 0.027 | | 0.6439 | 1369 | 0.0178 | | 0.6444 | 1370 | 0.0434 | | 0.6449 | 1371 | 0.0312 | | 0.6453 | 1372 | 0.0162 | | 0.6458 | 1373 | 0.0175 | | 0.6463 | 1374 | 0.0192 | | 0.6468 | 1375 | 0.0256 | | 0.6472 | 1376 | 0.0748 | | 0.6477 | 1377 | 0.0341 | | 0.6482 | 1378 | 0.033 | | 0.6486 | 1379 | 0.0303 | | 0.6491 | 1380 | 0.008 | | 0.6496 | 1381 | 0.026 | | 0.6500 | 1382 | 0.0306 | | 0.6505 | 1383 | 0.0096 | | 0.6510 | 1384 | 0.0576 | | 0.6515 | 1385 | 0.0808 | | 0.6519 | 1386 | 0.0542 | | 0.6524 | 1387 | 0.0292 | | 0.6529 | 1388 | 0.0385 | | 0.6533 | 1389 | 0.0657 | | 0.6538 | 1390 | 0.0239 | | 0.6543 | 1391 | 0.0457 | | 0.6548 | 1392 | 0.0126 | | 0.6552 | 1393 | 0.0121 | | 0.6557 | 1394 | 0.0236 | | 0.6562 | 1395 | 0.0299 | | 0.6566 | 1396 | 0.0638 | | 0.6571 | 1397 | 0.0209 | | 0.6576 | 1398 | 0.0457 | | 0.6580 | 1399 | 0.0258 | | 0.6585 | 1400 | 0.0434 | | 0.6590 | 1401 | 0.0497 | | 0.6595 | 1402 | 0.0447 | | 0.6599 | 1403 | 0.0333 | | 0.6604 | 1404 | 0.0289 | | 0.6609 | 1405 | 0.0601 | | 0.6613 | 1406 | 0.063 | | 0.6618 | 1407 | 0.0138 | | 0.6623 | 1408 | 0.0104 | | 0.6627 | 1409 | 0.0208 | | 0.6632 | 1410 | 0.0262 | | 0.6637 | 1411 | 0.0302 | | 0.6642 | 1412 | 0.0256 | | 0.6646 | 1413 | 0.0254 | | 0.6651 | 1414 | 0.0226 | | 0.6656 | 1415 | 0.0129 | | 0.6660 | 1416 | 0.077 | | 0.6665 | 1417 | 0.0174 | | 0.6670 | 1418 | 0.02 | | 0.6675 | 1419 | 0.0271 | | 0.6679 | 1420 | 0.0125 | | 0.6684 | 1421 | 0.066 | | 0.6689 | 1422 | 0.0214 | | 0.6693 | 1423 | 0.0361 | | 0.6698 | 1424 | 0.0136 | | 0.6703 | 1425 | 0.0397 | | 0.6707 | 1426 | 0.0352 | | 0.6712 | 1427 | 0.0278 | | 0.6717 | 1428 | 0.0106 | | 0.6722 | 1429 | 0.0297 | | 0.6726 | 1430 | 0.0546 | | 0.6731 | 1431 | 0.0148 | | 0.6736 | 1432 | 0.0194 | | 0.6740 | 1433 | 0.0571 | | 0.6745 | 1434 | 0.0329 | | 0.6750 | 1435 | 0.0154 | | 0.6754 | 1436 | 0.0634 | | 0.6759 | 1437 | 0.02 | | 0.6764 | 1438 | 0.0176 | | 0.6769 | 1439 | 0.0389 | | 0.6773 | 1440 | 0.0192 | | 0.6778 | 1441 | 0.0265 | | 0.6783 | 1442 | 0.0205 | | 0.6787 | 1443 | 0.0153 | | 0.6792 | 1444 | 0.0437 | | 0.6797 | 1445 | 0.032 | | 0.6802 | 1446 | 0.0156 | | 0.6806 | 1447 | 0.0506 | | 0.6811 | 1448 | 0.0135 | | 0.6816 | 1449 | 0.0234 | | 0.6820 | 1450 | 0.0183 | | 0.6825 | 1451 | 0.0674 | | 0.6830 | 1452 | 0.0633 | | 0.6834 | 1453 | 0.0262 | | 0.6839 | 1454 | 0.0173 | | 0.6844 | 1455 | 0.0247 | | 0.6849 | 1456 | 0.0277 | | 0.6853 | 1457 | 0.0299 | | 0.6858 | 1458 | 0.046 | | 0.6863 | 1459 | 0.0223 | | 0.6867 | 1460 | 0.0207 | | 0.6872 | 1461 | 0.0162 | | 0.6877 | 1462 | 0.0383 | | 0.6881 | 1463 | 0.0219 | | 0.6886 | 1464 | 0.0885 | | 0.6891 | 1465 | 0.0338 | | 0.6896 | 1466 | 0.0151 | | 0.6900 | 1467 | 0.0324 | | 0.6905 | 1468 | 0.0104 | | 0.6910 | 1469 | 0.0159 | | 0.6914 | 1470 | 0.0507 | | 0.6919 | 1471 | 0.0468 | | 0.6924 | 1472 | 0.0438 | | 0.6929 | 1473 | 0.0438 | | 0.6933 | 1474 | 0.0413 | | 0.6938 | 1475 | 0.027 | | 0.6943 | 1476 | 0.0261 | | 0.6947 | 1477 | 0.0521 | | 0.6952 | 1478 | 0.0526 | | 0.6957 | 1479 | 0.0333 | | 0.6961 | 1480 | 0.0407 | | 0.6966 | 1481 | 0.019 | | 0.6971 | 1482 | 0.0314 | | 0.6976 | 1483 | 0.0298 | | 0.6980 | 1484 | 0.0613 | | 0.6985 | 1485 | 0.0168 | | 0.6990 | 1486 | 0.016 | | 0.6994 | 1487 | 0.0366 | | 0.6999 | 1488 | 0.0289 | | 0.7004 | 1489 | 0.0761 | | 0.7008 | 1490 | 0.0172 | | 0.7013 | 1491 | 0.023 | | 0.7018 | 1492 | 0.0137 | | 0.7023 | 1493 | 0.0359 | | 0.7027 | 1494 | 0.013 | | 0.7032 | 1495 | 0.0088 | | 0.7037 | 1496 | 0.0463 | | 0.7041 | 1497 | 0.072 | | 0.7046 | 1498 | 0.0394 | | 0.7051 | 1499 | 0.0144 | | 0.7056 | 1500 | 0.0289 | | 0.7060 | 1501 | 0.0744 | | 0.7065 | 1502 | 0.0617 | | 0.7070 | 1503 | 0.0206 | | 0.7074 | 1504 | 0.0194 | | 0.7079 | 1505 | 0.0325 | | 0.7084 | 1506 | 0.0475 | | 0.7088 | 1507 | 0.0168 | | 0.7093 | 1508 | 0.0156 | | 0.7098 | 1509 | 0.0433 | | 0.7103 | 1510 | 0.0273 | | 0.7107 | 1511 | 0.0147 | | 0.7112 | 1512 | 0.0346 | | 0.7117 | 1513 | 0.0179 | | 0.7121 | 1514 | 0.0175 | | 0.7126 | 1515 | 0.0736 | | 0.7131 | 1516 | 0.0308 | | 0.7135 | 1517 | 0.0416 | | 0.7140 | 1518 | 0.0134 | | 0.7145 | 1519 | 0.0392 | | 0.7150 | 1520 | 0.0118 | | 0.7154 | 1521 | 0.0376 | | 0.7159 | 1522 | 0.0461 | | 0.7164 | 1523 | 0.0539 | | 0.7168 | 1524 | 0.0132 | | 0.7173 | 1525 | 0.0602 | | 0.7178 | 1526 | 0.0475 | | 0.7183 | 1527 | 0.0236 | | 0.7187 | 1528 | 0.0471 | | 0.7192 | 1529 | 0.02 | | 0.7197 | 1530 | 0.0202 | | 0.7201 | 1531 | 0.0347 | | 0.7206 | 1532 | 0.0224 | | 0.7211 | 1533 | 0.0169 | | 0.7215 | 1534 | 0.0373 | | 0.7220 | 1535 | 0.0356 | | 0.7225 | 1536 | 0.0288 | | 0.7230 | 1537 | 0.0236 | | 0.7234 | 1538 | 0.0135 | | 0.7239 | 1539 | 0.0185 | | 0.7244 | 1540 | 0.0186 | | 0.7248 | 1541 | 0.0151 | | 0.7253 | 1542 | 0.0122 | | 0.7258 | 1543 | 0.0132 | | 0.7262 | 1544 | 0.036 | | 0.7267 | 1545 | 0.0164 | | 0.7272 | 1546 | 0.0437 | | 0.7277 | 1547 | 0.0115 | | 0.7281 | 1548 | 0.0139 | | 0.7286 | 1549 | 0.058 | | 0.7291 | 1550 | 0.0074 | | 0.7295 | 1551 | 0.0555 | | 0.7300 | 1552 | 0.0163 | | 0.7305 | 1553 | 0.0263 | | 0.7310 | 1554 | 0.0132 | | 0.7314 | 1555 | 0.0186 | | 0.7319 | 1556 | 0.0389 | | 0.7324 | 1557 | 0.0205 | | 0.7328 | 1558 | 0.0615 | | 0.7333 | 1559 | 0.0672 | | 0.7338 | 1560 | 0.0103 | | 0.7342 | 1561 | 0.0481 | | 0.7347 | 1562 | 0.0098 | | 0.7352 | 1563 | 0.0158 | | 0.7357 | 1564 | 0.0237 | | 0.7361 | 1565 | 0.046 | | 0.7366 | 1566 | 0.0268 | | 0.7371 | 1567 | 0.0208 | | 0.7375 | 1568 | 0.0226 | | 0.7380 | 1569 | 0.0249 | | 0.7385 | 1570 | 0.0086 | | 0.7389 | 1571 | 0.0258 | | 0.7394 | 1572 | 0.1062 | | 0.7399 | 1573 | 0.0356 | | 0.7404 | 1574 | 0.0257 | | 0.7408 | 1575 | 0.0511 | | 0.7413 | 1576 | 0.0476 | | 0.7418 | 1577 | 0.0189 | | 0.7422 | 1578 | 0.0495 | | 0.7427 | 1579 | 0.015 | | 0.7432 | 1580 | 0.0269 | | 0.7437 | 1581 | 0.0286 | | 0.7441 | 1582 | 0.0271 | | 0.7446 | 1583 | 0.035 | | 0.7451 | 1584 | 0.0348 | | 0.7455 | 1585 | 0.0209 | | 0.7460 | 1586 | 0.0183 | | 0.7465 | 1587 | 0.0406 | | 0.7469 | 1588 | 0.0178 | | 0.7474 | 1589 | 0.0488 | | 0.7479 | 1590 | 0.0176 | | 0.7484 | 1591 | 0.0314 | | 0.7488 | 1592 | 0.0262 | | 0.7493 | 1593 | 0.0417 | | 0.7498 | 1594 | 0.0702 | | 0.7502 | 1595 | 0.025 | | 0.7507 | 1596 | 0.0078 | | 0.7512 | 1597 | 0.0142 | | 0.7516 | 1598 | 0.0095 | | 0.7521 | 1599 | 0.0158 | | 0.7526 | 1600 | 0.0325 | | 0.7531 | 1601 | 0.0171 | | 0.7535 | 1602 | 0.0233 | | 0.7540 | 1603 | 0.0185 | | 0.7545 | 1604 | 0.0241 | | 0.7549 | 1605 | 0.0172 | | 0.7554 | 1606 | 0.0302 | | 0.7559 | 1607 | 0.0338 | | 0.7563 | 1608 | 0.0129 | | 0.7568 | 1609 | 0.0172 | | 0.7573 | 1610 | 0.048 | | 0.7578 | 1611 | 0.0794 | | 0.7582 | 1612 | 0.0314 | | 0.7587 | 1613 | 0.0169 | | 0.7592 | 1614 | 0.0289 | | 0.7596 | 1615 | 0.0095 | | 0.7601 | 1616 | 0.0857 | | 0.7606 | 1617 | 0.0166 | | 0.7611 | 1618 | 0.0804 | | 0.7615 | 1619 | 0.0321 | | 0.7620 | 1620 | 0.0216 | | 0.7625 | 1621 | 0.0405 | | 0.7629 | 1622 | 0.0327 | | 0.7634 | 1623 | 0.0675 | | 0.7639 | 1624 | 0.0283 | | 0.7643 | 1625 | 0.0242 | | 0.7648 | 1626 | 0.0218 | | 0.7653 | 1627 | 0.0358 | | 0.7658 | 1628 | 0.0104 | | 0.7662 | 1629 | 0.0278 | | 0.7667 | 1630 | 0.0577 | | 0.7672 | 1631 | 0.0145 | | 0.7676 | 1632 | 0.0262 | | 0.7681 | 1633 | 0.014 | | 0.7686 | 1634 | 0.0393 | | 0.7690 | 1635 | 0.0181 | | 0.7695 | 1636 | 0.012 | | 0.7700 | 1637 | 0.0186 | | 0.7705 | 1638 | 0.0322 | | 0.7709 | 1639 | 0.0187 | | 0.7714 | 1640 | 0.0809 | | 0.7719 | 1641 | 0.038 | | 0.7723 | 1642 | 0.013 | | 0.7728 | 1643 | 0.0095 | | 0.7733 | 1644 | 0.026 | | 0.7738 | 1645 | 0.0469 | | 0.7742 | 1646 | 0.0355 | | 0.7747 | 1647 | 0.0089 | | 0.7752 | 1648 | 0.0125 | | 0.7756 | 1649 | 0.0229 | | 0.7761 | 1650 | 0.0206 | | 0.7766 | 1651 | 0.0211 | | 0.7770 | 1652 | 0.0273 | | 0.7775 | 1653 | 0.0329 | | 0.7780 | 1654 | 0.0342 | | 0.7785 | 1655 | 0.084 | | 0.7789 | 1656 | 0.0178 | | 0.7794 | 1657 | 0.0433 | | 0.7799 | 1658 | 0.018 | | 0.7803 | 1659 | 0.0741 | | 0.7808 | 1660 | 0.0993 | | 0.7813 | 1661 | 0.0276 | | 0.7817 | 1662 | 0.0134 | | 0.7822 | 1663 | 0.0561 | | 0.7827 | 1664 | 0.0288 | | 0.7832 | 1665 | 0.0491 | | 0.7836 | 1666 | 0.0554 | | 0.7841 | 1667 | 0.0338 | | 0.7846 | 1668 | 0.0251 | | 0.7850 | 1669 | 0.0091 | | 0.7855 | 1670 | 0.0357 | | 0.7860 | 1671 | 0.033 | | 0.7865 | 1672 | 0.0764 | | 0.7869 | 1673 | 0.016 | | 0.7874 | 1674 | 0.012 | | 0.7879 | 1675 | 0.0289 | | 0.7883 | 1676 | 0.0157 | | 0.7888 | 1677 | 0.05 | | 0.7893 | 1678 | 0.062 | | 0.7897 | 1679 | 0.0291 | | 0.7902 | 1680 | 0.0275 | | 0.7907 | 1681 | 0.0186 | | 0.7912 | 1682 | 0.0302 | | 0.7916 | 1683 | 0.0351 | | 0.7921 | 1684 | 0.0342 | | 0.7926 | 1685 | 0.0909 | | 0.7930 | 1686 | 0.0197 | | 0.7935 | 1687 | 0.0163 | | 0.7940 | 1688 | 0.0178 | | 0.7944 | 1689 | 0.1046 | | 0.7949 | 1690 | 0.0242 | | 0.7954 | 1691 | 0.0288 | | 0.7959 | 1692 | 0.0504 | | 0.7963 | 1693 | 0.0132 | | 0.7968 | 1694 | 0.0071 | | 0.7973 | 1695 | 0.0235 | | 0.7977 | 1696 | 0.0608 | | 0.7982 | 1697 | 0.0484 | | 0.7987 | 1698 | 0.0457 | | 0.7992 | 1699 | 0.0141 | | 0.7996 | 1700 | 0.0574 | | 0.8001 | 1701 | 0.0196 | | 0.8006 | 1702 | 0.0328 | | 0.8010 | 1703 | 0.0203 | | 0.8015 | 1704 | 0.0161 | | 0.8020 | 1705 | 0.028 | | 0.8024 | 1706 | 0.061 | | 0.8029 | 1707 | 0.0422 | | 0.8034 | 1708 | 0.0659 | | 0.8039 | 1709 | 0.0097 | | 0.8043 | 1710 | 0.0361 | | 0.8048 | 1711 | 0.0382 | | 0.8053 | 1712 | 0.0313 | | 0.8057 | 1713 | 0.0342 | | 0.8062 | 1714 | 0.0389 | | 0.8067 | 1715 | 0.0463 | | 0.8071 | 1716 | 0.0351 | | 0.8076 | 1717 | 0.02 | | 0.8081 | 1718 | 0.0458 | | 0.8086 | 1719 | 0.0376 | | 0.8090 | 1720 | 0.0126 | | 0.8095 | 1721 | 0.0178 | | 0.8100 | 1722 | 0.0251 | | 0.8104 | 1723 | 0.0402 | | 0.8109 | 1724 | 0.0228 | | 0.8114 | 1725 | 0.0378 | | 0.8119 | 1726 | 0.0235 | | 0.8123 | 1727 | 0.0094 | | 0.8128 | 1728 | 0.0691 | | 0.8133 | 1729 | 0.0455 | | 0.8137 | 1730 | 0.0485 | | 0.8142 | 1731 | 0.0324 | | 0.8147 | 1732 | 0.0147 | | 0.8151 | 1733 | 0.0137 | | 0.8156 | 1734 | 0.0539 | | 0.8161 | 1735 | 0.0337 | | 0.8166 | 1736 | 0.0219 | | 0.8170 | 1737 | 0.0189 | | 0.8175 | 1738 | 0.0304 | | 0.8180 | 1739 | 0.0639 | | 0.8184 | 1740 | 0.0341 | | 0.8189 | 1741 | 0.0452 | | 0.8194 | 1742 | 0.0143 | | 0.8198 | 1743 | 0.0376 | | 0.8203 | 1744 | 0.033 | | 0.8208 | 1745 | 0.0448 | | 0.8213 | 1746 | 0.0401 | | 0.8217 | 1747 | 0.0773 | | 0.8222 | 1748 | 0.0277 | | 0.8227 | 1749 | 0.0374 | | 0.8231 | 1750 | 0.0152 | | 0.8236 | 1751 | 0.0447 | | 0.8241 | 1752 | 0.0458 | | 0.8246 | 1753 | 0.0154 | | 0.8250 | 1754 | 0.016 | | 0.8255 | 1755 | 0.0285 | | 0.8260 | 1756 | 0.021 | | 0.8264 | 1757 | 0.0151 | | 0.8269 | 1758 | 0.0129 | | 0.8274 | 1759 | 0.0214 | | 0.8278 | 1760 | 0.0215 | | 0.8283 | 1761 | 0.0156 | | 0.8288 | 1762 | 0.015 | | 0.8293 | 1763 | 0.0334 | | 0.8297 | 1764 | 0.029 | | 0.8302 | 1765 | 0.0084 | | 0.8307 | 1766 | 0.0385 | | 0.8311 | 1767 | 0.0152 | | 0.8316 | 1768 | 0.0467 | | 0.8321 | 1769 | 0.0088 | | 0.8325 | 1770 | 0.0226 | | 0.8330 | 1771 | 0.0535 | | 0.8335 | 1772 | 0.0445 | | 0.8340 | 1773 | 0.051 | | 0.8344 | 1774 | 0.0323 | | 0.8349 | 1775 | 0.0151 | | 0.8354 | 1776 | 0.0574 | | 0.8358 | 1777 | 0.0314 | | 0.8363 | 1778 | 0.0214 | | 0.8368 | 1779 | 0.0147 | | 0.8373 | 1780 | 0.0408 | | 0.8377 | 1781 | 0.0122 | | 0.8382 | 1782 | 0.0209 | | 0.8387 | 1783 | 0.0133 | | 0.8391 | 1784 | 0.0285 | | 0.8396 | 1785 | 0.0422 | | 0.8401 | 1786 | 0.0163 | | 0.8405 | 1787 | 0.0145 | | 0.8410 | 1788 | 0.0174 | | 0.8415 | 1789 | 0.0086 | | 0.8420 | 1790 | 0.0121 | | 0.8424 | 1791 | 0.0309 | | 0.8429 | 1792 | 0.0142 | | 0.8434 | 1793 | 0.025 | | 0.8438 | 1794 | 0.0204 | | 0.8443 | 1795 | 0.0631 | | 0.8448 | 1796 | 0.0561 | | 0.8452 | 1797 | 0.0806 | | 0.8457 | 1798 | 0.0088 | | 0.8462 | 1799 | 0.009 | | 0.8467 | 1800 | 0.0112 | | 0.8471 | 1801 | 0.0274 | | 0.8476 | 1802 | 0.0354 | | 0.8481 | 1803 | 0.0238 | | 0.8485 | 1804 | 0.0261 | | 0.8490 | 1805 | 0.0251 | | 0.8495 | 1806 | 0.0394 | | 0.8500 | 1807 | 0.0101 | | 0.8504 | 1808 | 0.0126 | | 0.8509 | 1809 | 0.0246 | | 0.8514 | 1810 | 0.0172 | | 0.8518 | 1811 | 0.0152 | | 0.8523 | 1812 | 0.0437 | | 0.8528 | 1813 | 0.0377 | | 0.8532 | 1814 | 0.0316 | | 0.8537 | 1815 | 0.0201 | | 0.8542 | 1816 | 0.0214 | | 0.8547 | 1817 | 0.009 | | 0.8551 | 1818 | 0.0149 | | 0.8556 | 1819 | 0.035 | | 0.8561 | 1820 | 0.0129 | | 0.8565 | 1821 | 0.0506 | | 0.8570 | 1822 | 0.0165 | | 0.8575 | 1823 | 0.0166 | | 0.8579 | 1824 | 0.0326 | | 0.8584 | 1825 | 0.0199 | | 0.8589 | 1826 | 0.0433 | | 0.8594 | 1827 | 0.0485 | | 0.8598 | 1828 | 0.0323 | | 0.8603 | 1829 | 0.0315 | | 0.8608 | 1830 | 0.063 | | 0.8612 | 1831 | 0.0584 | | 0.8617 | 1832 | 0.0362 | | 0.8622 | 1833 | 0.0274 | | 0.8627 | 1834 | 0.0167 | | 0.8631 | 1835 | 0.0228 | | 0.8636 | 1836 | 0.0293 | | 0.8641 | 1837 | 0.0294 | | 0.8645 | 1838 | 0.0115 | | 0.8650 | 1839 | 0.0098 | | 0.8655 | 1840 | 0.0132 | | 0.8659 | 1841 | 0.0354 | | 0.8664 | 1842 | 0.0535 | | 0.8669 | 1843 | 0.0255 | | 0.8674 | 1844 | 0.0308 | | 0.8678 | 1845 | 0.0248 | | 0.8683 | 1846 | 0.0222 | | 0.8688 | 1847 | 0.0762 | | 0.8692 | 1848 | 0.0197 | | 0.8697 | 1849 | 0.0132 | | 0.8702 | 1850 | 0.0334 | | 0.8706 | 1851 | 0.0146 | | 0.8711 | 1852 | 0.0283 | | 0.8716 | 1853 | 0.0161 | | 0.8721 | 1854 | 0.0364 | | 0.8725 | 1855 | 0.0543 | | 0.8730 | 1856 | 0.0867 | | 0.8735 | 1857 | 0.0328 | | 0.8739 | 1858 | 0.0752 | | 0.8744 | 1859 | 0.1049 | | 0.8749 | 1860 | 0.0342 | | 0.8754 | 1861 | 0.0183 | | 0.8758 | 1862 | 0.0317 | | 0.8763 | 1863 | 0.0141 | | 0.8768 | 1864 | 0.0336 | | 0.8772 | 1865 | 0.0197 | | 0.8777 | 1866 | 0.0235 | | 0.8782 | 1867 | 0.015 | | 0.8786 | 1868 | 0.0228 | | 0.8791 | 1869 | 0.0085 | | 0.8796 | 1870 | 0.028 | | 0.8801 | 1871 | 0.0188 | | 0.8805 | 1872 | 0.0353 | | 0.8810 | 1873 | 0.0494 | | 0.8815 | 1874 | 0.0597 | | 0.8819 | 1875 | 0.065 | | 0.8824 | 1876 | 0.011 | | 0.8829 | 1877 | 0.0243 | | 0.8833 | 1878 | 0.0187 | | 0.8838 | 1879 | 0.0279 | | 0.8843 | 1880 | 0.0399 | | 0.8848 | 1881 | 0.0398 | | 0.8852 | 1882 | 0.0104 | | 0.8857 | 1883 | 0.0192 | | 0.8862 | 1884 | 0.0293 | | 0.8866 | 1885 | 0.0145 | | 0.8871 | 1886 | 0.0143 | | 0.8876 | 1887 | 0.049 | | 0.8881 | 1888 | 0.0125 | | 0.8885 | 1889 | 0.0618 | | 0.8890 | 1890 | 0.0181 | | 0.8895 | 1891 | 0.0232 | | 0.8899 | 1892 | 0.0352 | | 0.8904 | 1893 | 0.0387 | | 0.8909 | 1894 | 0.0188 | | 0.8913 | 1895 | 0.0249 | | 0.8918 | 1896 | 0.0254 | | 0.8923 | 1897 | 0.0385 | | 0.8928 | 1898 | 0.0234 | | 0.8932 | 1899 | 0.0456 | | 0.8937 | 1900 | 0.014 | | 0.8942 | 1901 | 0.0285 | | 0.8946 | 1902 | 0.0375 | | 0.8951 | 1903 | 0.0208 | | 0.8956 | 1904 | 0.0333 | | 0.8960 | 1905 | 0.0368 | | 0.8965 | 1906 | 0.0457 | | 0.8970 | 1907 | 0.0503 | | 0.8975 | 1908 | 0.0454 | | 0.8979 | 1909 | 0.0281 | | 0.8984 | 1910 | 0.0285 | | 0.8989 | 1911 | 0.0168 | | 0.8993 | 1912 | 0.0167 | | 0.8998 | 1913 | 0.0196 | | 0.9003 | 1914 | 0.0321 | | 0.9008 | 1915 | 0.0175 | | 0.9012 | 1916 | 0.0077 | | 0.9017 | 1917 | 0.0412 | | 0.9022 | 1918 | 0.0375 | | 0.9026 | 1919 | 0.0112 | | 0.9031 | 1920 | 0.0301 | | 0.9036 | 1921 | 0.0615 | | 0.9040 | 1922 | 0.0824 | | 0.9045 | 1923 | 0.0154 | | 0.9050 | 1924 | 0.0897 | | 0.9055 | 1925 | 0.0179 | | 0.9059 | 1926 | 0.0203 | | 0.9064 | 1927 | 0.0118 | | 0.9069 | 1928 | 0.0131 | | 0.9073 | 1929 | 0.0379 | | 0.9078 | 1930 | 0.079 | | 0.9083 | 1931 | 0.0632 | | 0.9087 | 1932 | 0.0119 | | 0.9092 | 1933 | 0.0223 | | 0.9097 | 1934 | 0.0278 | | 0.9102 | 1935 | 0.013 | | 0.9106 | 1936 | 0.0562 | | 0.9111 | 1937 | 0.0098 | | 0.9116 | 1938 | 0.0167 | | 0.9120 | 1939 | 0.0395 | | 0.9125 | 1940 | 0.0388 | | 0.9130 | 1941 | 0.054 | | 0.9135 | 1942 | 0.0143 | | 0.9139 | 1943 | 0.0289 | | 0.9144 | 1944 | 0.0094 | | 0.9149 | 1945 | 0.0081 | | 0.9153 | 1946 | 0.0276 | | 0.9158 | 1947 | 0.0165 | | 0.9163 | 1948 | 0.016 | | 0.9167 | 1949 | 0.02 | | 0.9172 | 1950 | 0.0304 | | 0.9177 | 1951 | 0.0126 | | 0.9182 | 1952 | 0.0688 | | 0.9186 | 1953 | 0.0153 | | 0.9191 | 1954 | 0.0142 | | 0.9196 | 1955 | 0.0296 | | 0.9200 | 1956 | 0.0279 | | 0.9205 | 1957 | 0.0162 | | 0.9210 | 1958 | 0.0181 | | 0.9214 | 1959 | 0.0264 | | 0.9219 | 1960 | 0.021 | | 0.9224 | 1961 | 0.01 | | 0.9229 | 1962 | 0.0228 | | 0.9233 | 1963 | 0.0268 | | 0.9238 | 1964 | 0.0272 | | 0.9243 | 1965 | 0.049 | | 0.9247 | 1966 | 0.0356 | | 0.9252 | 1967 | 0.0093 | | 0.9257 | 1968 | 0.0233 | | 0.9262 | 1969 | 0.0158 | | 0.9266 | 1970 | 0.0272 | | 0.9271 | 1971 | 0.0835 | | 0.9276 | 1972 | 0.0193 | | 0.9280 | 1973 | 0.0243 | | 0.9285 | 1974 | 0.0215 | | 0.9290 | 1975 | 0.0268 | | 0.9294 | 1976 | 0.0213 | | 0.9299 | 1977 | 0.0536 | | 0.9304 | 1978 | 0.029 | | 0.9309 | 1979 | 0.024 | | 0.9313 | 1980 | 0.0288 | | 0.9318 | 1981 | 0.0123 | | 0.9323 | 1982 | 0.0535 | | 0.9327 | 1983 | 0.0452 | | 0.9332 | 1984 | 0.042 | | 0.9337 | 1985 | 0.0944 | | 0.9341 | 1986 | 0.0159 | | 0.9346 | 1987 | 0.0189 | | 0.9351 | 1988 | 0.0239 | | 0.9356 | 1989 | 0.0304 | | 0.9360 | 1990 | 0.0282 | | 0.9365 | 1991 | 0.0192 | | 0.9370 | 1992 | 0.0276 | | 0.9374 | 1993 | 0.025 | | 0.9379 | 1994 | 0.0431 | | 0.9384 | 1995 | 0.0233 | | 0.9389 | 1996 | 0.0179 | | 0.9393 | 1997 | 0.0682 | | 0.9398 | 1998 | 0.0286 | | 0.9403 | 1999 | 0.122 | | 0.9407 | 2000 | 0.1082 | | 0.9412 | 2001 | 0.0068 | | 0.9417 | 2002 | 0.0352 | | 0.9421 | 2003 | 0.0305 | | 0.9426 | 2004 | 0.0085 | | 0.9431 | 2005 | 0.0226 | | 0.9436 | 2006 | 0.0065 | | 0.9440 | 2007 | 0.0191 | | 0.9445 | 2008 | 0.0604 | | 0.9450 | 2009 | 0.0283 | | 0.9454 | 2010 | 0.0233 | | 0.9459 | 2011 | 0.0211 | | 0.9464 | 2012 | 0.0125 | | 0.9468 | 2013 | 0.0406 | | 0.9473 | 2014 | 0.0259 | | 0.9478 | 2015 | 0.0224 | | 0.9483 | 2016 | 0.0153 | | 0.9487 | 2017 | 0.0364 | | 0.9492 | 2018 | 0.0212 | | 0.9497 | 2019 | 0.0333 | | 0.9501 | 2020 | 0.0183 | | 0.9506 | 2021 | 0.0119 | | 0.9511 | 2022 | 0.0435 | | 0.9516 | 2023 | 0.0237 | | 0.9520 | 2024 | 0.0359 | | 0.9525 | 2025 | 0.0104 | | 0.9530 | 2026 | 0.0219 | | 0.9534 | 2027 | 0.0655 | | 0.9539 | 2028 | 0.0157 | | 0.9544 | 2029 | 0.0126 | | 0.9548 | 2030 | 0.0275 | | 0.9553 | 2031 | 0.0154 | | 0.9558 | 2032 | 0.0243 | | 0.9563 | 2033 | 0.0422 | | 0.9567 | 2034 | 0.0111 | | 0.9572 | 2035 | 0.055 | | 0.9577 | 2036 | 0.0225 | | 0.9581 | 2037 | 0.0399 | | 0.9586 | 2038 | 0.015 | | 0.9591 | 2039 | 0.0227 | | 0.9595 | 2040 | 0.0487 | | 0.9600 | 2041 | 0.0228 | | 0.9605 | 2042 | 0.0525 | | 0.9610 | 2043 | 0.0486 | | 0.9614 | 2044 | 0.0281 | | 0.9619 | 2045 | 0.047 | | 0.9624 | 2046 | 0.0098 | | 0.9628 | 2047 | 0.0575 | | 0.9633 | 2048 | 0.0228 | | 0.9638 | 2049 | 0.0929 | | 0.9643 | 2050 | 0.0267 | | 0.9647 | 2051 | 0.0075 | | 0.9652 | 2052 | 0.0203 | | 0.9657 | 2053 | 0.0102 | | 0.9661 | 2054 | 0.0432 | | 0.9666 | 2055 | 0.0486 | | 0.9671 | 2056 | 0.0088 | | 0.9675 | 2057 | 0.0227 | | 0.9680 | 2058 | 0.0115 | | 0.9685 | 2059 | 0.0325 | | 0.9690 | 2060 | 0.0359 | | 0.9694 | 2061 | 0.0077 | | 0.9699 | 2062 | 0.0609 | | 0.9704 | 2063 | 0.0339 | | 0.9708 | 2064 | 0.0227 | | 0.9713 | 2065 | 0.0604 | | 0.9718 | 2066 | 0.0545 | | 0.9722 | 2067 | 0.0283 | | 0.9727 | 2068 | 0.0111 | | 0.9732 | 2069 | 0.0326 | | 0.9737 | 2070 | 0.0391 | | 0.9741 | 2071 | 0.0112 | | 0.9746 | 2072 | 0.0298 | | 0.9751 | 2073 | 0.0447 | | 0.9755 | 2074 | 0.0265 | | 0.9760 | 2075 | 0.0253 | | 0.9765 | 2076 | 0.0172 | | 0.9770 | 2077 | 0.025 | | 0.9774 | 2078 | 0.0236 | | 0.9779 | 2079 | 0.0205 | | 0.9784 | 2080 | 0.0667 | | 0.9788 | 2081 | 0.0275 | | 0.9793 | 2082 | 0.0426 | | 0.9798 | 2083 | 0.0138 | | 0.9802 | 2084 | 0.0278 | | 0.9807 | 2085 | 0.0234 | | 0.9812 | 2086 | 0.042 | | 0.9817 | 2087 | 0.0206 | | 0.9821 | 2088 | 0.0238 | | 0.9826 | 2089 | 0.0327 | | 0.9831 | 2090 | 0.024 | | 0.9835 | 2091 | 0.0428 | | 0.9840 | 2092 | 0.0104 | | 0.9845 | 2093 | 0.0062 | | 0.9849 | 2094 | 0.0267 | | 0.9854 | 2095 | 0.0308 | | 0.9859 | 2096 | 0.0209 | | 0.9864 | 2097 | 0.0251 | | 0.9868 | 2098 | 0.0119 | | 0.9873 | 2099 | 0.0243 | | 0.9878 | 2100 | 0.0155 | | 0.9882 | 2101 | 0.0164 | | 0.9887 | 2102 | 0.0362 | | 0.9892 | 2103 | 0.0392 | | 0.9897 | 2104 | 0.0187 | | 0.9901 | 2105 | 0.013 | | 0.9906 | 2106 | 0.054 | | 0.9911 | 2107 | 0.0125 | | 0.9915 | 2108 | 0.0245 | | 0.9920 | 2109 | 0.0054 | | 0.9925 | 2110 | 0.0127 | | 0.9929 | 2111 | 0.0232 | | 0.9934 | 2112 | 0.011 | | 0.9939 | 2113 | 0.0382 | | 0.9944 | 2114 | 0.0452 | | 0.9948 | 2115 | 0.0711 | | 0.9953 | 2116 | 0.0091 | | 0.9958 | 2117 | 0.0216 | | 0.9962 | 2118 | 0.0517 | | 0.9967 | 2119 | 0.0376 | | 0.9972 | 2120 | 0.0115 | | 0.9976 | 2121 | 0.0188 | | 0.9981 | 2122 | 0.0214 | | 0.9986 | 2123 | 0.0164 | | 0.9991 | 2124 | 0.0336 | | 0.9995 | 2125 | 0.0152 | | 1.0 | 2126 | 0.0779 | | 1.0005 | 2127 | 0.0165 | | 1.0009 | 2128 | 0.0695 | | 1.0014 | 2129 | 0.036 | | 1.0019 | 2130 | 0.0197 | | 1.0024 | 2131 | 0.0436 | | 1.0028 | 2132 | 0.0148 | | 1.0033 | 2133 | 0.0159 | | 1.0038 | 2134 | 0.0282 | | 1.0042 | 2135 | 0.0138 | | 1.0047 | 2136 | 0.016 | | 1.0052 | 2137 | 0.0268 | | 1.0056 | 2138 | 0.0448 | | 1.0061 | 2139 | 0.0176 | | 1.0066 | 2140 | 0.0118 | | 1.0071 | 2141 | 0.0538 | | 1.0075 | 2142 | 0.0169 | | 1.0080 | 2143 | 0.0158 | | 1.0085 | 2144 | 0.0434 | | 1.0089 | 2145 | 0.0278 | | 1.0094 | 2146 | 0.0117 | | 1.0099 | 2147 | 0.0479 | | 1.0103 | 2148 | 0.0163 | | 1.0108 | 2149 | 0.0178 | | 1.0113 | 2150 | 0.0423 | | 1.0118 | 2151 | 0.0159 | | 1.0122 | 2152 | 0.0125 | | 1.0127 | 2153 | 0.0582 | | 1.0132 | 2154 | 0.0296 | | 1.0136 | 2155 | 0.0111 | | 1.0141 | 2156 | 0.0328 | | 1.0146 | 2157 | 0.0098 | | 1.0151 | 2158 | 0.0284 | | 1.0155 | 2159 | 0.0755 | | 1.0160 | 2160 | 0.0158 | | 1.0165 | 2161 | 0.0185 | | 1.0169 | 2162 | 0.0279 | | 1.0174 | 2163 | 0.0141 | | 1.0179 | 2164 | 0.0196 | | 1.0183 | 2165 | 0.0148 | | 1.0188 | 2166 | 0.0318 | | 1.0193 | 2167 | 0.0394 | | 1.0198 | 2168 | 0.0309 | | 1.0202 | 2169 | 0.0198 | | 1.0207 | 2170 | 0.0265 | | 1.0212 | 2171 | 0.0222 | | 1.0216 | 2172 | 0.0391 | | 1.0221 | 2173 | 0.0094 | | 1.0226 | 2174 | 0.0526 | | 1.0230 | 2175 | 0.0476 | | 1.0235 | 2176 | 0.0148 | | 1.0240 | 2177 | 0.024 | | 1.0245 | 2178 | 0.0268 | | 1.0249 | 2179 | 0.0105 | | 1.0254 | 2180 | 0.0118 | | 1.0259 | 2181 | 0.0145 | | 1.0263 | 2182 | 0.0751 | | 1.0268 | 2183 | 0.0325 | | 1.0273 | 2184 | 0.0216 | | 1.0278 | 2185 | 0.0108 | | 1.0282 | 2186 | 0.0089 | | 1.0287 | 2187 | 0.0245 | | 1.0292 | 2188 | 0.0553 | | 1.0296 | 2189 | 0.0544 | | 1.0301 | 2190 | 0.0144 | | 1.0306 | 2191 | 0.011 | | 1.0310 | 2192 | 0.0211 | | 1.0315 | 2193 | 0.0334 | | 1.0320 | 2194 | 0.0346 | | 1.0325 | 2195 | 0.0153 | | 1.0329 | 2196 | 0.0354 | | 1.0334 | 2197 | 0.0254 | | 1.0339 | 2198 | 0.0788 | | 1.0343 | 2199 | 0.0681 | | 1.0348 | 2200 | 0.0066 | | 1.0353 | 2201 | 0.0206 | | 1.0357 | 2202 | 0.0395 | | 1.0362 | 2203 | 0.0294 | | 1.0367 | 2204 | 0.0168 | | 1.0372 | 2205 | 0.0276 | | 1.0376 | 2206 | 0.0158 | | 1.0381 | 2207 | 0.0282 | | 1.0386 | 2208 | 0.0732 | | 1.0390 | 2209 | 0.0503 | | 1.0395 | 2210 | 0.0297 | | 1.0400 | 2211 | 0.0213 | | 1.0405 | 2212 | 0.0142 | | 1.0409 | 2213 | 0.0278 | | 1.0414 | 2214 | 0.014 | | 1.0419 | 2215 | 0.0312 | | 1.0423 | 2216 | 0.015 | | 1.0428 | 2217 | 0.0107 | | 1.0433 | 2218 | 0.0484 | | 1.0437 | 2219 | 0.0435 | | 1.0442 | 2220 | 0.0246 | | 1.0447 | 2221 | 0.0128 | | 1.0452 | 2222 | 0.0414 | | 1.0456 | 2223 | 0.0559 | | 1.0461 | 2224 | 0.0272 | | 1.0466 | 2225 | 0.0225 | | 1.0470 | 2226 | 0.0312 | | 1.0475 | 2227 | 0.0211 | | 1.0480 | 2228 | 0.0188 | | 1.0484 | 2229 | 0.014 | | 1.0489 | 2230 | 0.0054 | | 1.0494 | 2231 | 0.0193 | | 1.0499 | 2232 | 0.0126 | | 1.0503 | 2233 | 0.0048 | | 1.0508 | 2234 | 0.0256 | | 1.0513 | 2235 | 0.0187 | | 1.0517 | 2236 | 0.0045 | | 1.0522 | 2237 | 0.0099 | | 1.0527 | 2238 | 0.0241 | | 1.0532 | 2239 | 0.0552 | | 1.0536 | 2240 | 0.0107 | | 1.0541 | 2241 | 0.0137 | | 1.0546 | 2242 | 0.0108 | | 1.0550 | 2243 | 0.0062 | | 1.0555 | 2244 | 0.0192 | | 1.0560 | 2245 | 0.0152 | | 1.0564 | 2246 | 0.0326 | | 1.0569 | 2247 | 0.0222 | | 1.0574 | 2248 | 0.0292 | | 1.0579 | 2249 | 0.043 | | 1.0583 | 2250 | 0.0258 | | 1.0588 | 2251 | 0.0259 | | 1.0593 | 2252 | 0.0236 | | 1.0597 | 2253 | 0.0244 | | 1.0602 | 2254 | 0.0132 | | 1.0607 | 2255 | 0.0275 | | 1.0611 | 2256 | 0.0257 | | 1.0616 | 2257 | 0.0085 | | 1.0621 | 2258 | 0.0131 | | 1.0626 | 2259 | 0.0283 | | 1.0630 | 2260 | 0.0143 | | 1.0635 | 2261 | 0.009 | | 1.0640 | 2262 | 0.0088 | | 1.0644 | 2263 | 0.0107 | | 1.0649 | 2264 | 0.0713 | | 1.0654 | 2265 | 0.0093 | | 1.0659 | 2266 | 0.0178 | | 1.0663 | 2267 | 0.0054 | | 1.0668 | 2268 | 0.0118 | | 1.0673 | 2269 | 0.0218 | | 1.0677 | 2270 | 0.0085 | | 1.0682 | 2271 | 0.0257 | | 1.0687 | 2272 | 0.0318 | | 1.0691 | 2273 | 0.0198 | | 1.0696 | 2274 | 0.0127 | | 1.0701 | 2275 | 0.0297 | | 1.0706 | 2276 | 0.0188 | | 1.0710 | 2277 | 0.0198 | | 1.0715 | 2278 | 0.0176 | | 1.0720 | 2279 | 0.0092 | | 1.0724 | 2280 | 0.0213 | | 1.0729 | 2281 | 0.0272 | | 1.0734 | 2282 | 0.0089 | | 1.0738 | 2283 | 0.0102 | | 1.0743 | 2284 | 0.0092 | | 1.0748 | 2285 | 0.0212 | | 1.0753 | 2286 | 0.008 | | 1.0757 | 2287 | 0.0224 | | 1.0762 | 2288 | 0.0328 | | 1.0767 | 2289 | 0.0314 | | 1.0771 | 2290 | 0.0682 | | 1.0776 | 2291 | 0.0077 | | 1.0781 | 2292 | 0.014 | | 1.0786 | 2293 | 0.0155 | | 1.0790 | 2294 | 0.0239 | | 1.0795 | 2295 | 0.0342 | | 1.0800 | 2296 | 0.0064 | | 1.0804 | 2297 | 0.0142 | | 1.0809 | 2298 | 0.0365 | | 1.0814 | 2299 | 0.013 | | 1.0818 | 2300 | 0.0116 | | 1.0823 | 2301 | 0.0267 | | 1.0828 | 2302 | 0.0637 | | 1.0833 | 2303 | 0.0268 | | 1.0837 | 2304 | 0.0095 | | 1.0842 | 2305 | 0.014 | | 1.0847 | 2306 | 0.0157 | | 1.0851 | 2307 | 0.0165 | | 1.0856 | 2308 | 0.0153 | | 1.0861 | 2309 | 0.0474 | | 1.0865 | 2310 | 0.0305 | | 1.0870 | 2311 | 0.0057 | | 1.0875 | 2312 | 0.0077 | | 1.0880 | 2313 | 0.0193 | | 1.0884 | 2314 | 0.0067 | | 1.0889 | 2315 | 0.01 | | 1.0894 | 2316 | 0.0093 | | 1.0898 | 2317 | 0.0122 | | 1.0903 | 2318 | 0.0147 | | 1.0908 | 2319 | 0.0356 | | 1.0913 | 2320 | 0.0397 | | 1.0917 | 2321 | 0.0059 | | 1.0922 | 2322 | 0.0266 | | 1.0927 | 2323 | 0.0088 | | 1.0931 | 2324 | 0.0101 | | 1.0936 | 2325 | 0.0101 | | 1.0941 | 2326 | 0.0261 | | 1.0945 | 2327 | 0.0075 | | 1.0950 | 2328 | 0.0338 | | 1.0955 | 2329 | 0.0519 | | 1.0960 | 2330 | 0.0086 | | 1.0964 | 2331 | 0.0133 | | 1.0969 | 2332 | 0.0311 | | 1.0974 | 2333 | 0.0179 | | 1.0978 | 2334 | 0.0103 | | 1.0983 | 2335 | 0.023 | | 1.0988 | 2336 | 0.0119 | | 1.0992 | 2337 | 0.0238 | | 1.0997 | 2338 | 0.0096 | | 1.1002 | 2339 | 0.0121 | | 1.1007 | 2340 | 0.0216 | | 1.1011 | 2341 | 0.0285 | | 1.1016 | 2342 | 0.0196 | | 1.1021 | 2343 | 0.037 | | 1.1025 | 2344 | 0.004 | | 1.1030 | 2345 | 0.0332 | | 1.1035 | 2346 | 0.0138 | | 1.1040 | 2347 | 0.0102 | | 1.1044 | 2348 | 0.016 | | 1.1049 | 2349 | 0.0251 | | 1.1054 | 2350 | 0.0128 | | 1.1058 | 2351 | 0.0112 | | 1.1063 | 2352 | 0.0115 | | 1.1068 | 2353 | 0.0114 | | 1.1072 | 2354 | 0.0105 | | 1.1077 | 2355 | 0.0188 | | 1.1082 | 2356 | 0.0313 | | 1.1087 | 2357 | 0.0325 | | 1.1091 | 2358 | 0.0303 | | 1.1096 | 2359 | 0.009 | | 1.1101 | 2360 | 0.0137 | | 1.1105 | 2361 | 0.0193 | | 1.1110 | 2362 | 0.0135 | | 1.1115 | 2363 | 0.0219 | | 1.1119 | 2364 | 0.0087 | | 1.1124 | 2365 | 0.0142 | | 1.1129 | 2366 | 0.0046 | | 1.1134 | 2367 | 0.0126 | | 1.1138 | 2368 | 0.0217 | | 1.1143 | 2369 | 0.0211 | | 1.1148 | 2370 | 0.0057 | | 1.1152 | 2371 | 0.0209 | | 1.1157 | 2372 | 0.0072 | | 1.1162 | 2373 | 0.0055 | | 1.1167 | 2374 | 0.0243 | | 1.1171 | 2375 | 0.0214 | | 1.1176 | 2376 | 0.0076 | | 1.1181 | 2377 | 0.0123 | | 1.1185 | 2378 | 0.0124 | | 1.1190 | 2379 | 0.014 | | 1.1195 | 2380 | 0.0329 | | 1.1199 | 2381 | 0.0075 | | 1.1204 | 2382 | 0.0113 | | 1.1209 | 2383 | 0.0062 | | 1.1214 | 2384 | 0.0312 | | 1.1218 | 2385 | 0.0291 | | 1.1223 | 2386 | 0.0081 | | 1.1228 | 2387 | 0.0062 | | 1.1232 | 2388 | 0.0226 | | 1.1237 | 2389 | 0.0174 | | 1.1242 | 2390 | 0.0041 | | 1.1246 | 2391 | 0.0067 | | 1.1251 | 2392 | 0.0276 | | 1.1256 | 2393 | 0.0186 | | 1.1261 | 2394 | 0.0237 | | 1.1265 | 2395 | 0.0104 | | 1.1270 | 2396 | 0.0149 | | 1.1275 | 2397 | 0.0079 | | 1.1279 | 2398 | 0.027 | | 1.1284 | 2399 | 0.0076 | | 1.1289 | 2400 | 0.0115 | | 1.1294 | 2401 | 0.0248 | | 1.1298 | 2402 | 0.0069 | | 1.1303 | 2403 | 0.0138 | | 1.1308 | 2404 | 0.0277 | | 1.1312 | 2405 | 0.0144 | | 1.1317 | 2406 | 0.0178 | | 1.1322 | 2407 | 0.0141 | | 1.1326 | 2408 | 0.0266 | | 1.1331 | 2409 | 0.0073 | | 1.1336 | 2410 | 0.0095 | | 1.1341 | 2411 | 0.015 | | 1.1345 | 2412 | 0.0091 | | 1.1350 | 2413 | 0.0207 | | 1.1355 | 2414 | 0.0145 | | 1.1359 | 2415 | 0.0162 | | 1.1364 | 2416 | 0.006 | | 1.1369 | 2417 | 0.0068 | | 1.1373 | 2418 | 0.0096 | | 1.1378 | 2419 | 0.0171 | | 1.1383 | 2420 | 0.0239 | | 1.1388 | 2421 | 0.0091 | | 1.1392 | 2422 | 0.046 | | 1.1397 | 2423 | 0.0356 | | 1.1402 | 2424 | 0.0341 | | 1.1406 | 2425 | 0.0179 | | 1.1411 | 2426 | 0.0209 | | 1.1416 | 2427 | 0.0058 | | 1.1421 | 2428 | 0.0115 | | 1.1425 | 2429 | 0.0079 | | 1.1430 | 2430 | 0.0399 | | 1.1435 | 2431 | 0.0088 | | 1.1439 | 2432 | 0.0166 | | 1.1444 | 2433 | 0.0204 | | 1.1449 | 2434 | 0.0061 | | 1.1453 | 2435 | 0.0094 | | 1.1458 | 2436 | 0.0484 | | 1.1463 | 2437 | 0.0105 | | 1.1468 | 2438 | 0.058 | | 1.1472 | 2439 | 0.024 | | 1.1477 | 2440 | 0.0247 | | 1.1482 | 2441 | 0.0145 | | 1.1486 | 2442 | 0.0199 | | 1.1491 | 2443 | 0.0126 | | 1.1496 | 2444 | 0.025 | | 1.1500 | 2445 | 0.0049 | | 1.1505 | 2446 | 0.0119 | | 1.1510 | 2447 | 0.0081 | | 1.1515 | 2448 | 0.0175 | | 1.1519 | 2449 | 0.0088 | | 1.1524 | 2450 | 0.0117 | | 1.1529 | 2451 | 0.0202 | | 1.1533 | 2452 | 0.0065 | | 1.1538 | 2453 | 0.0194 | | 1.1543 | 2454 | 0.0053 | | 1.1548 | 2455 | 0.0052 | | 1.1552 | 2456 | 0.0195 | | 1.1557 | 2457 | 0.0147 | | 1.1562 | 2458 | 0.0083 | | 1.1566 | 2459 | 0.0076 | | 1.1571 | 2460 | 0.0085 | | 1.1576 | 2461 | 0.0253 | | 1.1580 | 2462 | 0.0057 | | 1.1585 | 2463 | 0.0167 | | 1.1590 | 2464 | 0.0039 | | 1.1595 | 2465 | 0.0086 | | 1.1599 | 2466 | 0.0082 | | 1.1604 | 2467 | 0.0209 | | 1.1609 | 2468 | 0.0263 | | 1.1613 | 2469 | 0.0073 | | 1.1618 | 2470 | 0.0046 | | 1.1623 | 2471 | 0.0233 | | 1.1627 | 2472 | 0.0179 | | 1.1632 | 2473 | 0.0071 | | 1.1637 | 2474 | 0.0046 | | 1.1642 | 2475 | 0.0044 | | 1.1646 | 2476 | 0.0068 | | 1.1651 | 2477 | 0.0164 | | 1.1656 | 2478 | 0.0306 | | 1.1660 | 2479 | 0.0033 | | 1.1665 | 2480 | 0.026 | | 1.1670 | 2481 | 0.0135 | | 1.1675 | 2482 | 0.0206 | | 1.1679 | 2483 | 0.0135 | | 1.1684 | 2484 | 0.0098 | | 1.1689 | 2485 | 0.0106 | | 1.1693 | 2486 | 0.0109 | | 1.1698 | 2487 | 0.0182 | | 1.1703 | 2488 | 0.0183 | | 1.1707 | 2489 | 0.0362 | | 1.1712 | 2490 | 0.0055 | | 1.1717 | 2491 | 0.0139 | | 1.1722 | 2492 | 0.0089 | | 1.1726 | 2493 | 0.0213 | | 1.1731 | 2494 | 0.0074 | | 1.1736 | 2495 | 0.0086 | | 1.1740 | 2496 | 0.0086 | | 1.1745 | 2497 | 0.0104 | | 1.1750 | 2498 | 0.0175 | | 1.1754 | 2499 | 0.0063 | | 1.1759 | 2500 | 0.0112 | | 1.1764 | 2501 | 0.0287 | | 1.1769 | 2502 | 0.0042 | | 1.1773 | 2503 | 0.0161 | | 1.1778 | 2504 | 0.0177 | | 1.1783 | 2505 | 0.0106 | | 1.1787 | 2506 | 0.0069 | | 1.1792 | 2507 | 0.0073 | | 1.1797 | 2508 | 0.0156 | | 1.1802 | 2509 | 0.0107 | | 1.1806 | 2510 | 0.0038 | | 1.1811 | 2511 | 0.0098 | | 1.1816 | 2512 | 0.0115 | | 1.1820 | 2513 | 0.0106 | | 1.1825 | 2514 | 0.0108 | | 1.1830 | 2515 | 0.0128 | | 1.1834 | 2516 | 0.0067 | | 1.1839 | 2517 | 0.0086 | | 1.1844 | 2518 | 0.0085 | | 1.1849 | 2519 | 0.0297 | | 1.1853 | 2520 | 0.0166 | | 1.1858 | 2521 | 0.0053 | | 1.1863 | 2522 | 0.0107 | | 1.1867 | 2523 | 0.0247 | | 1.1872 | 2524 | 0.0068 | | 1.1877 | 2525 | 0.019 | | 1.1881 | 2526 | 0.0185 | | 1.1886 | 2527 | 0.0069 | | 1.1891 | 2528 | 0.0502 | | 1.1896 | 2529 | 0.004 | | 1.1900 | 2530 | 0.0077 | | 1.1905 | 2531 | 0.0125 | | 1.1910 | 2532 | 0.0132 | | 1.1914 | 2533 | 0.0086 | | 1.1919 | 2534 | 0.0113 | | 1.1924 | 2535 | 0.0205 | | 1.1929 | 2536 | 0.0151 | | 1.1933 | 2537 | 0.0162 | | 1.1938 | 2538 | 0.0134 | | 1.1943 | 2539 | 0.0181 | | 1.1947 | 2540 | 0.0114 | | 1.1952 | 2541 | 0.0148 | | 1.1957 | 2542 | 0.0111 | | 1.1961 | 2543 | 0.0064 | | 1.1966 | 2544 | 0.0095 | | 1.1971 | 2545 | 0.0227 | | 1.1976 | 2546 | 0.0043 | | 1.1980 | 2547 | 0.0136 | | 1.1985 | 2548 | 0.0096 | | 1.1990 | 2549 | 0.0068 | | 1.1994 | 2550 | 0.015 | | 1.1999 | 2551 | 0.0137 | | 1.2004 | 2552 | 0.0067 | | 1.2008 | 2553 | 0.0119 | | 1.2013 | 2554 | 0.0097 | | 1.2018 | 2555 | 0.0082 | | 1.2023 | 2556 | 0.0022 | | 1.2027 | 2557 | 0.0089 | | 1.2032 | 2558 | 0.0198 | | 1.2037 | 2559 | 0.0172 | | 1.2041 | 2560 | 0.0179 | | 1.2046 | 2561 | 0.0212 | | 1.2051 | 2562 | 0.0075 | | 1.2056 | 2563 | 0.0093 | | 1.2060 | 2564 | 0.0247 | | 1.2065 | 2565 | 0.0042 | | 1.2070 | 2566 | 0.007 | | 1.2074 | 2567 | 0.0234 | | 1.2079 | 2568 | 0.0057 | | 1.2084 | 2569 | 0.006 | | 1.2088 | 2570 | 0.0105 | | 1.2093 | 2571 | 0.0142 | | 1.2098 | 2572 | 0.0237 | | 1.2103 | 2573 | 0.0062 | | 1.2107 | 2574 | 0.0105 | | 1.2112 | 2575 | 0.0059 | | 1.2117 | 2576 | 0.009 | | 1.2121 | 2577 | 0.008 | | 1.2126 | 2578 | 0.0106 | | 1.2131 | 2579 | 0.0056 | | 1.2135 | 2580 | 0.0054 | | 1.2140 | 2581 | 0.0077 | | 1.2145 | 2582 | 0.0094 | | 1.2150 | 2583 | 0.0085 | | 1.2154 | 2584 | 0.0155 | | 1.2159 | 2585 | 0.0078 | | 1.2164 | 2586 | 0.0092 | | 1.2168 | 2587 | 0.0049 | | 1.2173 | 2588 | 0.0045 | | 1.2178 | 2589 | 0.0078 | | 1.2183 | 2590 | 0.0156 | | 1.2187 | 2591 | 0.0161 | | 1.2192 | 2592 | 0.0065 | | 1.2197 | 2593 | 0.0134 | | 1.2201 | 2594 | 0.0119 | | 1.2206 | 2595 | 0.0187 | | 1.2211 | 2596 | 0.0097 | | 1.2215 | 2597 | 0.0094 | | 1.2220 | 2598 | 0.0064 | | 1.2225 | 2599 | 0.0066 | | 1.2230 | 2600 | 0.0057 | | 1.2234 | 2601 | 0.0042 | | 1.2239 | 2602 | 0.0092 | | 1.2244 | 2603 | 0.0079 | | 1.2248 | 2604 | 0.0076 | | 1.2253 | 2605 | 0.0095 | | 1.2258 | 2606 | 0.0136 | | 1.2262 | 2607 | 0.0043 | | 1.2267 | 2608 | 0.0067 | | 1.2272 | 2609 | 0.0091 | | 1.2277 | 2610 | 0.0045 | | 1.2281 | 2611 | 0.0045 | | 1.2286 | 2612 | 0.0123 | | 1.2291 | 2613 | 0.0322 | | 1.2295 | 2614 | 0.0112 | | 1.2300 | 2615 | 0.0049 | | 1.2305 | 2616 | 0.0043 | | 1.2310 | 2617 | 0.0058 | | 1.2314 | 2618 | 0.0085 | | 1.2319 | 2619 | 0.0105 | | 1.2324 | 2620 | 0.0059 | | 1.2328 | 2621 | 0.0157 | | 1.2333 | 2622 | 0.02 | | 1.2338 | 2623 | 0.0088 | | 1.2342 | 2624 | 0.0041 | | 1.2347 | 2625 | 0.0197 | | 1.2352 | 2626 | 0.0154 | | 1.2357 | 2627 | 0.0036 | | 1.2361 | 2628 | 0.0103 | | 1.2366 | 2629 | 0.0171 | | 1.2371 | 2630 | 0.0048 | | 1.2375 | 2631 | 0.0129 | | 1.2380 | 2632 | 0.0083 | | 1.2385 | 2633 | 0.0051 | | 1.2389 | 2634 | 0.0099 | | 1.2394 | 2635 | 0.003 | | 1.2399 | 2636 | 0.0063 | | 1.2404 | 2637 | 0.0067 | | 1.2408 | 2638 | 0.0051 | | 1.2413 | 2639 | 0.0224 | | 1.2418 | 2640 | 0.0112 | | 1.2422 | 2641 | 0.0077 | | 1.2427 | 2642 | 0.0049 | | 1.2432 | 2643 | 0.0076 | | 1.2437 | 2644 | 0.0039 | | 1.2441 | 2645 | 0.0084 | | 1.2446 | 2646 | 0.0096 | | 1.2451 | 2647 | 0.0031 | | 1.2455 | 2648 | 0.0068 | | 1.2460 | 2649 | 0.0165 | | 1.2465 | 2650 | 0.0042 | | 1.2469 | 2651 | 0.0108 | | 1.2474 | 2652 | 0.0037 | | 1.2479 | 2653 | 0.0097 | | 1.2484 | 2654 | 0.0078 | | 1.2488 | 2655 | 0.0215 | | 1.2493 | 2656 | 0.0077 | | 1.2498 | 2657 | 0.0148 | | 1.2502 | 2658 | 0.0105 | | 1.2507 | 2659 | 0.0067 | | 1.2512 | 2660 | 0.0046 | | 1.2516 | 2661 | 0.0062 | | 1.2521 | 2662 | 0.0063 | | 1.2526 | 2663 | 0.0284 | | 1.2531 | 2664 | 0.0023 | | 1.2535 | 2665 | 0.007 | | 1.2540 | 2666 | 0.0044 | | 1.2545 | 2667 | 0.0206 | | 1.2549 | 2668 | 0.008 | | 1.2554 | 2669 | 0.0065 | | 1.2559 | 2670 | 0.0278 | | 1.2563 | 2671 | 0.0053 | | 1.2568 | 2672 | 0.0053 | | 1.2573 | 2673 | 0.0049 | | 1.2578 | 2674 | 0.0121 | | 1.2582 | 2675 | 0.0077 | | 1.2587 | 2676 | 0.0056 | | 1.2592 | 2677 | 0.008 | | 1.2596 | 2678 | 0.0043 | | 1.2601 | 2679 | 0.0122 | | 1.2606 | 2680 | 0.0161 | | 1.2611 | 2681 | 0.007 | | 1.2615 | 2682 | 0.0111 | | 1.2620 | 2683 | 0.0106 | | 1.2625 | 2684 | 0.0056 | | 1.2629 | 2685 | 0.0124 | | 1.2634 | 2686 | 0.012 | | 1.2639 | 2687 | 0.0393 | | 1.2643 | 2688 | 0.0038 | | 1.2648 | 2689 | 0.0042 | | 1.2653 | 2690 | 0.0199 | | 1.2658 | 2691 | 0.0057 | | 1.2662 | 2692 | 0.014 | | 1.2667 | 2693 | 0.007 | | 1.2672 | 2694 | 0.0112 | | 1.2676 | 2695 | 0.0033 | | 1.2681 | 2696 | 0.0139 | | 1.2686 | 2697 | 0.0046 | | 1.2690 | 2698 | 0.0196 | | 1.2695 | 2699 | 0.004 | | 1.2700 | 2700 | 0.0108 | | 1.2705 | 2701 | 0.0064 | | 1.2709 | 2702 | 0.0103 | | 1.2714 | 2703 | 0.0068 | | 1.2719 | 2704 | 0.0059 | | 1.2723 | 2705 | 0.0134 | | 1.2728 | 2706 | 0.0062 | | 1.2733 | 2707 | 0.0084 | | 1.2738 | 2708 | 0.0053 | | 1.2742 | 2709 | 0.004 | | 1.2747 | 2710 | 0.0099 | | 1.2752 | 2711 | 0.0043 | | 1.2756 | 2712 | 0.0043 | | 1.2761 | 2713 | 0.0041 | | 1.2766 | 2714 | 0.0068 | | 1.2770 | 2715 | 0.0024 | | 1.2775 | 2716 | 0.0042 | | 1.2780 | 2717 | 0.0174 | | 1.2785 | 2718 | 0.014 | | 1.2789 | 2719 | 0.0052 | | 1.2794 | 2720 | 0.004 | | 1.2799 | 2721 | 0.0071 | | 1.2803 | 2722 | 0.0059 | | 1.2808 | 2723 | 0.0087 | | 1.2813 | 2724 | 0.0046 | | 1.2817 | 2725 | 0.0182 | | 1.2822 | 2726 | 0.0088 | | 1.2827 | 2727 | 0.023 | | 1.2832 | 2728 | 0.0065 | | 1.2836 | 2729 | 0.0037 | | 1.2841 | 2730 | 0.016 | | 1.2846 | 2731 | 0.0032 | | 1.2850 | 2732 | 0.0032 | | 1.2855 | 2733 | 0.0111 | | 1.2860 | 2734 | 0.0126 | | 1.2865 | 2735 | 0.0046 | | 1.2869 | 2736 | 0.0048 | | 1.2874 | 2737 | 0.0054 | | 1.2879 | 2738 | 0.0134 | | 1.2883 | 2739 | 0.0548 | | 1.2888 | 2740 | 0.0055 | | 1.2893 | 2741 | 0.0065 | | 1.2897 | 2742 | 0.0071 | | 1.2902 | 2743 | 0.0092 | | 1.2907 | 2744 | 0.0248 | | 1.2912 | 2745 | 0.0067 | | 1.2916 | 2746 | 0.0102 | | 1.2921 | 2747 | 0.0036 | | 1.2926 | 2748 | 0.0055 | | 1.2930 | 2749 | 0.0045 | | 1.2935 | 2750 | 0.0108 | | 1.2940 | 2751 | 0.0071 | | 1.2944 | 2752 | 0.0049 | | 1.2949 | 2753 | 0.0109 | | 1.2954 | 2754 | 0.0424 | | 1.2959 | 2755 | 0.0085 | | 1.2963 | 2756 | 0.0116 | | 1.2968 | 2757 | 0.0067 | | 1.2973 | 2758 | 0.0168 | | 1.2977 | 2759 | 0.005 | | 1.2982 | 2760 | 0.0105 | | 1.2987 | 2761 | 0.0042 | | 1.2992 | 2762 | 0.0072 | | 1.2996 | 2763 | 0.0046 | | 1.3001 | 2764 | 0.0084 | | 1.3006 | 2765 | 0.0101 | | 1.3010 | 2766 | 0.0143 | | 1.3015 | 2767 | 0.0062 | | 1.3020 | 2768 | 0.0088 | | 1.3024 | 2769 | 0.0169 | | 1.3029 | 2770 | 0.003 | | 1.3034 | 2771 | 0.0096 | | 1.3039 | 2772 | 0.0111 | | 1.3043 | 2773 | 0.0111 | | 1.3048 | 2774 | 0.0041 | | 1.3053 | 2775 | 0.0052 | | 1.3057 | 2776 | 0.0082 | | 1.3062 | 2777 | 0.0041 | | 1.3067 | 2778 | 0.0057 | | 1.3071 | 2779 | 0.0118 | | 1.3076 | 2780 | 0.0028 | | 1.3081 | 2781 | 0.027 | | 1.3086 | 2782 | 0.006 | | 1.3090 | 2783 | 0.005 | | 1.3095 | 2784 | 0.0045 | | 1.3100 | 2785 | 0.005 | | 1.3104 | 2786 | 0.004 | | 1.3109 | 2787 | 0.0071 | | 1.3114 | 2788 | 0.0083 | | 1.3119 | 2789 | 0.0115 | | 1.3123 | 2790 | 0.0027 | | 1.3128 | 2791 | 0.0136 | | 1.3133 | 2792 | 0.0089 | | 1.3137 | 2793 | 0.0182 | | 1.3142 | 2794 | 0.0042 | | 1.3147 | 2795 | 0.0029 | | 1.3151 | 2796 | 0.0065 | | 1.3156 | 2797 | 0.0067 | | 1.3161 | 2798 | 0.007 | | 1.3166 | 2799 | 0.0035 | | 1.3170 | 2800 | 0.005 | | 1.3175 | 2801 | 0.0221 | | 1.3180 | 2802 | 0.004 | | 1.3184 | 2803 | 0.005 | | 1.3189 | 2804 | 0.0091 | | 1.3194 | 2805 | 0.0044 | | 1.3198 | 2806 | 0.004 | | 1.3203 | 2807 | 0.009 | | 1.3208 | 2808 | 0.0079 | | 1.3213 | 2809 | 0.0086 | | 1.3217 | 2810 | 0.0043 | | 1.3222 | 2811 | 0.0108 | | 1.3227 | 2812 | 0.0067 | | 1.3231 | 2813 | 0.0055 | | 1.3236 | 2814 | 0.0038 | | 1.3241 | 2815 | 0.0069 | | 1.3246 | 2816 | 0.0037 | | 1.3250 | 2817 | 0.0055 | | 1.3255 | 2818 | 0.0144 | | 1.3260 | 2819 | 0.0041 | | 1.3264 | 2820 | 0.0076 | | 1.3269 | 2821 | 0.0064 | | 1.3274 | 2822 | 0.011 | | 1.3278 | 2823 | 0.0061 | | 1.3283 | 2824 | 0.0058 | | 1.3288 | 2825 | 0.0019 | | 1.3293 | 2826 | 0.0146 | | 1.3297 | 2827 | 0.0077 | | 1.3302 | 2828 | 0.0044 | | 1.3307 | 2829 | 0.0051 | | 1.3311 | 2830 | 0.0114 | | 1.3316 | 2831 | 0.0019 | | 1.3321 | 2832 | 0.0063 | | 1.3325 | 2833 | 0.0135 | | 1.3330 | 2834 | 0.0066 | | 1.3335 | 2835 | 0.0164 | | 1.3340 | 2836 | 0.0068 | | 1.3344 | 2837 | 0.011 | | 1.3349 | 2838 | 0.0086 | | 1.3354 | 2839 | 0.0027 | | 1.3358 | 2840 | 0.0053 | | 1.3363 | 2841 | 0.0151 | | 1.3368 | 2842 | 0.0045 | | 1.3373 | 2843 | 0.0085 | | 1.3377 | 2844 | 0.0063 | | 1.3382 | 2845 | 0.0211 | | 1.3387 | 2846 | 0.0042 | | 1.3391 | 2847 | 0.0125 | | 1.3396 | 2848 | 0.0045 | | 1.3401 | 2849 | 0.0088 | | 1.3405 | 2850 | 0.0084 | | 1.3410 | 2851 | 0.0085 | | 1.3415 | 2852 | 0.0054 | | 1.3420 | 2853 | 0.0048 | | 1.3424 | 2854 | 0.0045 | | 1.3429 | 2855 | 0.0559 | | 1.3434 | 2856 | 0.0067 | | 1.3438 | 2857 | 0.0073 | | 1.3443 | 2858 | 0.0175 | | 1.3448 | 2859 | 0.0104 | | 1.3452 | 2860 | 0.0075 | | 1.3457 | 2861 | 0.0054 | | 1.3462 | 2862 | 0.006 | | 1.3467 | 2863 | 0.0107 | | 1.3471 | 2864 | 0.0048 | | 1.3476 | 2865 | 0.004 | | 1.3481 | 2866 | 0.008 | | 1.3485 | 2867 | 0.0033 | | 1.3490 | 2868 | 0.0072 | | 1.3495 | 2869 | 0.0075 | | 1.3500 | 2870 | 0.0072 | | 1.3504 | 2871 | 0.0189 | | 1.3509 | 2872 | 0.0127 | | 1.3514 | 2873 | 0.0031 | | 1.3518 | 2874 | 0.0052 | | 1.3523 | 2875 | 0.0173 | | 1.3528 | 2876 | 0.007 | | 1.3532 | 2877 | 0.0066 | | 1.3537 | 2878 | 0.0059 | | 1.3542 | 2879 | 0.0117 | | 1.3547 | 2880 | 0.0088 | | 1.3551 | 2881 | 0.0021 | | 1.3556 | 2882 | 0.0051 | | 1.3561 | 2883 | 0.0038 | | 1.3565 | 2884 | 0.008 | | 1.3570 | 2885 | 0.0076 | | 1.3575 | 2886 | 0.01 | | 1.3579 | 2887 | 0.0032 | | 1.3584 | 2888 | 0.0055 | | 1.3589 | 2889 | 0.0081 | | 1.3594 | 2890 | 0.0116 | | 1.3598 | 2891 | 0.007 | | 1.3603 | 2892 | 0.0037 | | 1.3608 | 2893 | 0.0069 | | 1.3612 | 2894 | 0.0071 | | 1.3617 | 2895 | 0.0138 | | 1.3622 | 2896 | 0.0038 | | 1.3627 | 2897 | 0.0078 | | 1.3631 | 2898 | 0.0032 | | 1.3636 | 2899 | 0.0165 | | 1.3641 | 2900 | 0.0094 | | 1.3645 | 2901 | 0.0095 | | 1.3650 | 2902 | 0.0069 | | 1.3655 | 2903 | 0.0153 | | 1.3659 | 2904 | 0.0044 | | 1.3664 | 2905 | 0.0139 | | 1.3669 | 2906 | 0.0105 | | 1.3674 | 2907 | 0.0035 | | 1.3678 | 2908 | 0.0055 | | 1.3683 | 2909 | 0.0039 | | 1.3688 | 2910 | 0.0067 | | 1.3692 | 2911 | 0.0023 | | 1.3697 | 2912 | 0.015 | | 1.3702 | 2913 | 0.0152 | | 1.3706 | 2914 | 0.0023 | | 1.3711 | 2915 | 0.0061 | | 1.3716 | 2916 | 0.0093 | | 1.3721 | 2917 | 0.0059 | | 1.3725 | 2918 | 0.0058 | | 1.3730 | 2919 | 0.0056 | | 1.3735 | 2920 | 0.0082 | | 1.3739 | 2921 | 0.0075 | | 1.3744 | 2922 | 0.012 | | 1.3749 | 2923 | 0.0042 | | 1.3754 | 2924 | 0.0054 | | 1.3758 | 2925 | 0.0069 | | 1.3763 | 2926 | 0.0052 | | 1.3768 | 2927 | 0.0065 | | 1.3772 | 2928 | 0.0051 | | 1.3777 | 2929 | 0.0243 | | 1.3782 | 2930 | 0.0023 | | 1.3786 | 2931 | 0.0127 | | 1.3791 | 2932 | 0.0104 | | 1.3796 | 2933 | 0.004 | | 1.3801 | 2934 | 0.0076 | | 1.3805 | 2935 | 0.0106 | | 1.3810 | 2936 | 0.0143 | | 1.3815 | 2937 | 0.0079 | | 1.3819 | 2938 | 0.0062 | | 1.3824 | 2939 | 0.0043 | | 1.3829 | 2940 | 0.0103 | | 1.3833 | 2941 | 0.005 | | 1.3838 | 2942 | 0.0209 | | 1.3843 | 2943 | 0.0027 | | 1.3848 | 2944 | 0.0032 | | 1.3852 | 2945 | 0.0117 | | 1.3857 | 2946 | 0.0115 | | 1.3862 | 2947 | 0.0045 | | 1.3866 | 2948 | 0.0026 | | 1.3871 | 2949 | 0.0045 | | 1.3876 | 2950 | 0.0066 | | 1.3881 | 2951 | 0.004 | | 1.3885 | 2952 | 0.0139 | | 1.3890 | 2953 | 0.0035 | | 1.3895 | 2954 | 0.004 | | 1.3899 | 2955 | 0.0126 | | 1.3904 | 2956 | 0.0061 | | 1.3909 | 2957 | 0.0066 | | 1.3913 | 2958 | 0.0081 | | 1.3918 | 2959 | 0.0046 | | 1.3923 | 2960 | 0.0213 | | 1.3928 | 2961 | 0.0053 | | 1.3932 | 2962 | 0.0032 | | 1.3937 | 2963 | 0.0053 | | 1.3942 | 2964 | 0.0056 | | 1.3946 | 2965 | 0.0041 | | 1.3951 | 2966 | 0.0112 | | 1.3956 | 2967 | 0.0105 | | 1.3960 | 2968 | 0.0104 | | 1.3965 | 2969 | 0.0069 | | 1.3970 | 2970 | 0.005 | | 1.3975 | 2971 | 0.0163 | | 1.3979 | 2972 | 0.0073 | | 1.3984 | 2973 | 0.0093 | | 1.3989 | 2974 | 0.0064 | | 1.3993 | 2975 | 0.005 | | 1.3998 | 2976 | 0.0022 | | 1.4003 | 2977 | 0.0066 | | 1.4008 | 2978 | 0.0077 | | 1.4012 | 2979 | 0.0102 | | 1.4017 | 2980 | 0.0055 | | 1.4022 | 2981 | 0.0053 | | 1.4026 | 2982 | 0.0091 | | 1.4031 | 2983 | 0.0034 | | 1.4036 | 2984 | 0.0066 | | 1.4040 | 2985 | 0.0092 | | 1.4045 | 2986 | 0.0044 | | 1.4050 | 2987 | 0.0044 | | 1.4055 | 2988 | 0.0023 | | 1.4059 | 2989 | 0.0072 | | 1.4064 | 2990 | 0.0081 | | 1.4069 | 2991 | 0.0031 | | 1.4073 | 2992 | 0.0055 | | 1.4078 | 2993 | 0.0042 | | 1.4083 | 2994 | 0.0094 | | 1.4087 | 2995 | 0.0024 | | 1.4092 | 2996 | 0.0035 | | 1.4097 | 2997 | 0.0029 | | 1.4102 | 2998 | 0.022 | | 1.4106 | 2999 | 0.0166 | | 1.4111 | 3000 | 0.0062 | | 1.4116 | 3001 | 0.0028 | | 1.4120 | 3002 | 0.0037 | | 1.4125 | 3003 | 0.0109 | | 1.4130 | 3004 | 0.0033 | | 1.4135 | 3005 | 0.0156 | | 1.4139 | 3006 | 0.0101 | | 1.4144 | 3007 | 0.0058 | | 1.4149 | 3008 | 0.0157 | | 1.4153 | 3009 | 0.0032 | | 1.4158 | 3010 | 0.0032 | | 1.4163 | 3011 | 0.0033 | | 1.4167 | 3012 | 0.0066 | | 1.4172 | 3013 | 0.0063 | | 1.4177 | 3014 | 0.0052 | | 1.4182 | 3015 | 0.0107 | | 1.4186 | 3016 | 0.0179 | | 1.4191 | 3017 | 0.0073 | | 1.4196 | 3018 | 0.0168 | | 1.4200 | 3019 | 0.003 | | 1.4205 | 3020 | 0.0074 | | 1.4210 | 3021 | 0.0161 | | 1.4214 | 3022 | 0.0079 | | 1.4219 | 3023 | 0.0054 | | 1.4224 | 3024 | 0.0026 | | 1.4229 | 3025 | 0.0067 | | 1.4233 | 3026 | 0.0068 | | 1.4238 | 3027 | 0.0091 | | 1.4243 | 3028 | 0.0148 | | 1.4247 | 3029 | 0.0083 | | 1.4252 | 3030 | 0.0023 | | 1.4257 | 3031 | 0.0048 | | 1.4262 | 3032 | 0.0069 | | 1.4266 | 3033 | 0.011 | | 1.4271 | 3034 | 0.0117 | | 1.4276 | 3035 | 0.0059 | | 1.4280 | 3036 | 0.0069 | | 1.4285 | 3037 | 0.0047 | | 1.4290 | 3038 | 0.0086 | | 1.4294 | 3039 | 0.0084 | | 1.4299 | 3040 | 0.0052 | | 1.4304 | 3041 | 0.0124 | | 1.4309 | 3042 | 0.0046 | | 1.4313 | 3043 | 0.0079 | | 1.4318 | 3044 | 0.002 | | 1.4323 | 3045 | 0.0043 | | 1.4327 | 3046 | 0.0028 | | 1.4332 | 3047 | 0.0036 | | 1.4337 | 3048 | 0.0035 | | 1.4341 | 3049 | 0.0072 | | 1.4346 | 3050 | 0.0184 | | 1.4351 | 3051 | 0.003 | | 1.4356 | 3052 | 0.014 | | 1.4360 | 3053 | 0.0058 | | 1.4365 | 3054 | 0.004 | | 1.4370 | 3055 | 0.0062 | | 1.4374 | 3056 | 0.0081 | | 1.4379 | 3057 | 0.006 | | 1.4384 | 3058 | 0.0032 | | 1.4389 | 3059 | 0.0096 | | 1.4393 | 3060 | 0.021 | | 1.4398 | 3061 | 0.0033 | | 1.4403 | 3062 | 0.0035 | | 1.4407 | 3063 | 0.0082 | | 1.4412 | 3064 | 0.0059 | | 1.4417 | 3065 | 0.0035 | | 1.4421 | 3066 | 0.0054 | | 1.4426 | 3067 | 0.0082 | | 1.4431 | 3068 | 0.0062 | | 1.4436 | 3069 | 0.028 | | 1.4440 | 3070 | 0.0034 | | 1.4445 | 3071 | 0.0084 | | 1.4450 | 3072 | 0.0028 | | 1.4454 | 3073 | 0.0043 | | 1.4459 | 3074 | 0.0071 | | 1.4464 | 3075 | 0.0062 | | 1.4468 | 3076 | 0.0117 | | 1.4473 | 3077 | 0.0046 | | 1.4478 | 3078 | 0.0101 | | 1.4483 | 3079 | 0.0098 | | 1.4487 | 3080 | 0.0052 | | 1.4492 | 3081 | 0.0074 | | 1.4497 | 3082 | 0.0037 | | 1.4501 | 3083 | 0.0175 | | 1.4506 | 3084 | 0.0033 | | 1.4511 | 3085 | 0.0095 | | 1.4516 | 3086 | 0.003 | | 1.4520 | 3087 | 0.0048 | | 1.4525 | 3088 | 0.0052 | | 1.4530 | 3089 | 0.0095 | | 1.4534 | 3090 | 0.0029 | | 1.4539 | 3091 | 0.0022 | | 1.4544 | 3092 | 0.007 | | 1.4548 | 3093 | 0.0069 | | 1.4553 | 3094 | 0.0027 | | 1.4558 | 3095 | 0.0042 | | 1.4563 | 3096 | 0.0088 | | 1.4567 | 3097 | 0.0052 | | 1.4572 | 3098 | 0.0093 | | 1.4577 | 3099 | 0.009 | | 1.4581 | 3100 | 0.0106 | | 1.4586 | 3101 | 0.0124 | | 1.4591 | 3102 | 0.005 | | 1.4595 | 3103 | 0.0062 | | 1.4600 | 3104 | 0.003 | | 1.4605 | 3105 | 0.0053 | | 1.4610 | 3106 | 0.0044 | | 1.4614 | 3107 | 0.0074 | | 1.4619 | 3108 | 0.0065 | | 1.4624 | 3109 | 0.0042 | | 1.4628 | 3110 | 0.0047 | | 1.4633 | 3111 | 0.0024 | | 1.4638 | 3112 | 0.0125 | | 1.4643 | 3113 | 0.0031 | | 1.4647 | 3114 | 0.0053 | | 1.4652 | 3115 | 0.0035 | | 1.4657 | 3116 | 0.0029 | | 1.4661 | 3117 | 0.0038 | | 1.4666 | 3118 | 0.005 | | 1.4671 | 3119 | 0.0067 | | 1.4675 | 3120 | 0.0106 | | 1.4680 | 3121 | 0.0029 | | 1.4685 | 3122 | 0.0047 | | 1.4690 | 3123 | 0.01 | | 1.4694 | 3124 | 0.0062 | | 1.4699 | 3125 | 0.0038 | | 1.4704 | 3126 | 0.0034 | | 1.4708 | 3127 | 0.0132 | | 1.4713 | 3128 | 0.0105 | | 1.4718 | 3129 | 0.004 | | 1.4722 | 3130 | 0.0104 | | 1.4727 | 3131 | 0.0025 | | 1.4732 | 3132 | 0.0067 | | 1.4737 | 3133 | 0.0071 | | 1.4741 | 3134 | 0.0038 | | 1.4746 | 3135 | 0.0092 | | 1.4751 | 3136 | 0.0223 | | 1.4755 | 3137 | 0.0087 | | 1.4760 | 3138 | 0.0067 | | 1.4765 | 3139 | 0.0029 | | 1.4770 | 3140 | 0.0019 | | 1.4774 | 3141 | 0.0028 | | 1.4779 | 3142 | 0.0032 | | 1.4784 | 3143 | 0.0024 | | 1.4788 | 3144 | 0.0041 | | 1.4793 | 3145 | 0.0045 | | 1.4798 | 3146 | 0.0073 | | 1.4802 | 3147 | 0.0115 | | 1.4807 | 3148 | 0.0051 | | 1.4812 | 3149 | 0.0395 | | 1.4817 | 3150 | 0.0054 | | 1.4821 | 3151 | 0.0062 | | 1.4826 | 3152 | 0.0047 | | 1.4831 | 3153 | 0.0086 | | 1.4835 | 3154 | 0.0031 | | 1.4840 | 3155 | 0.0101 | | 1.4845 | 3156 | 0.005 | | 1.4849 | 3157 | 0.0098 | | 1.4854 | 3158 | 0.0033 | | 1.4859 | 3159 | 0.0116 | | 1.4864 | 3160 | 0.003 | | 1.4868 | 3161 | 0.0055 | | 1.4873 | 3162 | 0.0018 | | 1.4878 | 3163 | 0.0051 | | 1.4882 | 3164 | 0.0177 | | 1.4887 | 3165 | 0.0128 | | 1.4892 | 3166 | 0.0085 | | 1.4897 | 3167 | 0.0066 | | 1.4901 | 3168 | 0.006 | | 1.4906 | 3169 | 0.0095 | | 1.4911 | 3170 | 0.0073 | | 1.4915 | 3171 | 0.0077 | | 1.4920 | 3172 | 0.0108 | | 1.4925 | 3173 | 0.0054 | | 1.4929 | 3174 | 0.0086 | | 1.4934 | 3175 | 0.0048 | | 1.4939 | 3176 | 0.0046 | | 1.4944 | 3177 | 0.0092 | | 1.4948 | 3178 | 0.004 | | 1.4953 | 3179 | 0.0043 | | 1.4958 | 3180 | 0.007 | | 1.4962 | 3181 | 0.0074 | | 1.4967 | 3182 | 0.0115 | | 1.4972 | 3183 | 0.0139 | | 1.4976 | 3184 | 0.0028 | | 1.4981 | 3185 | 0.0051 | | 1.4986 | 3186 | 0.0038 | | 1.4991 | 3187 | 0.0058 | | 1.4995 | 3188 | 0.0129 | | 1.5 | 3189 | 0.0082 | | 1.5005 | 3190 | 0.011 | | 1.5009 | 3191 | 0.007 | | 1.5014 | 3192 | 0.005 | | 1.5019 | 3193 | 0.0013 | | 1.5024 | 3194 | 0.0085 | | 1.5028 | 3195 | 0.0061 | | 1.5033 | 3196 | 0.0026 | | 1.5038 | 3197 | 0.0226 | | 1.5042 | 3198 | 0.0127 | | 1.5047 | 3199 | 0.0116 | | 1.5052 | 3200 | 0.0196 | | 1.5056 | 3201 | 0.0099 | | 1.5061 | 3202 | 0.004 | | 1.5066 | 3203 | 0.0044 | | 1.5071 | 3204 | 0.0067 | | 1.5075 | 3205 | 0.0062 | | 1.5080 | 3206 | 0.0048 | | 1.5085 | 3207 | 0.0106 | | 1.5089 | 3208 | 0.0071 | | 1.5094 | 3209 | 0.006 | | 1.5099 | 3210 | 0.0023 | | 1.5103 | 3211 | 0.0173 | | 1.5108 | 3212 | 0.0028 | | 1.5113 | 3213 | 0.0046 | | 1.5118 | 3214 | 0.0238 | | 1.5122 | 3215 | 0.0069 | | 1.5127 | 3216 | 0.0052 | | 1.5132 | 3217 | 0.0169 | | 1.5136 | 3218 | 0.0104 | | 1.5141 | 3219 | 0.0061 | | 1.5146 | 3220 | 0.0203 | | 1.5151 | 3221 | 0.0042 | | 1.5155 | 3222 | 0.0071 | | 1.5160 | 3223 | 0.0025 | | 1.5165 | 3224 | 0.0201 | | 1.5169 | 3225 | 0.0044 | | 1.5174 | 3226 | 0.0034 | | 1.5179 | 3227 | 0.008 | | 1.5183 | 3228 | 0.0039 | | 1.5188 | 3229 | 0.0031 | | 1.5193 | 3230 | 0.0063 | | 1.5198 | 3231 | 0.007 | | 1.5202 | 3232 | 0.0027 | | 1.5207 | 3233 | 0.0045 | | 1.5212 | 3234 | 0.0035 | | 1.5216 | 3235 | 0.0054 | | 1.5221 | 3236 | 0.005 | | 1.5226 | 3237 | 0.0097 | | 1.5230 | 3238 | 0.0127 | | 1.5235 | 3239 | 0.0064 | | 1.5240 | 3240 | 0.0031 | | 1.5245 | 3241 | 0.0116 | | 1.5249 | 3242 | 0.0112 | | 1.5254 | 3243 | 0.0057 | | 1.5259 | 3244 | 0.0066 | | 1.5263 | 3245 | 0.0058 | | 1.5268 | 3246 | 0.0081 | | 1.5273 | 3247 | 0.0081 | | 1.5278 | 3248 | 0.0102 | | 1.5282 | 3249 | 0.0035 | | 1.5287 | 3250 | 0.0058 | | 1.5292 | 3251 | 0.023 | | 1.5296 | 3252 | 0.0117 | | 1.5301 | 3253 | 0.0036 | | 1.5306 | 3254 | 0.004 | | 1.5310 | 3255 | 0.0067 | | 1.5315 | 3256 | 0.0021 | | 1.5320 | 3257 | 0.0078 | | 1.5325 | 3258 | 0.0081 | | 1.5329 | 3259 | 0.0062 | | 1.5334 | 3260 | 0.0095 | | 1.5339 | 3261 | 0.0036 | | 1.5343 | 3262 | 0.0118 | | 1.5348 | 3263 | 0.0074 | | 1.5353 | 3264 | 0.0137 | | 1.5357 | 3265 | 0.0054 | | 1.5362 | 3266 | 0.0079 | | 1.5367 | 3267 | 0.0073 | | 1.5372 | 3268 | 0.0048 | | 1.5376 | 3269 | 0.0069 | | 1.5381 | 3270 | 0.004 | | 1.5386 | 3271 | 0.025 | | 1.5390 | 3272 | 0.0074 | | 1.5395 | 3273 | 0.0031 | | 1.5400 | 3274 | 0.0036 | | 1.5405 | 3275 | 0.0028 | | 1.5409 | 3276 | 0.009 | | 1.5414 | 3277 | 0.007 | | 1.5419 | 3278 | 0.0068 | | 1.5423 | 3279 | 0.0105 | | 1.5428 | 3280 | 0.0058 | | 1.5433 | 3281 | 0.0042 | | 1.5437 | 3282 | 0.0038 | | 1.5442 | 3283 | 0.0092 | | 1.5447 | 3284 | 0.0176 | | 1.5452 | 3285 | 0.0089 | | 1.5456 | 3286 | 0.0319 | | 1.5461 | 3287 | 0.0062 | | 1.5466 | 3288 | 0.0122 | | 1.5470 | 3289 | 0.0029 | | 1.5475 | 3290 | 0.0131 | | 1.5480 | 3291 | 0.006 | | 1.5484 | 3292 | 0.0038 | | 1.5489 | 3293 | 0.0096 | | 1.5494 | 3294 | 0.0053 | | 1.5499 | 3295 | 0.0041 | | 1.5503 | 3296 | 0.0115 | | 1.5508 | 3297 | 0.0061 | | 1.5513 | 3298 | 0.0043 | | 1.5517 | 3299 | 0.0049 | | 1.5522 | 3300 | 0.0046 | | 1.5527 | 3301 | 0.0079 | | 1.5532 | 3302 | 0.0102 | | 1.5536 | 3303 | 0.0042 | | 1.5541 | 3304 | 0.0105 | | 1.5546 | 3305 | 0.0031 | | 1.5550 | 3306 | 0.0141 | | 1.5555 | 3307 | 0.0044 | | 1.5560 | 3308 | 0.0164 | | 1.5564 | 3309 | 0.009 | | 1.5569 | 3310 | 0.012 | | 1.5574 | 3311 | 0.0072 | | 1.5579 | 3312 | 0.0103 | | 1.5583 | 3313 | 0.008 | | 1.5588 | 3314 | 0.0044 | | 1.5593 | 3315 | 0.014 | | 1.5597 | 3316 | 0.0075 | | 1.5602 | 3317 | 0.0032 | | 1.5607 | 3318 | 0.0063 | | 1.5611 | 3319 | 0.0049 | | 1.5616 | 3320 | 0.004 | | 1.5621 | 3321 | 0.008 | | 1.5626 | 3322 | 0.0069 | | 1.5630 | 3323 | 0.0035 | | 1.5635 | 3324 | 0.0077 | | 1.5640 | 3325 | 0.0055 | | 1.5644 | 3326 | 0.0016 | | 1.5649 | 3327 | 0.0084 | | 1.5654 | 3328 | 0.0054 | | 1.5659 | 3329 | 0.0158 | | 1.5663 | 3330 | 0.0059 | | 1.5668 | 3331 | 0.0069 | | 1.5673 | 3332 | 0.0059 | | 1.5677 | 3333 | 0.0054 | | 1.5682 | 3334 | 0.0077 | | 1.5687 | 3335 | 0.0056 | | 1.5691 | 3336 | 0.0027 | | 1.5696 | 3337 | 0.0073 | | 1.5701 | 3338 | 0.0097 | | 1.5706 | 3339 | 0.0069 | | 1.5710 | 3340 | 0.012 | | 1.5715 | 3341 | 0.0163 | | 1.5720 | 3342 | 0.0143 | | 1.5724 | 3343 | 0.0068 | | 1.5729 | 3344 | 0.0067 | | 1.5734 | 3345 | 0.0036 | | 1.5738 | 3346 | 0.0034 | | 1.5743 | 3347 | 0.0101 | | 1.5748 | 3348 | 0.0036 | | 1.5753 | 3349 | 0.0047 | | 1.5757 | 3350 | 0.0065 | | 1.5762 | 3351 | 0.0056 | | 1.5767 | 3352 | 0.0093 | | 1.5771 | 3353 | 0.0036 | | 1.5776 | 3354 | 0.009 | | 1.5781 | 3355 | 0.012 | | 1.5786 | 3356 | 0.0074 | | 1.5790 | 3357 | 0.0089 | | 1.5795 | 3358 | 0.0055 | | 1.5800 | 3359 | 0.0089 | | 1.5804 | 3360 | 0.0097 | | 1.5809 | 3361 | 0.0057 | | 1.5814 | 3362 | 0.0047 | | 1.5818 | 3363 | 0.0069 | | 1.5823 | 3364 | 0.0068 | | 1.5828 | 3365 | 0.0079 | | 1.5833 | 3366 | 0.005 | | 1.5837 | 3367 | 0.0028 | | 1.5842 | 3368 | 0.0062 | | 1.5847 | 3369 | 0.0051 | | 1.5851 | 3370 | 0.002 | | 1.5856 | 3371 | 0.0097 | | 1.5861 | 3372 | 0.0043 | | 1.5865 | 3373 | 0.0181 | | 1.5870 | 3374 | 0.0049 | | 1.5875 | 3375 | 0.005 | | 1.5880 | 3376 | 0.0063 | | 1.5884 | 3377 | 0.0068 | | 1.5889 | 3378 | 0.0042 | | 1.5894 | 3379 | 0.0057 | | 1.5898 | 3380 | 0.0025 | | 1.5903 | 3381 | 0.0055 | | 1.5908 | 3382 | 0.0067 | | 1.5913 | 3383 | 0.0035 | | 1.5917 | 3384 | 0.0133 | | 1.5922 | 3385 | 0.0033 | | 1.5927 | 3386 | 0.0037 | | 1.5931 | 3387 | 0.0044 | | 1.5936 | 3388 | 0.0058 | | 1.5941 | 3389 | 0.0031 | | 1.5945 | 3390 | 0.0028 | | 1.5950 | 3391 | 0.0058 | | 1.5955 | 3392 | 0.003 | | 1.5960 | 3393 | 0.0046 | | 1.5964 | 3394 | 0.0039 | | 1.5969 | 3395 | 0.0264 | | 1.5974 | 3396 | 0.0126 | | 1.5978 | 3397 | 0.0082 | | 1.5983 | 3398 | 0.0152 | | 1.5988 | 3399 | 0.0059 | | 1.5992 | 3400 | 0.0113 | | 1.5997 | 3401 | 0.01 | | 1.6002 | 3402 | 0.0038 | | 1.6007 | 3403 | 0.0059 | | 1.6011 | 3404 | 0.0111 | | 1.6016 | 3405 | 0.0034 | | 1.6021 | 3406 | 0.0042 | | 1.6025 | 3407 | 0.0139 | | 1.6030 | 3408 | 0.0028 | | 1.6035 | 3409 | 0.0135 | | 1.6040 | 3410 | 0.0067 | | 1.6044 | 3411 | 0.0042 | | 1.6049 | 3412 | 0.0124 | | 1.6054 | 3413 | 0.0035 | | 1.6058 | 3414 | 0.0046 | | 1.6063 | 3415 | 0.0123 | | 1.6068 | 3416 | 0.0042 | | 1.6072 | 3417 | 0.0033 | | 1.6077 | 3418 | 0.0066 | | 1.6082 | 3419 | 0.0062 | | 1.6087 | 3420 | 0.0107 | | 1.6091 | 3421 | 0.0055 | | 1.6096 | 3422 | 0.0076 | | 1.6101 | 3423 | 0.0038 | | 1.6105 | 3424 | 0.0136 | | 1.6110 | 3425 | 0.0065 | | 1.6115 | 3426 | 0.0091 | | 1.6119 | 3427 | 0.0054 | | 1.6124 | 3428 | 0.0064 | | 1.6129 | 3429 | 0.0041 | | 1.6134 | 3430 | 0.0122 | | 1.6138 | 3431 | 0.007 | | 1.6143 | 3432 | 0.0083 | | 1.6148 | 3433 | 0.0118 | | 1.6152 | 3434 | 0.0028 | | 1.6157 | 3435 | 0.0048 | | 1.6162 | 3436 | 0.0035 | | 1.6167 | 3437 | 0.0101 | | 1.6171 | 3438 | 0.0093 | | 1.6176 | 3439 | 0.0047 | | 1.6181 | 3440 | 0.0039 | | 1.6185 | 3441 | 0.0028 | | 1.6190 | 3442 | 0.0063 | | 1.6195 | 3443 | 0.0033 | | 1.6199 | 3444 | 0.008 | | 1.6204 | 3445 | 0.0155 | | 1.6209 | 3446 | 0.003 | | 1.6214 | 3447 | 0.0054 | | 1.6218 | 3448 | 0.0089 | | 1.6223 | 3449 | 0.0058 | | 1.6228 | 3450 | 0.0084 | | 1.6232 | 3451 | 0.006 | | 1.6237 | 3452 | 0.006 | | 1.6242 | 3453 | 0.0061 | | 1.6246 | 3454 | 0.0027 | | 1.6251 | 3455 | 0.0071 | | 1.6256 | 3456 | 0.0029 | | 1.6261 | 3457 | 0.0067 | | 1.6265 | 3458 | 0.0083 | | 1.6270 | 3459 | 0.0119 | | 1.6275 | 3460 | 0.0082 | | 1.6279 | 3461 | 0.0035 | | 1.6284 | 3462 | 0.0051 | | 1.6289 | 3463 | 0.0087 | | 1.6294 | 3464 | 0.0035 | | 1.6298 | 3465 | 0.0048 | | 1.6303 | 3466 | 0.0078 | | 1.6308 | 3467 | 0.0073 | | 1.6312 | 3468 | 0.0032 | | 1.6317 | 3469 | 0.0059 | | 1.6322 | 3470 | 0.0191 | | 1.6326 | 3471 | 0.007 | | 1.6331 | 3472 | 0.0048 | | 1.6336 | 3473 | 0.0067 | | 1.6341 | 3474 | 0.0069 | | 1.6345 | 3475 | 0.0082 | | 1.6350 | 3476 | 0.0044 | | 1.6355 | 3477 | 0.0074 | | 1.6359 | 3478 | 0.0052 | | 1.6364 | 3479 | 0.0031 | | 1.6369 | 3480 | 0.009 | | 1.6373 | 3481 | 0.0073 | | 1.6378 | 3482 | 0.0051 | | 1.6383 | 3483 | 0.0038 | | 1.6388 | 3484 | 0.0087 | | 1.6392 | 3485 | 0.0099 | | 1.6397 | 3486 | 0.003 | | 1.6402 | 3487 | 0.0155 | | 1.6406 | 3488 | 0.0038 | | 1.6411 | 3489 | 0.0035 | | 1.6416 | 3490 | 0.0044 | | 1.6421 | 3491 | 0.0021 | | 1.6425 | 3492 | 0.0063 | | 1.6430 | 3493 | 0.0029 | | 1.6435 | 3494 | 0.005 | | 1.6439 | 3495 | 0.0081 | | 1.6444 | 3496 | 0.009 | | 1.6449 | 3497 | 0.005 | | 1.6453 | 3498 | 0.0029 | | 1.6458 | 3499 | 0.003 | | 1.6463 | 3500 | 0.0034 | | 1.6468 | 3501 | 0.0042 | | 1.6472 | 3502 | 0.0177 | | 1.6477 | 3503 | 0.0095 | | 1.6482 | 3504 | 0.0069 | | 1.6486 | 3505 | 0.006 | | 1.6491 | 3506 | 0.0029 | | 1.6496 | 3507 | 0.0076 | | 1.6500 | 3508 | 0.0092 | | 1.6505 | 3509 | 0.0082 | | 1.6510 | 3510 | 0.007 | | 1.6515 | 3511 | 0.0247 | | 1.6519 | 3512 | 0.0102 | | 1.6524 | 3513 | 0.0044 | | 1.6529 | 3514 | 0.0114 | | 1.6533 | 3515 | 0.0178 | | 1.6538 | 3516 | 0.0052 | | 1.6543 | 3517 | 0.0145 | | 1.6548 | 3518 | 0.0018 | | 1.6552 | 3519 | 0.004 | | 1.6557 | 3520 | 0.0083 | | 1.6562 | 3521 | 0.0057 | | 1.6566 | 3522 | 0.0131 | | 1.6571 | 3523 | 0.0045 | | 1.6576 | 3524 | 0.0069 | | 1.6580 | 3525 | 0.0056 | | 1.6585 | 3526 | 0.0096 | | 1.6590 | 3527 | 0.0064 | | 1.6595 | 3528 | 0.0069 | | 1.6599 | 3529 | 0.0085 | | 1.6604 | 3530 | 0.007 | | 1.6609 | 3531 | 0.0069 | | 1.6613 | 3532 | 0.0155 | | 1.6618 | 3533 | 0.0027 | | 1.6623 | 3534 | 0.0031 | | 1.6627 | 3535 | 0.004 | | 1.6632 | 3536 | 0.0049 | | 1.6637 | 3537 | 0.0073 | | 1.6642 | 3538 | 0.0059 | | 1.6646 | 3539 | 0.0097 | | 1.6651 | 3540 | 0.0175 | | 1.6656 | 3541 | 0.0054 | | 1.6660 | 3542 | 0.0125 | | 1.6665 | 3543 | 0.0022 | | 1.6670 | 3544 | 0.0031 | | 1.6675 | 3545 | 0.0059 | | 1.6679 | 3546 | 0.0021 | | 1.6684 | 3547 | 0.0182 | | 1.6689 | 3548 | 0.0034 | | 1.6693 | 3549 | 0.0029 | | 1.6698 | 3550 | 0.0035 | | 1.6703 | 3551 | 0.0047 | | 1.6707 | 3552 | 0.0062 | | 1.6712 | 3553 | 0.0106 | | 1.6717 | 3554 | 0.0029 | | 1.6722 | 3555 | 0.0056 | | 1.6726 | 3556 | 0.0057 | | 1.6731 | 3557 | 0.0046 | | 1.6736 | 3558 | 0.0024 | | 1.6740 | 3559 | 0.0063 | | 1.6745 | 3560 | 0.0057 | | 1.6750 | 3561 | 0.0027 | | 1.6754 | 3562 | 0.0174 | | 1.6759 | 3563 | 0.005 | | 1.6764 | 3564 | 0.004 | | 1.6769 | 3565 | 0.0034 | | 1.6773 | 3566 | 0.0049 | | 1.6778 | 3567 | 0.0026 | | 1.6783 | 3568 | 0.0065 | | 1.6787 | 3569 | 0.0049 | | 1.6792 | 3570 | 0.0061 | | 1.6797 | 3571 | 0.0091 | | 1.6802 | 3572 | 0.0032 | | 1.6806 | 3573 | 0.0108 | | 1.6811 | 3574 | 0.0049 | | 1.6816 | 3575 | 0.0065 | | 1.6820 | 3576 | 0.0032 | | 1.6825 | 3577 | 0.0043 | | 1.6830 | 3578 | 0.007 | | 1.6834 | 3579 | 0.0079 | | 1.6839 | 3580 | 0.0069 | | 1.6844 | 3581 | 0.0084 | | 1.6849 | 3582 | 0.0048 | | 1.6853 | 3583 | 0.0079 | | 1.6858 | 3584 | 0.0047 | | 1.6863 | 3585 | 0.007 | | 1.6867 | 3586 | 0.0076 | | 1.6872 | 3587 | 0.0059 | | 1.6877 | 3588 | 0.0121 | | 1.6881 | 3589 | 0.0046 | | 1.6886 | 3590 | 0.0095 | | 1.6891 | 3591 | 0.0087 | | 1.6896 | 3592 | 0.0059 | | 1.6900 | 3593 | 0.0044 | | 1.6905 | 3594 | 0.0055 | | 1.6910 | 3595 | 0.0029 | | 1.6914 | 3596 | 0.0093 | | 1.6919 | 3597 | 0.0091 | | 1.6924 | 3598 | 0.0134 | | 1.6929 | 3599 | 0.005 | | 1.6933 | 3600 | 0.0073 | | 1.6938 | 3601 | 0.004 | | 1.6943 | 3602 | 0.0033 | | 1.6947 | 3603 | 0.0103 | | 1.6952 | 3604 | 0.0145 | | 1.6957 | 3605 | 0.0121 | | 1.6961 | 3606 | 0.01 | | 1.6966 | 3607 | 0.0047 | | 1.6971 | 3608 | 0.0038 | | 1.6976 | 3609 | 0.0072 | | 1.6980 | 3610 | 0.011 | | 1.6985 | 3611 | 0.009 | | 1.6990 | 3612 | 0.0028 | | 1.6994 | 3613 | 0.0035 | | 1.6999 | 3614 | 0.0045 | | 1.7004 | 3615 | 0.0274 | | 1.7008 | 3616 | 0.0046 | | 1.7013 | 3617 | 0.0067 | | 1.7018 | 3618 | 0.0053 | | 1.7023 | 3619 | 0.0066 | | 1.7027 | 3620 | 0.0035 | | 1.7032 | 3621 | 0.0021 | | 1.7037 | 3622 | 0.0159 | | 1.7041 | 3623 | 0.0096 | | 1.7046 | 3624 | 0.0077 | | 1.7051 | 3625 | 0.0051 | | 1.7056 | 3626 | 0.0047 | | 1.7060 | 3627 | 0.0138 | | 1.7065 | 3628 | 0.0116 | | 1.7070 | 3629 | 0.0039 | | 1.7074 | 3630 | 0.0045 | | 1.7079 | 3631 | 0.0055 | | 1.7084 | 3632 | 0.006 | | 1.7088 | 3633 | 0.0028 | | 1.7093 | 3634 | 0.0166 | | 1.7098 | 3635 | 0.0088 | | 1.7103 | 3636 | 0.0057 | | 1.7107 | 3637 | 0.0033 | | 1.7112 | 3638 | 0.0102 | | 1.7117 | 3639 | 0.0058 | | 1.7121 | 3640 | 0.0076 | | 1.7126 | 3641 | 0.0075 | | 1.7131 | 3642 | 0.0041 | | 1.7135 | 3643 | 0.0074 | | 1.7140 | 3644 | 0.0025 | | 1.7145 | 3645 | 0.0042 | | 1.7150 | 3646 | 0.0036 | | 1.7154 | 3647 | 0.0049 | | 1.7159 | 3648 | 0.0085 | | 1.7164 | 3649 | 0.0105 | | 1.7168 | 3650 | 0.0051 | | 1.7173 | 3651 | 0.016 | | 1.7178 | 3652 | 0.012 | | 1.7183 | 3653 | 0.0045 | | 1.7187 | 3654 | 0.0085 | | 1.7192 | 3655 | 0.0034 | | 1.7197 | 3656 | 0.0037 | | 1.7201 | 3657 | 0.0084 | | 1.7206 | 3658 | 0.0052 | | 1.7211 | 3659 | 0.003 | | 1.7215 | 3660 | 0.0057 | | 1.7220 | 3661 | 0.0055 | | 1.7225 | 3662 | 0.0164 | | 1.7230 | 3663 | 0.004 | | 1.7234 | 3664 | 0.0053 | | 1.7239 | 3665 | 0.0034 | | 1.7244 | 3666 | 0.0067 | | 1.7248 | 3667 | 0.0038 | | 1.7253 | 3668 | 0.0047 | | 1.7258 | 3669 | 0.0028 | | 1.7262 | 3670 | 0.0061 | | 1.7267 | 3671 | 0.0053 | | 1.7272 | 3672 | 0.0093 | | 1.7277 | 3673 | 0.0041 | | 1.7281 | 3674 | 0.0047 | | 1.7286 | 3675 | 0.0161 | | 1.7291 | 3676 | 0.0037 | | 1.7295 | 3677 | 0.0104 | | 1.7300 | 3678 | 0.0042 | | 1.7305 | 3679 | 0.0052 | | 1.7310 | 3680 | 0.0056 | | 1.7314 | 3681 | 0.0032 | | 1.7319 | 3682 | 0.0056 | | 1.7324 | 3683 | 0.0045 | | 1.7328 | 3684 | 0.0106 | | 1.7333 | 3685 | 0.0099 | | 1.7338 | 3686 | 0.0054 | | 1.7342 | 3687 | 0.0064 | | 1.7347 | 3688 | 0.0025 | | 1.7352 | 3689 | 0.0061 | | 1.7357 | 3690 | 0.0052 | | 1.7361 | 3691 | 0.0066 | | 1.7366 | 3692 | 0.0107 | | 1.7371 | 3693 | 0.004 | | 1.7375 | 3694 | 0.005 | | 1.7380 | 3695 | 0.0039 | | 1.7385 | 3696 | 0.0025 | | 1.7389 | 3697 | 0.0022 | | 1.7394 | 3698 | 0.0339 | | 1.7399 | 3699 | 0.0095 | | 1.7404 | 3700 | 0.0069 | | 1.7408 | 3701 | 0.0108 | | 1.7413 | 3702 | 0.0035 | | 1.7418 | 3703 | 0.008 | | 1.7422 | 3704 | 0.0074 | | 1.7427 | 3705 | 0.0063 | | 1.7432 | 3706 | 0.0085 | | 1.7437 | 3707 | 0.0052 | | 1.7441 | 3708 | 0.0109 | | 1.7446 | 3709 | 0.0081 | | 1.7451 | 3710 | 0.0046 | | 1.7455 | 3711 | 0.0045 | | 1.7460 | 3712 | 0.0019 | | 1.7465 | 3713 | 0.0076 | | 1.7469 | 3714 | 0.0029 | | 1.7474 | 3715 | 0.0171 | | 1.7479 | 3716 | 0.0066 | | 1.7484 | 3717 | 0.0137 | | 1.7488 | 3718 | 0.0036 | | 1.7493 | 3719 | 0.0105 | | 1.7498 | 3720 | 0.0146 | | 1.7502 | 3721 | 0.0092 | | 1.7507 | 3722 | 0.0026 | | 1.7512 | 3723 | 0.0058 | | 1.7516 | 3724 | 0.0027 | | 1.7521 | 3725 | 0.0027 | | 1.7526 | 3726 | 0.0075 | | 1.7531 | 3727 | 0.0029 | | 1.7535 | 3728 | 0.0059 | | 1.7540 | 3729 | 0.0041 | | 1.7545 | 3730 | 0.0044 | | 1.7549 | 3731 | 0.0045 | | 1.7554 | 3732 | 0.0049 | | 1.7559 | 3733 | 0.0117 | | 1.7563 | 3734 | 0.0037 | | 1.7568 | 3735 | 0.0034 | | 1.7573 | 3736 | 0.0061 | | 1.7578 | 3737 | 0.0157 | | 1.7582 | 3738 | 0.0042 | | 1.7587 | 3739 | 0.0041 | | 1.7592 | 3740 | 0.0095 | | 1.7596 | 3741 | 0.0024 | | 1.7601 | 3742 | 0.0173 | | 1.7606 | 3743 | 0.0039 | | 1.7611 | 3744 | 0.017 | | 1.7615 | 3745 | 0.0052 | | 1.7620 | 3746 | 0.0062 | | 1.7625 | 3747 | 0.0041 | | 1.7629 | 3748 | 0.0087 | | 1.7634 | 3749 | 0.0115 | | 1.7639 | 3750 | 0.0055 | | 1.7643 | 3751 | 0.0063 | | 1.7648 | 3752 | 0.005 | | 1.7653 | 3753 | 0.0072 | | 1.7658 | 3754 | 0.0046 | | 1.7662 | 3755 | 0.0047 | | 1.7667 | 3756 | 0.0054 | | 1.7672 | 3757 | 0.0065 | | 1.7676 | 3758 | 0.0044 | | 1.7681 | 3759 | 0.0043 | | 1.7686 | 3760 | 0.0113 | | 1.7690 | 3761 | 0.006 | | 1.7695 | 3762 | 0.0041 | | 1.7700 | 3763 | 0.0087 | | 1.7705 | 3764 | 0.0105 | | 1.7709 | 3765 | 0.0048 | | 1.7714 | 3766 | 0.0224 | | 1.7719 | 3767 | 0.008 | | 1.7723 | 3768 | 0.0045 | | 1.7728 | 3769 | 0.0049 | | 1.7733 | 3770 | 0.0066 | | 1.7738 | 3771 | 0.0095 | | 1.7742 | 3772 | 0.0088 | | 1.7747 | 3773 | 0.0024 | | 1.7752 | 3774 | 0.0031 | | 1.7756 | 3775 | 0.0056 | | 1.7761 | 3776 | 0.0022 | | 1.7766 | 3777 | 0.0061 | | 1.7770 | 3778 | 0.0057 | | 1.7775 | 3779 | 0.0045 | | 1.7780 | 3780 | 0.0065 | | 1.7785 | 3781 | 0.0091 | | 1.7789 | 3782 | 0.0063 | | 1.7794 | 3783 | 0.0068 | | 1.7799 | 3784 | 0.0034 | | 1.7803 | 3785 | 0.0114 | | 1.7808 | 3786 | 0.0322 | | 1.7813 | 3787 | 0.0041 | | 1.7817 | 3788 | 0.0026 | | 1.7822 | 3789 | 0.0134 | | 1.7827 | 3790 | 0.0049 | | 1.7832 | 3791 | 0.018 | | 1.7836 | 3792 | 0.0077 | | 1.7841 | 3793 | 0.0046 | | 1.7846 | 3794 | 0.0068 | | 1.7850 | 3795 | 0.0064 | | 1.7855 | 3796 | 0.0077 | | 1.7860 | 3797 | 0.0084 | | 1.7865 | 3798 | 0.0111 | | 1.7869 | 3799 | 0.0044 | | 1.7874 | 3800 | 0.004 | | 1.7879 | 3801 | 0.0043 | | 1.7883 | 3802 | 0.005 | | 1.7888 | 3803 | 0.0045 | | 1.7893 | 3804 | 0.0076 | | 1.7897 | 3805 | 0.007 | | 1.7902 | 3806 | 0.0198 | | 1.7907 | 3807 | 0.0039 | | 1.7912 | 3808 | 0.0069 | | 1.7916 | 3809 | 0.0053 | | 1.7921 | 3810 | 0.0041 | | 1.7926 | 3811 | 0.0122 | | 1.7930 | 3812 | 0.0051 | | 1.7935 | 3813 | 0.0037 | | 1.7940 | 3814 | 0.0034 | | 1.7944 | 3815 | 0.0245 | | 1.7949 | 3816 | 0.0034 | | 1.7954 | 3817 | 0.004 | | 1.7959 | 3818 | 0.0072 | | 1.7963 | 3819 | 0.0057 | | 1.7968 | 3820 | 0.0035 | | 1.7973 | 3821 | 0.0034 | | 1.7977 | 3822 | 0.0073 | | 1.7982 | 3823 | 0.0038 | | 1.7987 | 3824 | 0.0049 | | 1.7992 | 3825 | 0.0039 | | 1.7996 | 3826 | 0.0061 | | 1.8001 | 3827 | 0.0036 | | 1.8006 | 3828 | 0.0125 | | 1.8010 | 3829 | 0.0077 | | 1.8015 | 3830 | 0.0048 | | 1.8020 | 3831 | 0.0072 | | 1.8024 | 3832 | 0.0119 | | 1.8029 | 3833 | 0.0105 | | 1.8034 | 3834 | 0.0102 | | 1.8039 | 3835 | 0.0023 | | 1.8043 | 3836 | 0.0032 | | 1.8048 | 3837 | 0.0041 | | 1.8053 | 3838 | 0.0042 | | 1.8057 | 3839 | 0.0075 | | 1.8062 | 3840 | 0.0054 | | 1.8067 | 3841 | 0.0138 | | 1.8071 | 3842 | 0.0106 | | 1.8076 | 3843 | 0.0048 | | 1.8081 | 3844 | 0.0059 | | 1.8086 | 3845 | 0.0075 | | 1.8090 | 3846 | 0.0047 | | 1.8095 | 3847 | 0.0045 | | 1.8100 | 3848 | 0.0066 | | 1.8104 | 3849 | 0.0094 | | 1.8109 | 3850 | 0.0034 | | 1.8114 | 3851 | 0.0045 | | 1.8119 | 3852 | 0.007 | | 1.8123 | 3853 | 0.0082 | | 1.8128 | 3854 | 0.0109 | | 1.8133 | 3855 | 0.0053 | | 1.8137 | 3856 | 0.003 | | 1.8142 | 3857 | 0.0055 | | 1.8147 | 3858 | 0.0031 | | 1.8151 | 3859 | 0.0046 | | 1.8156 | 3860 | 0.0119 | | 1.8161 | 3861 | 0.01 | | 1.8166 | 3862 | 0.0056 | | 1.8170 | 3863 | 0.0044 | | 1.8175 | 3864 | 0.0086 | | 1.8180 | 3865 | 0.0066 | | 1.8184 | 3866 | 0.0039 | | 1.8189 | 3867 | 0.0057 | | 1.8194 | 3868 | 0.0032 | | 1.8198 | 3869 | 0.0147 | | 1.8203 | 3870 | 0.0092 | | 1.8208 | 3871 | 0.0054 | | 1.8213 | 3872 | 0.0104 | | 1.8217 | 3873 | 0.025 | | 1.8222 | 3874 | 0.0067 | | 1.8227 | 3875 | 0.0063 | | 1.8231 | 3876 | 0.0031 | | 1.8236 | 3877 | 0.0053 | | 1.8241 | 3878 | 0.0106 | | 1.8246 | 3879 | 0.0055 | | 1.8250 | 3880 | 0.0019 | | 1.8255 | 3881 | 0.0062 | | 1.8260 | 3882 | 0.006 | | 1.8264 | 3883 | 0.0038 | | 1.8269 | 3884 | 0.0035 | | 1.8274 | 3885 | 0.0078 | | 1.8278 | 3886 | 0.0039 | | 1.8283 | 3887 | 0.0068 | | 1.8288 | 3888 | 0.0044 | | 1.8293 | 3889 | 0.0096 | | 1.8297 | 3890 | 0.0052 | | 1.8302 | 3891 | 0.0062 | | 1.8307 | 3892 | 0.0055 | | 1.8311 | 3893 | 0.0059 | | 1.8316 | 3894 | 0.0089 | | 1.8321 | 3895 | 0.0034 | | 1.8325 | 3896 | 0.0062 | | 1.8330 | 3897 | 0.009 | | 1.8335 | 3898 | 0.0071 | | 1.8340 | 3899 | 0.0099 | | 1.8344 | 3900 | 0.0038 | | 1.8349 | 3901 | 0.0032 | | 1.8354 | 3902 | 0.0141 | | 1.8358 | 3903 | 0.0062 | | 1.8363 | 3904 | 0.0064 | | 1.8368 | 3905 | 0.0052 | | 1.8373 | 3906 | 0.0054 | | 1.8377 | 3907 | 0.0031 | | 1.8382 | 3908 | 0.004 | | 1.8387 | 3909 | 0.0024 | | 1.8391 | 3910 | 0.0058 | | 1.8396 | 3911 | 0.0069 | | 1.8401 | 3912 | 0.0021 | | 1.8405 | 3913 | 0.0044 | | 1.8410 | 3914 | 0.0073 | | 1.8415 | 3915 | 0.0039 | | 1.8420 | 3916 | 0.0071 | | 1.8424 | 3917 | 0.0027 | | 1.8429 | 3918 | 0.0046 | | 1.8434 | 3919 | 0.0097 | | 1.8438 | 3920 | 0.0077 | | 1.8443 | 3921 | 0.005 | | 1.8448 | 3922 | 0.0046 | | 1.8452 | 3923 | 0.0114 | | 1.8457 | 3924 | 0.0039 | | 1.8462 | 3925 | 0.003 | | 1.8467 | 3926 | 0.0038 | | 1.8471 | 3927 | 0.0039 | | 1.8476 | 3928 | 0.0086 | | 1.8481 | 3929 | 0.0046 | | 1.8485 | 3930 | 0.0056 | | 1.8490 | 3931 | 0.0069 | | 1.8495 | 3932 | 0.0057 | | 1.8500 | 3933 | 0.0045 | | 1.8504 | 3934 | 0.0036 | | 1.8509 | 3935 | 0.0056 | | 1.8514 | 3936 | 0.004 | | 1.8518 | 3937 | 0.0025 | | 1.8523 | 3938 | 0.0049 | | 1.8528 | 3939 | 0.0028 | | 1.8532 | 3940 | 0.0067 | | 1.8537 | 3941 | 0.004 | | 1.8542 | 3942 | 0.0051 | | 1.8547 | 3943 | 0.0058 | | 1.8551 | 3944 | 0.0043 | | 1.8556 | 3945 | 0.0144 | | 1.8561 | 3946 | 0.002 | | 1.8565 | 3947 | 0.0123 | | 1.8570 | 3948 | 0.0039 | | 1.8575 | 3949 | 0.004 | | 1.8579 | 3950 | 0.003 | | 1.8584 | 3951 | 0.0046 | | 1.8589 | 3952 | 0.0084 | | 1.8594 | 3953 | 0.0099 | | 1.8598 | 3954 | 0.008 | | 1.8603 | 3955 | 0.0055 | | 1.8608 | 3956 | 0.0128 | | 1.8612 | 3957 | 0.0108 | | 1.8617 | 3958 | 0.006 | | 1.8622 | 3959 | 0.0046 | | 1.8627 | 3960 | 0.0034 | | 1.8631 | 3961 | 0.0045 | | 1.8636 | 3962 | 0.0132 | | 1.8641 | 3963 | 0.0082 | | 1.8645 | 3964 | 0.0043 | | 1.8650 | 3965 | 0.0059 | | 1.8655 | 3966 | 0.004 | | 1.8659 | 3967 | 0.0066 | | 1.8664 | 3968 | 0.0088 | | 1.8669 | 3969 | 0.0051 | | 1.8674 | 3970 | 0.009 | | 1.8678 | 3971 | 0.0031 | | 1.8683 | 3972 | 0.0074 | | 1.8688 | 3973 | 0.0165 | | 1.8692 | 3974 | 0.0037 | | 1.8697 | 3975 | 0.0065 | | 1.8702 | 3976 | 0.0057 | | 1.8706 | 3977 | 0.0028 | | 1.8711 | 3978 | 0.0043 | | 1.8716 | 3979 | 0.0052 | | 1.8721 | 3980 | 0.0093 | | 1.8725 | 3981 | 0.012 | | 1.8730 | 3982 | 0.0043 | | 1.8735 | 3983 | 0.0066 | | 1.8739 | 3984 | 0.0086 | | 1.8744 | 3985 | 0.0213 | | 1.8749 | 3986 | 0.0072 | | 1.8754 | 3987 | 0.0016 | | 1.8758 | 3988 | 0.0024 | | 1.8763 | 3989 | 0.0039 | | 1.8768 | 3990 | 0.0054 | | 1.8772 | 3991 | 0.0034 | | 1.8777 | 3992 | 0.0061 | | 1.8782 | 3993 | 0.0046 | | 1.8786 | 3994 | 0.0053 | | 1.8791 | 3995 | 0.0042 | | 1.8796 | 3996 | 0.006 | | 1.8801 | 3997 | 0.0064 | | 1.8805 | 3998 | 0.0163 | | 1.8810 | 3999 | 0.0065 | | 1.8815 | 4000 | 0.0106 | | 1.8819 | 4001 | 0.0149 | | 1.8824 | 4002 | 0.006 | | 1.8829 | 4003 | 0.0051 | | 1.8833 | 4004 | 0.0026 | | 1.8838 | 4005 | 0.0104 | | 1.8843 | 4006 | 0.0055 | | 1.8848 | 4007 | 0.0119 | | 1.8852 | 4008 | 0.0042 | | 1.8857 | 4009 | 0.0061 | | 1.8862 | 4010 | 0.0089 | | 1.8866 | 4011 | 0.0025 | | 1.8871 | 4012 | 0.0049 | | 1.8876 | 4013 | 0.008 | | 1.8881 | 4014 | 0.0033 | | 1.8885 | 4015 | 0.0167 | | 1.8890 | 4016 | 0.004 | | 1.8895 | 4017 | 0.0092 | | 1.8899 | 4018 | 0.003 | | 1.8904 | 4019 | 0.005 | | 1.8909 | 4020 | 0.0033 | | 1.8913 | 4021 | 0.0047 | | 1.8918 | 4022 | 0.0057 | | 1.8923 | 4023 | 0.008 | | 1.8928 | 4024 | 0.005 | | 1.8932 | 4025 | 0.01 | | 1.8937 | 4026 | 0.0071 | | 1.8942 | 4027 | 0.0049 | | 1.8946 | 4028 | 0.0097 | | 1.8951 | 4029 | 0.0041 | | 1.8956 | 4030 | 0.015 | | 1.8960 | 4031 | 0.0109 | | 1.8965 | 4032 | 0.0128 | | 1.8970 | 4033 | 0.0047 | | 1.8975 | 4034 | 0.0061 | | 1.8979 | 4035 | 0.0046 | | 1.8984 | 4036 | 0.0073 | | 1.8989 | 4037 | 0.0086 | | 1.8993 | 4038 | 0.005 | | 1.8998 | 4039 | 0.0076 | | 1.9003 | 4040 | 0.0128 | | 1.9008 | 4041 | 0.0059 | | 1.9012 | 4042 | 0.0044 | | 1.9017 | 4043 | 0.0048 | | 1.9022 | 4044 | 0.0061 | | 1.9026 | 4045 | 0.0031 | | 1.9031 | 4046 | 0.0045 | | 1.9036 | 4047 | 0.0054 | | 1.9040 | 4048 | 0.0115 | | 1.9045 | 4049 | 0.0046 | | 1.9050 | 4050 | 0.0193 | | 1.9055 | 4051 | 0.008 | | 1.9059 | 4052 | 0.0045 | | 1.9064 | 4053 | 0.0054 | | 1.9069 | 4054 | 0.0056 | | 1.9073 | 4055 | 0.0068 | | 1.9078 | 4056 | 0.0195 | | 1.9083 | 4057 | 0.0067 | | 1.9087 | 4058 | 0.0056 | | 1.9092 | 4059 | 0.0044 | | 1.9097 | 4060 | 0.0086 | | 1.9102 | 4061 | 0.0028 | | 1.9106 | 4062 | 0.0064 | | 1.9111 | 4063 | 0.0039 | | 1.9116 | 4064 | 0.0057 | | 1.9120 | 4065 | 0.0043 | | 1.9125 | 4066 | 0.0076 | | 1.9130 | 4067 | 0.0121 | | 1.9135 | 4068 | 0.0022 | | 1.9139 | 4069 | 0.0064 | | 1.9144 | 4070 | 0.0063 | | 1.9149 | 4071 | 0.0036 | | 1.9153 | 4072 | 0.0082 | | 1.9158 | 4073 | 0.0045 | | 1.9163 | 4074 | 0.0052 | | 1.9167 | 4075 | 0.0027 | | 1.9172 | 4076 | 0.0037 | | 1.9177 | 4077 | 0.0023 | | 1.9182 | 4078 | 0.0085 | | 1.9186 | 4079 | 0.0066 | | 1.9191 | 4080 | 0.0026 | | 1.9196 | 4081 | 0.0024 | | 1.9200 | 4082 | 0.0056 | | 1.9205 | 4083 | 0.0067 | | 1.9210 | 4084 | 0.004 | | 1.9214 | 4085 | 0.0048 | | 1.9219 | 4086 | 0.0048 | | 1.9224 | 4087 | 0.0043 | | 1.9229 | 4088 | 0.0048 | | 1.9233 | 4089 | 0.0053 | | 1.9238 | 4090 | 0.0082 | | 1.9243 | 4091 | 0.0048 | | 1.9247 | 4092 | 0.0088 | | 1.9252 | 4093 | 0.0055 | | 1.9257 | 4094 | 0.0056 | | 1.9262 | 4095 | 0.0039 | | 1.9266 | 4096 | 0.0056 | | 1.9271 | 4097 | 0.0184 | | 1.9276 | 4098 | 0.004 | | 1.9280 | 4099 | 0.0093 | | 1.9285 | 4100 | 0.0029 | | 1.9290 | 4101 | 0.0041 | | 1.9294 | 4102 | 0.0074 | | 1.9299 | 4103 | 0.0097 | | 1.9304 | 4104 | 0.004 | | 1.9309 | 4105 | 0.0043 | | 1.9313 | 4106 | 0.0075 | | 1.9318 | 4107 | 0.0025 | | 1.9323 | 4108 | 0.0136 | | 1.9327 | 4109 | 0.0075 | | 1.9332 | 4110 | 0.0071 | | 1.9337 | 4111 | 0.0235 | | 1.9341 | 4112 | 0.0036 | | 1.9346 | 4113 | 0.0035 | | 1.9351 | 4114 | 0.0051 | | 1.9356 | 4115 | 0.0045 | | 1.9360 | 4116 | 0.0035 | | 1.9365 | 4117 | 0.0043 | | 1.9370 | 4118 | 0.0089 | | 1.9374 | 4119 | 0.0045 | | 1.9379 | 4120 | 0.0139 | | 1.9384 | 4121 | 0.0032 | | 1.9389 | 4122 | 0.0036 | | 1.9393 | 4123 | 0.0056 | | 1.9398 | 4124 | 0.0096 | | 1.9403 | 4125 | 0.0524 | | 1.9407 | 4126 | 0.0159 | | 1.9412 | 4127 | 0.0032 | | 1.9417 | 4128 | 0.0039 | | 1.9421 | 4129 | 0.0074 | | 1.9426 | 4130 | 0.0027 | | 1.9431 | 4131 | 0.0047 | | 1.9436 | 4132 | 0.0027 | | 1.9440 | 4133 | 0.0047 | | 1.9445 | 4134 | 0.0056 | | 1.9450 | 4135 | 0.003 | | 1.9454 | 4136 | 0.0047 | | 1.9459 | 4137 | 0.0038 | | 1.9464 | 4138 | 0.0035 | | 1.9468 | 4139 | 0.014 | | 1.9473 | 4140 | 0.0035 | | 1.9478 | 4141 | 0.0039 | | 1.9483 | 4142 | 0.006 | | 1.9487 | 4143 | 0.005 | | 1.9492 | 4144 | 0.0021 | | 1.9497 | 4145 | 0.0055 | | 1.9501 | 4146 | 0.0029 | | 1.9506 | 4147 | 0.0022 | | 1.9511 | 4148 | 0.0115 | | 1.9516 | 4149 | 0.0053 | | 1.9520 | 4150 | 0.0131 | | 1.9525 | 4151 | 0.0039 | | 1.9530 | 4152 | 0.003 | | 1.9534 | 4153 | 0.0143 | | 1.9539 | 4154 | 0.0034 | | 1.9544 | 4155 | 0.0019 | | 1.9548 | 4156 | 0.0082 | | 1.9553 | 4157 | 0.0088 | | 1.9558 | 4158 | 0.0041 | | 1.9563 | 4159 | 0.0042 | | 1.9567 | 4160 | 0.0049 | | 1.9572 | 4161 | 0.018 | | 1.9577 | 4162 | 0.0035 | | 1.9581 | 4163 | 0.0118 | | 1.9586 | 4164 | 0.0042 | | 1.9591 | 4165 | 0.0071 | | 1.9595 | 4166 | 0.0065 | | 1.9600 | 4167 | 0.0039 | | 1.9605 | 4168 | 0.007 | | 1.9610 | 4169 | 0.0029 | | 1.9614 | 4170 | 0.0069 | | 1.9619 | 4171 | 0.0091 | | 1.9624 | 4172 | 0.0022 | | 1.9628 | 4173 | 0.0063 | | 1.9633 | 4174 | 0.0055 | | 1.9638 | 4175 | 0.0081 | | 1.9643 | 4176 | 0.0036 | | 1.9647 | 4177 | 0.0022 | | 1.9652 | 4178 | 0.0039 | | 1.9657 | 4179 | 0.0026 | | 1.9661 | 4180 | 0.0079 | | 1.9666 | 4181 | 0.007 | | 1.9671 | 4182 | 0.0033 | | 1.9675 | 4183 | 0.0035 | | 1.9680 | 4184 | 0.0043 | | 1.9685 | 4185 | 0.0038 | | 1.9690 | 4186 | 0.0238 | | 1.9694 | 4187 | 0.0035 | | 1.9699 | 4188 | 0.0111 | | 1.9704 | 4189 | 0.005 | | 1.9708 | 4190 | 0.0035 | | 1.9713 | 4191 | 0.0062 | | 1.9718 | 4192 | 0.0082 | | 1.9722 | 4193 | 0.0043 | | 1.9727 | 4194 | 0.0053 | | 1.9732 | 4195 | 0.0039 | | 1.9737 | 4196 | 0.0099 | | 1.9741 | 4197 | 0.0032 | | 1.9746 | 4198 | 0.0062 | | 1.9751 | 4199 | 0.0032 | | 1.9755 | 4200 | 0.0066 | | 1.9760 | 4201 | 0.0117 | | 1.9765 | 4202 | 0.0033 | | 1.9770 | 4203 | 0.0044 | | 1.9774 | 4204 | 0.0046 | | 1.9779 | 4205 | 0.0069 | | 1.9784 | 4206 | 0.0086 | | 1.9788 | 4207 | 0.0045 | | 1.9793 | 4208 | 0.0122 | | 1.9798 | 4209 | 0.0043 | | 1.9802 | 4210 | 0.0065 | | 1.9807 | 4211 | 0.0057 | | 1.9812 | 4212 | 0.0047 | | 1.9817 | 4213 | 0.0021 | | 1.9821 | 4214 | 0.0035 | | 1.9826 | 4215 | 0.0085 | | 1.9831 | 4216 | 0.0095 | | 1.9835 | 4217 | 0.0071 | | 1.9840 | 4218 | 0.0034 | | 1.9845 | 4219 | 0.0031 | | 1.9849 | 4220 | 0.0072 | | 1.9854 | 4221 | 0.004 | | 1.9859 | 4222 | 0.0035 | | 1.9864 | 4223 | 0.0047 | | 1.9868 | 4224 | 0.0039 | | 1.9873 | 4225 | 0.0032 | | 1.9878 | 4226 | 0.0056 | | 1.9882 | 4227 | 0.0026 | | 1.9887 | 4228 | 0.0082 | | 1.9892 | 4229 | 0.0053 | | 1.9897 | 4230 | 0.0054 | | 1.9901 | 4231 | 0.003 | | 1.9906 | 4232 | 0.0125 | | 1.9911 | 4233 | 0.0044 | | 1.9915 | 4234 | 0.0051 | | 1.9920 | 4235 | 0.002 | | 1.9925 | 4236 | 0.0036 | | 1.9929 | 4237 | 0.0081 | | 1.9934 | 4238 | 0.0035 | | 1.9939 | 4239 | 0.0087 | | 1.9944 | 4240 | 0.0048 | | 1.9948 | 4241 | 0.0204 | | 1.9953 | 4242 | 0.0032 | | 1.9958 | 4243 | 0.0078 | | 1.9962 | 4244 | 0.0065 | | 1.9967 | 4245 | 0.0058 | | 1.9972 | 4246 | 0.0025 | | 1.9976 | 4247 | 0.0042 | | 1.9981 | 4248 | 0.0034 | | 1.9986 | 4249 | 0.0085 | | 1.9991 | 4250 | 0.0075 | | 1.9995 | 4251 | 0.0038 | | 2.0 | 4252 | 0.0127 | | 2.0005 | 4253 | 0.0042 | | 2.0009 | 4254 | 0.0082 | | 2.0014 | 4255 | 0.0059 | | 2.0019 | 4256 | 0.0061 | | 2.0024 | 4257 | 0.0121 | | 2.0028 | 4258 | 0.0036 | | 2.0033 | 4259 | 0.0027 | | 2.0038 | 4260 | 0.0047 | | 2.0042 | 4261 | 0.0106 | | 2.0047 | 4262 | 0.0051 | | 2.0052 | 4263 | 0.0045 | | 2.0056 | 4264 | 0.0126 | | 2.0061 | 4265 | 0.0067 | | 2.0066 | 4266 | 0.011 | | 2.0071 | 4267 | 0.0057 | | 2.0075 | 4268 | 0.005 | | 2.0080 | 4269 | 0.0055 | | 2.0085 | 4270 | 0.0075 | | 2.0089 | 4271 | 0.0077 | | 2.0094 | 4272 | 0.0055 | | 2.0099 | 4273 | 0.0078 | | 2.0103 | 4274 | 0.0063 | | 2.0108 | 4275 | 0.0174 | | 2.0113 | 4276 | 0.007 | | 2.0118 | 4277 | 0.0043 | | 2.0122 | 4278 | 0.0028 | | 2.0127 | 4279 | 0.0119 | | 2.0132 | 4280 | 0.0067 | | 2.0136 | 4281 | 0.0029 | | 2.0141 | 4282 | 0.005 | | 2.0146 | 4283 | 0.0047 | | 2.0151 | 4284 | 0.0022 | | 2.0155 | 4285 | 0.0102 | | 2.0160 | 4286 | 0.0029 | | 2.0165 | 4287 | 0.0182 | | 2.0169 | 4288 | 0.018 | | 2.0174 | 4289 | 0.0046 | | 2.0179 | 4290 | 0.0027 | | 2.0183 | 4291 | 0.0021 | | 2.0188 | 4292 | 0.0047 | | 2.0193 | 4293 | 0.0041 | | 2.0198 | 4294 | 0.0075 | | 2.0202 | 4295 | 0.0061 | | 2.0207 | 4296 | 0.007 | | 2.0212 | 4297 | 0.0017 | | 2.0216 | 4298 | 0.0075 | | 2.0221 | 4299 | 0.0041 | | 2.0226 | 4300 | 0.0063 | | 2.0230 | 4301 | 0.008 | | 2.0235 | 4302 | 0.0041 | | 2.0240 | 4303 | 0.0045 | | 2.0245 | 4304 | 0.0076 | | 2.0249 | 4305 | 0.0044 | | 2.0254 | 4306 | 0.0017 | | 2.0259 | 4307 | 0.0042 | | 2.0263 | 4308 | 0.0145 | | 2.0268 | 4309 | 0.0046 | | 2.0273 | 4310 | 0.004 | | 2.0278 | 4311 | 0.007 | | 2.0282 | 4312 | 0.0025 | | 2.0287 | 4313 | 0.0069 | | 2.0292 | 4314 | 0.0065 | | 2.0296 | 4315 | 0.0129 | | 2.0301 | 4316 | 0.0072 | | 2.0306 | 4317 | 0.0031 | | 2.0310 | 4318 | 0.0051 | | 2.0315 | 4319 | 0.0104 | | 2.0320 | 4320 | 0.0047 | | 2.0325 | 4321 | 0.0052 | | 2.0329 | 4322 | 0.0089 | | 2.0334 | 4323 | 0.0068 | | 2.0339 | 4324 | 0.0139 | | 2.0343 | 4325 | 0.0095 | | 2.0348 | 4326 | 0.0019 | | 2.0353 | 4327 | 0.0042 | | 2.0357 | 4328 | 0.007 | | 2.0362 | 4329 | 0.008 | | 2.0367 | 4330 | 0.0032 | | 2.0372 | 4331 | 0.0044 | | 2.0376 | 4332 | 0.0024 | | 2.0381 | 4333 | 0.0069 | | 2.0386 | 4334 | 0.0174 | | 2.0390 | 4335 | 0.0064 | | 2.0395 | 4336 | 0.0109 | | 2.0400 | 4337 | 0.003 | | 2.0405 | 4338 | 0.0073 | | 2.0409 | 4339 | 0.0052 | | 2.0414 | 4340 | 0.0052 | | 2.0419 | 4341 | 0.0056 | | 2.0423 | 4342 | 0.004 | | 2.0428 | 4343 | 0.0043 | | 2.0433 | 4344 | 0.0053 | | 2.0437 | 4345 | 0.0062 | | 2.0442 | 4346 | 0.0045 | | 2.0447 | 4347 | 0.0035 | | 2.0452 | 4348 | 0.0043 | | 2.0456 | 4349 | 0.0074 | | 2.0461 | 4350 | 0.0036 | | 2.0466 | 4351 | 0.0099 | | 2.0470 | 4352 | 0.004 | | 2.0475 | 4353 | 0.0081 | | 2.0480 | 4354 | 0.0035 | | 2.0484 | 4355 | 0.003 | | 2.0489 | 4356 | 0.0017 | | 2.0494 | 4357 | 0.0025 | | 2.0499 | 4358 | 0.0055 | | 2.0503 | 4359 | 0.0021 | | 2.0508 | 4360 | 0.0039 | | 2.0513 | 4361 | 0.0045 | | 2.0517 | 4362 | 0.0031 | | 2.0522 | 4363 | 0.0026 | | 2.0527 | 4364 | 0.0032 | | 2.0532 | 4365 | 0.0064 | | 2.0536 | 4366 | 0.0076 | | 2.0541 | 4367 | 0.0025 | | 2.0546 | 4368 | 0.0067 | | 2.0550 | 4369 | 0.0049 | | 2.0555 | 4370 | 0.0061 | | 2.0560 | 4371 | 0.0021 | | 2.0564 | 4372 | 0.0069 | | 2.0569 | 4373 | 0.0033 | | 2.0574 | 4374 | 0.0033 | | 2.0579 | 4375 | 0.0114 | | 2.0583 | 4376 | 0.0036 | | 2.0588 | 4377 | 0.0047 | | 2.0593 | 4378 | 0.0047 | | 2.0597 | 4379 | 0.0026 | | 2.0602 | 4380 | 0.0039 | | 2.0607 | 4381 | 0.0054 | | 2.0611 | 4382 | 0.0103 | | 2.0616 | 4383 | 0.0055 | | 2.0621 | 4384 | 0.0026 | | 2.0626 | 4385 | 0.0036 | | 2.0630 | 4386 | 0.0043 | | 2.0635 | 4387 | 0.004 | | 2.0640 | 4388 | 0.0032 | | 2.0644 | 4389 | 0.0045 | | 2.0649 | 4390 | 0.0169 | | 2.0654 | 4391 | 0.0027 | | 2.0659 | 4392 | 0.0052 | | 2.0663 | 4393 | 0.0033 | | 2.0668 | 4394 | 0.005 | | 2.0673 | 4395 | 0.0028 | | 2.0677 | 4396 | 0.0046 | | 2.0682 | 4397 | 0.0069 | | 2.0687 | 4398 | 0.0067 | | 2.0691 | 4399 | 0.0098 | | 2.0696 | 4400 | 0.0036 | | 2.0701 | 4401 | 0.006 | | 2.0706 | 4402 | 0.0033 | | 2.0710 | 4403 | 0.0036 | | 2.0715 | 4404 | 0.0043 | | 2.0720 | 4405 | 0.0036 | | 2.0724 | 4406 | 0.0064 | | 2.0729 | 4407 | 0.0073 | | 2.0734 | 4408 | 0.0036 | | 2.0738 | 4409 | 0.0023 | | 2.0743 | 4410 | 0.0031 | | 2.0748 | 4411 | 0.0058 | | 2.0753 | 4412 | 0.0025 | | 2.0757 | 4413 | 0.0045 | | 2.0762 | 4414 | 0.006 | | 2.0767 | 4415 | 0.0229 | | 2.0771 | 4416 | 0.012 | | 2.0776 | 4417 | 0.0025 | | 2.0781 | 4418 | 0.0029 | | 2.0786 | 4419 | 0.0038 | | 2.0790 | 4420 | 0.0101 | | 2.0795 | 4421 | 0.0064 | | 2.0800 | 4422 | 0.003 | | 2.0804 | 4423 | 0.0066 | | 2.0809 | 4424 | 0.0059 | | 2.0814 | 4425 | 0.0039 | | 2.0818 | 4426 | 0.005 | | 2.0823 | 4427 | 0.008 | | 2.0828 | 4428 | 0.0118 | | 2.0833 | 4429 | 0.0066 | | 2.0837 | 4430 | 0.0035 | | 2.0842 | 4431 | 0.005 | | 2.0847 | 4432 | 0.0032 | | 2.0851 | 4433 | 0.0023 | | 2.0856 | 4434 | 0.0046 | | 2.0861 | 4435 | 0.012 | | 2.0865 | 4436 | 0.0093 | | 2.0870 | 4437 | 0.0046 | | 2.0875 | 4438 | 0.002 | | 2.0880 | 4439 | 0.0042 | | 2.0884 | 4440 | 0.0035 | | 2.0889 | 4441 | 0.0031 | | 2.0894 | 4442 | 0.0068 | | 2.0898 | 4443 | 0.0026 | | 2.0903 | 4444 | 0.0062 | | 2.0908 | 4445 | 0.01 | | 2.0913 | 4446 | 0.0036 | | 2.0917 | 4447 | 0.0034 | | 2.0922 | 4448 | 0.0083 | | 2.0927 | 4449 | 0.0027 | | 2.0931 | 4450 | 0.0027 | | 2.0936 | 4451 | 0.0053 | | 2.0941 | 4452 | 0.0052 | | 2.0945 | 4453 | 0.0039 | | 2.0950 | 4454 | 0.0057 | | 2.0955 | 4455 | 0.0059 | | 2.0960 | 4456 | 0.0051 | | 2.0964 | 4457 | 0.0047 | | 2.0969 | 4458 | 0.0047 | | 2.0974 | 4459 | 0.0042 | | 2.0978 | 4460 | 0.0037 | | 2.0983 | 4461 | 0.0038 | | 2.0988 | 4462 | 0.0032 | | 2.0992 | 4463 | 0.0064 | | 2.0997 | 4464 | 0.0026 | | 2.1002 | 4465 | 0.0039 | | 2.1007 | 4466 | 0.0051 | | 2.1011 | 4467 | 0.0043 | | 2.1016 | 4468 | 0.007 | | 2.1021 | 4469 | 0.0038 | | 2.1025 | 4470 | 0.0023 | | 2.1030 | 4471 | 0.0079 | | 2.1035 | 4472 | 0.0058 | | 2.1040 | 4473 | 0.0032 | | 2.1044 | 4474 | 0.0076 | | 2.1049 | 4475 | 0.0062 | | 2.1054 | 4476 | 0.0036 | | 2.1058 | 4477 | 0.0035 | | 2.1063 | 4478 | 0.0042 | | 2.1068 | 4479 | 0.0038 | | 2.1072 | 4480 | 0.0018 | | 2.1077 | 4481 | 0.0059 | | 2.1082 | 4482 | 0.0059 | | 2.1087 | 4483 | 0.0058 | | 2.1091 | 4484 | 0.0059 | | 2.1096 | 4485 | 0.0047 | | 2.1101 | 4486 | 0.0043 | | 2.1105 | 4487 | 0.0066 | | 2.1110 | 4488 | 0.0083 | | 2.1115 | 4489 | 0.0052 | | 2.1119 | 4490 | 0.0035 | | 2.1124 | 4491 | 0.0034 | | 2.1129 | 4492 | 0.0016 | | 2.1134 | 4493 | 0.0064 | | 2.1138 | 4494 | 0.0044 | | 2.1143 | 4495 | 0.0037 | | 2.1148 | 4496 | 0.0022 | | 2.1152 | 4497 | 0.0038 | | 2.1157 | 4498 | 0.0033 | | 2.1162 | 4499 | 0.0029 | | 2.1167 | 4500 | 0.003 | | 2.1171 | 4501 | 0.0118 | | 2.1176 | 4502 | 0.0041 | | 2.1181 | 4503 | 0.0059 | | 2.1185 | 4504 | 0.0057 | | 2.1190 | 4505 | 0.0028 | | 2.1195 | 4506 | 0.008 | | 2.1199 | 4507 | 0.0046 | | 2.1204 | 4508 | 0.0039 | | 2.1209 | 4509 | 0.0045 | | 2.1214 | 4510 | 0.0038 | | 2.1218 | 4511 | 0.0051 | | 2.1223 | 4512 | 0.0039 | | 2.1228 | 4513 | 0.0024 | | 2.1232 | 4514 | 0.0027 | | 2.1237 | 4515 | 0.0033 | | 2.1242 | 4516 | 0.0029 | | 2.1246 | 4517 | 0.0053 | | 2.1251 | 4518 | 0.0053 | | 2.1256 | 4519 | 0.0048 | | 2.1261 | 4520 | 0.0054 | | 2.1265 | 4521 | 0.0052 | | 2.1270 | 4522 | 0.0078 | | 2.1275 | 4523 | 0.0032 | | 2.1279 | 4524 | 0.0056 | | 2.1284 | 4525 | 0.0015 | | 2.1289 | 4526 | 0.0028 | | 2.1294 | 4527 | 0.0028 | | 2.1298 | 4528 | 0.0028 | | 2.1303 | 4529 | 0.0016 | | 2.1308 | 4530 | 0.0236 | | 2.1312 | 4531 | 0.0056 | | 2.1317 | 4532 | 0.0068 | | 2.1322 | 4533 | 0.0023 | | 2.1326 | 4534 | 0.0056 | | 2.1331 | 4535 | 0.003 | | 2.1336 | 4536 | 0.003 | | 2.1341 | 4537 | 0.0021 | | 2.1345 | 4538 | 0.0033 | | 2.1350 | 4539 | 0.0042 | | 2.1355 | 4540 | 0.0039 | | 2.1359 | 4541 | 0.0041 | | 2.1364 | 4542 | 0.0027 | | 2.1369 | 4543 | 0.0055 | | 2.1373 | 4544 | 0.0017 | | 2.1378 | 4545 | 0.0067 | | 2.1383 | 4546 | 0.0052 | | 2.1388 | 4547 | 0.0033 | | 2.1392 | 4548 | 0.0065 | | 2.1397 | 4549 | 0.0081 | | 2.1402 | 4550 | 0.0049 | | 2.1406 | 4551 | 0.0051 | | 2.1411 | 4552 | 0.0046 | | 2.1416 | 4553 | 0.0028 | | 2.1421 | 4554 | 0.0036 | | 2.1425 | 4555 | 0.0025 | | 2.1430 | 4556 | 0.0035 | | 2.1435 | 4557 | 0.0029 | | 2.1439 | 4558 | 0.0082 | | 2.1444 | 4559 | 0.0061 | | 2.1449 | 4560 | 0.0027 | | 2.1453 | 4561 | 0.0027 | | 2.1458 | 4562 | 0.0076 | | 2.1463 | 4563 | 0.0063 | | 2.1468 | 4564 | 0.0229 | | 2.1472 | 4565 | 0.0026 | | 2.1477 | 4566 | 0.0054 | | 2.1482 | 4567 | 0.0062 | | 2.1486 | 4568 | 0.012 | | 2.1491 | 4569 | 0.0042 | | 2.1496 | 4570 | 0.0033 | | 2.1500 | 4571 | 0.0024 | | 2.1505 | 4572 | 0.0034 | | 2.1510 | 4573 | 0.0028 | | 2.1515 | 4574 | 0.0025 | | 2.1519 | 4575 | 0.0019 | | 2.1524 | 4576 | 0.005 | | 2.1529 | 4577 | 0.0046 | | 2.1533 | 4578 | 0.0045 | | 2.1538 | 4579 | 0.0076 | | 2.1543 | 4580 | 0.0049 | | 2.1548 | 4581 | 0.0034 | | 2.1552 | 4582 | 0.0072 | | 2.1557 | 4583 | 0.0035 | | 2.1562 | 4584 | 0.0044 | | 2.1566 | 4585 | 0.0121 | | 2.1571 | 4586 | 0.0039 | | 2.1576 | 4587 | 0.0035 | | 2.1580 | 4588 | 0.0031 | | 2.1585 | 4589 | 0.0048 | | 2.1590 | 4590 | 0.0026 | | 2.1595 | 4591 | 0.0045 | | 2.1599 | 4592 | 0.007 | | 2.1604 | 4593 | 0.0036 | | 2.1609 | 4594 | 0.0048 | | 2.1613 | 4595 | 0.0036 | | 2.1618 | 4596 | 0.0022 | | 2.1623 | 4597 | 0.0057 | | 2.1627 | 4598 | 0.0028 | | 2.1632 | 4599 | 0.0022 | | 2.1637 | 4600 | 0.0014 | | 2.1642 | 4601 | 0.0018 | | 2.1646 | 4602 | 0.0017 | | 2.1651 | 4603 | 0.0033 | | 2.1656 | 4604 | 0.0031 | | 2.1660 | 4605 | 0.0021 | | 2.1665 | 4606 | 0.0028 | | 2.1670 | 4607 | 0.0046 | | 2.1675 | 4608 | 0.005 | | 2.1679 | 4609 | 0.0048 | | 2.1684 | 4610 | 0.004 | | 2.1689 | 4611 | 0.0027 | | 2.1693 | 4612 | 0.0038 | | 2.1698 | 4613 | 0.0036 | | 2.1703 | 4614 | 0.004 | | 2.1707 | 4615 | 0.0095 | | 2.1712 | 4616 | 0.0018 | | 2.1717 | 4617 | 0.0036 | | 2.1722 | 4618 | 0.0033 | | 2.1726 | 4619 | 0.0038 | | 2.1731 | 4620 | 0.0023 | | 2.1736 | 4621 | 0.0019 | | 2.1740 | 4622 | 0.0042 | | 2.1745 | 4623 | 0.0049 | | 2.1750 | 4624 | 0.0039 | | 2.1754 | 4625 | 0.0046 | | 2.1759 | 4626 | 0.0048 | | 2.1764 | 4627 | 0.0052 | | 2.1769 | 4628 | 0.0014 | | 2.1773 | 4629 | 0.0044 | | 2.1778 | 4630 | 0.0052 | | 2.1783 | 4631 | 0.0048 | | 2.1787 | 4632 | 0.0028 | | 2.1792 | 4633 | 0.004 | | 2.1797 | 4634 | 0.0067 | | 2.1802 | 4635 | 0.0038 | | 2.1806 | 4636 | 0.0016 | | 2.1811 | 4637 | 0.0016 | | 2.1816 | 4638 | 0.0025 | | 2.1820 | 4639 | 0.005 | | 2.1825 | 4640 | 0.0062 | | 2.1830 | 4641 | 0.0029 | | 2.1834 | 4642 | 0.0025 | | 2.1839 | 4643 | 0.0035 | | 2.1844 | 4644 | 0.006 | | 2.1849 | 4645 | 0.0041 | | 2.1853 | 4646 | 0.0051 | | 2.1858 | 4647 | 0.0018 | | 2.1863 | 4648 | 0.0017 | | 2.1867 | 4649 | 0.0051 | | 2.1872 | 4650 | 0.0021 | | 2.1877 | 4651 | 0.0043 | | 2.1881 | 4652 | 0.004 | | 2.1886 | 4653 | 0.0031 | | 2.1891 | 4654 | 0.0066 | | 2.1896 | 4655 | 0.0027 | | 2.1900 | 4656 | 0.0029 | | 2.1905 | 4657 | 0.0034 | | 2.1910 | 4658 | 0.0047 | | 2.1914 | 4659 | 0.0028 | | 2.1919 | 4660 | 0.0044 | | 2.1924 | 4661 | 0.0057 | | 2.1929 | 4662 | 0.0022 | | 2.1933 | 4663 | 0.0018 | | 2.1938 | 4664 | 0.0046 | | 2.1943 | 4665 | 0.0049 | | 2.1947 | 4666 | 0.0051 | | 2.1952 | 4667 | 0.0043 | | 2.1957 | 4668 | 0.0042 | | 2.1961 | 4669 | 0.0059 | | 2.1966 | 4670 | 0.0034 | | 2.1971 | 4671 | 0.0042 | | 2.1976 | 4672 | 0.0033 | | 2.1980 | 4673 | 0.0136 | | 2.1985 | 4674 | 0.0042 | | 2.1990 | 4675 | 0.0033 | | 2.1994 | 4676 | 0.0037 | | 2.1999 | 4677 | 0.0048 | | 2.2004 | 4678 | 0.0035 | | 2.2008 | 4679 | 0.0035 | | 2.2013 | 4680 | 0.0075 | | 2.2018 | 4681 | 0.0014 | | 2.2023 | 4682 | 0.002 | | 2.2027 | 4683 | 0.0044 | | 2.2032 | 4684 | 0.0058 | | 2.2037 | 4685 | 0.0035 | | 2.2041 | 4686 | 0.0041 | | 2.2046 | 4687 | 0.0075 | | 2.2051 | 4688 | 0.0052 | | 2.2056 | 4689 | 0.0036 | | 2.2060 | 4690 | 0.0054 | | 2.2065 | 4691 | 0.0021 | | 2.2070 | 4692 | 0.0033 | | 2.2074 | 4693 | 0.0085 | | 2.2079 | 4694 | 0.0014 | | 2.2084 | 4695 | 0.0028 | | 2.2088 | 4696 | 0.0039 | | 2.2093 | 4697 | 0.0038 | | 2.2098 | 4698 | 0.0032 | | 2.2103 | 4699 | 0.0031 | | 2.2107 | 4700 | 0.0033 | | 2.2112 | 4701 | 0.004 | | 2.2117 | 4702 | 0.0036 | | 2.2121 | 4703 | 0.0032 | | 2.2126 | 4704 | 0.0026 | | 2.2131 | 4705 | 0.0037 | | 2.2135 | 4706 | 0.0056 | | 2.2140 | 4707 | 0.0034 | | 2.2145 | 4708 | 0.0051 | | 2.2150 | 4709 | 0.0028 | | 2.2154 | 4710 | 0.0049 | | 2.2159 | 4711 | 0.0036 | | 2.2164 | 4712 | 0.0027 | | 2.2168 | 4713 | 0.0022 | | 2.2173 | 4714 | 0.0016 | | 2.2178 | 4715 | 0.0049 | | 2.2183 | 4716 | 0.0034 | | 2.2187 | 4717 | 0.0037 | | 2.2192 | 4718 | 0.0043 | | 2.2197 | 4719 | 0.0045 | | 2.2201 | 4720 | 0.0039 | | 2.2206 | 4721 | 0.0032 | | 2.2211 | 4722 | 0.0048 | | 2.2215 | 4723 | 0.0054 | | 2.2220 | 4724 | 0.0027 | | 2.2225 | 4725 | 0.0026 | | 2.2230 | 4726 | 0.0023 | | 2.2234 | 4727 | 0.0026 | | 2.2239 | 4728 | 0.0019 | | 2.2244 | 4729 | 0.0047 | | 2.2248 | 4730 | 0.0017 | | 2.2253 | 4731 | 0.003 | | 2.2258 | 4732 | 0.0031 | | 2.2262 | 4733 | 0.0016 | | 2.2267 | 4734 | 0.0036 | | 2.2272 | 4735 | 0.0066 | | 2.2277 | 4736 | 0.0021 | | 2.2281 | 4737 | 0.003 | | 2.2286 | 4738 | 0.0034 | | 2.2291 | 4739 | 0.0091 | | 2.2295 | 4740 | 0.0025 | | 2.2300 | 4741 | 0.0022 | | 2.2305 | 4742 | 0.0051 | | 2.2310 | 4743 | 0.0031 | | 2.2314 | 4744 | 0.0031 | | 2.2319 | 4745 | 0.0046 | | 2.2324 | 4746 | 0.003 | | 2.2328 | 4747 | 0.0029 | | 2.2333 | 4748 | 0.0031 | | 2.2338 | 4749 | 0.0021 | | 2.2342 | 4750 | 0.0033 | | 2.2347 | 4751 | 0.0095 | | 2.2352 | 4752 | 0.0027 | | 2.2357 | 4753 | 0.0021 | | 2.2361 | 4754 | 0.0024 | | 2.2366 | 4755 | 0.0034 | | 2.2371 | 4756 | 0.0028 | | 2.2375 | 4757 | 0.0044 | | 2.2380 | 4758 | 0.0051 | | 2.2385 | 4759 | 0.003 | | 2.2389 | 4760 | 0.0034 | | 2.2394 | 4761 | 0.0022 | | 2.2399 | 4762 | 0.0021 | | 2.2404 | 4763 | 0.0026 | | 2.2408 | 4764 | 0.0029 | | 2.2413 | 4765 | 0.0044 | | 2.2418 | 4766 | 0.0026 | | 2.2422 | 4767 | 0.0023 | | 2.2427 | 4768 | 0.0022 | | 2.2432 | 4769 | 0.0086 | | 2.2437 | 4770 | 0.0025 | | 2.2441 | 4771 | 0.0041 | | 2.2446 | 4772 | 0.003 | | 2.2451 | 4773 | 0.003 | | 2.2455 | 4774 | 0.0022 | | 2.2460 | 4775 | 0.0083 | | 2.2465 | 4776 | 0.0015 | | 2.2469 | 4777 | 0.0063 | | 2.2474 | 4778 | 0.0022 | | 2.2479 | 4779 | 0.0045 | | 2.2484 | 4780 | 0.0044 | | 2.2488 | 4781 | 0.0082 | | 2.2493 | 4782 | 0.0027 | | 2.2498 | 4783 | 0.0035 | | 2.2502 | 4784 | 0.0028 | | 2.2507 | 4785 | 0.0044 | | 2.2512 | 4786 | 0.0016 | | 2.2516 | 4787 | 0.0025 | | 2.2521 | 4788 | 0.0032 | | 2.2526 | 4789 | 0.0039 | | 2.2531 | 4790 | 0.0018 | | 2.2535 | 4791 | 0.0023 | | 2.2540 | 4792 | 0.0025 | | 2.2545 | 4793 | 0.0093 | | 2.2549 | 4794 | 0.0054 | | 2.2554 | 4795 | 0.0015 | | 2.2559 | 4796 | 0.0033 | | 2.2563 | 4797 | 0.0026 | | 2.2568 | 4798 | 0.0044 | | 2.2573 | 4799 | 0.0026 | | 2.2578 | 4800 | 0.0057 | | 2.2582 | 4801 | 0.0019 | | 2.2587 | 4802 | 0.0032 | | 2.2592 | 4803 | 0.0043 | | 2.2596 | 4804 | 0.004 | | 2.2601 | 4805 | 0.0073 | | 2.2606 | 4806 | 0.0059 | | 2.2611 | 4807 | 0.003 | | 2.2615 | 4808 | 0.0041 | | 2.2620 | 4809 | 0.0032 | | 2.2625 | 4810 | 0.0072 | | 2.2629 | 4811 | 0.0043 | | 2.2634 | 4812 | 0.005 | | 2.2639 | 4813 | 0.0073 | | 2.2643 | 4814 | 0.0029 | | 2.2648 | 4815 | 0.0023 | | 2.2653 | 4816 | 0.0049 | | 2.2658 | 4817 | 0.0034 | | 2.2662 | 4818 | 0.0029 | | 2.2667 | 4819 | 0.003 | | 2.2672 | 4820 | 0.0023 | | 2.2676 | 4821 | 0.0025 | | 2.2681 | 4822 | 0.002 | | 2.2686 | 4823 | 0.002 | | 2.2690 | 4824 | 0.0088 | | 2.2695 | 4825 | 0.0036 | | 2.2700 | 4826 | 0.0033 | | 2.2705 | 4827 | 0.0042 | | 2.2709 | 4828 | 0.0062 | | 2.2714 | 4829 | 0.0024 | | 2.2719 | 4830 | 0.0024 | | 2.2723 | 4831 | 0.0032 | | 2.2728 | 4832 | 0.0028 | | 2.2733 | 4833 | 0.0021 | | 2.2738 | 4834 | 0.0024 | | 2.2742 | 4835 | 0.0025 | | 2.2747 | 4836 | 0.0031 | | 2.2752 | 4837 | 0.0018 | | 2.2756 | 4838 | 0.0018 | | 2.2761 | 4839 | 0.002 | | 2.2766 | 4840 | 0.0028 | | 2.2770 | 4841 | 0.0029 | | 2.2775 | 4842 | 0.0028 | | 2.2780 | 4843 | 0.0052 | | 2.2785 | 4844 | 0.003 | | 2.2789 | 4845 | 0.0024 | | 2.2794 | 4846 | 0.0031 | | 2.2799 | 4847 | 0.0071 | | 2.2803 | 4848 | 0.0027 | | 2.2808 | 4849 | 0.0047 | | 2.2813 | 4850 | 0.0023 | | 2.2817 | 4851 | 0.0044 | | 2.2822 | 4852 | 0.0041 | | 2.2827 | 4853 | 0.0049 | | 2.2832 | 4854 | 0.0023 | | 2.2836 | 4855 | 0.0025 | | 2.2841 | 4856 | 0.0064 | | 2.2846 | 4857 | 0.0029 | | 2.2850 | 4858 | 0.0023 | | 2.2855 | 4859 | 0.0019 | | 2.2860 | 4860 | 0.0046 | | 2.2865 | 4861 | 0.003 | | 2.2869 | 4862 | 0.0025 | | 2.2874 | 4863 | 0.0015 | | 2.2879 | 4864 | 0.0029 | | 2.2883 | 4865 | 0.0063 | | 2.2888 | 4866 | 0.0034 | | 2.2893 | 4867 | 0.003 | | 2.2897 | 4868 | 0.0021 | | 2.2902 | 4869 | 0.0021 | | 2.2907 | 4870 | 0.0101 | | 2.2912 | 4871 | 0.0041 | | 2.2916 | 4872 | 0.0028 | | 2.2921 | 4873 | 0.0022 | | 2.2926 | 4874 | 0.002 | | 2.2930 | 4875 | 0.0013 | | 2.2935 | 4876 | 0.0034 | | 2.2940 | 4877 | 0.0074 | | 2.2944 | 4878 | 0.0026 | | 2.2949 | 4879 | 0.0039 | | 2.2954 | 4880 | 0.0098 | | 2.2959 | 4881 | 0.0043 | | 2.2963 | 4882 | 0.0029 | | 2.2968 | 4883 | 0.0019 | | 2.2973 | 4884 | 0.0066 | | 2.2977 | 4885 | 0.0033 | | 2.2982 | 4886 | 0.0034 | | 2.2987 | 4887 | 0.0046 | | 2.2992 | 4888 | 0.0021 | | 2.2996 | 4889 | 0.003 | | 2.3001 | 4890 | 0.0025 | | 2.3006 | 4891 | 0.0062 | | 2.3010 | 4892 | 0.0033 | | 2.3015 | 4893 | 0.0032 | | 2.3020 | 4894 | 0.004 | | 2.3024 | 4895 | 0.003 | | 2.3029 | 4896 | 0.0021 | | 2.3034 | 4897 | 0.0023 | | 2.3039 | 4898 | 0.0035 | | 2.3043 | 4899 | 0.0038 | | 2.3048 | 4900 | 0.0021 | | 2.3053 | 4901 | 0.002 | | 2.3057 | 4902 | 0.002 | | 2.3062 | 4903 | 0.0021 | | 2.3067 | 4904 | 0.0035 | | 2.3071 | 4905 | 0.004 | | 2.3076 | 4906 | 0.0025 | | 2.3081 | 4907 | 0.0229 | | 2.3086 | 4908 | 0.0036 | | 2.3090 | 4909 | 0.0038 | | 2.3095 | 4910 | 0.002 | | 2.3100 | 4911 | 0.0019 | | 2.3104 | 4912 | 0.0042 | | 2.3109 | 4913 | 0.003 | | 2.3114 | 4914 | 0.0039 | | 2.3119 | 4915 | 0.0049 | | 2.3123 | 4916 | 0.0023 | | 2.3128 | 4917 | 0.0035 | | 2.3133 | 4918 | 0.0063 | | 2.3137 | 4919 | 0.0028 | | 2.3142 | 4920 | 0.0022 | | 2.3147 | 4921 | 0.0028 | | 2.3151 | 4922 | 0.0038 | | 2.3156 | 4923 | 0.0018 | | 2.3161 | 4924 | 0.005 | | 2.3166 | 4925 | 0.002 | | 2.3170 | 4926 | 0.0039 | | 2.3175 | 4927 | 0.0046 | | 2.3180 | 4928 | 0.0024 | | 2.3184 | 4929 | 0.0022 | | 2.3189 | 4930 | 0.0027 | | 2.3194 | 4931 | 0.0028 | | 2.3198 | 4932 | 0.005 | | 2.3203 | 4933 | 0.002 | | 2.3208 | 4934 | 0.0028 | | 2.3213 | 4935 | 0.0039 | | 2.3217 | 4936 | 0.0023 | | 2.3222 | 4937 | 0.0049 | | 2.3227 | 4938 | 0.0048 | | 2.3231 | 4939 | 0.0031 | | 2.3236 | 4940 | 0.0022 | | 2.3241 | 4941 | 0.0036 | | 2.3246 | 4942 | 0.0024 | | 2.3250 | 4943 | 0.0038 | | 2.3255 | 4944 | 0.0054 | | 2.3260 | 4945 | 0.0026 | | 2.3264 | 4946 | 0.0045 | | 2.3269 | 4947 | 0.0026 | | 2.3274 | 4948 | 0.0042 | | 2.3278 | 4949 | 0.0027 | | 2.3283 | 4950 | 0.003 | | 2.3288 | 4951 | 0.0012 | | 2.3293 | 4952 | 0.0018 | | 2.3297 | 4953 | 0.0027 | | 2.3302 | 4954 | 0.0026 | | 2.3307 | 4955 | 0.002 | | 2.3311 | 4956 | 0.0024 | | 2.3316 | 4957 | 0.0013 | | 2.3321 | 4958 | 0.0051 | | 2.3325 | 4959 | 0.0035 | | 2.3330 | 4960 | 0.003 | | 2.3335 | 4961 | 0.004 | | 2.3340 | 4962 | 0.0043 | | 2.3344 | 4963 | 0.0024 | | 2.3349 | 4964 | 0.0042 | | 2.3354 | 4965 | 0.0017 | | 2.3358 | 4966 | 0.004 | | 2.3363 | 4967 | 0.0043 | | 2.3368 | 4968 | 0.0016 | | 2.3373 | 4969 | 0.0037 | | 2.3377 | 4970 | 0.0032 | | 2.3382 | 4971 | 0.0031 | | 2.3387 | 4972 | 0.0023 | | 2.3391 | 4973 | 0.0049 | | 2.3396 | 4974 | 0.0026 | | 2.3401 | 4975 | 0.0027 | | 2.3405 | 4976 | 0.0024 | | 2.3410 | 4977 | 0.002 | | 2.3415 | 4978 | 0.0019 | | 2.3420 | 4979 | 0.0023 | | 2.3424 | 4980 | 0.003 | | 2.3429 | 4981 | 0.0085 | | 2.3434 | 4982 | 0.0037 | | 2.3438 | 4983 | 0.0026 | | 2.3443 | 4984 | 0.0031 | | 2.3448 | 4985 | 0.0029 | | 2.3452 | 4986 | 0.0046 | | 2.3457 | 4987 | 0.002 | | 2.3462 | 4988 | 0.0031 | | 2.3467 | 4989 | 0.0031 | | 2.3471 | 4990 | 0.0019 | | 2.3476 | 4991 | 0.0022 | | 2.3481 | 4992 | 0.0028 | | 2.3485 | 4993 | 0.0018 | | 2.3490 | 4994 | 0.0029 | | 2.3495 | 4995 | 0.002 | | 2.3500 | 4996 | 0.0056 | | 2.3504 | 4997 | 0.0048 | | 2.3509 | 4998 | 0.0022 | | 2.3514 | 4999 | 0.0022 | | 2.3518 | 5000 | 0.0032 | | 2.3523 | 5001 | 0.0034 | | 2.3528 | 5002 | 0.0013 | | 2.3532 | 5003 | 0.0021 | | 2.3537 | 5004 | 0.0054 | | 2.3542 | 5005 | 0.0029 | | 2.3547 | 5006 | 0.0019 | | 2.3551 | 5007 | 0.0014 | | 2.3556 | 5008 | 0.0016 | | 2.3561 | 5009 | 0.0036 | | 2.3565 | 5010 | 0.0025 | | 2.3570 | 5011 | 0.0027 | | 2.3575 | 5012 | 0.0018 | | 2.3579 | 5013 | 0.0017 | | 2.3584 | 5014 | 0.0043 | | 2.3589 | 5015 | 0.0015 | | 2.3594 | 5016 | 0.0026 | | 2.3598 | 5017 | 0.003 | | 2.3603 | 5018 | 0.0016 | | 2.3608 | 5019 | 0.003 | | 2.3612 | 5020 | 0.0053 | | 2.3617 | 5021 | 0.0027 | | 2.3622 | 5022 | 0.004 | | 2.3627 | 5023 | 0.0028 | | 2.3631 | 5024 | 0.003 | | 2.3636 | 5025 | 0.0045 | | 2.3641 | 5026 | 0.0046 | | 2.3645 | 5027 | 0.002 | | 2.3650 | 5028 | 0.0029 | | 2.3655 | 5029 | 0.0039 | | 2.3659 | 5030 | 0.0023 | | 2.3664 | 5031 | 0.0054 | | 2.3669 | 5032 | 0.0043 | | 2.3674 | 5033 | 0.0009 | | 2.3678 | 5034 | 0.0024 | | 2.3683 | 5035 | 0.0021 | | 2.3688 | 5036 | 0.0036 | | 2.3692 | 5037 | 0.0014 | | 2.3697 | 5038 | 0.0055 | | 2.3702 | 5039 | 0.0028 | | 2.3706 | 5040 | 0.0017 | | 2.3711 | 5041 | 0.0023 | | 2.3716 | 5042 | 0.0026 | | 2.3721 | 5043 | 0.0032 | | 2.3725 | 5044 | 0.0026 | | 2.3730 | 5045 | 0.0031 | | 2.3735 | 5046 | 0.004 | | 2.3739 | 5047 | 0.0042 | | 2.3744 | 5048 | 0.0028 | | 2.3749 | 5049 | 0.0023 | | 2.3754 | 5050 | 0.0043 | | 2.3758 | 5051 | 0.0034 | | 2.3763 | 5052 | 0.003 | | 2.3768 | 5053 | 0.0034 | | 2.3772 | 5054 | 0.0032 | | 2.3777 | 5055 | 0.0115 | | 2.3782 | 5056 | 0.0017 | | 2.3786 | 5057 | 0.0018 | | 2.3791 | 5058 | 0.0056 | | 2.3796 | 5059 | 0.0019 | | 2.3801 | 5060 | 0.0051 | | 2.3805 | 5061 | 0.0034 | | 2.3810 | 5062 | 0.0031 | | 2.3815 | 5063 | 0.0033 | | 2.3819 | 5064 | 0.0017 | | 2.3824 | 5065 | 0.0028 | | 2.3829 | 5066 | 0.002 | | 2.3833 | 5067 | 0.0018 | | 2.3838 | 5068 | 0.0048 | | 2.3843 | 5069 | 0.0016 | | 2.3848 | 5070 | 0.0043 | | 2.3852 | 5071 | 0.0043 | | 2.3857 | 5072 | 0.0022 | | 2.3862 | 5073 | 0.0028 | | 2.3866 | 5074 | 0.0021 | | 2.3871 | 5075 | 0.0027 | | 2.3876 | 5076 | 0.0014 | | 2.3881 | 5077 | 0.002 | | 2.3885 | 5078 | 0.0027 | | 2.3890 | 5079 | 0.0014 | | 2.3895 | 5080 | 0.0016 | | 2.3899 | 5081 | 0.0033 | | 2.3904 | 5082 | 0.003 | | 2.3909 | 5083 | 0.004 | | 2.3913 | 5084 | 0.0044 | | 2.3918 | 5085 | 0.0023 | | 2.3923 | 5086 | 0.0081 | | 2.3928 | 5087 | 0.0032 | | 2.3932 | 5088 | 0.0031 | | 2.3937 | 5089 | 0.0033 | | 2.3942 | 5090 | 0.0029 | | 2.3946 | 5091 | 0.0021 | | 2.3951 | 5092 | 0.003 | | 2.3956 | 5093 | 0.0025 | | 2.3960 | 5094 | 0.0048 | | 2.3965 | 5095 | 0.0045 | | 2.3970 | 5096 | 0.0034 | | 2.3975 | 5097 | 0.0044 | | 2.3979 | 5098 | 0.0021 | | 2.3984 | 5099 | 0.0033 | | 2.3989 | 5100 | 0.0029 | | 2.3993 | 5101 | 0.0042 | | 2.3998 | 5102 | 0.0016 | | 2.4003 | 5103 | 0.0022 | | 2.4008 | 5104 | 0.0039 | | 2.4012 | 5105 | 0.0045 | | 2.4017 | 5106 | 0.0036 | | 2.4022 | 5107 | 0.0029 | | 2.4026 | 5108 | 0.005 | | 2.4031 | 5109 | 0.0026 | | 2.4036 | 5110 | 0.0056 | | 2.4040 | 5111 | 0.0033 | | 2.4045 | 5112 | 0.0016 | | 2.4050 | 5113 | 0.0027 | | 2.4055 | 5114 | 0.0017 | | 2.4059 | 5115 | 0.0025 | | 2.4064 | 5116 | 0.0038 | | 2.4069 | 5117 | 0.0019 | | 2.4073 | 5118 | 0.0028 | | 2.4078 | 5119 | 0.0026 | | 2.4083 | 5120 | 0.0049 | | 2.4087 | 5121 | 0.0016 | | 2.4092 | 5122 | 0.0024 | | 2.4097 | 5123 | 0.0029 | | 2.4102 | 5124 | 0.0038 | | 2.4106 | 5125 | 0.0041 | | 2.4111 | 5126 | 0.002 | | 2.4116 | 5127 | 0.0042 | | 2.4120 | 5128 | 0.0015 | | 2.4125 | 5129 | 0.0046 | | 2.4130 | 5130 | 0.0014 | | 2.4135 | 5131 | 0.002 | | 2.4139 | 5132 | 0.0031 | | 2.4144 | 5133 | 0.0032 | | 2.4149 | 5134 | 0.0039 | | 2.4153 | 5135 | 0.0023 | | 2.4158 | 5136 | 0.0023 | | 2.4163 | 5137 | 0.0025 | | 2.4167 | 5138 | 0.0018 | | 2.4172 | 5139 | 0.0011 | | 2.4177 | 5140 | 0.0038 | | 2.4182 | 5141 | 0.0038 | | 2.4186 | 5142 | 0.0047 | | 2.4191 | 5143 | 0.0037 | | 2.4196 | 5144 | 0.003 | | 2.4200 | 5145 | 0.0016 | | 2.4205 | 5146 | 0.0023 | | 2.4210 | 5147 | 0.003 | | 2.4214 | 5148 | 0.0039 | | 2.4219 | 5149 | 0.0022 | | 2.4224 | 5150 | 0.0014 | | 2.4229 | 5151 | 0.0031 | | 2.4233 | 5152 | 0.0036 | | 2.4238 | 5153 | 0.0038 | | 2.4243 | 5154 | 0.0027 | | 2.4247 | 5155 | 0.0043 | | 2.4252 | 5156 | 0.0018 | | 2.4257 | 5157 | 0.0023 | | 2.4262 | 5158 | 0.0055 | | 2.4266 | 5159 | 0.0028 | | 2.4271 | 5160 | 0.0023 | | 2.4276 | 5161 | 0.0019 | | 2.4280 | 5162 | 0.0034 | | 2.4285 | 5163 | 0.0019 | | 2.4290 | 5164 | 0.0032 | | 2.4294 | 5165 | 0.0048 | | 2.4299 | 5166 | 0.003 | | 2.4304 | 5167 | 0.0037 | | 2.4309 | 5168 | 0.0015 | | 2.4313 | 5169 | 0.0027 | | 2.4318 | 5170 | 0.0012 | | 2.4323 | 5171 | 0.0018 | | 2.4327 | 5172 | 0.0021 | | 2.4332 | 5173 | 0.0016 | | 2.4337 | 5174 | 0.0035 | | 2.4341 | 5175 | 0.0046 | | 2.4346 | 5176 | 0.0057 | | 2.4351 | 5177 | 0.0015 | | 2.4356 | 5178 | 0.005 | | 2.4360 | 5179 | 0.0037 | | 2.4365 | 5180 | 0.0014 | | 2.4370 | 5181 | 0.0025 | | 2.4374 | 5182 | 0.003 | | 2.4379 | 5183 | 0.0033 | | 2.4384 | 5184 | 0.0016 | | 2.4389 | 5185 | 0.0051 | | 2.4393 | 5186 | 0.008 | | 2.4398 | 5187 | 0.0014 | | 2.4403 | 5188 | 0.0017 | | 2.4407 | 5189 | 0.0044 | | 2.4412 | 5190 | 0.0023 | | 2.4417 | 5191 | 0.0033 | | 2.4421 | 5192 | 0.0027 | | 2.4426 | 5193 | 0.005 | | 2.4431 | 5194 | 0.0017 | | 2.4436 | 5195 | 0.0489 | | 2.4440 | 5196 | 0.003 | | 2.4445 | 5197 | 0.0023 | | 2.4450 | 5198 | 0.0014 | | 2.4454 | 5199 | 0.003 | | 2.4459 | 5200 | 0.0053 | | 2.4464 | 5201 | 0.0028 | | 2.4468 | 5202 | 0.0036 | | 2.4473 | 5203 | 0.0034 | | 2.4478 | 5204 | 0.0025 | | 2.4483 | 5205 | 0.0025 | | 2.4487 | 5206 | 0.0047 | | 2.4492 | 5207 | 0.0032 | | 2.4497 | 5208 | 0.0034 | | 2.4501 | 5209 | 0.0048 | | 2.4506 | 5210 | 0.0024 | | 2.4511 | 5211 | 0.0037 | | 2.4516 | 5212 | 0.0015 | | 2.4520 | 5213 | 0.003 | | 2.4525 | 5214 | 0.0025 | | 2.4530 | 5215 | 0.0029 | | 2.4534 | 5216 | 0.0017 | | 2.4539 | 5217 | 0.0012 | | 2.4544 | 5218 | 0.0027 | | 2.4548 | 5219 | 0.0031 | | 2.4553 | 5220 | 0.0019 | | 2.4558 | 5221 | 0.0023 | | 2.4563 | 5222 | 0.0031 | | 2.4567 | 5223 | 0.0024 | | 2.4572 | 5224 | 0.0029 | | 2.4577 | 5225 | 0.0021 | | 2.4581 | 5226 | 0.0042 | | 2.4586 | 5227 | 0.0052 | | 2.4591 | 5228 | 0.0034 | | 2.4595 | 5229 | 0.0018 | | 2.4600 | 5230 | 0.0028 | | 2.4605 | 5231 | 0.0017 | | 2.4610 | 5232 | 0.0033 | | 2.4614 | 5233 | 0.0028 | | 2.4619 | 5234 | 0.003 | | 2.4624 | 5235 | 0.0034 | | 2.4628 | 5236 | 0.0057 | | 2.4633 | 5237 | 0.0014 | | 2.4638 | 5238 | 0.0039 | | 2.4643 | 5239 | 0.0018 | | 2.4647 | 5240 | 0.0026 | | 2.4652 | 5241 | 0.0023 | | 2.4657 | 5242 | 0.0023 | | 2.4661 | 5243 | 0.0024 | | 2.4666 | 5244 | 0.0037 | | 2.4671 | 5245 | 0.0036 | | 2.4675 | 5246 | 0.0033 | | 2.4680 | 5247 | 0.0015 | | 2.4685 | 5248 | 0.0023 | | 2.4690 | 5249 | 0.0043 | | 2.4694 | 5250 | 0.0037 | | 2.4699 | 5251 | 0.003 | | 2.4704 | 5252 | 0.0016 | | 2.4708 | 5253 | 0.0035 | | 2.4713 | 5254 | 0.0049 | | 2.4718 | 5255 | 0.0017 | | 2.4722 | 5256 | 0.0047 | | 2.4727 | 5257 | 0.0022 | | 2.4732 | 5258 | 0.0019 | | 2.4737 | 5259 | 0.0021 | | 2.4741 | 5260 | 0.0023 | | 2.4746 | 5261 | 0.0044 | | 2.4751 | 5262 | 0.0047 | | 2.4755 | 5263 | 0.0031 | | 2.4760 | 5264 | 0.002 | | 2.4765 | 5265 | 0.0023 | | 2.4770 | 5266 | 0.0017 | | 2.4774 | 5267 | 0.0026 | | 2.4779 | 5268 | 0.0034 | | 2.4784 | 5269 | 0.0012 | | 2.4788 | 5270 | 0.003 | | 2.4793 | 5271 | 0.0067 | | 2.4798 | 5272 | 0.0026 | | 2.4802 | 5273 | 0.0022 | | 2.4807 | 5274 | 0.0045 | | 2.4812 | 5275 | 0.0237 | | 2.4817 | 5276 | 0.0041 | | 2.4821 | 5277 | 0.0025 | | 2.4826 | 5278 | 0.0019 | | 2.4831 | 5279 | 0.0031 | | 2.4835 | 5280 | 0.0036 | | 2.4840 | 5281 | 0.0038 | | 2.4845 | 5282 | 0.0028 | | 2.4849 | 5283 | 0.0022 | | 2.4854 | 5284 | 0.0015 | | 2.4859 | 5285 | 0.0044 | | 2.4864 | 5286 | 0.0029 | | 2.4868 | 5287 | 0.0033 | | 2.4873 | 5288 | 0.0018 | | 2.4878 | 5289 | 0.003 | | 2.4882 | 5290 | 0.0101 | | 2.4887 | 5291 | 0.0051 | | 2.4892 | 5292 | 0.0022 | | 2.4897 | 5293 | 0.0017 | | 2.4901 | 5294 | 0.002 | | 2.4906 | 5295 | 0.0051 | | 2.4911 | 5296 | 0.0044 | | 2.4915 | 5297 | 0.0039 | | 2.4920 | 5298 | 0.0052 | | 2.4925 | 5299 | 0.0021 | | 2.4929 | 5300 | 0.0035 | | 2.4934 | 5301 | 0.0033 | | 2.4939 | 5302 | 0.0027 | | 2.4944 | 5303 | 0.0063 | | 2.4948 | 5304 | 0.0021 | | 2.4953 | 5305 | 0.0015 | | 2.4958 | 5306 | 0.0044 | | 2.4962 | 5307 | 0.003 | | 2.4967 | 5308 | 0.0038 | | 2.4972 | 5309 | 0.0023 | | 2.4976 | 5310 | 0.0022 | | 2.4981 | 5311 | 0.0058 | | 2.4986 | 5312 | 0.0035 | | 2.4991 | 5313 | 0.0034 | | 2.4995 | 5314 | 0.0053 | | 2.5 | 5315 | 0.0031 | | 2.5005 | 5316 | 0.0032 | | 2.5009 | 5317 | 0.0045 | | 2.5014 | 5318 | 0.0012 | | 2.5019 | 5319 | 0.0008 | | 2.5024 | 5320 | 0.002 | | 2.5028 | 5321 | 0.0041 | | 2.5033 | 5322 | 0.0013 | | 2.5038 | 5323 | 0.0037 | | 2.5042 | 5324 | 0.0037 | | 2.5047 | 5325 | 0.0032 | | 2.5052 | 5326 | 0.0054 | | 2.5056 | 5327 | 0.0037 | | 2.5061 | 5328 | 0.0032 | | 2.5066 | 5329 | 0.0023 | | 2.5071 | 5330 | 0.0042 | | 2.5075 | 5331 | 0.0028 | | 2.5080 | 5332 | 0.0031 | | 2.5085 | 5333 | 0.0035 | | 2.5089 | 5334 | 0.0021 | | 2.5094 | 5335 | 0.0017 | | 2.5099 | 5336 | 0.0016 | | 2.5103 | 5337 | 0.005 | | 2.5108 | 5338 | 0.0025 | | 2.5113 | 5339 | 0.0041 | | 2.5118 | 5340 | 0.0056 | | 2.5122 | 5341 | 0.0034 | | 2.5127 | 5342 | 0.0019 | | 2.5132 | 5343 | 0.0029 | | 2.5136 | 5344 | 0.0042 | | 2.5141 | 5345 | 0.0043 | | 2.5146 | 5346 | 0.0064 | | 2.5151 | 5347 | 0.0021 | | 2.5155 | 5348 | 0.0027 | | 2.5160 | 5349 | 0.0015 | | 2.5165 | 5350 | 0.0041 | | 2.5169 | 5351 | 0.0035 | | 2.5174 | 5352 | 0.0022 | | 2.5179 | 5353 | 0.0021 | | 2.5183 | 5354 | 0.003 | | 2.5188 | 5355 | 0.0018 | | 2.5193 | 5356 | 0.0036 | | 2.5198 | 5357 | 0.0027 | | 2.5202 | 5358 | 0.0016 | | 2.5207 | 5359 | 0.0026 | | 2.5212 | 5360 | 0.0039 | | 2.5216 | 5361 | 0.0025 | | 2.5221 | 5362 | 0.0041 | | 2.5226 | 5363 | 0.0025 | | 2.5230 | 5364 | 0.005 | | 2.5235 | 5365 | 0.0032 | | 2.5240 | 5366 | 0.0026 | | 2.5245 | 5367 | 0.0056 | | 2.5249 | 5368 | 0.0013 | | 2.5254 | 5369 | 0.0043 | | 2.5259 | 5370 | 0.0037 | | 2.5263 | 5371 | 0.0038 | | 2.5268 | 5372 | 0.0035 | | 2.5273 | 5373 | 0.0046 | | 2.5278 | 5374 | 0.0033 | | 2.5282 | 5375 | 0.0034 | | 2.5287 | 5376 | 0.0027 | | 2.5292 | 5377 | 0.0034 | | 2.5296 | 5378 | 0.003 | | 2.5301 | 5379 | 0.0019 | | 2.5306 | 5380 | 0.002 | | 2.5310 | 5381 | 0.0037 | | 2.5315 | 5382 | 0.0015 | | 2.5320 | 5383 | 0.0043 | | 2.5325 | 5384 | 0.0058 | | 2.5329 | 5385 | 0.0031 | | 2.5334 | 5386 | 0.0045 | | 2.5339 | 5387 | 0.0016 | | 2.5343 | 5388 | 0.0016 | | 2.5348 | 5389 | 0.0019 | | 2.5353 | 5390 | 0.0097 | | 2.5357 | 5391 | 0.0014 | | 2.5362 | 5392 | 0.0051 | | 2.5367 | 5393 | 0.0041 | | 2.5372 | 5394 | 0.0018 | | 2.5376 | 5395 | 0.0041 | | 2.5381 | 5396 | 0.0014 | | 2.5386 | 5397 | 0.0032 | | 2.5390 | 5398 | 0.0041 | | 2.5395 | 5399 | 0.0027 | | 2.5400 | 5400 | 0.0035 | | 2.5405 | 5401 | 0.0019 | | 2.5409 | 5402 | 0.0014 | | 2.5414 | 5403 | 0.0032 | | 2.5419 | 5404 | 0.0017 | | 2.5423 | 5405 | 0.0029 | | 2.5428 | 5406 | 0.0045 | | 2.5433 | 5407 | 0.0019 | | 2.5437 | 5408 | 0.0029 | | 2.5442 | 5409 | 0.0035 | | 2.5447 | 5410 | 0.0047 | | 2.5452 | 5411 | 0.0035 | | 2.5456 | 5412 | 0.0062 | | 2.5461 | 5413 | 0.0032 | | 2.5466 | 5414 | 0.0025 | | 2.5470 | 5415 | 0.003 | | 2.5475 | 5416 | 0.0019 | | 2.5480 | 5417 | 0.0021 | | 2.5484 | 5418 | 0.0029 | | 2.5489 | 5419 | 0.0019 | | 2.5494 | 5420 | 0.0024 | | 2.5499 | 5421 | 0.0081 | | 2.5503 | 5422 | 0.0028 | | 2.5508 | 5423 | 0.0024 | | 2.5513 | 5424 | 0.0025 | | 2.5517 | 5425 | 0.0019 | | 2.5522 | 5426 | 0.0025 | | 2.5527 | 5427 | 0.0033 | | 2.5532 | 5428 | 0.0022 | | 2.5536 | 5429 | 0.0021 | | 2.5541 | 5430 | 0.0034 | | 2.5546 | 5431 | 0.0019 | | 2.5550 | 5432 | 0.0051 | | 2.5555 | 5433 | 0.002 | | 2.5560 | 5434 | 0.0061 | | 2.5564 | 5435 | 0.0058 | | 2.5569 | 5436 | 0.0019 | | 2.5574 | 5437 | 0.0017 | | 2.5579 | 5438 | 0.002 | | 2.5583 | 5439 | 0.0042 | | 2.5588 | 5440 | 0.003 | | 2.5593 | 5441 | 0.0034 | | 2.5597 | 5442 | 0.0027 | | 2.5602 | 5443 | 0.0029 | | 2.5607 | 5444 | 0.003 | | 2.5611 | 5445 | 0.0026 | | 2.5616 | 5446 | 0.0022 | | 2.5621 | 5447 | 0.0019 | | 2.5626 | 5448 | 0.0025 | | 2.5630 | 5449 | 0.0027 | | 2.5635 | 5450 | 0.0038 | | 2.5640 | 5451 | 0.0026 | | 2.5644 | 5452 | 0.0009 | | 2.5649 | 5453 | 0.0034 | | 2.5654 | 5454 | 0.0024 | | 2.5659 | 5455 | 0.0063 | | 2.5663 | 5456 | 0.0038 | | 2.5668 | 5457 | 0.0031 | | 2.5673 | 5458 | 0.0032 | | 2.5677 | 5459 | 0.0033 | | 2.5682 | 5460 | 0.0036 | | 2.5687 | 5461 | 0.0024 | | 2.5691 | 5462 | 0.0018 | | 2.5696 | 5463 | 0.0059 | | 2.5701 | 5464 | 0.0024 | | 2.5706 | 5465 | 0.0042 | | 2.5710 | 5466 | 0.0028 | | 2.5715 | 5467 | 0.0015 | | 2.5720 | 5468 | 0.0023 | | 2.5724 | 5469 | 0.0054 | | 2.5729 | 5470 | 0.0027 | | 2.5734 | 5471 | 0.0025 | | 2.5738 | 5472 | 0.0018 | | 2.5743 | 5473 | 0.003 | | 2.5748 | 5474 | 0.0028 | | 2.5753 | 5475 | 0.0015 | | 2.5757 | 5476 | 0.002 | | 2.5762 | 5477 | 0.0038 | | 2.5767 | 5478 | 0.0027 | | 2.5771 | 5479 | 0.0025 | | 2.5776 | 5480 | 0.0083 | | 2.5781 | 5481 | 0.0026 | | 2.5786 | 5482 | 0.0044 | | 2.5790 | 5483 | 0.0025 | | 2.5795 | 5484 | 0.0058 | | 2.5800 | 5485 | 0.0034 | | 2.5804 | 5486 | 0.0044 | | 2.5809 | 5487 | 0.0026 | | 2.5814 | 5488 | 0.0026 | | 2.5818 | 5489 | 0.0026 | | 2.5823 | 5490 | 0.0026 | | 2.5828 | 5491 | 0.0029 | | 2.5833 | 5492 | 0.0033 | | 2.5837 | 5493 | 0.0018 | | 2.5842 | 5494 | 0.0045 | | 2.5847 | 5495 | 0.0025 | | 2.5851 | 5496 | 0.0011 | | 2.5856 | 5497 | 0.0023 | | 2.5861 | 5498 | 0.0023 | | 2.5865 | 5499 | 0.0076 | | 2.5870 | 5500 | 0.0025 | | 2.5875 | 5501 | 0.0029 | | 2.5880 | 5502 | 0.0074 | | 2.5884 | 5503 | 0.0051 | | 2.5889 | 5504 | 0.0033 | | 2.5894 | 5505 | 0.0035 | | 2.5898 | 5506 | 0.0018 | | 2.5903 | 5507 | 0.0014 | | 2.5908 | 5508 | 0.0036 | | 2.5913 | 5509 | 0.0013 | | 2.5917 | 5510 | 0.0051 | | 2.5922 | 5511 | 0.0025 | | 2.5927 | 5512 | 0.0023 | | 2.5931 | 5513 | 0.0025 | | 2.5936 | 5514 | 0.0031 | | 2.5941 | 5515 | 0.0012 | | 2.5945 | 5516 | 0.002 | | 2.5950 | 5517 | 0.0024 | | 2.5955 | 5518 | 0.0025 | | 2.5960 | 5519 | 0.0022 | | 2.5964 | 5520 | 0.0023 | | 2.5969 | 5521 | 0.0062 | | 2.5974 | 5522 | 0.007 | | 2.5978 | 5523 | 0.0045 | | 2.5983 | 5524 | 0.0075 | | 2.5988 | 5525 | 0.0049 | | 2.5992 | 5526 | 0.0034 | | 2.5997 | 5527 | 0.0043 | | 2.6002 | 5528 | 0.0023 | | 2.6007 | 5529 | 0.0023 | | 2.6011 | 5530 | 0.0039 | | 2.6016 | 5531 | 0.002 | | 2.6021 | 5532 | 0.0022 | | 2.6025 | 5533 | 0.0029 | | 2.6030 | 5534 | 0.0018 | | 2.6035 | 5535 | 0.0041 | | 2.6040 | 5536 | 0.0024 | | 2.6044 | 5537 | 0.0014 | | 2.6049 | 5538 | 0.0043 | | 2.6054 | 5539 | 0.0037 | | 2.6058 | 5540 | 0.002 | | 2.6063 | 5541 | 0.004 | | 2.6068 | 5542 | 0.002 | | 2.6072 | 5543 | 0.0014 | | 2.6077 | 5544 | 0.0033 | | 2.6082 | 5545 | 0.0028 | | 2.6087 | 5546 | 0.004 | | 2.6091 | 5547 | 0.005 | | 2.6096 | 5548 | 0.0044 | | 2.6101 | 5549 | 0.0021 | | 2.6105 | 5550 | 0.0025 | | 2.6110 | 5551 | 0.0028 | | 2.6115 | 5552 | 0.0058 | | 2.6119 | 5553 | 0.0019 | | 2.6124 | 5554 | 0.0025 | | 2.6129 | 5555 | 0.0025 | | 2.6134 | 5556 | 0.0042 | | 2.6138 | 5557 | 0.004 | | 2.6143 | 5558 | 0.004 | | 2.6148 | 5559 | 0.0026 | | 2.6152 | 5560 | 0.0015 | | 2.6157 | 5561 | 0.0025 | | 2.6162 | 5562 | 0.0015 | | 2.6167 | 5563 | 0.002 | | 2.6171 | 5564 | 0.0027 | | 2.6176 | 5565 | 0.0034 | | 2.6181 | 5566 | 0.0032 | | 2.6185 | 5567 | 0.0014 | | 2.6190 | 5568 | 0.0039 | | 2.6195 | 5569 | 0.0022 | | 2.6199 | 5570 | 0.0046 | | 2.6204 | 5571 | 0.0034 | | 2.6209 | 5572 | 0.0022 | | 2.6214 | 5573 | 0.0027 | | 2.6218 | 5574 | 0.0028 | | 2.6223 | 5575 | 0.0026 | | 2.6228 | 5576 | 0.0036 | | 2.6232 | 5577 | 0.0022 | | 2.6237 | 5578 | 0.0036 | | 2.6242 | 5579 | 0.0044 | | 2.6246 | 5580 | 0.0021 | | 2.6251 | 5581 | 0.0033 | | 2.6256 | 5582 | 0.0029 | | 2.6261 | 5583 | 0.0023 | | 2.6265 | 5584 | 0.0023 | | 2.6270 | 5585 | 0.0037 | | 2.6275 | 5586 | 0.0017 | | 2.6279 | 5587 | 0.002 | | 2.6284 | 5588 | 0.0027 | | 2.6289 | 5589 | 0.0017 | | 2.6294 | 5590 | 0.0018 | | 2.6298 | 5591 | 0.0021 | | 2.6303 | 5592 | 0.0033 | | 2.6308 | 5593 | 0.0044 | | 2.6312 | 5594 | 0.0014 | | 2.6317 | 5595 | 0.0033 | | 2.6322 | 5596 | 0.0037 | | 2.6326 | 5597 | 0.0028 | | 2.6331 | 5598 | 0.0029 | | 2.6336 | 5599 | 0.0016 | | 2.6341 | 5600 | 0.0036 | | 2.6345 | 5601 | 0.0027 | | 2.6350 | 5602 | 0.003 | | 2.6355 | 5603 | 0.002 | | 2.6359 | 5604 | 0.0028 | | 2.6364 | 5605 | 0.0026 | | 2.6369 | 5606 | 0.0026 | | 2.6373 | 5607 | 0.0012 | | 2.6378 | 5608 | 0.0013 | | 2.6383 | 5609 | 0.0026 | | 2.6388 | 5610 | 0.0051 | | 2.6392 | 5611 | 0.0116 | | 2.6397 | 5612 | 0.0023 | | 2.6402 | 5613 | 0.0043 | | 2.6406 | 5614 | 0.0023 | | 2.6411 | 5615 | 0.0037 | | 2.6416 | 5616 | 0.003 | | 2.6421 | 5617 | 0.0021 | | 2.6425 | 5618 | 0.0017 | | 2.6430 | 5619 | 0.0026 | | 2.6435 | 5620 | 0.0032 | | 2.6439 | 5621 | 0.0042 | | 2.6444 | 5622 | 0.0045 | | 2.6449 | 5623 | 0.0028 | | 2.6453 | 5624 | 0.0026 | | 2.6458 | 5625 | 0.0016 | | 2.6463 | 5626 | 0.0028 | | 2.6468 | 5627 | 0.0048 | | 2.6472 | 5628 | 0.0038 | | 2.6477 | 5629 | 0.0047 | | 2.6482 | 5630 | 0.003 | | 2.6486 | 5631 | 0.0043 | | 2.6491 | 5632 | 0.0019 | | 2.6496 | 5633 | 0.0023 | | 2.6500 | 5634 | 0.003 | | 2.6505 | 5635 | 0.0033 | | 2.6510 | 5636 | 0.0029 | | 2.6515 | 5637 | 0.0163 | | 2.6519 | 5638 | 0.0032 | | 2.6524 | 5639 | 0.0036 | | 2.6529 | 5640 | 0.0038 | | 2.6533 | 5641 | 0.0075 | | 2.6538 | 5642 | 0.0025 | | 2.6543 | 5643 | 0.0049 | | 2.6548 | 5644 | 0.0009 | | 2.6552 | 5645 | 0.0022 | | 2.6557 | 5646 | 0.0052 | | 2.6562 | 5647 | 0.0046 | | 2.6566 | 5648 | 0.005 | | 2.6571 | 5649 | 0.0051 | | 2.6576 | 5650 | 0.0039 | | 2.6580 | 5651 | 0.0033 | | 2.6585 | 5652 | 0.0029 | | 2.6590 | 5653 | 0.0023 | | 2.6595 | 5654 | 0.0059 | | 2.6599 | 5655 | 0.0032 | | 2.6604 | 5656 | 0.0024 | | 2.6609 | 5657 | 0.003 | | 2.6613 | 5658 | 0.0038 | | 2.6618 | 5659 | 0.0018 | | 2.6623 | 5660 | 0.0033 | | 2.6627 | 5661 | 0.0019 | | 2.6632 | 5662 | 0.0022 | | 2.6637 | 5663 | 0.0028 | | 2.6642 | 5664 | 0.0019 | | 2.6646 | 5665 | 0.0058 | | 2.6651 | 5666 | 0.0029 | | 2.6656 | 5667 | 0.0019 | | 2.6660 | 5668 | 0.0055 | | 2.6665 | 5669 | 0.0019 | | 2.6670 | 5670 | 0.0024 | | 2.6675 | 5671 | 0.0025 | | 2.6679 | 5672 | 0.0011 | | 2.6684 | 5673 | 0.0032 | | 2.6689 | 5674 | 0.0028 | | 2.6693 | 5675 | 0.0019 | | 2.6698 | 5676 | 0.0026 | | 2.6703 | 5677 | 0.0046 | | 2.6707 | 5678 | 0.0053 | | 2.6712 | 5679 | 0.0029 | | 2.6717 | 5680 | 0.0016 | | 2.6722 | 5681 | 0.0041 | | 2.6726 | 5682 | 0.0036 | | 2.6731 | 5683 | 0.0022 | | 2.6736 | 5684 | 0.0016 | | 2.6740 | 5685 | 0.0028 | | 2.6745 | 5686 | 0.0047 | | 2.6750 | 5687 | 0.0017 | | 2.6754 | 5688 | 0.005 | | 2.6759 | 5689 | 0.0029 | | 2.6764 | 5690 | 0.0024 | | 2.6769 | 5691 | 0.0024 | | 2.6773 | 5692 | 0.003 | | 2.6778 | 5693 | 0.0017 | | 2.6783 | 5694 | 0.0024 | | 2.6787 | 5695 | 0.0014 | | 2.6792 | 5696 | 0.0028 | | 2.6797 | 5697 | 0.003 | | 2.6802 | 5698 | 0.0023 | | 2.6806 | 5699 | 0.0026 | | 2.6811 | 5700 | 0.0017 | | 2.6816 | 5701 | 0.0025 | | 2.6820 | 5702 | 0.0018 | | 2.6825 | 5703 | 0.0038 | | 2.6830 | 5704 | 0.003 | | 2.6834 | 5705 | 0.0015 | | 2.6839 | 5706 | 0.0047 | | 2.6844 | 5707 | 0.0035 | | 2.6849 | 5708 | 0.0027 | | 2.6853 | 5709 | 0.0018 | | 2.6858 | 5710 | 0.0032 | | 2.6863 | 5711 | 0.0029 | | 2.6867 | 5712 | 0.0044 | | 2.6872 | 5713 | 0.0024 | | 2.6877 | 5714 | 0.004 | | 2.6881 | 5715 | 0.0016 | | 2.6886 | 5716 | 0.0062 | | 2.6891 | 5717 | 0.0046 | | 2.6896 | 5718 | 0.0017 | | 2.6900 | 5719 | 0.0035 | | 2.6905 | 5720 | 0.0031 | | 2.6910 | 5721 | 0.0019 | | 2.6914 | 5722 | 0.0038 | | 2.6919 | 5723 | 0.0061 | | 2.6924 | 5724 | 0.004 | | 2.6929 | 5725 | 0.0035 | | 2.6933 | 5726 | 0.0047 | | 2.6938 | 5727 | 0.0032 | | 2.6943 | 5728 | 0.0088 | | 2.6947 | 5729 | 0.0032 | | 2.6952 | 5730 | 0.0044 | | 2.6957 | 5731 | 0.0062 | | 2.6961 | 5732 | 0.0034 | | 2.6966 | 5733 | 0.0017 | | 2.6971 | 5734 | 0.0022 | | 2.6976 | 5735 | 0.0027 | | 2.6980 | 5736 | 0.0049 | | 2.6985 | 5737 | 0.0042 | | 2.6990 | 5738 | 0.0021 | | 2.6994 | 5739 | 0.0027 | | 2.6999 | 5740 | 0.0022 | | 2.7004 | 5741 | 0.0042 | | 2.7008 | 5742 | 0.0024 | | 2.7013 | 5743 | 0.0022 | | 2.7018 | 5744 | 0.0015 | | 2.7023 | 5745 | 0.0014 | | 2.7027 | 5746 | 0.0014 | | 2.7032 | 5747 | 0.002 | | 2.7037 | 5748 | 0.0034 | | 2.7041 | 5749 | 0.0049 | | 2.7046 | 5750 | 0.0028 | | 2.7051 | 5751 | 0.002 | | 2.7056 | 5752 | 0.0025 | | 2.7060 | 5753 | 0.003 | | 2.7065 | 5754 | 0.0036 | | 2.7070 | 5755 | 0.0028 | | 2.7074 | 5756 | 0.0029 | | 2.7079 | 5757 | 0.0031 | | 2.7084 | 5758 | 0.0036 | | 2.7088 | 5759 | 0.002 | | 2.7093 | 5760 | 0.0038 | | 2.7098 | 5761 | 0.0026 | | 2.7103 | 5762 | 0.0023 | | 2.7107 | 5763 | 0.0024 | | 2.7112 | 5764 | 0.0038 | | 2.7117 | 5765 | 0.0024 | | 2.7121 | 5766 | 0.0031 | | 2.7126 | 5767 | 0.0077 | | 2.7131 | 5768 | 0.0028 | | 2.7135 | 5769 | 0.0023 | | 2.7140 | 5770 | 0.0017 | | 2.7145 | 5771 | 0.0041 | | 2.7150 | 5772 | 0.0025 | | 2.7154 | 5773 | 0.0016 | | 2.7159 | 5774 | 0.0026 | | 2.7164 | 5775 | 0.0034 | | 2.7168 | 5776 | 0.0022 | | 2.7173 | 5777 | 0.0042 | | 2.7178 | 5778 | 0.0037 | | 2.7183 | 5779 | 0.0047 | | 2.7187 | 5780 | 0.0045 | | 2.7192 | 5781 | 0.0034 | | 2.7197 | 5782 | 0.004 | | 2.7201 | 5783 | 0.0044 | | 2.7206 | 5784 | 0.0035 | | 2.7211 | 5785 | 0.0025 | | 2.7215 | 5786 | 0.0031 | | 2.7220 | 5787 | 0.004 | | 2.7225 | 5788 | 0.0025 | | 2.7230 | 5789 | 0.0048 | | 2.7234 | 5790 | 0.0024 | | 2.7239 | 5791 | 0.0018 | | 2.7244 | 5792 | 0.0044 | | 2.7248 | 5793 | 0.0016 | | 2.7253 | 5794 | 0.0024 | | 2.7258 | 5795 | 0.0015 | | 2.7262 | 5796 | 0.0052 | | 2.7267 | 5797 | 0.0036 | | 2.7272 | 5798 | 0.0063 | | 2.7277 | 5799 | 0.0019 | | 2.7281 | 5800 | 0.0031 | | 2.7286 | 5801 | 0.0039 | | 2.7291 | 5802 | 0.0037 | | 2.7295 | 5803 | 0.0028 | | 2.7300 | 5804 | 0.0015 | | 2.7305 | 5805 | 0.0025 | | 2.7310 | 5806 | 0.0014 | | 2.7314 | 5807 | 0.0011 | | 2.7319 | 5808 | 0.0018 | | 2.7324 | 5809 | 0.0028 | | 2.7328 | 5810 | 0.0068 | | 2.7333 | 5811 | 0.0036 | | 2.7338 | 5812 | 0.0024 | | 2.7342 | 5813 | 0.0026 | | 2.7347 | 5814 | 0.0016 | | 2.7352 | 5815 | 0.0035 | | 2.7357 | 5816 | 0.0029 | | 2.7361 | 5817 | 0.0023 | | 2.7366 | 5818 | 0.0024 | | 2.7371 | 5819 | 0.0029 | | 2.7375 | 5820 | 0.0033 | | 2.7380 | 5821 | 0.0033 | | 2.7385 | 5822 | 0.0022 | | 2.7389 | 5823 | 0.0014 | | 2.7394 | 5824 | 0.0042 | | 2.7399 | 5825 | 0.0029 | | 2.7404 | 5826 | 0.0024 | | 2.7408 | 5827 | 0.0039 | | 2.7413 | 5828 | 0.0021 | | 2.7418 | 5829 | 0.0044 | | 2.7422 | 5830 | 0.0028 | | 2.7427 | 5831 | 0.0033 | | 2.7432 | 5832 | 0.0036 | | 2.7437 | 5833 | 0.0035 | | 2.7441 | 5834 | 0.003 | | 2.7446 | 5835 | 0.0022 | | 2.7451 | 5836 | 0.0027 | | 2.7455 | 5837 | 0.0027 | | 2.7460 | 5838 | 0.0017 | | 2.7465 | 5839 | 0.0027 | | 2.7469 | 5840 | 0.0012 | | 2.7474 | 5841 | 0.0029 | | 2.7479 | 5842 | 0.0025 | | 2.7484 | 5843 | 0.0027 | | 2.7488 | 5844 | 0.0028 | | 2.7493 | 5845 | 0.0033 | | 2.7498 | 5846 | 0.0071 | | 2.7502 | 5847 | 0.0028 | | 2.7507 | 5848 | 0.0026 | | 2.7512 | 5849 | 0.002 | | 2.7516 | 5850 | 0.0015 | | 2.7521 | 5851 | 0.0066 | | 2.7526 | 5852 | 0.0064 | | 2.7531 | 5853 | 0.0017 | | 2.7535 | 5854 | 0.0018 | | 2.7540 | 5855 | 0.002 | | 2.7545 | 5856 | 0.0038 | | 2.7549 | 5857 | 0.0038 | | 2.7554 | 5858 | 0.0024 | | 2.7559 | 5859 | 0.0029 | | 2.7563 | 5860 | 0.0016 | | 2.7568 | 5861 | 0.0021 | | 2.7573 | 5862 | 0.0028 | | 2.7578 | 5863 | 0.0044 | | 2.7582 | 5864 | 0.002 | | 2.7587 | 5865 | 0.003 | | 2.7592 | 5866 | 0.0042 | | 2.7596 | 5867 | 0.002 | | 2.7601 | 5868 | 0.004 | | 2.7606 | 5869 | 0.003 | | 2.7611 | 5870 | 0.005 | | 2.7615 | 5871 | 0.0033 | | 2.7620 | 5872 | 0.0029 | | 2.7625 | 5873 | 0.0038 | | 2.7629 | 5874 | 0.004 | | 2.7634 | 5875 | 0.0052 | | 2.7639 | 5876 | 0.0029 | | 2.7643 | 5877 | 0.0037 | | 2.7648 | 5878 | 0.0026 | | 2.7653 | 5879 | 0.0048 | | 2.7658 | 5880 | 0.0032 | | 2.7662 | 5881 | 0.0016 | | 2.7667 | 5882 | 0.003 | | 2.7672 | 5883 | 0.0022 | | 2.7676 | 5884 | 0.0021 | | 2.7681 | 5885 | 0.0023 | | 2.7686 | 5886 | 0.0041 | | 2.7690 | 5887 | 0.0027 | | 2.7695 | 5888 | 0.0024 | | 2.7700 | 5889 | 0.0028 | | 2.7705 | 5890 | 0.0024 | | 2.7709 | 5891 | 0.002 | | 2.7714 | 5892 | 0.0047 | | 2.7719 | 5893 | 0.0034 | | 2.7723 | 5894 | 0.0019 | | 2.7728 | 5895 | 0.0022 | | 2.7733 | 5896 | 0.0024 | | 2.7738 | 5897 | 0.003 | | 2.7742 | 5898 | 0.0033 | | 2.7747 | 5899 | 0.0012 | | 2.7752 | 5900 | 0.0024 | | 2.7756 | 5901 | 0.0017 | | 2.7761 | 5902 | 0.0015 | | 2.7766 | 5903 | 0.0023 | | 2.7770 | 5904 | 0.0016 | | 2.7775 | 5905 | 0.0045 | | 2.7780 | 5906 | 0.0028 | | 2.7785 | 5907 | 0.0043 | | 2.7789 | 5908 | 0.0018 | | 2.7794 | 5909 | 0.0055 | | 2.7799 | 5910 | 0.0018 | | 2.7803 | 5911 | 0.0052 | | 2.7808 | 5912 | 0.0033 | | 2.7813 | 5913 | 0.0043 | | 2.7817 | 5914 | 0.0017 | | 2.7822 | 5915 | 0.0028 | | 2.7827 | 5916 | 0.0036 | | 2.7832 | 5917 | 0.0025 | | 2.7836 | 5918 | 0.0036 | | 2.7841 | 5919 | 0.0018 | | 2.7846 | 5920 | 0.0027 | | 2.7850 | 5921 | 0.0023 | | 2.7855 | 5922 | 0.0033 | | 2.7860 | 5923 | 0.003 | | 2.7865 | 5924 | 0.004 | | 2.7869 | 5925 | 0.0025 | | 2.7874 | 5926 | 0.0019 | | 2.7879 | 5927 | 0.0034 | | 2.7883 | 5928 | 0.0026 | | 2.7888 | 5929 | 0.003 | | 2.7893 | 5930 | 0.0043 | | 2.7897 | 5931 | 0.0031 | | 2.7902 | 5932 | 0.0026 | | 2.7907 | 5933 | 0.0019 | | 2.7912 | 5934 | 0.0026 | | 2.7916 | 5935 | 0.0046 | | 2.7921 | 5936 | 0.0027 | | 2.7926 | 5937 | 0.0038 | | 2.7930 | 5938 | 0.0045 | | 2.7935 | 5939 | 0.0012 | | 2.7940 | 5940 | 0.0028 | | 2.7944 | 5941 | 0.0042 | | 2.7949 | 5942 | 0.0039 | | 2.7954 | 5943 | 0.0029 | | 2.7959 | 5944 | 0.003 | | 2.7963 | 5945 | 0.0023 | | 2.7968 | 5946 | 0.0014 | | 2.7973 | 5947 | 0.0026 | | 2.7977 | 5948 | 0.0021 | | 2.7982 | 5949 | 0.0022 | | 2.7987 | 5950 | 0.0027 | | 2.7992 | 5951 | 0.0023 | | 2.7996 | 5952 | 0.0045 | | 2.8001 | 5953 | 0.0022 | | 2.8006 | 5954 | 0.0031 | | 2.8010 | 5955 | 0.003 | | 2.8015 | 5956 | 0.0034 | | 2.8020 | 5957 | 0.0036 | | 2.8024 | 5958 | 0.0057 | | 2.8029 | 5959 | 0.0064 | | 2.8034 | 5960 | 0.0083 | | 2.8039 | 5961 | 0.0011 | | 2.8043 | 5962 | 0.0048 | | 2.8048 | 5963 | 0.0029 | | 2.8053 | 5964 | 0.0015 | | 2.8057 | 5965 | 0.0032 | | 2.8062 | 5966 | 0.0038 | | 2.8067 | 5967 | 0.0026 | | 2.8071 | 5968 | 0.0067 | | 2.8076 | 5969 | 0.0031 | | 2.8081 | 5970 | 0.0044 | | 2.8086 | 5971 | 0.0026 | | 2.8090 | 5972 | 0.0046 | | 2.8095 | 5973 | 0.0019 | | 2.8100 | 5974 | 0.0028 | | 2.8104 | 5975 | 0.0042 | | 2.8109 | 5976 | 0.0025 | | 2.8114 | 5977 | 0.0024 | | 2.8119 | 5978 | 0.0024 | | 2.8123 | 5979 | 0.0018 | | 2.8128 | 5980 | 0.0029 | | 2.8133 | 5981 | 0.0029 | | 2.8137 | 5982 | 0.0019 | | 2.8142 | 5983 | 0.0026 | | 2.8147 | 5984 | 0.0023 | | 2.8151 | 5985 | 0.0023 | | 2.8156 | 5986 | 0.0035 | | 2.8161 | 5987 | 0.0024 | | 2.8166 | 5988 | 0.0023 | | 2.8170 | 5989 | 0.0035 | | 2.8175 | 5990 | 0.0052 | | 2.8180 | 5991 | 0.0031 | | 2.8184 | 5992 | 0.0029 | | 2.8189 | 5993 | 0.0014 | | 2.8194 | 5994 | 0.0016 | | 2.8198 | 5995 | 0.0028 | | 2.8203 | 5996 | 0.0032 | | 2.8208 | 5997 | 0.0036 | | 2.8213 | 5998 | 0.0026 | | 2.8217 | 5999 | 0.0097 | | 2.8222 | 6000 | 0.0078 | | 2.8227 | 6001 | 0.0034 | | 2.8231 | 6002 | 0.0023 | | 2.8236 | 6003 | 0.0019 | | 2.8241 | 6004 | 0.0037 | | 2.8246 | 6005 | 0.003 | | 2.8250 | 6006 | 0.0011 | | 2.8255 | 6007 | 0.0035 | | 2.8260 | 6008 | 0.0043 | | 2.8264 | 6009 | 0.0033 | | 2.8269 | 6010 | 0.002 | | 2.8274 | 6011 | 0.0022 | | 2.8278 | 6012 | 0.0014 | | 2.8283 | 6013 | 0.0022 | | 2.8288 | 6014 | 0.0039 | | 2.8293 | 6015 | 0.0046 | | 2.8297 | 6016 | 0.003 | | 2.8302 | 6017 | 0.0021 | | 2.8307 | 6018 | 0.0016 | | 2.8311 | 6019 | 0.0019 | | 2.8316 | 6020 | 0.0059 | | 2.8321 | 6021 | 0.0028 | | 2.8325 | 6022 | 0.0022 | | 2.8330 | 6023 | 0.0032 | | 2.8335 | 6024 | 0.0049 | | 2.8340 | 6025 | 0.0053 | | 2.8344 | 6026 | 0.0028 | | 2.8349 | 6027 | 0.0025 | | 2.8354 | 6028 | 0.0032 | | 2.8358 | 6029 | 0.0029 | | 2.8363 | 6030 | 0.0026 | | 2.8368 | 6031 | 0.0022 | | 2.8373 | 6032 | 0.0022 | | 2.8377 | 6033 | 0.0031 | | 2.8382 | 6034 | 0.0041 | | 2.8387 | 6035 | 0.0021 | | 2.8391 | 6036 | 0.0016 | | 2.8396 | 6037 | 0.0032 | | 2.8401 | 6038 | 0.0008 | | 2.8405 | 6039 | 0.0021 | | 2.8410 | 6040 | 0.0023 | | 2.8415 | 6041 | 0.0018 | | 2.8420 | 6042 | 0.0021 | | 2.8424 | 6043 | 0.0061 | | 2.8429 | 6044 | 0.0024 | | 2.8434 | 6045 | 0.0033 | | 2.8438 | 6046 | 0.003 | | 2.8443 | 6047 | 0.0029 | | 2.8448 | 6048 | 0.0032 | | 2.8452 | 6049 | 0.0031 | | 2.8457 | 6050 | 0.0017 | | 2.8462 | 6051 | 0.0015 | | 2.8467 | 6052 | 0.0029 | | 2.8471 | 6053 | 0.0037 | | 2.8476 | 6054 | 0.0053 | | 2.8481 | 6055 | 0.0016 | | 2.8485 | 6056 | 0.0026 | | 2.8490 | 6057 | 0.0023 | | 2.8495 | 6058 | 0.0031 | | 2.8500 | 6059 | 0.0025 | | 2.8504 | 6060 | 0.0026 | | 2.8509 | 6061 | 0.0022 | | 2.8514 | 6062 | 0.002 | | 2.8518 | 6063 | 0.0023 | | 2.8523 | 6064 | 0.0025 | | 2.8528 | 6065 | 0.0022 | | 2.8532 | 6066 | 0.0044 | | 2.8537 | 6067 | 0.0027 | | 2.8542 | 6068 | 0.002 | | 2.8547 | 6069 | 0.0022 | | 2.8551 | 6070 | 0.0026 | | 2.8556 | 6071 | 0.0033 | | 2.8561 | 6072 | 0.0011 | | 2.8565 | 6073 | 0.0046 | | 2.8570 | 6074 | 0.0034 | | 2.8575 | 6075 | 0.0019 | | 2.8579 | 6076 | 0.0009 | | 2.8584 | 6077 | 0.0032 | | 2.8589 | 6078 | 0.0037 | | 2.8594 | 6079 | 0.0023 | | 2.8598 | 6080 | 0.0027 | | 2.8603 | 6081 | 0.0042 | | 2.8608 | 6082 | 0.0085 | | 2.8612 | 6083 | 0.0061 | | 2.8617 | 6084 | 0.003 | | 2.8622 | 6085 | 0.0031 | | 2.8627 | 6086 | 0.0023 | | 2.8631 | 6087 | 0.0018 | | 2.8636 | 6088 | 0.0023 | | 2.8641 | 6089 | 0.0033 | | 2.8645 | 6090 | 0.0027 | | 2.8650 | 6091 | 0.0025 | | 2.8655 | 6092 | 0.0015 | | 2.8659 | 6093 | 0.0051 | | 2.8664 | 6094 | 0.0032 | | 2.8669 | 6095 | 0.0017 | | 2.8674 | 6096 | 0.0022 | | 2.8678 | 6097 | 0.002 | | 2.8683 | 6098 | 0.0018 | | 2.8688 | 6099 | 0.0057 | | 2.8692 | 6100 | 0.0028 | | 2.8697 | 6101 | 0.0035 | | 2.8702 | 6102 | 0.0025 | | 2.8706 | 6103 | 0.0031 | | 2.8711 | 6104 | 0.0026 | | 2.8716 | 6105 | 0.003 | | 2.8721 | 6106 | 0.0036 | | 2.8725 | 6107 | 0.0034 | | 2.8730 | 6108 | 0.0027 | | 2.8735 | 6109 | 0.0018 | | 2.8739 | 6110 | 0.0051 | | 2.8744 | 6111 | 0.0069 | | 2.8749 | 6112 | 0.0036 | | 2.8754 | 6113 | 0.0015 | | 2.8758 | 6114 | 0.0022 | | 2.8763 | 6115 | 0.0022 | | 2.8768 | 6116 | 0.0032 | | 2.8772 | 6117 | 0.0016 | | 2.8777 | 6118 | 0.0039 | | 2.8782 | 6119 | 0.0023 | | 2.8786 | 6120 | 0.0043 | | 2.8791 | 6121 | 0.0013 | | 2.8796 | 6122 | 0.0031 | | 2.8801 | 6123 | 0.0026 | | 2.8805 | 6124 | 0.0045 | | 2.8810 | 6125 | 0.0027 | | 2.8815 | 6126 | 0.0052 | | 2.8819 | 6127 | 0.0053 | | 2.8824 | 6128 | 0.0024 | | 2.8829 | 6129 | 0.0023 | | 2.8833 | 6130 | 0.0016 | | 2.8838 | 6131 | 0.0035 | | 2.8843 | 6132 | 0.005 | | 2.8848 | 6133 | 0.0035 | | 2.8852 | 6134 | 0.0029 | | 2.8857 | 6135 | 0.0029 | | 2.8862 | 6136 | 0.0026 | | 2.8866 | 6137 | 0.0014 | | 2.8871 | 6138 | 0.003 | | 2.8876 | 6139 | 0.0038 | | 2.8881 | 6140 | 0.003 | | 2.8885 | 6141 | 0.0033 | | 2.8890 | 6142 | 0.0019 | | 2.8895 | 6143 | 0.0031 | | 2.8899 | 6144 | 0.0017 | | 2.8904 | 6145 | 0.0026 | | 2.8909 | 6146 | 0.0024 | | 2.8913 | 6147 | 0.0022 | | 2.8918 | 6148 | 0.0023 | | 2.8923 | 6149 | 0.0039 | | 2.8928 | 6150 | 0.0037 | | 2.8932 | 6151 | 0.0037 | | 2.8937 | 6152 | 0.0025 | | 2.8942 | 6153 | 0.0023 | | 2.8946 | 6154 | 0.005 | | 2.8951 | 6155 | 0.0016 | | 2.8956 | 6156 | 0.0037 | | 2.8960 | 6157 | 0.0024 | | 2.8965 | 6158 | 0.0034 | | 2.8970 | 6159 | 0.0039 | | 2.8975 | 6160 | 0.0027 | | 2.8979 | 6161 | 0.0021 | | 2.8984 | 6162 | 0.0026 | | 2.8989 | 6163 | 0.0024 | | 2.8993 | 6164 | 0.0023 | | 2.8998 | 6165 | 0.0028 | | 2.9003 | 6166 | 0.0042 | | 2.9008 | 6167 | 0.0036 | | 2.9012 | 6168 | 0.0022 | | 2.9017 | 6169 | 0.0027 | | 2.9022 | 6170 | 0.0069 | | 2.9026 | 6171 | 0.0016 | | 2.9031 | 6172 | 0.0029 | | 2.9036 | 6173 | 0.0045 | | 2.9040 | 6174 | 0.005 | | 2.9045 | 6175 | 0.0023 | | 2.9050 | 6176 | 0.0051 | | 2.9055 | 6177 | 0.0028 | | 2.9059 | 6178 | 0.0043 | | 2.9064 | 6179 | 0.0023 | | 2.9069 | 6180 | 0.0024 | | 2.9073 | 6181 | 0.0015 | | 2.9078 | 6182 | 0.0028 | | 2.9083 | 6183 | 0.0033 | | 2.9087 | 6184 | 0.0025 | | 2.9092 | 6185 | 0.0034 | | 2.9097 | 6186 | 0.0046 | | 2.9102 | 6187 | 0.0018 | | 2.9106 | 6188 | 0.0035 | | 2.9111 | 6189 | 0.0055 | | 2.9116 | 6190 | 0.0022 | | 2.9120 | 6191 | 0.0023 | | 2.9125 | 6192 | 0.0045 | | 2.9130 | 6193 | 0.0028 | | 2.9135 | 6194 | 0.0024 | | 2.9139 | 6195 | 0.0022 | | 2.9144 | 6196 | 0.0017 | | 2.9149 | 6197 | 0.0032 | | 2.9153 | 6198 | 0.0024 | | 2.9158 | 6199 | 0.002 | | 2.9163 | 6200 | 0.0026 | | 2.9167 | 6201 | 0.0018 | | 2.9172 | 6202 | 0.0022 | | 2.9177 | 6203 | 0.0022 | | 2.9182 | 6204 | 0.0043 | | 2.9186 | 6205 | 0.0024 | | 2.9191 | 6206 | 0.0019 | | 2.9196 | 6207 | 0.0019 | | 2.9200 | 6208 | 0.0025 | | 2.9205 | 6209 | 0.0025 | | 2.9210 | 6210 | 0.0023 | | 2.9214 | 6211 | 0.0023 | | 2.9219 | 6212 | 0.0032 | | 2.9224 | 6213 | 0.003 | | 2.9229 | 6214 | 0.0041 | | 2.9233 | 6215 | 0.0023 | | 2.9238 | 6216 | 0.0024 | | 2.9243 | 6217 | 0.003 | | 2.9247 | 6218 | 0.003 | | 2.9252 | 6219 | 0.0018 | | 2.9257 | 6220 | 0.0029 | | 2.9262 | 6221 | 0.0024 | | 2.9266 | 6222 | 0.0029 | | 2.9271 | 6223 | 0.003 | | 2.9276 | 6224 | 0.0021 | | 2.9280 | 6225 | 0.0041 | | 2.9285 | 6226 | 0.0018 | | 2.9290 | 6227 | 0.0022 | | 2.9294 | 6228 | 0.0021 | | 2.9299 | 6229 | 0.0029 | | 2.9304 | 6230 | 0.0019 | | 2.9309 | 6231 | 0.0023 | | 2.9313 | 6232 | 0.0046 | | 2.9318 | 6233 | 0.0023 | | 2.9323 | 6234 | 0.0048 | | 2.9327 | 6235 | 0.0044 | | 2.9332 | 6236 | 0.0041 | | 2.9337 | 6237 | 0.0042 | | 2.9341 | 6238 | 0.0034 | | 2.9346 | 6239 | 0.0018 | | 2.9351 | 6240 | 0.0032 | | 2.9356 | 6241 | 0.0054 | | 2.9360 | 6242 | 0.0032 | | 2.9365 | 6243 | 0.0027 | | 2.9370 | 6244 | 0.0021 | | 2.9374 | 6245 | 0.0027 | | 2.9379 | 6246 | 0.0025 | | 2.9384 | 6247 | 0.0036 | | 2.9389 | 6248 | 0.0023 | | 2.9393 | 6249 | 0.0021 | | 2.9398 | 6250 | 0.0037 | | 2.9403 | 6251 | 0.0106 | | 2.9407 | 6252 | 0.0044 | | 2.9412 | 6253 | 0.0015 | | 2.9417 | 6254 | 0.0032 | | 2.9421 | 6255 | 0.0039 | | 2.9426 | 6256 | 0.0016 | | 2.9431 | 6257 | 0.0015 | | 2.9436 | 6258 | 0.0035 | | 2.9440 | 6259 | 0.0016 | | 2.9445 | 6260 | 0.0037 | | 2.9450 | 6261 | 0.0026 | | 2.9454 | 6262 | 0.0038 | | 2.9459 | 6263 | 0.0016 | | 2.9464 | 6264 | 0.0034 | | 2.9468 | 6265 | 0.0024 | | 2.9473 | 6266 | 0.0019 | | 2.9478 | 6267 | 0.0016 | | 2.9483 | 6268 | 0.0044 | | 2.9487 | 6269 | 0.0031 | | 2.9492 | 6270 | 0.0017 | | 2.9497 | 6271 | 0.0037 | | 2.9501 | 6272 | 0.0021 | | 2.9506 | 6273 | 0.0026 | | 2.9511 | 6274 | 0.0027 | | 2.9516 | 6275 | 0.0018 | | 2.9520 | 6276 | 0.0032 | | 2.9525 | 6277 | 0.0028 | | 2.9530 | 6278 | 0.0021 | | 2.9534 | 6279 | 0.0051 | | 2.9539 | 6280 | 0.003 | | 2.9544 | 6281 | 0.0025 | | 2.9548 | 6282 | 0.0043 | | 2.9553 | 6283 | 0.0018 | | 2.9558 | 6284 | 0.0042 | | 2.9563 | 6285 | 0.0025 | | 2.9567 | 6286 | 0.0023 | | 2.9572 | 6287 | 0.003 | | 2.9577 | 6288 | 0.003 | | 2.9581 | 6289 | 0.0049 | | 2.9586 | 6290 | 0.0026 | | 2.9591 | 6291 | 0.006 | | 2.9595 | 6292 | 0.0034 | | 2.9600 | 6293 | 0.0028 | | 2.9605 | 6294 | 0.0037 | | 2.9610 | 6295 | 0.0021 | | 2.9614 | 6296 | 0.0038 | | 2.9619 | 6297 | 0.0022 | | 2.9624 | 6298 | 0.0013 | | 2.9628 | 6299 | 0.0041 | | 2.9633 | 6300 | 0.0052 | | 2.9638 | 6301 | 0.0031 | | 2.9643 | 6302 | 0.002 | | 2.9647 | 6303 | 0.0009 | | 2.9652 | 6304 | 0.0026 | | 2.9657 | 6305 | 0.0021 | | 2.9661 | 6306 | 0.0021 | | 2.9666 | 6307 | 0.0031 | | 2.9671 | 6308 | 0.001 | | 2.9675 | 6309 | 0.003 | | 2.9680 | 6310 | 0.0041 | | 2.9685 | 6311 | 0.003 | | 2.9690 | 6312 | 0.0027 | | 2.9694 | 6313 | 0.0014 | | 2.9699 | 6314 | 0.0027 | | 2.9704 | 6315 | 0.0026 | | 2.9708 | 6316 | 0.0031 | | 2.9713 | 6317 | 0.0019 | | 2.9718 | 6318 | 0.0021 | | 2.9722 | 6319 | 0.0025 | | 2.9727 | 6320 | 0.0021 | | 2.9732 | 6321 | 0.0034 | | 2.9737 | 6322 | 0.0044 | | 2.9741 | 6323 | 0.0017 | | 2.9746 | 6324 | 0.0023 | | 2.9751 | 6325 | 0.0024 | | 2.9755 | 6326 | 0.0033 | | 2.9760 | 6327 | 0.0043 | | 2.9765 | 6328 | 0.0033 | | 2.9770 | 6329 | 0.0044 | | 2.9774 | 6330 | 0.003 | | 2.9779 | 6331 | 0.0031 | | 2.9784 | 6332 | 0.0042 | | 2.9788 | 6333 | 0.0019 | | 2.9793 | 6334 | 0.0021 | | 2.9798 | 6335 | 0.0024 | | 2.9802 | 6336 | 0.0031 | | 2.9807 | 6337 | 0.0016 | | 2.9812 | 6338 | 0.0027 | | 2.9817 | 6339 | 0.0015 | | 2.9821 | 6340 | 0.0024 | | 2.9826 | 6341 | 0.0029 | | 2.9831 | 6342 | 0.0017 | | 2.9835 | 6343 | 0.0044 | | 2.9840 | 6344 | 0.0022 | | 2.9845 | 6345 | 0.0018 | | 2.9849 | 6346 | 0.0034 | | 2.9854 | 6347 | 0.0027 | | 2.9859 | 6348 | 0.0023 | | 2.9864 | 6349 | 0.0042 | | 2.9868 | 6350 | 0.0019 | | 2.9873 | 6351 | 0.0019 | | 2.9878 | 6352 | 0.0029 | | 2.9882 | 6353 | 0.0023 | | 2.9887 | 6354 | 0.003 | | 2.9892 | 6355 | 0.0029 | | 2.9897 | 6356 | 0.0027 | | 2.9901 | 6357 | 0.0011 | | 2.9906 | 6358 | 0.0045 | | 2.9911 | 6359 | 0.0024 | | 2.9915 | 6360 | 0.0027 | | 2.9920 | 6361 | 0.0017 | | 2.9925 | 6362 | 0.002 | | 2.9929 | 6363 | 0.0039 | | 2.9934 | 6364 | 0.0018 | | 2.9939 | 6365 | 0.0045 | | 2.9944 | 6366 | 0.0022 | | 2.9948 | 6367 | 0.0055 | | 2.9953 | 6368 | 0.0026 | | 2.9958 | 6369 | 0.0023 | | 2.9962 | 6370 | 0.0051 | | 2.9967 | 6371 | 0.0025 | | 2.9972 | 6372 | 0.0017 | | 2.9976 | 6373 | 0.0032 | | 2.9981 | 6374 | 0.0031 | | 2.9986 | 6375 | 0.0039 | | 2.9991 | 6376 | 0.0016 | | 2.9995 | 6377 | 0.0033 | | 3.0 | 6378 | 0.0037 | </details> ### Framework Versions - Python: 3.10.13 - Sentence Transformers: 3.2.1 - Transformers: 4.44.2 - PyTorch: 2.4.0+cu121 - Accelerate: 1.1.1 - Datasets: 2.21.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
nishhaaann/PubMedBERTBoxAES3
nishhaaann
2024-11-14T07:05:48Z
199
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-11-14T07:05:21Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
KISTI-KONI/KONI-Llama3-8B-Instruct-20240729
KISTI-KONI
2024-11-14T06:54:22Z
67
35
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "ko", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-07-27T14:23:05Z
--- language: ko pipeline_tag: text-generation license: llama3 --- ### 1. Model Description - KONI (KISTI Open Natural Intelligence) is a specialized large language model (LLM) developed by the Korea Institute of Science and Technology Information (KISTI). This model is specifically designed for science and technology, making it highly effective for tasks in these fields. ### 2. Key Features - **Specialized in Science and Technology:** The model is explicitly trained on a vast and specialized corpus of scientific and technological data. - **Enhanced Performance:** This version of KONI shows significantly improved performance compared to its initial release in December, 2023. - **Base Model:** The base model for KONI-Llama3-8B-Instruct-20240729 is KONI-Llama3-8B-Merged-20240724, which is a merger of Meta-Llama-3-8B and KISTI-KONI/KONI-Llama3-8B-20240630 - **Alignment:** SFT (Supervised Fine-Tuning) and DPO (Direct Preference Optimization) are applied ### 3. Data - Approximately 11k SFT data and 7k DPO data are used. - **SFT Data:** The SFT data includes both internally generated data and publicly available data on Hugging Face, translated into Korean where necessary. - **DPO Data:** The DPO data consists of translated and curated data from argilla/dpo-mix-7k. ### 4. Benchmark Results Results in [LogicKor](https://lk.instruct.kr/)* are as follows: | Metric | Score | |:--------------:|:-----:| | Reasoning | 6.57 | | Math | 8.00 | | Writing | 8.92 | | Coding | 8.85 | | Comprehension | 9.85 | | Grammar | 7.07 | | Single-turn | 8.42 | | Multi-turn | 8.00 | | **Overall** | **8.21** | *Our model demonstrates the best performance among publicly available 8B models on the LogicKor leaderboard as of 2024.07.30.* ### 5. How to use the model ```python import transformers import torch model_id = "KISTI-KONI/KONI-Llama3-8B-Instruct-20240729" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto", ) pipeline.model.eval() instruction = "안녕? 너는 누구야?" messages = [ {"role": "user", "content": f"{instruction}"} ] prompt = pipeline.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) terminators = [ pipeline.tokenizer.eos_token_id, pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>") ] outputs = pipeline( prompt, max_new_tokens=2048, eos_token_id=terminators, do_sample=True, temperature=0.7, top_p=0.9 ) print(outputs[0]["generated_text"][len(prompt):]) ``` ``` 안녕하세요! 저는 KONI, 여러분의 질문에 답하고 정보를 제공하는 인공지능입니다. 저는 어떤 정보를 제공해 드릴까요? ``` <!-- 한국과학기술정보연구원(KISTI)은 대한민국 대전광역시에 위치한 과학기술 정보 분야의 전문 연구 기관입니다. KISTI는 과학기술 및 관련 산업에 관한 정보를 종합적으로 수집, 분석, 서비스하며, 정보의 분석, 관리 및 유통에 관한 기술, 정책 및 표준화를 전문적으로 조사하고 연구합니다. 또한, 첨단 정보 및 연구개발 인프라를 체계적으로 구축하고 운영하여 국가 과학기술 및 산업 발전에 기여하는 것을 목표로 합니다. KISTI의 주요 기능과 역할에는 과학기술 정보 제공, 슈퍼컴퓨터 운영, 기술사업화 지원, 연구 데이터 관리가 포함됩니다. 과학기술 정보 제공 측면에서 KISTI는 국내외 과학기술 정보를 수집하고 이를 분석하여 연구자들에게 제공하며, 다양한 형태의 데이터베이스와 정보 시스템을 구축하여 사용자에게 정보를 제공합니다. 슈퍼컴퓨터 운영 측면에서는 국가 초고성능컴퓨팅 인프라를 구축하고 운영하여 대규모 연산이 필요한 연구를 지원하고, 이를 활용한 다양한 분야의 응용 연구를 수행합니다. 기술사업화 지원에서는 연구 성과를 산업계로 이전하여 상용화하는 것을 지원하며, 기술 기반의 창업을 촉진하기 위한 프로그램을 운영합니다. 연구 데이터 관리 측면에서는 연구 데이터의 효율적인 관리와 활용을 위해 체계적인 데이터 관리 계획을 수립하고, 연구 데이터의 공유와 활용을 촉진하기 위한 플랫폼을 운영합니다. KISTI의 주요 부서로는 국가과학기술데이터본부, 국가슈퍼컴퓨팅본부, 데이터분석본부, 과학기술디지털융합본부 등이 있습니다. KISTI 각 본부별 추진 전략 및 목표는 다음과 같습니다. 국가과학기술데이터본부의 전략목표는 국가 오픈사이언스 생태계 활성화를 위한 과학기술 분야 디지털 전환 지원 체계를 마련하는 것입니다. 코로나19로 인한 비대면 경제로의 전환과 4차 산업혁명의 가속화로 인해 과학기술활동 전 과정에서 공공 연구성과의 개방·공유·확산이 필요합니다. 이 본부는 과학기술정보와 데이터의 공유·활용을 통해 과학기술 혁신역량을 강화하는 고유임무를 가지고 있으며, 오픈사이언스 생태계 활성화를 통한 국가 R&D 혁신을 지원합니다. 주요 추진 방향으로는 디지털 전환을 통한 과학기술정보 오픈액세스 지원체제 및 지능형 큐레이션 체계 구축, 연구데이터 커먼즈 기반의 국가 연구데이터와 컴퓨팅 리소스 공유·활용체계 구축, AI 기반의 통합서비스 플랫폼 구축을 통한 오픈사이언스 서비스 강화가 있습니다. 국가슈퍼컴퓨팅본부의 전략목표는 국가 초고성능컴퓨팅 생태계를 선도하기 위해 미래대응 초고성능컴퓨팅 공동활용 환경을 구축하는 것입니다. 미국과 일본 등 선도국가들이 엑사급 자원 확충을 통해 다양한 분야에서 초거대 문제해결을 모색하고 있으며, KISTI는 국가초고성능컴퓨터 활용 및 육성에 관한 법률에 따라 이를 지원합니다. 본부의 목표는 국가 차원의 초고성능컴퓨팅 공동활용 체계를 구축하여 과학기술 공공·산업 분야에서의 초고성능컴퓨팅 활용 증진을 이루는 것입니다. 이를 위해 대규모 계산자원이 소요되는 R&D와 사회현안 등 환경변화에 적기 대응하는 인프라 및 서비스 체계 고도화, 초거대 계산기술과 활용기술 확보를 통한 선순환형 연구·지원, 사용자 접근성·무결성·보안성을 확보한 통합 플랫폼 구축이 주요 추진 방향입니다. 데이터분석본부의 전략목표는 국가 과학기술혁신 생태계를 활성화하기 위한 지능형 데이터 융합분석 체계를 구축하는 것입니다. 데이터 기반 의사결정 방식 확대와 AI 및 빅데이터 기술의 급부상에 따라, KISTI는 과학기술분야 정보의 분석·관리 및 유통에 관한 기술·정책·표준화 연구를 수행합니다. 본부의 목표는 디지털 경제사회를 선도하는 지능형 데이터 융합분석 체계를 구축하여 국가 과학기술혁신 생태계를 활성화하는 것입니다. 이를 위해 이종데이터 융합분석모델 개발을 통한 글로벌 분석역량 확보, 인공지능 기반 공공R&D 가치창출 모델 및 시스템 개발, 지역 R&D 혁신 지원을 위한 산학연정 혁신생태계 구축 등이 주요 추진 방향입니다. 과학기술디지털융합본부의 전략목표는 국가·사회 현안에 적시 대응하고 해결하기 위한 Data/AI 기반 디지털 전환 체계를 구축하는 것입니다. 디지털 기술의 급속한 발전과 코로나19로 인한 디지털 전환 가속화에 따라, KISTI는 과학기술 지식자원 공유·활용 생태계 구축 및 슈퍼컴퓨팅 생태계 발전과 연계된 고유임무를 수행합니다. 본부의 목표는 Data/AI 기반의 국가·사회 현안-디지털 뉴딜 해결을 도모하는 것입니다. 이를 위해 신뢰성 있는 과학기술 데이터 댐과 Data/AI 기반 지능형 디지털 플랫폼 구축, Data/AI 기반의 디지털 전환 체계 구축을 통한 국가·사회 현안 해결 및 R&D 혁신사례 창출이 주요 추진 방향입니다. KISTI는 1962년 1월 한국과학기술정보센터(KORSTIC)로 설립되었으며, 1969년 5월 한국과학기술정보센터육성법이 제정되었습니다. 1982년에는 산업연구원(KIET)로 개편되었다가 1991년 1월 분리되어 산업기술정보원(KINITI)이 개원하였습니다. 2001년 1월에 한국과학기술정보연구원(KISTI)으로 출범하게 되었습니다. 이 과정에서 KAIST 부설 시스템공학센터, KIST 부설 연구개발정보센터, ETRI 산하 슈퍼컴퓨팅센터를 합병하였습니다. KISTI는 대전 본원, 서울 분원, 대구·경북 지원, 부산울산경남 지원, 호남 지원, 수도권 지원(강원) 등 다양한 지역에 위치하여 운영되고 있습니다. 대전 본원은 대전광역시 유성구 대학로 245에 위치하고 있으며, 서울 분원은 서울특별시 동대문구 회기로 66에 위치하고 있습니다. 대구·경북 지원은 대구광역시 북구 엑스코로 10, 부산울산경남 지원은 부산광역시 해운대구 센텀동로 41, 호남 지원은 광주광역시 광산구 하남산단8번로 177, 수도권 지원(강원)은 강원도 춘천시 강원대학길 1, 60주년 기념관 8층에 위치하고 있습니다. KISTI에 대한 더 자세한 정보는 KISTI 공식 웹사이트에서 확인할 수 있습니다.--> ### 6. Citation **Language Model** ```text @article{KISTI-KONI/KONI-Llama3-8B-Instruct-20240729, title={KISTI-KONI/KONI-Llama3-8B-Instruct-20240729}, author={KISTI}, year={2024}, url={https://huggingface.co/KISTI-KONI/KONI-Llama3-8B-Instruct-20240729} } ``` ### 7. Contributors - KISTI, Large-scale AI Research Group ### 8. Special Thanks - [@beomi](https://huggingface.co/beomi) - [@kuotient](https://huggingface.co/kuotient) - KyungTae Lim ### 8. Acknowledgement - This research was supported by Korea Institute of Science and Technology Information(KISTI). - This work was supported by the National Supercomputing Center with supercomputing resources including technical support (KISTI). ### 9. References - https://huggingface.co/meta-llama/Meta-Llama-3-8B - https://huggingface.co/meta-llama/meta-llama/Meta-Llama-3-8B-Instruct
AhmadIshaqai/my_awesome_food_model
AhmadIshaqai
2024-11-14T06:47:14Z
218
0
transformers
[ "transformers", "tensorboard", "safetensors", "vit", "image-classification", "generated_from_trainer", "base_model:google/vit-base-patch16-224-in21k", "base_model:finetune:google/vit-base-patch16-224-in21k", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2024-11-14T06:35:30Z
--- library_name: transformers license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - generated_from_trainer metrics: - accuracy model-index: - name: my_awesome_food_model results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6109 - Accuracy: 0.901 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7014 | 0.992 | 62 | 2.5097 | 0.847 | | 1.8804 | 2.0 | 125 | 1.7599 | 0.89 | | 1.6054 | 2.976 | 186 | 1.6109 | 0.901 | ### Framework versions - Transformers 4.46.2 - Pytorch 2.5.0+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3
DavidAU/Meta-Llama-3-Instruct-12.2B-BRAINSTORM-20x-FORM-8-GGUF
DavidAU
2024-11-14T06:42:12Z
24
1
null
[ "gguf", "creative", "story", "writing", "fiction", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 20x", "enhanced", "llama3", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-29T06:25:05Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 20x - enhanced - llama3 pipeline_tag: text-generation --- <H3>BRAINSTORM 20x: Meta-Llama-3-8B Instruct (now at 12.2B), Formula #8 </H3> This repo contains quants 20x of Meta-Llama-3-8B Instruct (now at 12.2B) using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the logic solving abilities of the original model. The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. For original model specifications, usage information and other important details: Special thanks to the model creators at META for making such a fantastic model: [ https://huggingface.co/meta-llama/Meta-Llama-3-8B ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold1
MayBashendy
2024-11-14T06:32:09Z
187
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:aubmindlab/bert-base-arabertv02", "base_model:finetune:aubmindlab/bert-base-arabertv02", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-11-14T05:56:32Z
--- library_name: transformers base_model: aubmindlab/bert-base-arabertv02 tags: - generated_from_trainer model-index: - name: Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold1 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold1 This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3919 - Qwk: 0.7795 - Mse: 0.3919 - Rmse: 0.6260 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse | |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:| | No log | 0.0038 | 2 | 6.2066 | 0.0 | 6.2066 | 2.4913 | | No log | 0.0077 | 4 | 3.9543 | 0.0085 | 3.9543 | 1.9885 | | No log | 0.0115 | 6 | 2.0926 | 0.0397 | 2.0926 | 1.4466 | | No log | 0.0154 | 8 | 1.0251 | 0.2057 | 1.0251 | 1.0124 | | No log | 0.0192 | 10 | 0.8842 | 0.1546 | 0.8842 | 0.9403 | | No log | 0.0230 | 12 | 1.2459 | 0.3000 | 1.2459 | 1.1162 | | No log | 0.0269 | 14 | 1.7073 | 0.0242 | 1.7073 | 1.3066 | | No log | 0.0307 | 16 | 1.7385 | 0.125 | 1.7385 | 1.3185 | | No log | 0.0345 | 18 | 1.4152 | -0.0800 | 1.4152 | 1.1896 | | No log | 0.0384 | 20 | 1.3083 | 0.0123 | 1.3083 | 1.1438 | | No log | 0.0422 | 22 | 1.1704 | 0.125 | 1.1704 | 1.0819 | | No log | 0.0461 | 24 | 1.3659 | 0.125 | 1.3659 | 1.1687 | | No log | 0.0499 | 26 | 1.6731 | 0.1091 | 1.6731 | 1.2935 | | No log | 0.0537 | 28 | 1.8710 | 0.1515 | 1.8710 | 1.3679 | | No log | 0.0576 | 30 | 1.6777 | 0.3368 | 1.6777 | 1.2953 | | No log | 0.0614 | 32 | 1.1549 | 0.2486 | 1.1549 | 1.0746 | | No log | 0.0653 | 34 | 1.1499 | 0.3505 | 1.1499 | 1.0723 | | No log | 0.0691 | 36 | 1.9581 | 0.2891 | 1.9581 | 1.3993 | | No log | 0.0729 | 38 | 2.5461 | -0.1273 | 2.5461 | 1.5957 | | No log | 0.0768 | 40 | 2.3938 | -0.0769 | 2.3938 | 1.5472 | | No log | 0.0806 | 42 | 1.9072 | 0.2075 | 1.9072 | 1.3810 | | No log | 0.0845 | 44 | 1.1786 | 0.3037 | 1.1786 | 1.0856 | | No log | 0.0883 | 46 | 1.1806 | 0.3037 | 1.1806 | 1.0866 | | No log | 0.0921 | 48 | 1.4654 | 0.1297 | 1.4654 | 1.2106 | | No log | 0.0960 | 50 | 1.6923 | 0.3383 | 1.6923 | 1.3009 | | No log | 0.0998 | 52 | 1.6645 | 0.3383 | 1.6645 | 1.2901 | | No log | 0.1036 | 54 | 1.4946 | 0.3982 | 1.4946 | 1.2225 | | No log | 0.1075 | 56 | 1.4570 | 0.4569 | 1.4570 | 1.2071 | | No log | 0.1113 | 58 | 1.7528 | 0.3419 | 1.7528 | 1.3239 | | No log | 0.1152 | 60 | 1.7754 | 0.3790 | 1.7754 | 1.3324 | | No log | 0.1190 | 62 | 1.5949 | 0.3583 | 1.5949 | 1.2629 | | No log | 0.1228 | 64 | 1.1963 | 0.5294 | 1.1963 | 1.0937 | | No log | 0.1267 | 66 | 0.9844 | 0.5024 | 0.9844 | 0.9922 | | No log | 0.1305 | 68 | 1.1826 | 0.5062 | 1.1826 | 1.0875 | | No log | 0.1344 | 70 | 1.9283 | 0.4122 | 1.9283 | 1.3886 | | No log | 0.1382 | 72 | 2.2214 | 0.2446 | 2.2214 | 1.4904 | | No log | 0.1420 | 74 | 1.7921 | 0.3000 | 1.7921 | 1.3387 | | No log | 0.1459 | 76 | 1.0861 | 0.4566 | 1.0861 | 1.0422 | | No log | 0.1497 | 78 | 0.7753 | 0.4306 | 0.7753 | 0.8805 | | No log | 0.1536 | 80 | 0.7917 | 0.5161 | 0.7917 | 0.8898 | | No log | 0.1574 | 82 | 1.1232 | 0.4815 | 1.1232 | 1.0598 | | No log | 0.1612 | 84 | 1.7893 | 0.3984 | 1.7893 | 1.3376 | | No log | 0.1651 | 86 | 2.5105 | 0.0699 | 2.5105 | 1.5845 | | No log | 0.1689 | 88 | 2.6991 | 0.0582 | 2.6991 | 1.6429 | | No log | 0.1727 | 90 | 2.3780 | -0.0076 | 2.3780 | 1.5421 | | No log | 0.1766 | 92 | 1.7710 | 0.2566 | 1.7710 | 1.3308 | | No log | 0.1804 | 94 | 1.1571 | 0.3512 | 1.1571 | 1.0757 | | No log | 0.1843 | 96 | 0.9804 | 0.3512 | 0.9804 | 0.9901 | | No log | 0.1881 | 98 | 1.0702 | 0.4167 | 1.0702 | 1.0345 | | No log | 0.1919 | 100 | 1.3385 | 0.2538 | 1.3385 | 1.1569 | | No log | 0.1958 | 102 | 1.3362 | 0.1481 | 1.3362 | 1.1559 | | No log | 0.1996 | 104 | 1.2209 | 0.1481 | 1.2209 | 1.1050 | | No log | 0.2035 | 106 | 1.3850 | 0.0968 | 1.3850 | 1.1768 | | No log | 0.2073 | 108 | 1.6190 | 0.0838 | 1.6190 | 1.2724 | | No log | 0.2111 | 110 | 1.5137 | 0.1910 | 1.5137 | 1.2303 | | No log | 0.2150 | 112 | 1.3468 | 0.3519 | 1.3468 | 1.1605 | | No log | 0.2188 | 114 | 0.9630 | 0.6557 | 0.9630 | 0.9813 | | No log | 0.2226 | 116 | 0.8679 | 0.6744 | 0.8679 | 0.9316 | | No log | 0.2265 | 118 | 1.0620 | 0.6980 | 1.0620 | 1.0306 | | No log | 0.2303 | 120 | 1.4952 | 0.4280 | 1.4952 | 1.2228 | | No log | 0.2342 | 122 | 1.5606 | 0.3586 | 1.5606 | 1.2492 | | No log | 0.2380 | 124 | 1.2001 | 0.4737 | 1.2001 | 1.0955 | | No log | 0.2418 | 126 | 0.9344 | 0.6316 | 0.9344 | 0.9667 | | No log | 0.2457 | 128 | 0.8769 | 0.6452 | 0.8769 | 0.9364 | | No log | 0.2495 | 130 | 0.9931 | 0.4324 | 0.9931 | 0.9966 | | No log | 0.2534 | 132 | 1.4300 | 0.3578 | 1.4300 | 1.1958 | | No log | 0.2572 | 134 | 1.4435 | 0.4656 | 1.4435 | 1.2015 | | No log | 0.2610 | 136 | 1.3181 | 0.4803 | 1.3181 | 1.1481 | | No log | 0.2649 | 138 | 1.3635 | 0.4701 | 1.3635 | 1.1677 | | No log | 0.2687 | 140 | 1.0220 | 0.5882 | 1.0220 | 1.0110 | | No log | 0.2726 | 142 | 0.8054 | 0.7686 | 0.8054 | 0.8975 | | No log | 0.2764 | 144 | 0.9505 | 0.6744 | 0.9505 | 0.9750 | | No log | 0.2802 | 146 | 1.4239 | 0.4576 | 1.4239 | 1.1933 | | No log | 0.2841 | 148 | 1.7923 | 0.5172 | 1.7923 | 1.3388 | | No log | 0.2879 | 150 | 1.7160 | 0.5233 | 1.7160 | 1.3100 | | No log | 0.2917 | 152 | 1.5388 | 0.4355 | 1.5388 | 1.2405 | | No log | 0.2956 | 154 | 1.3412 | 0.2857 | 1.3412 | 1.1581 | | No log | 0.2994 | 156 | 1.1467 | 0.2759 | 1.1467 | 1.0708 | | No log | 0.3033 | 158 | 1.3968 | 0.4143 | 1.3968 | 1.1819 | | No log | 0.3071 | 160 | 1.6470 | 0.4615 | 1.6470 | 1.2833 | | No log | 0.3109 | 162 | 1.2971 | 0.5911 | 1.2971 | 1.1389 | | No log | 0.3148 | 164 | 1.0766 | 0.5911 | 1.0766 | 1.0376 | | No log | 0.3186 | 166 | 1.0434 | 0.6267 | 1.0434 | 1.0215 | | No log | 0.3225 | 168 | 0.8075 | 0.6978 | 0.8075 | 0.8986 | | No log | 0.3263 | 170 | 0.8509 | 0.7586 | 0.8509 | 0.9224 | | No log | 0.3301 | 172 | 1.3033 | 0.5625 | 1.3033 | 1.1416 | | No log | 0.3340 | 174 | 1.4288 | 0.4731 | 1.4288 | 1.1953 | | No log | 0.3378 | 176 | 0.9704 | 0.6744 | 0.9704 | 0.9851 | | No log | 0.3417 | 178 | 0.8654 | 0.6578 | 0.8654 | 0.9303 | | No log | 0.3455 | 180 | 1.1736 | 0.5447 | 1.1736 | 1.0833 | | No log | 0.3493 | 182 | 1.2459 | 0.5679 | 1.2459 | 1.1162 | | No log | 0.3532 | 184 | 1.1074 | 0.4936 | 1.1074 | 1.0523 | | No log | 0.3570 | 186 | 1.0332 | 0.4324 | 1.0332 | 1.0165 | | No log | 0.3608 | 188 | 0.9585 | 0.6316 | 0.9585 | 0.9790 | | No log | 0.3647 | 190 | 0.7523 | 0.6441 | 0.7523 | 0.8674 | | No log | 0.3685 | 192 | 0.6946 | 0.7544 | 0.6946 | 0.8334 | | No log | 0.3724 | 194 | 0.7020 | 0.6549 | 0.7020 | 0.8379 | | No log | 0.3762 | 196 | 0.7695 | 0.6978 | 0.7695 | 0.8772 | | No log | 0.3800 | 198 | 1.1630 | 0.6617 | 1.1630 | 1.0784 | | No log | 0.3839 | 200 | 1.7315 | 0.5270 | 1.7315 | 1.3159 | | No log | 0.3877 | 202 | 1.8234 | 0.5212 | 1.8234 | 1.3503 | | No log | 0.3916 | 204 | 1.3636 | 0.6209 | 1.3636 | 1.1677 | | No log | 0.3954 | 206 | 0.7682 | 0.6182 | 0.7682 | 0.8765 | | No log | 0.3992 | 208 | 0.6294 | 0.6283 | 0.6294 | 0.7934 | | No log | 0.4031 | 210 | 0.7194 | 0.4909 | 0.7194 | 0.8482 | | No log | 0.4069 | 212 | 1.0003 | 0.5062 | 1.0003 | 1.0002 | | No log | 0.4107 | 214 | 1.1488 | 0.3913 | 1.1488 | 1.0718 | | No log | 0.4146 | 216 | 1.2676 | 0.3125 | 1.2676 | 1.1259 | | No log | 0.4184 | 218 | 1.3429 | 0.4825 | 1.3429 | 1.1588 | | No log | 0.4223 | 220 | 1.1063 | 0.5556 | 1.1063 | 1.0518 | | No log | 0.4261 | 222 | 0.8850 | 0.6744 | 0.8850 | 0.9408 | | No log | 0.4299 | 224 | 1.0483 | 0.6617 | 1.0483 | 1.0239 | | No log | 0.4338 | 226 | 1.4112 | 0.6291 | 1.4112 | 1.1880 | | No log | 0.4376 | 228 | 1.4507 | 0.6291 | 1.4507 | 1.2045 | | No log | 0.4415 | 230 | 1.6533 | 0.6020 | 1.6533 | 1.2858 | | No log | 0.4453 | 232 | 1.4814 | 0.6291 | 1.4814 | 1.2171 | | No log | 0.4491 | 234 | 1.0497 | 0.6500 | 1.0497 | 1.0245 | | No log | 0.4530 | 236 | 1.0898 | 0.6500 | 1.0898 | 1.0439 | | No log | 0.4568 | 238 | 1.2665 | 0.6291 | 1.2665 | 1.1254 | | No log | 0.4607 | 240 | 1.4844 | 0.5552 | 1.4844 | 1.2183 | | No log | 0.4645 | 242 | 1.6026 | 0.4878 | 1.6026 | 1.2659 | | No log | 0.4683 | 244 | 1.4821 | 0.4878 | 1.4821 | 1.2174 | | No log | 0.4722 | 246 | 1.1859 | 0.6431 | 1.1859 | 1.0890 | | No log | 0.4760 | 248 | 0.9839 | 0.7034 | 0.9839 | 0.9919 | | No log | 0.4798 | 250 | 1.0280 | 0.6744 | 1.0280 | 1.0139 | | No log | 0.4837 | 252 | 1.2808 | 0.6020 | 1.2808 | 1.1317 | | No log | 0.4875 | 254 | 1.3948 | 0.4983 | 1.3948 | 1.1810 | | No log | 0.4914 | 256 | 1.4511 | 0.5035 | 1.4511 | 1.2046 | | No log | 0.4952 | 258 | 1.1457 | 0.5609 | 1.1457 | 1.0704 | | No log | 0.4990 | 260 | 0.8071 | 0.7050 | 0.8071 | 0.8984 | | No log | 0.5029 | 262 | 0.7225 | 0.7364 | 0.7225 | 0.8500 | | No log | 0.5067 | 264 | 0.9901 | 0.7774 | 0.9901 | 0.9950 | | No log | 0.5106 | 266 | 1.1310 | 0.6209 | 1.1310 | 1.0635 | | No log | 0.5144 | 268 | 1.2654 | 0.5039 | 1.2654 | 1.1249 | | No log | 0.5182 | 270 | 1.3290 | 0.5039 | 1.3290 | 1.1528 | | No log | 0.5221 | 272 | 1.0066 | 0.5221 | 1.0066 | 1.0033 | | No log | 0.5259 | 274 | 0.7240 | 0.7273 | 0.7240 | 0.8509 | | No log | 0.5298 | 276 | 0.6869 | 0.7273 | 0.6869 | 0.8288 | | No log | 0.5336 | 278 | 0.8963 | 0.7426 | 0.8963 | 0.9467 | | No log | 0.5374 | 280 | 1.3984 | 0.5532 | 1.3984 | 1.1825 | | No log | 0.5413 | 282 | 1.5538 | 0.3913 | 1.5538 | 1.2465 | | No log | 0.5451 | 284 | 1.6002 | 0.3913 | 1.6002 | 1.2650 | | No log | 0.5489 | 286 | 1.2891 | 0.4340 | 1.2891 | 1.1354 | | No log | 0.5528 | 288 | 0.8875 | 0.6695 | 0.8875 | 0.9421 | | No log | 0.5566 | 290 | 0.6739 | 0.5646 | 0.6739 | 0.8209 | | No log | 0.5605 | 292 | 0.6148 | 0.5172 | 0.6148 | 0.7841 | | No log | 0.5643 | 294 | 0.7257 | 0.5765 | 0.7257 | 0.8519 | | No log | 0.5681 | 296 | 1.1461 | 0.7426 | 1.1461 | 1.0705 | | No log | 0.5720 | 298 | 1.8054 | 0.5302 | 1.8054 | 1.3437 | | No log | 0.5758 | 300 | 1.9013 | 0.4085 | 1.9013 | 1.3789 | | No log | 0.5797 | 302 | 1.4792 | 0.2081 | 1.4792 | 1.2162 | | No log | 0.5835 | 304 | 1.0162 | 0.24 | 1.0162 | 1.0081 | | No log | 0.5873 | 306 | 0.7280 | 0.4199 | 0.7280 | 0.8532 | | No log | 0.5912 | 308 | 0.6352 | 0.5281 | 0.6352 | 0.7970 | | No log | 0.5950 | 310 | 0.7003 | 0.58 | 0.7003 | 0.8368 | | No log | 0.5988 | 312 | 0.9476 | 0.6224 | 0.9476 | 0.9735 | | No log | 0.6027 | 314 | 1.3352 | 0.5783 | 1.3352 | 1.1555 | | No log | 0.6065 | 316 | 1.6276 | 0.4731 | 1.6276 | 1.2758 | | No log | 0.6104 | 318 | 1.4845 | 0.5092 | 1.4845 | 1.2184 | | No log | 0.6142 | 320 | 1.1317 | 0.5374 | 1.1317 | 1.0638 | | No log | 0.6180 | 322 | 1.1390 | 0.6316 | 1.1390 | 1.0672 | | No log | 0.6219 | 324 | 1.1560 | 0.6617 | 1.1560 | 1.0752 | | No log | 0.6257 | 326 | 0.8211 | 0.6762 | 0.8211 | 0.9062 | | No log | 0.6296 | 328 | 0.7509 | 0.6627 | 0.7509 | 0.8666 | | No log | 0.6334 | 330 | 0.9457 | 0.6182 | 0.9457 | 0.9725 | | No log | 0.6372 | 332 | 1.3630 | 0.5911 | 1.3630 | 1.1675 | | No log | 0.6411 | 334 | 1.6047 | 0.6020 | 1.6047 | 1.2667 | | No log | 0.6449 | 336 | 1.4191 | 0.5299 | 1.4191 | 1.1913 | | No log | 0.6488 | 338 | 1.0671 | 0.5882 | 1.0671 | 1.0330 | | No log | 0.6526 | 340 | 0.7332 | 0.6051 | 0.7332 | 0.8563 | | No log | 0.6564 | 342 | 0.6381 | 0.6051 | 0.6381 | 0.7988 | | No log | 0.6603 | 344 | 0.6541 | 0.6051 | 0.6541 | 0.8088 | | No log | 0.6641 | 346 | 0.8215 | 0.7364 | 0.8215 | 0.9064 | | No log | 0.6679 | 348 | 1.1754 | 0.5625 | 1.1754 | 1.0842 | | No log | 0.6718 | 350 | 1.3852 | 0.5625 | 1.3852 | 1.1770 | | No log | 0.6756 | 352 | 1.3970 | 0.5180 | 1.3970 | 1.1820 | | No log | 0.6795 | 354 | 1.1985 | 0.6174 | 1.1985 | 1.0947 | | No log | 0.6833 | 356 | 0.8920 | 0.6851 | 0.8920 | 0.9445 | | No log | 0.6871 | 358 | 0.7005 | 0.6154 | 0.7005 | 0.8369 | | No log | 0.6910 | 360 | 0.7699 | 0.6978 | 0.7699 | 0.8774 | | No log | 0.6948 | 362 | 1.0327 | 0.6573 | 1.0327 | 1.0162 | | No log | 0.6987 | 364 | 1.0954 | 0.75 | 1.0954 | 1.0466 | | No log | 0.7025 | 366 | 0.9509 | 0.7287 | 0.9509 | 0.9752 | | No log | 0.7063 | 368 | 0.9011 | 0.6729 | 0.9011 | 0.9493 | | No log | 0.7102 | 370 | 1.0184 | 0.5494 | 1.0184 | 1.0092 | | No log | 0.7140 | 372 | 1.2496 | 0.4878 | 1.2496 | 1.1179 | | No log | 0.7179 | 374 | 1.2261 | 0.4878 | 1.2261 | 1.1073 | | No log | 0.7217 | 376 | 1.0696 | 0.5374 | 1.0696 | 1.0342 | | No log | 0.7255 | 378 | 1.0441 | 0.5374 | 1.0441 | 1.0218 | | No log | 0.7294 | 380 | 1.0156 | 0.6316 | 1.0156 | 1.0078 | | No log | 0.7332 | 382 | 1.0285 | 0.6715 | 1.0285 | 1.0141 | | No log | 0.7370 | 384 | 0.9229 | 0.7138 | 0.9229 | 0.9607 | | No log | 0.7409 | 386 | 0.7399 | 0.776 | 0.7399 | 0.8602 | | No log | 0.7447 | 388 | 0.8734 | 0.7774 | 0.8734 | 0.9345 | | No log | 0.7486 | 390 | 1.1757 | 0.6125 | 1.1757 | 1.0843 | | No log | 0.7524 | 392 | 1.1407 | 0.6231 | 1.1407 | 1.0680 | | No log | 0.7562 | 394 | 0.9501 | 0.6231 | 0.9501 | 0.9748 | | No log | 0.7601 | 396 | 0.8515 | 0.5845 | 0.8515 | 0.9228 | | No log | 0.7639 | 398 | 0.9319 | 0.6471 | 0.9319 | 0.9654 | | No log | 0.7678 | 400 | 1.0503 | 0.6459 | 1.0503 | 1.0248 | | No log | 0.7716 | 402 | 1.1928 | 0.6142 | 1.1928 | 1.0922 | | No log | 0.7754 | 404 | 1.2134 | 0.6142 | 1.2134 | 1.1015 | | No log | 0.7793 | 406 | 1.0246 | 0.7220 | 1.0246 | 1.0122 | | No log | 0.7831 | 408 | 1.2295 | 0.6957 | 1.2295 | 1.1088 | | No log | 0.7869 | 410 | 1.3089 | 0.6957 | 1.3089 | 1.1441 | | No log | 0.7908 | 412 | 0.9123 | 0.7279 | 0.9123 | 0.9551 | | No log | 0.7946 | 414 | 0.7389 | 0.6851 | 0.7389 | 0.8596 | | No log | 0.7985 | 416 | 0.7481 | 0.7586 | 0.7481 | 0.8649 | | No log | 0.8023 | 418 | 1.0451 | 0.6617 | 1.0451 | 1.0223 | | No log | 0.8061 | 420 | 1.3213 | 0.5128 | 1.3213 | 1.1495 | | No log | 0.8100 | 422 | 1.1304 | 0.4878 | 1.1304 | 1.0632 | | No log | 0.8138 | 424 | 0.7179 | 0.7950 | 0.7179 | 0.8473 | | No log | 0.8177 | 426 | 0.4798 | 0.7070 | 0.4798 | 0.6927 | | No log | 0.8215 | 428 | 0.4884 | 0.7070 | 0.4884 | 0.6989 | | No log | 0.8253 | 430 | 0.7006 | 0.7586 | 0.7006 | 0.8370 | | No log | 0.8292 | 432 | 1.1433 | 0.7279 | 1.1433 | 1.0692 | | No log | 0.8330 | 434 | 1.2250 | 0.7279 | 1.2250 | 1.1068 | | No log | 0.8369 | 436 | 0.9136 | 0.7426 | 0.9136 | 0.9558 | | No log | 0.8407 | 438 | 0.7366 | 0.6978 | 0.7366 | 0.8583 | | No log | 0.8445 | 440 | 0.6116 | 0.7116 | 0.6116 | 0.7820 | | No log | 0.8484 | 442 | 0.6202 | 0.7116 | 0.6202 | 0.7875 | | No log | 0.8522 | 444 | 0.7291 | 0.7116 | 0.7291 | 0.8539 | | No log | 0.8560 | 446 | 0.8650 | 0.6851 | 0.8650 | 0.9300 | | No log | 0.8599 | 448 | 0.8570 | 0.6573 | 0.8570 | 0.9257 | | No log | 0.8637 | 450 | 0.7121 | 0.6286 | 0.7121 | 0.8439 | | No log | 0.8676 | 452 | 0.5480 | 0.6459 | 0.5480 | 0.7403 | | No log | 0.8714 | 454 | 0.5175 | 0.6190 | 0.5175 | 0.7194 | | No log | 0.8752 | 456 | 0.5745 | 0.7165 | 0.5745 | 0.7579 | | No log | 0.8791 | 458 | 0.7986 | 0.6851 | 0.7986 | 0.8936 | | No log | 0.8829 | 460 | 1.0190 | 0.7138 | 1.0190 | 1.0095 | | No log | 0.8868 | 462 | 1.0921 | 0.7138 | 1.0921 | 1.0450 | | No log | 0.8906 | 464 | 1.0462 | 0.7138 | 1.0462 | 1.0228 | | No log | 0.8944 | 466 | 0.8301 | 0.6978 | 0.8301 | 0.9111 | | No log | 0.8983 | 468 | 0.7669 | 0.6719 | 0.7669 | 0.8758 | | No log | 0.9021 | 470 | 0.6351 | 0.6286 | 0.6351 | 0.7969 | | No log | 0.9060 | 472 | 0.5565 | 0.6160 | 0.5565 | 0.7460 | | No log | 0.9098 | 474 | 0.6044 | 0.6286 | 0.6044 | 0.7774 | | No log | 0.9136 | 476 | 0.7420 | 0.776 | 0.7420 | 0.8614 | | No log | 0.9175 | 478 | 0.7459 | 0.776 | 0.7459 | 0.8637 | | No log | 0.9213 | 480 | 0.6868 | 0.7364 | 0.6868 | 0.8287 | | No log | 0.9251 | 482 | 0.7618 | 0.776 | 0.7618 | 0.8728 | | No log | 0.9290 | 484 | 0.8971 | 0.7586 | 0.8971 | 0.9471 | | No log | 0.9328 | 486 | 0.8505 | 0.7586 | 0.8505 | 0.9222 | | No log | 0.9367 | 488 | 0.8878 | 0.776 | 0.8878 | 0.9422 | | No log | 0.9405 | 490 | 1.0217 | 0.7059 | 1.0217 | 1.0108 | | No log | 0.9443 | 492 | 1.1521 | 0.4803 | 1.1521 | 1.0733 | | No log | 0.9482 | 494 | 1.3060 | 0.5259 | 1.3060 | 1.1428 | | No log | 0.9520 | 496 | 1.2791 | 0.5259 | 1.2791 | 1.1310 | | No log | 0.9559 | 498 | 1.0093 | 0.7138 | 1.0093 | 1.0047 | | 0.5151 | 0.9597 | 500 | 0.7940 | 0.6883 | 0.7940 | 0.8911 | | 0.5151 | 0.9635 | 502 | 0.6974 | 0.6778 | 0.6974 | 0.8351 | | 0.5151 | 0.9674 | 504 | 0.6817 | 0.6592 | 0.6817 | 0.8257 | | 0.5151 | 0.9712 | 506 | 0.7627 | 0.6978 | 0.7627 | 0.8733 | | 0.5151 | 0.9750 | 508 | 0.8530 | 0.6851 | 0.8530 | 0.9236 | | 0.5151 | 0.9789 | 510 | 1.0832 | 0.75 | 1.0832 | 1.0408 | | 0.5151 | 0.9827 | 512 | 1.2595 | 0.5821 | 1.2595 | 1.1223 | | 0.5151 | 0.9866 | 514 | 1.2548 | 0.4979 | 1.2548 | 1.1202 | | 0.5151 | 0.9904 | 516 | 1.0647 | 0.3000 | 1.0647 | 1.0319 | | 0.5151 | 0.9942 | 518 | 0.8240 | 0.5625 | 0.8240 | 0.9077 | | 0.5151 | 0.9981 | 520 | 0.6234 | 0.6051 | 0.6234 | 0.7896 | | 0.5151 | 1.0019 | 522 | 0.5344 | 0.5767 | 0.5344 | 0.7310 | | 0.5151 | 1.0058 | 524 | 0.5232 | 0.5767 | 0.5232 | 0.7233 | | 0.5151 | 1.0096 | 526 | 0.6254 | 0.5377 | 0.6254 | 0.7908 | | 0.5151 | 1.0134 | 528 | 0.9631 | 0.7287 | 0.9631 | 0.9814 | | 0.5151 | 1.0173 | 530 | 1.2054 | 0.6715 | 1.2054 | 1.0979 | | 0.5151 | 1.0211 | 532 | 1.2047 | 0.6715 | 1.2047 | 1.0976 | | 0.5151 | 1.0250 | 534 | 0.9383 | 0.6691 | 0.9383 | 0.9687 | | 0.5151 | 1.0288 | 536 | 0.6804 | 0.6286 | 0.6804 | 0.8248 | | 0.5151 | 1.0326 | 538 | 0.6658 | 0.6286 | 0.6658 | 0.8160 | | 0.5151 | 1.0365 | 540 | 0.6603 | 0.6286 | 0.6603 | 0.8126 | | 0.5151 | 1.0403 | 542 | 0.7685 | 0.6719 | 0.7685 | 0.8766 | | 0.5151 | 1.0441 | 544 | 0.8912 | 0.7449 | 0.8912 | 0.9440 | | 0.5151 | 1.0480 | 546 | 0.8335 | 0.7449 | 0.8335 | 0.9130 | | 0.5151 | 1.0518 | 548 | 0.7324 | 0.6286 | 0.7324 | 0.8558 | | 0.5151 | 1.0557 | 550 | 0.6516 | 0.5812 | 0.6516 | 0.8072 | | 0.5151 | 1.0595 | 552 | 0.5989 | 0.5919 | 0.5989 | 0.7739 | | 0.5151 | 1.0633 | 554 | 0.7230 | 0.7116 | 0.7230 | 0.8503 | | 0.5151 | 1.0672 | 556 | 0.9364 | 0.6805 | 0.9364 | 0.9677 | | 0.5151 | 1.0710 | 558 | 1.1952 | 0.5370 | 1.1952 | 1.0932 | | 0.5151 | 1.0749 | 560 | 1.2783 | 0.5039 | 1.2783 | 1.1306 | | 0.5151 | 1.0787 | 562 | 1.1523 | 0.5103 | 1.1523 | 1.0735 | | 0.5151 | 1.0825 | 564 | 0.8932 | 0.5962 | 0.8932 | 0.9451 | | 0.5151 | 1.0864 | 566 | 0.6654 | 0.5746 | 0.6654 | 0.8157 | | 0.5151 | 1.0902 | 568 | 0.5639 | 0.5746 | 0.5639 | 0.7509 | | 0.5151 | 1.0940 | 570 | 0.5653 | 0.5051 | 0.5653 | 0.7519 | | 0.5151 | 1.0979 | 572 | 0.6824 | 0.6061 | 0.6824 | 0.8261 | | 0.5151 | 1.1017 | 574 | 0.9440 | 0.6500 | 0.9440 | 0.9716 | | 0.5151 | 1.1056 | 576 | 0.9825 | 0.6936 | 0.9825 | 0.9912 | | 0.5151 | 1.1094 | 578 | 0.7825 | 0.7482 | 0.7825 | 0.8846 | | 0.5151 | 1.1132 | 580 | 0.6188 | 0.6719 | 0.6188 | 0.7866 | | 0.5151 | 1.1171 | 582 | 0.5885 | 0.6719 | 0.5885 | 0.7671 | | 0.5151 | 1.1209 | 584 | 0.6588 | 0.5291 | 0.6588 | 0.8117 | | 0.5151 | 1.1248 | 586 | 0.8716 | 0.7034 | 0.8716 | 0.9336 | | 0.5151 | 1.1286 | 588 | 1.0197 | 0.6111 | 1.0197 | 1.0098 | | 0.5151 | 1.1324 | 590 | 1.0057 | 0.7131 | 1.0057 | 1.0028 | | 0.5151 | 1.1363 | 592 | 0.8433 | 0.7034 | 0.8433 | 0.9183 | | 0.5151 | 1.1401 | 594 | 0.7442 | 0.7034 | 0.7442 | 0.8626 | | 0.5151 | 1.1440 | 596 | 0.7225 | 0.6719 | 0.7225 | 0.8500 | | 0.5151 | 1.1478 | 598 | 0.7552 | 0.7034 | 0.7552 | 0.8690 | | 0.5151 | 1.1516 | 600 | 0.9891 | 0.7287 | 0.9891 | 0.9946 | | 0.5151 | 1.1555 | 602 | 1.2824 | 0.6525 | 1.2824 | 1.1324 | | 0.5151 | 1.1593 | 604 | 1.2400 | 0.6525 | 1.2400 | 1.1136 | | 0.5151 | 1.1631 | 606 | 1.2422 | 0.5591 | 1.2422 | 1.1145 | | 0.5151 | 1.1670 | 608 | 1.0750 | 0.7386 | 1.0750 | 1.0368 | | 0.5151 | 1.1708 | 610 | 0.8373 | 0.6578 | 0.8373 | 0.9150 | | 0.5151 | 1.1747 | 612 | 0.6014 | 0.5377 | 0.6014 | 0.7755 | | 0.5151 | 1.1785 | 614 | 0.5291 | 0.6026 | 0.5291 | 0.7274 | | 0.5151 | 1.1823 | 616 | 0.5908 | 0.6375 | 0.5908 | 0.7686 | | 0.5151 | 1.1862 | 618 | 0.8502 | 0.7449 | 0.8502 | 0.9221 | | 0.5151 | 1.1900 | 620 | 1.2058 | 0.7445 | 1.2058 | 1.0981 | | 0.5151 | 1.1939 | 622 | 1.1933 | 0.7445 | 1.1933 | 1.0924 | | 0.5151 | 1.1977 | 624 | 0.9319 | 0.7449 | 0.9319 | 0.9653 | | 0.5151 | 1.2015 | 626 | 0.7362 | 0.6207 | 0.7362 | 0.8580 | | 0.5151 | 1.2054 | 628 | 0.7389 | 0.6207 | 0.7389 | 0.8596 | | 0.5151 | 1.2092 | 630 | 0.8475 | 0.6075 | 0.8475 | 0.9206 | | 0.5151 | 1.2131 | 632 | 1.0092 | 0.7449 | 1.0092 | 1.0046 | | 0.5151 | 1.2169 | 634 | 1.0681 | 0.6557 | 1.0681 | 1.0335 | | 0.5151 | 1.2207 | 636 | 1.0233 | 0.7449 | 1.0233 | 1.0116 | | 0.5151 | 1.2246 | 638 | 1.0007 | 0.7287 | 1.0007 | 1.0003 | | 0.5151 | 1.2284 | 640 | 0.8820 | 0.7449 | 0.8820 | 0.9392 | | 0.5151 | 1.2322 | 642 | 0.8489 | 0.7449 | 0.8489 | 0.9213 | | 0.5151 | 1.2361 | 644 | 0.8288 | 0.7449 | 0.8288 | 0.9104 | | 0.5151 | 1.2399 | 646 | 0.6800 | 0.6286 | 0.6800 | 0.8246 | | 0.5151 | 1.2438 | 648 | 0.6316 | 0.5812 | 0.6316 | 0.7947 | | 0.5151 | 1.2476 | 650 | 0.6690 | 0.6719 | 0.6690 | 0.8179 | | 0.5151 | 1.2514 | 652 | 0.7381 | 0.6978 | 0.7381 | 0.8591 | | 0.5151 | 1.2553 | 654 | 0.8160 | 0.6978 | 0.8160 | 0.9033 | | 0.5151 | 1.2591 | 656 | 0.8150 | 0.7390 | 0.8150 | 0.9028 | | 0.5151 | 1.2630 | 658 | 0.8508 | 0.7390 | 0.8508 | 0.9224 | | 0.5151 | 1.2668 | 660 | 0.8042 | 0.7123 | 0.8042 | 0.8968 | | 0.5151 | 1.2706 | 662 | 0.6958 | 0.7390 | 0.6958 | 0.8341 | | 0.5151 | 1.2745 | 664 | 0.6557 | 0.6978 | 0.6557 | 0.8098 | | 0.5151 | 1.2783 | 666 | 0.6316 | 0.7390 | 0.6316 | 0.7947 | | 0.5151 | 1.2821 | 668 | 0.6758 | 0.7308 | 0.6758 | 0.8220 | | 0.5151 | 1.2860 | 670 | 0.5785 | 0.6216 | 0.5785 | 0.7606 | | 0.5151 | 1.2898 | 672 | 0.4972 | 0.5704 | 0.4972 | 0.7051 | | 0.5151 | 1.2937 | 674 | 0.5060 | 0.5704 | 0.5060 | 0.7114 | | 0.5151 | 1.2975 | 676 | 0.5713 | 0.5825 | 0.5713 | 0.7558 | | 0.5151 | 1.3013 | 678 | 0.7269 | 0.6711 | 0.7269 | 0.8526 | | 0.5151 | 1.3052 | 680 | 0.8853 | 0.6997 | 0.8853 | 0.9409 | | 0.5151 | 1.3090 | 682 | 0.9393 | 0.7552 | 0.9393 | 0.9692 | | 0.5151 | 1.3129 | 684 | 0.7691 | 0.72 | 0.7691 | 0.8770 | | 0.5151 | 1.3167 | 686 | 0.5563 | 0.5817 | 0.5563 | 0.7459 | | 0.5151 | 1.3205 | 688 | 0.4468 | 0.5576 | 0.4468 | 0.6684 | | 0.5151 | 1.3244 | 690 | 0.4374 | 0.6111 | 0.4374 | 0.6614 | | 0.5151 | 1.3282 | 692 | 0.4967 | 0.6283 | 0.4967 | 0.7047 | | 0.5151 | 1.3321 | 694 | 0.6194 | 0.7364 | 0.6194 | 0.7870 | | 0.5151 | 1.3359 | 696 | 0.7051 | 0.7586 | 0.7051 | 0.8397 | | 0.5151 | 1.3397 | 698 | 0.9315 | 0.7287 | 0.9315 | 0.9651 | | 0.5151 | 1.3436 | 700 | 0.9986 | 0.7287 | 0.9986 | 0.9993 | | 0.5151 | 1.3474 | 702 | 0.8580 | 0.7287 | 0.8580 | 0.9263 | | 0.5151 | 1.3512 | 704 | 0.6266 | 0.6930 | 0.6266 | 0.7916 | | 0.5151 | 1.3551 | 706 | 0.4778 | 0.7009 | 0.4778 | 0.6912 | | 0.5151 | 1.3589 | 708 | 0.4370 | 0.6423 | 0.4370 | 0.6611 | | 0.5151 | 1.3628 | 710 | 0.4821 | 0.7021 | 0.4821 | 0.6943 | | 0.5151 | 1.3666 | 712 | 0.6580 | 0.7586 | 0.6580 | 0.8111 | | 0.5151 | 1.3704 | 714 | 0.7447 | 0.7586 | 0.7447 | 0.8630 | | 0.5151 | 1.3743 | 716 | 0.6761 | 0.7586 | 0.6761 | 0.8222 | | 0.5151 | 1.3781 | 718 | 0.5800 | 0.6930 | 0.5800 | 0.7616 | | 0.5151 | 1.3820 | 720 | 0.5988 | 0.6452 | 0.5988 | 0.7738 | | 0.5151 | 1.3858 | 722 | 0.7025 | 0.6351 | 0.7025 | 0.8382 | | 0.5151 | 1.3896 | 724 | 0.8861 | 0.7386 | 0.8861 | 0.9413 | | 0.5151 | 1.3935 | 726 | 0.9254 | 0.7059 | 0.9254 | 0.9620 | | 0.5151 | 1.3973 | 728 | 0.7830 | 0.6635 | 0.7830 | 0.8849 | | 0.5151 | 1.4012 | 730 | 0.6706 | 0.6602 | 0.6706 | 0.8189 | | 0.5151 | 1.4050 | 732 | 0.6504 | 0.6452 | 0.6504 | 0.8064 | | 0.5151 | 1.4088 | 734 | 0.7472 | 0.7627 | 0.7472 | 0.8644 | | 0.5151 | 1.4127 | 736 | 0.9567 | 0.6723 | 0.9567 | 0.9781 | | 0.5151 | 1.4165 | 738 | 0.9408 | 0.7529 | 0.9408 | 0.9699 | | 0.5151 | 1.4203 | 740 | 0.7317 | 0.8016 | 0.7317 | 0.8554 | | 0.5151 | 1.4242 | 742 | 0.5636 | 0.6930 | 0.5636 | 0.7507 | | 0.5151 | 1.4280 | 744 | 0.5852 | 0.6602 | 0.5852 | 0.7650 | | 0.5151 | 1.4319 | 746 | 0.7242 | 0.6635 | 0.7242 | 0.8510 | | 0.5151 | 1.4357 | 748 | 0.8891 | 0.6875 | 0.8891 | 0.9429 | | 0.5151 | 1.4395 | 750 | 0.9540 | 0.6875 | 0.9540 | 0.9767 | | 0.5151 | 1.4434 | 752 | 0.9712 | 0.6875 | 0.9712 | 0.9855 | | 0.5151 | 1.4472 | 754 | 0.8341 | 0.7565 | 0.8341 | 0.9133 | | 0.5151 | 1.4511 | 756 | 0.6182 | 0.7364 | 0.6182 | 0.7863 | | 0.5151 | 1.4549 | 758 | 0.5265 | 0.6930 | 0.5265 | 0.7256 | | 0.5151 | 1.4587 | 760 | 0.4894 | 0.7107 | 0.4894 | 0.6996 | | 0.5151 | 1.4626 | 762 | 0.5816 | 0.776 | 0.5816 | 0.7626 | | 0.5151 | 1.4664 | 764 | 0.6097 | 0.776 | 0.6097 | 0.7808 | | 0.5151 | 1.4702 | 766 | 0.7230 | 0.7586 | 0.7230 | 0.8503 | | 0.5151 | 1.4741 | 768 | 0.9799 | 0.7941 | 0.9799 | 0.9899 | | 0.5151 | 1.4779 | 770 | 1.0232 | 0.7222 | 1.0232 | 1.0115 | | 0.5151 | 1.4818 | 772 | 0.7953 | 0.7529 | 0.7953 | 0.8918 | | 0.5151 | 1.4856 | 774 | 0.5396 | 0.6667 | 0.5396 | 0.7346 | | 0.5151 | 1.4894 | 776 | 0.4743 | 0.5882 | 0.4743 | 0.6887 | | 0.5151 | 1.4933 | 778 | 0.4877 | 0.6169 | 0.4877 | 0.6983 | | 0.5151 | 1.4971 | 780 | 0.5616 | 0.7364 | 0.5616 | 0.7494 | | 0.5151 | 1.5010 | 782 | 0.6671 | 0.7586 | 0.6671 | 0.8168 | | 0.5151 | 1.5048 | 784 | 0.6296 | 0.7586 | 0.6296 | 0.7935 | | 0.5151 | 1.5086 | 786 | 0.6729 | 0.7941 | 0.6729 | 0.8203 | | 0.5151 | 1.5125 | 788 | 0.7570 | 0.7658 | 0.7570 | 0.8700 | | 0.5151 | 1.5163 | 790 | 0.8619 | 0.7658 | 0.8619 | 0.9284 | | 0.5151 | 1.5202 | 792 | 0.7533 | 0.7658 | 0.7533 | 0.8679 | | 0.5151 | 1.5240 | 794 | 0.5587 | 0.6930 | 0.5587 | 0.7474 | | 0.5151 | 1.5278 | 796 | 0.4360 | 0.6169 | 0.4360 | 0.6603 | | 0.5151 | 1.5317 | 798 | 0.4212 | 0.6169 | 0.4212 | 0.6490 | | 0.5151 | 1.5355 | 800 | 0.4787 | 0.6051 | 0.4787 | 0.6919 | | 0.5151 | 1.5393 | 802 | 0.6129 | 0.6452 | 0.6129 | 0.7829 | | 0.5151 | 1.5432 | 804 | 0.8161 | 0.7705 | 0.8161 | 0.9034 | | 0.5151 | 1.5470 | 806 | 0.9034 | 0.7895 | 0.9034 | 0.9505 | | 0.5151 | 1.5509 | 808 | 0.8458 | 0.7529 | 0.8458 | 0.9197 | | 0.5151 | 1.5547 | 810 | 0.6845 | 0.776 | 0.6845 | 0.8274 | | 0.5151 | 1.5585 | 812 | 0.5838 | 0.776 | 0.5838 | 0.7641 | | 0.5151 | 1.5624 | 814 | 0.5736 | 0.776 | 0.5736 | 0.7574 | | 0.5151 | 1.5662 | 816 | 0.6750 | 0.7586 | 0.6750 | 0.8216 | | 0.5151 | 1.5701 | 818 | 0.7025 | 0.7426 | 0.7025 | 0.8381 | | 0.5151 | 1.5739 | 820 | 0.7778 | 0.7426 | 0.7778 | 0.8819 | | 0.5151 | 1.5777 | 822 | 0.8225 | 0.7774 | 0.8225 | 0.9069 | | 0.5151 | 1.5816 | 824 | 0.7472 | 0.7426 | 0.7472 | 0.8644 | | 0.5151 | 1.5854 | 826 | 0.6106 | 0.7442 | 0.6106 | 0.7814 | | 0.5151 | 1.5893 | 828 | 0.5185 | 0.7442 | 0.5185 | 0.7201 | | 0.5151 | 1.5931 | 830 | 0.4986 | 0.7442 | 0.4986 | 0.7061 | | 0.5151 | 1.5969 | 832 | 0.5076 | 0.7442 | 0.5076 | 0.7125 | | 0.5151 | 1.6008 | 834 | 0.5344 | 0.7042 | 0.5344 | 0.7310 | | 0.5151 | 1.6046 | 836 | 0.6418 | 0.776 | 0.6418 | 0.8011 | | 0.5151 | 1.6084 | 838 | 0.8397 | 0.7586 | 0.8397 | 0.9164 | | 0.5151 | 1.6123 | 840 | 0.9492 | 0.7138 | 0.9492 | 0.9743 | | 0.5151 | 1.6161 | 842 | 0.9422 | 0.7138 | 0.9422 | 0.9707 | | 0.5151 | 1.6200 | 844 | 0.7942 | 0.7586 | 0.7942 | 0.8912 | | 0.5151 | 1.6238 | 846 | 0.6140 | 0.776 | 0.6140 | 0.7836 | | 0.5151 | 1.6276 | 848 | 0.5609 | 0.7042 | 0.5609 | 0.7489 | | 0.5151 | 1.6315 | 850 | 0.5781 | 0.7442 | 0.5781 | 0.7604 | | 0.5151 | 1.6353 | 852 | 0.6572 | 0.6978 | 0.6572 | 0.8107 | | 0.5151 | 1.6392 | 854 | 0.6407 | 0.6978 | 0.6407 | 0.8004 | | 0.5151 | 1.6430 | 856 | 0.5795 | 0.7442 | 0.5795 | 0.7612 | | 0.5151 | 1.6468 | 858 | 0.4829 | 0.7021 | 0.4829 | 0.6949 | | 0.5151 | 1.6507 | 860 | 0.4829 | 0.7021 | 0.4829 | 0.6949 | | 0.5151 | 1.6545 | 862 | 0.5050 | 0.7290 | 0.5050 | 0.7107 | | 0.5151 | 1.6583 | 864 | 0.5682 | 0.7442 | 0.5682 | 0.7538 | | 0.5151 | 1.6622 | 866 | 0.5495 | 0.7442 | 0.5495 | 0.7413 | | 0.5151 | 1.6660 | 868 | 0.5710 | 0.7442 | 0.5710 | 0.7556 | | 0.5151 | 1.6699 | 870 | 0.6215 | 0.7308 | 0.6215 | 0.7884 | | 0.5151 | 1.6737 | 872 | 0.5982 | 0.7364 | 0.5982 | 0.7735 | | 0.5151 | 1.6775 | 874 | 0.5789 | 0.6930 | 0.5789 | 0.7609 | | 0.5151 | 1.6814 | 876 | 0.5343 | 0.6602 | 0.5343 | 0.7310 | | 0.5151 | 1.6852 | 878 | 0.5990 | 0.7097 | 0.5990 | 0.7739 | | 0.5151 | 1.6891 | 880 | 0.6299 | 0.7364 | 0.6299 | 0.7937 | | 0.5151 | 1.6929 | 882 | 0.5931 | 0.776 | 0.5931 | 0.7702 | | 0.5151 | 1.6967 | 884 | 0.5165 | 0.7640 | 0.5165 | 0.7187 | | 0.5151 | 1.7006 | 886 | 0.4463 | 0.7518 | 0.4463 | 0.6681 | | 0.5151 | 1.7044 | 888 | 0.4810 | 0.7926 | 0.4810 | 0.6935 | | 0.5151 | 1.7083 | 890 | 0.4312 | 0.8151 | 0.4312 | 0.6567 | | 0.5151 | 1.7121 | 892 | 0.4287 | 0.8151 | 0.4287 | 0.6547 | | 0.5151 | 1.7159 | 894 | 0.4459 | 0.8293 | 0.4459 | 0.6677 | | 0.5151 | 1.7198 | 896 | 0.4154 | 0.8293 | 0.4154 | 0.6445 | | 0.5151 | 1.7236 | 898 | 0.4895 | 0.8121 | 0.4895 | 0.6996 | | 0.5151 | 1.7274 | 900 | 0.6570 | 0.7864 | 0.6570 | 0.8106 | | 0.5151 | 1.7313 | 902 | 0.6939 | 0.7287 | 0.6939 | 0.8330 | | 0.5151 | 1.7351 | 904 | 0.5653 | 0.776 | 0.5653 | 0.7518 | | 0.5151 | 1.7390 | 906 | 0.4142 | 0.6419 | 0.4142 | 0.6436 | | 0.5151 | 1.7428 | 908 | 0.3718 | 0.6147 | 0.3718 | 0.6097 | | 0.5151 | 1.7466 | 910 | 0.4014 | 0.6903 | 0.4014 | 0.6335 | | 0.5151 | 1.7505 | 912 | 0.5373 | 0.7926 | 0.5373 | 0.7330 | | 0.5151 | 1.7543 | 914 | 0.5741 | 0.7482 | 0.5741 | 0.7577 | | 0.5151 | 1.7582 | 916 | 0.4576 | 0.7926 | 0.4576 | 0.6765 | | 0.5151 | 1.7620 | 918 | 0.3691 | 0.7154 | 0.3691 | 0.6075 | | 0.5151 | 1.7658 | 920 | 0.3666 | 0.6540 | 0.3666 | 0.6055 | | 0.5151 | 1.7697 | 922 | 0.4594 | 0.8069 | 0.4594 | 0.6778 | | 0.5151 | 1.7735 | 924 | 0.6672 | 0.7864 | 0.6672 | 0.8168 | | 0.5151 | 1.7774 | 926 | 0.7735 | 0.7712 | 0.7735 | 0.8795 | | 0.5151 | 1.7812 | 928 | 0.7531 | 0.7712 | 0.7531 | 0.8678 | | 0.5151 | 1.7850 | 930 | 0.6552 | 0.7712 | 0.6552 | 0.8094 | | 0.5151 | 1.7889 | 932 | 0.5408 | 0.7864 | 0.5408 | 0.7354 | | 0.5151 | 1.7927 | 934 | 0.4922 | 0.7805 | 0.4922 | 0.7016 | | 0.5151 | 1.7965 | 936 | 0.5171 | 0.7805 | 0.5171 | 0.7191 | | 0.5151 | 1.8004 | 938 | 0.5907 | 0.776 | 0.5907 | 0.7686 | | 0.5151 | 1.8042 | 940 | 0.5940 | 0.776 | 0.5940 | 0.7707 | | 0.5151 | 1.8081 | 942 | 0.5576 | 0.6930 | 0.5576 | 0.7467 | | 0.5151 | 1.8119 | 944 | 0.6183 | 0.776 | 0.6183 | 0.7863 | | 0.5151 | 1.8157 | 946 | 0.6647 | 0.776 | 0.6647 | 0.8153 | | 0.5151 | 1.8196 | 948 | 0.6435 | 0.776 | 0.6435 | 0.8022 | | 0.5151 | 1.8234 | 950 | 0.5009 | 0.7490 | 0.5009 | 0.7077 | | 0.5151 | 1.8273 | 952 | 0.3949 | 0.6540 | 0.3949 | 0.6284 | | 0.5151 | 1.8311 | 954 | 0.3945 | 0.6111 | 0.3945 | 0.6281 | | 0.5151 | 1.8349 | 956 | 0.4698 | 0.6016 | 0.4698 | 0.6855 | | 0.5151 | 1.8388 | 958 | 0.6296 | 0.776 | 0.6296 | 0.7935 | | 0.5151 | 1.8426 | 960 | 0.6584 | 0.776 | 0.6584 | 0.8114 | | 0.5151 | 1.8464 | 962 | 0.5364 | 0.6698 | 0.5364 | 0.7324 | | 0.5151 | 1.8503 | 964 | 0.4303 | 0.6147 | 0.4303 | 0.6560 | | 0.5151 | 1.8541 | 966 | 0.4175 | 0.6111 | 0.4175 | 0.6461 | | 0.5151 | 1.8580 | 968 | 0.4727 | 0.6638 | 0.4727 | 0.6875 | | 0.5151 | 1.8618 | 970 | 0.6393 | 0.776 | 0.6393 | 0.7995 | | 0.5151 | 1.8656 | 972 | 0.8438 | 0.75 | 0.8438 | 0.9186 | | 0.5151 | 1.8695 | 974 | 1.0624 | 0.7726 | 1.0624 | 1.0307 | | 0.5151 | 1.8733 | 976 | 1.0188 | 0.7726 | 1.0188 | 1.0094 | | 0.5151 | 1.8772 | 978 | 0.7964 | 0.6578 | 0.7964 | 0.8924 | | 0.5151 | 1.8810 | 980 | 0.5451 | 0.6051 | 0.5451 | 0.7383 | | 0.5151 | 1.8848 | 982 | 0.4547 | 0.6379 | 0.4547 | 0.6743 | | 0.5151 | 1.8887 | 984 | 0.4044 | 0.6111 | 0.4044 | 0.6359 | | 0.5151 | 1.8925 | 986 | 0.3989 | 0.6111 | 0.3989 | 0.6316 | | 0.5151 | 1.8964 | 988 | 0.4356 | 0.6540 | 0.4356 | 0.6600 | | 0.5151 | 1.9002 | 990 | 0.5457 | 0.7237 | 0.5457 | 0.7387 | | 0.5151 | 1.9040 | 992 | 0.6906 | 0.7864 | 0.6906 | 0.8310 | | 0.5151 | 1.9079 | 994 | 0.8121 | 0.7879 | 0.8121 | 0.9012 | | 0.5151 | 1.9117 | 996 | 0.7496 | 0.8133 | 0.7496 | 0.8658 | | 0.5151 | 1.9155 | 998 | 0.5820 | 0.7986 | 0.5820 | 0.7629 | | 0.1845 | 1.9194 | 1000 | 0.4231 | 0.7083 | 0.4231 | 0.6504 | | 0.1845 | 1.9232 | 1002 | 0.3664 | 0.6617 | 0.3664 | 0.6053 | | 0.1845 | 1.9271 | 1004 | 0.3672 | 0.6617 | 0.3672 | 0.6060 | | 0.1845 | 1.9309 | 1006 | 0.4531 | 0.7083 | 0.4531 | 0.6731 | | 0.1845 | 1.9347 | 1008 | 0.5814 | 0.7864 | 0.5814 | 0.7625 | | 0.1845 | 1.9386 | 1010 | 0.6907 | 0.7586 | 0.6907 | 0.8311 | | 0.1845 | 1.9424 | 1012 | 0.7838 | 0.7287 | 0.7838 | 0.8853 | | 0.1845 | 1.9463 | 1014 | 0.8375 | 0.7287 | 0.8375 | 0.9152 | | 0.1845 | 1.9501 | 1016 | 0.7326 | 0.7034 | 0.7326 | 0.8559 | | 0.1845 | 1.9539 | 1018 | 0.6632 | 0.6930 | 0.6632 | 0.8144 | | 0.1845 | 1.9578 | 1020 | 0.6018 | 0.6602 | 0.6018 | 0.7758 | | 0.1845 | 1.9616 | 1022 | 0.4934 | 0.6638 | 0.4934 | 0.7024 | | 0.1845 | 1.9655 | 1024 | 0.4659 | 0.6638 | 0.4659 | 0.6825 | | 0.1845 | 1.9693 | 1026 | 0.4515 | 0.6638 | 0.4515 | 0.6719 | | 0.1845 | 1.9731 | 1028 | 0.4187 | 0.6540 | 0.4187 | 0.6471 | | 0.1845 | 1.9770 | 1030 | 0.4055 | 0.6540 | 0.4055 | 0.6368 | | 0.1845 | 1.9808 | 1032 | 0.4536 | 0.7298 | 0.4536 | 0.6735 | | 0.1845 | 1.9846 | 1034 | 0.5453 | 0.7640 | 0.5453 | 0.7385 | | 0.1845 | 1.9885 | 1036 | 0.6492 | 0.7640 | 0.6492 | 0.8057 | | 0.1845 | 1.9923 | 1038 | 0.6872 | 0.7482 | 0.6872 | 0.8290 | | 0.1845 | 1.9962 | 1040 | 0.6792 | 0.7482 | 0.6792 | 0.8241 | | 0.1845 | 2.0 | 1042 | 0.5617 | 0.8028 | 0.5617 | 0.7495 | | 0.1845 | 2.0038 | 1044 | 0.4461 | 0.7298 | 0.4461 | 0.6679 | | 0.1845 | 2.0077 | 1046 | 0.4033 | 0.7354 | 0.4033 | 0.6351 | | 0.1845 | 2.0115 | 1048 | 0.4387 | 0.6934 | 0.4387 | 0.6623 | | 0.1845 | 2.0154 | 1050 | 0.5288 | 0.7971 | 0.5288 | 0.7272 | | 0.1845 | 2.0192 | 1052 | 0.6746 | 0.7482 | 0.6746 | 0.8214 | | 0.1845 | 2.0230 | 1054 | 0.7640 | 0.72 | 0.7640 | 0.8741 | | 0.1845 | 2.0269 | 1056 | 0.7396 | 0.72 | 0.7396 | 0.8600 | | 0.1845 | 2.0307 | 1058 | 0.6675 | 0.7482 | 0.6675 | 0.8170 | | 0.1845 | 2.0345 | 1060 | 0.5070 | 0.7742 | 0.5070 | 0.7121 | | 0.1845 | 2.0384 | 1062 | 0.4141 | 0.6540 | 0.4141 | 0.6435 | | 0.1845 | 2.0422 | 1064 | 0.3956 | 0.6573 | 0.3956 | 0.6290 | | 0.1845 | 2.0461 | 1066 | 0.4177 | 0.6936 | 0.4177 | 0.6463 | | 0.1845 | 2.0499 | 1068 | 0.4705 | 0.7331 | 0.4705 | 0.6859 | | 0.1845 | 2.0537 | 1070 | 0.6116 | 0.7358 | 0.6116 | 0.7821 | | 0.1845 | 2.0576 | 1072 | 0.7249 | 0.8170 | 0.7249 | 0.8514 | | 0.1845 | 2.0614 | 1074 | 0.7198 | 0.8170 | 0.7198 | 0.8484 | | 0.1845 | 2.0653 | 1076 | 0.6382 | 0.7864 | 0.6382 | 0.7989 | | 0.1845 | 2.0691 | 1078 | 0.5629 | 0.7692 | 0.5629 | 0.7503 | | 0.1845 | 2.0729 | 1080 | 0.5697 | 0.7328 | 0.5697 | 0.7548 | | 0.1845 | 2.0768 | 1082 | 0.4995 | 0.7083 | 0.4995 | 0.7067 | | 0.1845 | 2.0806 | 1084 | 0.4582 | 0.7083 | 0.4582 | 0.6769 | | 0.1845 | 2.0845 | 1086 | 0.4539 | 0.6831 | 0.4539 | 0.6737 | | 0.1845 | 2.0883 | 1088 | 0.5287 | 0.7328 | 0.5287 | 0.7271 | | 0.1845 | 2.0921 | 1090 | 0.5844 | 0.7864 | 0.5844 | 0.7644 | | 0.1845 | 2.0960 | 1092 | 0.6590 | 0.7864 | 0.6590 | 0.8118 | | 0.1845 | 2.0998 | 1094 | 0.6770 | 0.7864 | 0.6770 | 0.8228 | | 0.1845 | 2.1036 | 1096 | 0.5902 | 0.7864 | 0.5902 | 0.7682 | | 0.1845 | 2.1075 | 1098 | 0.5403 | 0.7328 | 0.5403 | 0.7351 | | 0.1845 | 2.1113 | 1100 | 0.5679 | 0.7328 | 0.5679 | 0.7536 | | 0.1845 | 2.1152 | 1102 | 0.6529 | 0.7266 | 0.6529 | 0.8080 | | 0.1845 | 2.1190 | 1104 | 0.6187 | 0.7266 | 0.6187 | 0.7866 | | 0.1845 | 2.1228 | 1106 | 0.5922 | 0.7266 | 0.5922 | 0.7695 | | 0.1845 | 2.1267 | 1108 | 0.5291 | 0.7692 | 0.5291 | 0.7274 | | 0.1845 | 2.1305 | 1110 | 0.4523 | 0.6807 | 0.4523 | 0.6725 | | 0.1845 | 2.1344 | 1112 | 0.4298 | 0.6934 | 0.4298 | 0.6556 | | 0.1845 | 2.1382 | 1114 | 0.4254 | 0.6111 | 0.4254 | 0.6523 | | 0.1845 | 2.1420 | 1116 | 0.4465 | 0.6540 | 0.4465 | 0.6682 | | 0.1845 | 2.1459 | 1118 | 0.5396 | 0.7692 | 0.5396 | 0.7346 | | 0.1845 | 2.1497 | 1120 | 0.6002 | 0.7482 | 0.6002 | 0.7747 | | 0.1845 | 2.1536 | 1122 | 0.6728 | 0.7482 | 0.6728 | 0.8202 | | 0.1845 | 2.1574 | 1124 | 0.6186 | 0.7535 | 0.6186 | 0.7865 | | 0.1845 | 2.1612 | 1126 | 0.5243 | 0.7103 | 0.5243 | 0.7241 | | 0.1845 | 2.1651 | 1128 | 0.4133 | 0.6540 | 0.4133 | 0.6429 | | 0.1845 | 2.1689 | 1130 | 0.3832 | 0.6397 | 0.3832 | 0.6191 | | 0.1845 | 2.1727 | 1132 | 0.3781 | 0.6397 | 0.3781 | 0.6149 | | 0.1845 | 2.1766 | 1134 | 0.4048 | 0.6111 | 0.4048 | 0.6362 | | 0.1845 | 2.1804 | 1136 | 0.5126 | 0.7328 | 0.5126 | 0.7160 | | 0.1845 | 2.1843 | 1138 | 0.6423 | 0.7986 | 0.6423 | 0.8014 | | 0.1845 | 2.1881 | 1140 | 0.6545 | 0.7586 | 0.6545 | 0.8090 | | 0.1845 | 2.1919 | 1142 | 0.5924 | 0.776 | 0.5924 | 0.7697 | | 0.1845 | 2.1958 | 1144 | 0.5232 | 0.6930 | 0.5232 | 0.7233 | | 0.1845 | 2.1996 | 1146 | 0.4537 | 0.6169 | 0.4537 | 0.6736 | | 0.1845 | 2.2035 | 1148 | 0.4541 | 0.7175 | 0.4541 | 0.6739 | | 0.1845 | 2.2073 | 1150 | 0.5604 | 0.7510 | 0.5604 | 0.7486 | | 0.1845 | 2.2111 | 1152 | 0.7344 | 0.7586 | 0.7344 | 0.8570 | | 0.1845 | 2.2150 | 1154 | 0.7623 | 0.7586 | 0.7623 | 0.8731 | | 0.1845 | 2.2188 | 1156 | 0.6561 | 0.7348 | 0.6561 | 0.8100 | | 0.1845 | 2.2226 | 1158 | 0.5172 | 0.7535 | 0.5172 | 0.7192 | | 0.1845 | 2.2265 | 1160 | 0.4166 | 0.7742 | 0.4166 | 0.6455 | | 0.1845 | 2.2303 | 1162 | 0.4004 | 0.7549 | 0.4004 | 0.6327 | | 0.1845 | 2.2342 | 1164 | 0.4575 | 0.7083 | 0.4575 | 0.6764 | | 0.1845 | 2.2380 | 1166 | 0.5499 | 0.7364 | 0.5499 | 0.7415 | | 0.1845 | 2.2418 | 1168 | 0.6554 | 0.7449 | 0.6554 | 0.8096 | | 0.1845 | 2.2457 | 1170 | 0.7860 | 0.7529 | 0.7860 | 0.8866 | | 0.1845 | 2.2495 | 1172 | 0.8029 | 0.7368 | 0.8029 | 0.8961 | | 0.1845 | 2.2534 | 1174 | 0.7019 | 0.7287 | 0.7019 | 0.8378 | | 0.1845 | 2.2572 | 1176 | 0.5166 | 0.8151 | 0.5166 | 0.7188 | | 0.1845 | 2.2610 | 1178 | 0.4031 | 0.6540 | 0.4031 | 0.6349 | | 0.1845 | 2.2649 | 1180 | 0.3986 | 0.6540 | 0.3986 | 0.6314 | | 0.1845 | 2.2687 | 1182 | 0.4747 | 0.7795 | 0.4747 | 0.6890 | | 0.1845 | 2.2726 | 1184 | 0.6504 | 0.6883 | 0.6504 | 0.8065 | | 0.1845 | 2.2764 | 1186 | 0.8079 | 0.7222 | 0.8079 | 0.8988 | | 0.1845 | 2.2802 | 1188 | 0.8483 | 0.7222 | 0.8483 | 0.9210 | | 0.1845 | 2.2841 | 1190 | 0.7558 | 0.6805 | 0.7558 | 0.8694 | | 0.1845 | 2.2879 | 1192 | 0.5782 | 0.7364 | 0.5782 | 0.7604 | | 0.1845 | 2.2917 | 1194 | 0.4757 | 0.7407 | 0.4757 | 0.6897 | | 0.1845 | 2.2956 | 1196 | 0.4292 | 0.6723 | 0.4292 | 0.6551 | | 0.1845 | 2.2994 | 1198 | 0.4224 | 0.6723 | 0.4224 | 0.6499 | | 0.1845 | 2.3033 | 1200 | 0.3922 | 0.6723 | 0.3922 | 0.6263 | | 0.1845 | 2.3071 | 1202 | 0.4081 | 0.6008 | 0.4081 | 0.6389 | | 0.1845 | 2.3109 | 1204 | 0.4644 | 0.7004 | 0.4644 | 0.6814 | | 0.1845 | 2.3148 | 1206 | 0.6109 | 0.7758 | 0.6109 | 0.7816 | | 0.1845 | 2.3186 | 1208 | 0.7181 | 0.7287 | 0.7181 | 0.8474 | | 0.1845 | 2.3225 | 1210 | 0.7492 | 0.7658 | 0.7492 | 0.8656 | | 0.1845 | 2.3263 | 1212 | 0.7286 | 0.7287 | 0.7286 | 0.8536 | | 0.1845 | 2.3301 | 1214 | 0.6124 | 0.7364 | 0.6124 | 0.7826 | | 0.1845 | 2.3340 | 1216 | 0.4827 | 0.6026 | 0.4827 | 0.6947 | | 0.1845 | 2.3378 | 1218 | 0.4291 | 0.6147 | 0.4291 | 0.6550 | | 0.1845 | 2.3417 | 1220 | 0.4385 | 0.6500 | 0.4385 | 0.6622 | | 0.1845 | 2.3455 | 1222 | 0.5156 | 0.7328 | 0.5156 | 0.7181 | | 0.1845 | 2.3493 | 1224 | 0.6016 | 0.7758 | 0.6016 | 0.7756 | | 0.1845 | 2.3532 | 1226 | 0.5780 | 0.8121 | 0.5780 | 0.7603 | | 0.1845 | 2.3570 | 1228 | 0.5626 | 0.8121 | 0.5626 | 0.7500 | | 0.1845 | 2.3608 | 1230 | 0.5749 | 0.8121 | 0.5749 | 0.7582 | | 0.1845 | 2.3647 | 1232 | 0.6650 | 0.7864 | 0.6650 | 0.8155 | | 0.1845 | 2.3685 | 1234 | 0.6896 | 0.7482 | 0.6896 | 0.8304 | | 0.1845 | 2.3724 | 1236 | 0.6758 | 0.7482 | 0.6758 | 0.8221 | | 0.1845 | 2.3762 | 1238 | 0.6608 | 0.7864 | 0.6608 | 0.8129 | | 0.1845 | 2.3800 | 1240 | 0.5699 | 0.7864 | 0.5699 | 0.7549 | | 0.1845 | 2.3839 | 1242 | 0.5097 | 0.8293 | 0.5097 | 0.7139 | | 0.1845 | 2.3877 | 1244 | 0.5319 | 0.8028 | 0.5319 | 0.7293 | | 0.1845 | 2.3916 | 1246 | 0.6213 | 0.7864 | 0.6213 | 0.7882 | | 0.1845 | 2.3954 | 1248 | 0.6486 | 0.8170 | 0.6486 | 0.8054 | | 0.1845 | 2.3992 | 1250 | 0.5573 | 0.7864 | 0.5573 | 0.7465 | | 0.1845 | 2.4031 | 1252 | 0.4443 | 0.6423 | 0.4443 | 0.6666 | | 0.1845 | 2.4069 | 1254 | 0.4068 | 0.6423 | 0.4068 | 0.6378 | | 0.1845 | 2.4107 | 1256 | 0.4346 | 0.7742 | 0.4346 | 0.6592 | | 0.1845 | 2.4146 | 1258 | 0.5174 | 0.8121 | 0.5174 | 0.7193 | | 0.1845 | 2.4184 | 1260 | 0.5915 | 0.7482 | 0.5915 | 0.7691 | | 0.1845 | 2.4223 | 1262 | 0.7220 | 0.7941 | 0.7220 | 0.8497 | | 0.1845 | 2.4261 | 1264 | 0.8386 | 0.7774 | 0.8386 | 0.9158 | | 0.1845 | 2.4299 | 1266 | 0.8349 | 0.7774 | 0.8349 | 0.9137 | | 0.1845 | 2.4338 | 1268 | 0.6947 | 0.7941 | 0.6947 | 0.8335 | | 0.1845 | 2.4376 | 1270 | 0.5540 | 0.7758 | 0.5540 | 0.7443 | | 0.1845 | 2.4415 | 1272 | 0.5202 | 0.7926 | 0.5202 | 0.7213 | | 0.1845 | 2.4453 | 1274 | 0.5129 | 0.7926 | 0.5129 | 0.7162 | | 0.1845 | 2.4491 | 1276 | 0.5439 | 0.7544 | 0.5439 | 0.7375 | | 0.1845 | 2.4530 | 1278 | 0.5239 | 0.7429 | 0.5239 | 0.7238 | | 0.1845 | 2.4568 | 1280 | 0.4482 | 0.7342 | 0.4482 | 0.6695 | | 0.1845 | 2.4607 | 1282 | 0.4336 | 0.7742 | 0.4336 | 0.6585 | | 0.1845 | 2.4645 | 1284 | 0.5228 | 0.7820 | 0.5228 | 0.7231 | | 0.1845 | 2.4683 | 1286 | 0.5950 | 0.7820 | 0.5950 | 0.7714 | | 0.1845 | 2.4722 | 1288 | 0.7293 | 0.7667 | 0.7293 | 0.8540 | | 0.1845 | 2.4760 | 1290 | 0.6953 | 0.7820 | 0.6953 | 0.8338 | | 0.1845 | 2.4798 | 1292 | 0.5838 | 0.7820 | 0.5838 | 0.7641 | | 0.1845 | 2.4837 | 1294 | 0.4790 | 0.7482 | 0.4790 | 0.6921 | | 0.1845 | 2.4875 | 1296 | 0.4260 | 0.6751 | 0.4260 | 0.6527 | | 0.1845 | 2.4914 | 1298 | 0.4355 | 0.6547 | 0.4355 | 0.6599 | | 0.1845 | 2.4952 | 1300 | 0.5018 | 0.7009 | 0.5018 | 0.7084 | | 0.1845 | 2.4990 | 1302 | 0.5827 | 0.7116 | 0.5827 | 0.7634 | | 0.1845 | 2.5029 | 1304 | 0.5805 | 0.6719 | 0.5805 | 0.7619 | | 0.1845 | 2.5067 | 1306 | 0.5632 | 0.7009 | 0.5632 | 0.7505 | | 0.1845 | 2.5106 | 1308 | 0.5016 | 0.6038 | 0.5016 | 0.7082 | | 0.1845 | 2.5144 | 1310 | 0.4256 | 0.6038 | 0.4256 | 0.6524 | | 0.1845 | 2.5182 | 1312 | 0.4150 | 0.6419 | 0.4150 | 0.6442 | | 0.1845 | 2.5221 | 1314 | 0.4388 | 0.6038 | 0.4388 | 0.6624 | | 0.1845 | 2.5259 | 1316 | 0.5111 | 0.7009 | 0.5111 | 0.7149 | | 0.1845 | 2.5298 | 1318 | 0.6042 | 0.6719 | 0.6042 | 0.7773 | | 0.1845 | 2.5336 | 1320 | 0.5951 | 0.6857 | 0.5951 | 0.7714 | | 0.1845 | 2.5374 | 1322 | 0.4979 | 0.7009 | 0.4979 | 0.7057 | | 0.1845 | 2.5413 | 1324 | 0.4076 | 0.6016 | 0.4076 | 0.6384 | | 0.1845 | 2.5451 | 1326 | 0.3879 | 0.6016 | 0.3879 | 0.6228 | | 0.1845 | 2.5489 | 1328 | 0.4103 | 0.6459 | 0.4103 | 0.6406 | | 0.1845 | 2.5528 | 1330 | 0.4717 | 0.7342 | 0.4717 | 0.6868 | | 0.1845 | 2.5566 | 1332 | 0.5378 | 0.7482 | 0.5378 | 0.7333 | | 0.1845 | 2.5605 | 1334 | 0.5611 | 0.7482 | 0.5611 | 0.7490 | | 0.1845 | 2.5643 | 1336 | 0.5667 | 0.7482 | 0.5667 | 0.7528 | | 0.1845 | 2.5681 | 1338 | 0.6018 | 0.7482 | 0.6018 | 0.7758 | | 0.1845 | 2.5720 | 1340 | 0.6053 | 0.7482 | 0.6053 | 0.7780 | | 0.1845 | 2.5758 | 1342 | 0.6237 | 0.7482 | 0.6237 | 0.7898 | | 0.1845 | 2.5797 | 1344 | 0.6179 | 0.7482 | 0.6179 | 0.7861 | | 0.1845 | 2.5835 | 1346 | 0.5692 | 0.7758 | 0.5692 | 0.7544 | | 0.1845 | 2.5873 | 1348 | 0.4908 | 0.7697 | 0.4908 | 0.7006 | | 0.1845 | 2.5912 | 1350 | 0.4966 | 0.7697 | 0.4966 | 0.7047 | | 0.1845 | 2.5950 | 1352 | 0.5579 | 0.7758 | 0.5579 | 0.7469 | | 0.1845 | 2.5988 | 1354 | 0.5942 | 0.7758 | 0.5942 | 0.7708 | | 0.1845 | 2.6027 | 1356 | 0.5663 | 0.7758 | 0.5663 | 0.7525 | | 0.1845 | 2.6065 | 1358 | 0.4814 | 0.7692 | 0.4814 | 0.6938 | | 0.1845 | 2.6104 | 1360 | 0.4156 | 0.6459 | 0.4156 | 0.6447 | | 0.1845 | 2.6142 | 1362 | 0.4284 | 0.7083 | 0.4284 | 0.6545 | | 0.1845 | 2.6180 | 1364 | 0.4276 | 0.7083 | 0.4276 | 0.6539 | | 0.1845 | 2.6219 | 1366 | 0.4369 | 0.7083 | 0.4369 | 0.6610 | | 0.1845 | 2.6257 | 1368 | 0.5039 | 0.8108 | 0.5039 | 0.7099 | | 0.1845 | 2.6296 | 1370 | 0.5983 | 0.7926 | 0.5983 | 0.7735 | | 0.1845 | 2.6334 | 1372 | 0.5855 | 0.7926 | 0.5855 | 0.7652 | | 0.1845 | 2.6372 | 1374 | 0.4839 | 0.8108 | 0.4839 | 0.6956 | | 0.1845 | 2.6411 | 1376 | 0.4011 | 0.7910 | 0.4011 | 0.6333 | | 0.1845 | 2.6449 | 1378 | 0.4235 | 0.8108 | 0.4235 | 0.6507 | | 0.1845 | 2.6488 | 1380 | 0.5104 | 0.8108 | 0.5104 | 0.7144 | | 0.1845 | 2.6526 | 1382 | 0.6489 | 0.7820 | 0.6489 | 0.8055 | | 0.1845 | 2.6564 | 1384 | 0.6288 | 0.7640 | 0.6288 | 0.7930 | | 0.1845 | 2.6603 | 1386 | 0.6165 | 0.7812 | 0.6165 | 0.7852 | | 0.1845 | 2.6641 | 1388 | 0.5534 | 0.7429 | 0.5534 | 0.7439 | | 0.1845 | 2.6679 | 1390 | 0.5192 | 0.7429 | 0.5192 | 0.7206 | | 0.1845 | 2.6718 | 1392 | 0.5217 | 0.7429 | 0.5217 | 0.7223 | | 0.1845 | 2.6756 | 1394 | 0.5159 | 0.7429 | 0.5159 | 0.7183 | | 0.1845 | 2.6795 | 1396 | 0.4850 | 0.7863 | 0.4850 | 0.6964 | | 0.1845 | 2.6833 | 1398 | 0.4888 | 0.7863 | 0.4888 | 0.6992 | | 0.1845 | 2.6871 | 1400 | 0.4380 | 0.7795 | 0.4380 | 0.6618 | | 0.1845 | 2.6910 | 1402 | 0.4356 | 0.7795 | 0.4356 | 0.6600 | | 0.1845 | 2.6948 | 1404 | 0.4138 | 0.7795 | 0.4138 | 0.6433 | | 0.1845 | 2.6987 | 1406 | 0.3867 | 0.7795 | 0.3867 | 0.6219 | | 0.1845 | 2.7025 | 1408 | 0.4148 | 0.8151 | 0.4148 | 0.6441 | | 0.1845 | 2.7063 | 1410 | 0.4485 | 0.8478 | 0.4485 | 0.6697 | | 0.1845 | 2.7102 | 1412 | 0.4263 | 0.8478 | 0.4263 | 0.6529 | | 0.1845 | 2.7140 | 1414 | 0.4566 | 0.8293 | 0.4566 | 0.6757 | | 0.1845 | 2.7179 | 1416 | 0.5129 | 0.8293 | 0.5129 | 0.7162 | | 0.1845 | 2.7217 | 1418 | 0.4755 | 0.8293 | 0.4755 | 0.6896 | | 0.1845 | 2.7255 | 1420 | 0.4790 | 0.8293 | 0.4790 | 0.6921 | | 0.1845 | 2.7294 | 1422 | 0.5489 | 0.8028 | 0.5489 | 0.7408 | | 0.1845 | 2.7332 | 1424 | 0.6232 | 0.8028 | 0.6232 | 0.7895 | | 0.1845 | 2.7370 | 1426 | 0.5688 | 0.8028 | 0.5688 | 0.7542 | | 0.1845 | 2.7409 | 1428 | 0.4731 | 0.7623 | 0.4731 | 0.6878 | | 0.1845 | 2.7447 | 1430 | 0.3919 | 0.7154 | 0.3919 | 0.6260 | | 0.1845 | 2.7486 | 1432 | 0.3663 | 0.6573 | 0.3663 | 0.6052 | | 0.1845 | 2.7524 | 1434 | 0.3742 | 0.6936 | 0.3742 | 0.6117 | | 0.1845 | 2.7562 | 1436 | 0.4267 | 0.6934 | 0.4267 | 0.6532 | | 0.1845 | 2.7601 | 1438 | 0.5459 | 0.7640 | 0.5459 | 0.7388 | | 0.1845 | 2.7639 | 1440 | 0.6071 | 0.7640 | 0.6071 | 0.7792 | | 0.1845 | 2.7678 | 1442 | 0.5618 | 0.7640 | 0.5618 | 0.7495 | | 0.1845 | 2.7716 | 1444 | 0.4534 | 0.7910 | 0.4534 | 0.6733 | | 0.1845 | 2.7754 | 1446 | 0.4062 | 0.6934 | 0.4062 | 0.6374 | | 0.1845 | 2.7793 | 1448 | 0.4226 | 0.7549 | 0.4226 | 0.6501 | | 0.1845 | 2.7831 | 1450 | 0.5094 | 0.7640 | 0.5094 | 0.7137 | | 0.1845 | 2.7869 | 1452 | 0.6280 | 0.7640 | 0.6280 | 0.7924 | | 0.1845 | 2.7908 | 1454 | 0.6618 | 0.7348 | 0.6618 | 0.8135 | | 0.1845 | 2.7946 | 1456 | 0.6803 | 0.7348 | 0.6803 | 0.8248 | | 0.1845 | 2.7985 | 1458 | 0.6565 | 0.7640 | 0.6565 | 0.8103 | | 0.1845 | 2.8023 | 1460 | 0.5580 | 0.7640 | 0.5580 | 0.7470 | | 0.1845 | 2.8061 | 1462 | 0.4281 | 0.7490 | 0.4281 | 0.6543 | | 0.1845 | 2.8100 | 1464 | 0.3610 | 0.7605 | 0.3610 | 0.6009 | | 0.1845 | 2.8138 | 1466 | 0.3502 | 0.7605 | 0.3502 | 0.5918 | | 0.1845 | 2.8177 | 1468 | 0.3718 | 0.7549 | 0.3718 | 0.6097 | | 0.1845 | 2.8215 | 1470 | 0.3994 | 0.7083 | 0.3994 | 0.6320 | | 0.1845 | 2.8253 | 1472 | 0.4585 | 0.7328 | 0.4585 | 0.6771 | | 0.1845 | 2.8292 | 1474 | 0.5336 | 0.7640 | 0.5336 | 0.7305 | | 0.1845 | 2.8330 | 1476 | 0.5845 | 0.776 | 0.5845 | 0.7645 | | 0.1845 | 2.8369 | 1478 | 0.5571 | 0.776 | 0.5571 | 0.7464 | | 0.1845 | 2.8407 | 1480 | 0.5263 | 0.7364 | 0.5263 | 0.7255 | | 0.1845 | 2.8445 | 1482 | 0.4916 | 0.7097 | 0.4916 | 0.7012 | | 0.1845 | 2.8484 | 1484 | 0.4672 | 0.7455 | 0.4672 | 0.6835 | | 0.1845 | 2.8522 | 1486 | 0.4654 | 0.7879 | 0.4654 | 0.6822 | | 0.1845 | 2.8560 | 1488 | 0.5331 | 0.7544 | 0.5331 | 0.7301 | | 0.1845 | 2.8599 | 1490 | 0.6054 | 0.776 | 0.6054 | 0.7781 | | 0.1845 | 2.8637 | 1492 | 0.5817 | 0.776 | 0.5817 | 0.7627 | | 0.1845 | 2.8676 | 1494 | 0.4616 | 0.7742 | 0.4616 | 0.6794 | | 0.1845 | 2.8714 | 1496 | 0.3740 | 0.7083 | 0.3740 | 0.6115 | | 0.1845 | 2.8752 | 1498 | 0.3336 | 0.6723 | 0.3336 | 0.5776 | | 0.1278 | 2.8791 | 1500 | 0.3160 | 0.6723 | 0.3160 | 0.5621 | | 0.1278 | 2.8829 | 1502 | 0.3426 | 0.6723 | 0.3426 | 0.5854 | | 0.1278 | 2.8868 | 1504 | 0.4348 | 0.7490 | 0.4348 | 0.6594 | | 0.1278 | 2.8906 | 1506 | 0.5240 | 0.7568 | 0.5240 | 0.7239 | | 0.1278 | 2.8944 | 1508 | 0.5873 | 0.7986 | 0.5873 | 0.7663 | | 0.1278 | 2.8983 | 1510 | 0.5495 | 0.7926 | 0.5495 | 0.7413 | | 0.1278 | 2.9021 | 1512 | 0.4497 | 0.7490 | 0.4497 | 0.6706 | | 0.1278 | 2.9060 | 1514 | 0.4043 | 0.6638 | 0.4043 | 0.6358 | | 0.1278 | 2.9098 | 1516 | 0.3963 | 0.6638 | 0.3963 | 0.6295 | | 0.1278 | 2.9136 | 1518 | 0.4156 | 0.6638 | 0.4156 | 0.6447 | | 0.1278 | 2.9175 | 1520 | 0.5001 | 0.7328 | 0.5001 | 0.7072 | | 0.1278 | 2.9213 | 1522 | 0.5339 | 0.776 | 0.5339 | 0.7307 | | 0.1278 | 2.9251 | 1524 | 0.5637 | 0.776 | 0.5637 | 0.7508 | | 0.1278 | 2.9290 | 1526 | 0.5959 | 0.776 | 0.5959 | 0.7719 | | 0.1278 | 2.9328 | 1528 | 0.5489 | 0.776 | 0.5489 | 0.7409 | | 0.1278 | 2.9367 | 1530 | 0.4788 | 0.7429 | 0.4788 | 0.6919 | | 0.1278 | 2.9405 | 1532 | 0.4490 | 0.7490 | 0.4490 | 0.6700 | | 0.1278 | 2.9443 | 1534 | 0.4561 | 0.7607 | 0.4561 | 0.6754 | | 0.1278 | 2.9482 | 1536 | 0.4781 | 0.7568 | 0.4781 | 0.6915 | | 0.1278 | 2.9520 | 1538 | 0.5373 | 0.7640 | 0.5373 | 0.7330 | | 0.1278 | 2.9559 | 1540 | 0.6354 | 0.7986 | 0.6354 | 0.7971 | | 0.1278 | 2.9597 | 1542 | 0.6201 | 0.7986 | 0.6201 | 0.7875 | | 0.1278 | 2.9635 | 1544 | 0.5514 | 0.8028 | 0.5514 | 0.7425 | | 0.1278 | 2.9674 | 1546 | 0.5302 | 0.8293 | 0.5302 | 0.7282 | | 0.1278 | 2.9712 | 1548 | 0.4828 | 0.8293 | 0.4828 | 0.6949 | | 0.1278 | 2.9750 | 1550 | 0.4029 | 0.7093 | 0.4029 | 0.6347 | | 0.1278 | 2.9789 | 1552 | 0.3299 | 0.6873 | 0.3299 | 0.5744 | | 0.1278 | 2.9827 | 1554 | 0.3142 | 0.6397 | 0.3142 | 0.5606 | | 0.1278 | 2.9866 | 1556 | 0.3148 | 0.6097 | 0.3148 | 0.5610 | | 0.1278 | 2.9904 | 1558 | 0.3297 | 0.65 | 0.3297 | 0.5742 | | 0.1278 | 2.9942 | 1560 | 0.4008 | 0.7298 | 0.4008 | 0.6331 | | 0.1278 | 2.9981 | 1562 | 0.5551 | 0.8591 | 0.5551 | 0.7451 | | 0.1278 | 3.0019 | 1564 | 0.6476 | 0.8414 | 0.6476 | 0.8047 | | 0.1278 | 3.0058 | 1566 | 0.6055 | 0.8414 | 0.6055 | 0.7781 | | 0.1278 | 3.0096 | 1568 | 0.4729 | 0.8293 | 0.4729 | 0.6877 | | 0.1278 | 3.0134 | 1570 | 0.3404 | 0.7072 | 0.3404 | 0.5834 | | 0.1278 | 3.0173 | 1572 | 0.2970 | 0.6097 | 0.2970 | 0.5450 | | 0.1278 | 3.0211 | 1574 | 0.2902 | 0.6182 | 0.2902 | 0.5387 | | 0.1278 | 3.0250 | 1576 | 0.2895 | 0.6182 | 0.2895 | 0.5380 | | 0.1278 | 3.0288 | 1578 | 0.3087 | 0.6097 | 0.3087 | 0.5556 | | 0.1278 | 3.0326 | 1580 | 0.4082 | 0.7549 | 0.4082 | 0.6389 | | 0.1278 | 3.0365 | 1582 | 0.5788 | 0.8293 | 0.5788 | 0.7608 | | 0.1278 | 3.0403 | 1584 | 0.7619 | 0.8000 | 0.7619 | 0.8729 | | 0.1278 | 3.0441 | 1586 | 0.7982 | 0.7726 | 0.7982 | 0.8934 | | 0.1278 | 3.0480 | 1588 | 0.7151 | 0.8727 | 0.7151 | 0.8456 | | 0.1278 | 3.0518 | 1590 | 0.5548 | 0.8293 | 0.5548 | 0.7448 | | 0.1278 | 3.0557 | 1592 | 0.4194 | 0.7910 | 0.4194 | 0.6476 | | 0.1278 | 3.0595 | 1594 | 0.3773 | 0.7910 | 0.3773 | 0.6143 | | 0.1278 | 3.0633 | 1596 | 0.3931 | 0.7910 | 0.3931 | 0.6270 | | 0.1278 | 3.0672 | 1598 | 0.3912 | 0.7910 | 0.3912 | 0.6255 | | 0.1278 | 3.0710 | 1600 | 0.3726 | 0.7298 | 0.3726 | 0.6104 | | 0.1278 | 3.0749 | 1602 | 0.3896 | 0.7298 | 0.3896 | 0.6242 | | 0.1278 | 3.0787 | 1604 | 0.4169 | 0.7492 | 0.4169 | 0.6457 | | 0.1278 | 3.0825 | 1606 | 0.4648 | 0.7492 | 0.4648 | 0.6818 | | 0.1278 | 3.0864 | 1608 | 0.4995 | 0.7492 | 0.4995 | 0.7068 | | 0.1278 | 3.0902 | 1610 | 0.5191 | 0.7492 | 0.5191 | 0.7205 | | 0.1278 | 3.0940 | 1612 | 0.4556 | 0.7492 | 0.4556 | 0.6750 | | 0.1278 | 3.0979 | 1614 | 0.4364 | 0.7298 | 0.4364 | 0.6606 | | 0.1278 | 3.1017 | 1616 | 0.4686 | 0.8151 | 0.4686 | 0.6845 | | 0.1278 | 3.1056 | 1618 | 0.4973 | 0.8151 | 0.4973 | 0.7052 | | 0.1278 | 3.1094 | 1620 | 0.5317 | 0.7328 | 0.5317 | 0.7292 | | 0.1278 | 3.1132 | 1622 | 0.5207 | 0.7623 | 0.5207 | 0.7216 | | 0.1278 | 3.1171 | 1624 | 0.5418 | 0.8293 | 0.5418 | 0.7360 | | 0.1278 | 3.1209 | 1626 | 0.5153 | 0.8293 | 0.5153 | 0.7179 | | 0.1278 | 3.1248 | 1628 | 0.4281 | 0.7388 | 0.4281 | 0.6543 | | 0.1278 | 3.1286 | 1630 | 0.3927 | 0.6540 | 0.3927 | 0.6266 | | 0.1278 | 3.1324 | 1632 | 0.3859 | 0.6540 | 0.3859 | 0.6212 | | 0.1278 | 3.1363 | 1634 | 0.4036 | 0.7154 | 0.4036 | 0.6353 | | 0.1278 | 3.1401 | 1636 | 0.4459 | 0.7971 | 0.4459 | 0.6677 | | 0.1278 | 3.1440 | 1638 | 0.5544 | 0.8293 | 0.5544 | 0.7446 | | 0.1278 | 3.1478 | 1640 | 0.5976 | 0.8339 | 0.5976 | 0.7731 | | 0.1278 | 3.1516 | 1642 | 0.5490 | 0.8028 | 0.5490 | 0.7410 | | 0.1278 | 3.1555 | 1644 | 0.4548 | 0.8151 | 0.4548 | 0.6744 | | 0.1278 | 3.1593 | 1646 | 0.3781 | 0.7910 | 0.3781 | 0.6149 | | 0.1278 | 3.1631 | 1648 | 0.3807 | 0.7910 | 0.3807 | 0.6170 | | 0.1278 | 3.1670 | 1650 | 0.4373 | 0.7971 | 0.4373 | 0.6613 | | 0.1278 | 3.1708 | 1652 | 0.5256 | 0.8591 | 0.5256 | 0.7250 | | 0.1278 | 3.1747 | 1654 | 0.5325 | 0.8591 | 0.5325 | 0.7297 | | 0.1278 | 3.1785 | 1656 | 0.4443 | 0.7971 | 0.4443 | 0.6666 | | 0.1278 | 3.1823 | 1658 | 0.3607 | 0.7910 | 0.3607 | 0.6006 | | 0.1278 | 3.1862 | 1660 | 0.3519 | 0.7549 | 0.3519 | 0.5932 | | 0.1278 | 3.1900 | 1662 | 0.3968 | 0.8151 | 0.3968 | 0.6299 | | 0.1278 | 3.1939 | 1664 | 0.5005 | 0.7568 | 0.5005 | 0.7075 | | 0.1278 | 3.1977 | 1666 | 0.5911 | 0.7986 | 0.5911 | 0.7688 | | 0.1278 | 3.2015 | 1668 | 0.5667 | 0.7986 | 0.5667 | 0.7528 | | 0.1278 | 3.2054 | 1670 | 0.4638 | 0.7742 | 0.4638 | 0.6810 | | 0.1278 | 3.2092 | 1672 | 0.4209 | 0.7795 | 0.4209 | 0.6488 | | 0.1278 | 3.2131 | 1674 | 0.4333 | 0.7342 | 0.4333 | 0.6582 | | 0.1278 | 3.2169 | 1676 | 0.4682 | 0.7009 | 0.4682 | 0.6843 | | 0.1278 | 3.2207 | 1678 | 0.4988 | 0.7429 | 0.4988 | 0.7062 | | 0.1278 | 3.2246 | 1680 | 0.5520 | 0.7544 | 0.5520 | 0.7430 | | 0.1278 | 3.2284 | 1682 | 0.6105 | 0.776 | 0.6105 | 0.7813 | | 0.1278 | 3.2322 | 1684 | 0.5818 | 0.7640 | 0.5818 | 0.7627 | | 0.1278 | 3.2361 | 1686 | 0.5057 | 0.7692 | 0.5057 | 0.7111 | | 0.1278 | 3.2399 | 1688 | 0.4241 | 0.7863 | 0.4241 | 0.6512 | | 0.1278 | 3.2438 | 1690 | 0.4226 | 0.7863 | 0.4226 | 0.6501 | | 0.1278 | 3.2476 | 1692 | 0.4678 | 0.7692 | 0.4678 | 0.6840 | | 0.1278 | 3.2514 | 1694 | 0.4445 | 0.7692 | 0.4445 | 0.6667 | | 0.1278 | 3.2553 | 1696 | 0.4279 | 0.8028 | 0.4279 | 0.6542 | | 0.1278 | 3.2591 | 1698 | 0.4607 | 0.8028 | 0.4607 | 0.6787 | | 0.1278 | 3.2630 | 1700 | 0.5256 | 0.8028 | 0.5256 | 0.7250 | | 0.1278 | 3.2668 | 1702 | 0.5285 | 0.8028 | 0.5285 | 0.7270 | | 0.1278 | 3.2706 | 1704 | 0.4919 | 0.8028 | 0.4919 | 0.7014 | | 0.1278 | 3.2745 | 1706 | 0.4311 | 0.8028 | 0.4311 | 0.6566 | | 0.1278 | 3.2783 | 1708 | 0.4263 | 0.7863 | 0.4263 | 0.6529 | | 0.1278 | 3.2821 | 1710 | 0.4661 | 0.7692 | 0.4661 | 0.6827 | | 0.1278 | 3.2860 | 1712 | 0.5249 | 0.7266 | 0.5249 | 0.7245 | | 0.1278 | 3.2898 | 1714 | 0.5111 | 0.6602 | 0.5111 | 0.7149 | | 0.1278 | 3.2937 | 1716 | 0.4728 | 0.6038 | 0.4728 | 0.6876 | | 0.1278 | 3.2975 | 1718 | 0.4295 | 0.6038 | 0.4295 | 0.6554 | | 0.1278 | 3.3013 | 1720 | 0.4156 | 0.6038 | 0.4156 | 0.6447 | | 0.1278 | 3.3052 | 1722 | 0.4468 | 0.6547 | 0.4468 | 0.6684 | | 0.1278 | 3.3090 | 1724 | 0.4372 | 0.7863 | 0.4372 | 0.6612 | | 0.1278 | 3.3129 | 1726 | 0.4135 | 0.7971 | 0.4135 | 0.6430 | | 0.1278 | 3.3167 | 1728 | 0.3976 | 0.7742 | 0.3976 | 0.6305 | | 0.1278 | 3.3205 | 1730 | 0.4061 | 0.8293 | 0.4061 | 0.6372 | | 0.1278 | 3.3244 | 1732 | 0.3821 | 0.8281 | 0.3821 | 0.6181 | | 0.1278 | 3.3282 | 1734 | 0.3671 | 0.8281 | 0.3671 | 0.6059 | | 0.1278 | 3.3321 | 1736 | 0.3955 | 0.8511 | 0.3955 | 0.6289 | | 0.1278 | 3.3359 | 1738 | 0.4886 | 0.8028 | 0.4886 | 0.6990 | | 0.1278 | 3.3397 | 1740 | 0.5419 | 0.7640 | 0.5419 | 0.7362 | | 0.1278 | 3.3436 | 1742 | 0.5700 | 0.7879 | 0.5700 | 0.7550 | | 0.1278 | 3.3474 | 1744 | 0.5480 | 0.7640 | 0.5480 | 0.7403 | | 0.1278 | 3.3512 | 1746 | 0.5610 | 0.7640 | 0.5610 | 0.7490 | | 0.1278 | 3.3551 | 1748 | 0.5377 | 0.8028 | 0.5377 | 0.7333 | | 0.1278 | 3.3589 | 1750 | 0.5366 | 0.7640 | 0.5366 | 0.7325 | | 0.1278 | 3.3628 | 1752 | 0.5987 | 0.8218 | 0.5987 | 0.7738 | | 0.1278 | 3.3666 | 1754 | 0.6912 | 0.8218 | 0.6912 | 0.8314 | | 0.1278 | 3.3704 | 1756 | 0.7171 | 0.8218 | 0.7171 | 0.8468 | | 0.1278 | 3.3743 | 1758 | 0.6310 | 0.7879 | 0.6310 | 0.7944 | | 0.1278 | 3.3781 | 1760 | 0.5324 | 0.7009 | 0.5324 | 0.7297 | | 0.1278 | 3.3820 | 1762 | 0.4488 | 0.6903 | 0.4488 | 0.6699 | | 0.1278 | 3.3858 | 1764 | 0.4365 | 0.7083 | 0.4365 | 0.6607 | | 0.1278 | 3.3896 | 1766 | 0.4616 | 0.7490 | 0.4616 | 0.6794 | | 0.1278 | 3.3935 | 1768 | 0.4871 | 0.7926 | 0.4871 | 0.6979 | | 0.1278 | 3.3973 | 1770 | 0.4786 | 0.7926 | 0.4786 | 0.6918 | | 0.1278 | 3.4012 | 1772 | 0.4776 | 0.8293 | 0.4776 | 0.6911 | | 0.1278 | 3.4050 | 1774 | 0.4919 | 0.7926 | 0.4919 | 0.7014 | | 0.1278 | 3.4088 | 1776 | 0.5236 | 0.7640 | 0.5236 | 0.7236 | | 0.1278 | 3.4127 | 1778 | 0.5062 | 0.7742 | 0.5062 | 0.7115 | | 0.1278 | 3.4165 | 1780 | 0.4682 | 0.8151 | 0.4682 | 0.6843 | | 0.1278 | 3.4203 | 1782 | 0.4517 | 0.7795 | 0.4517 | 0.6721 | | 0.1278 | 3.4242 | 1784 | 0.4294 | 0.7795 | 0.4294 | 0.6553 | | 0.1278 | 3.4280 | 1786 | 0.4043 | 0.7795 | 0.4043 | 0.6358 | | 0.1278 | 3.4319 | 1788 | 0.3947 | 0.7795 | 0.3947 | 0.6282 | | 0.1278 | 3.4357 | 1790 | 0.4229 | 0.7742 | 0.4229 | 0.6503 | | 0.1278 | 3.4395 | 1792 | 0.4053 | 0.7742 | 0.4053 | 0.6366 | | 0.1278 | 3.4434 | 1794 | 0.4165 | 0.8108 | 0.4165 | 0.6453 | | 0.1278 | 3.4472 | 1796 | 0.4962 | 0.8256 | 0.4962 | 0.7044 | | 0.1278 | 3.4511 | 1798 | 0.5810 | 0.8256 | 0.5810 | 0.7622 | | 0.1278 | 3.4549 | 1800 | 0.5868 | 0.8256 | 0.5868 | 0.7660 | | 0.1278 | 3.4587 | 1802 | 0.5016 | 0.8256 | 0.5016 | 0.7082 | | 0.1278 | 3.4626 | 1804 | 0.4141 | 0.8256 | 0.4141 | 0.6435 | | 0.1278 | 3.4664 | 1806 | 0.3494 | 0.6638 | 0.3494 | 0.5911 | | 0.1278 | 3.4702 | 1808 | 0.3502 | 0.6638 | 0.3502 | 0.5917 | | 0.1278 | 3.4741 | 1810 | 0.3839 | 0.6903 | 0.3839 | 0.6196 | | 0.1278 | 3.4779 | 1812 | 0.3884 | 0.6903 | 0.3884 | 0.6232 | | 0.1278 | 3.4818 | 1814 | 0.3957 | 0.7742 | 0.3957 | 0.6290 | | 0.1278 | 3.4856 | 1816 | 0.4767 | 0.7926 | 0.4767 | 0.6904 | | 0.1278 | 3.4894 | 1818 | 0.5299 | 0.7926 | 0.5299 | 0.7279 | | 0.1278 | 3.4933 | 1820 | 0.5058 | 0.7926 | 0.5058 | 0.7112 | | 0.1278 | 3.4971 | 1822 | 0.4900 | 0.7926 | 0.4900 | 0.7000 | | 0.1278 | 3.5010 | 1824 | 0.4919 | 0.7926 | 0.4919 | 0.7014 | | 0.1278 | 3.5048 | 1826 | 0.5104 | 0.7926 | 0.5104 | 0.7144 | | 0.1278 | 3.5086 | 1828 | 0.5083 | 0.7926 | 0.5083 | 0.7130 | | 0.1278 | 3.5125 | 1830 | 0.5280 | 0.7926 | 0.5280 | 0.7266 | | 0.1278 | 3.5163 | 1832 | 0.5278 | 0.7926 | 0.5278 | 0.7265 | | 0.1278 | 3.5202 | 1834 | 0.5017 | 0.7568 | 0.5017 | 0.7083 | | 0.1278 | 3.5240 | 1836 | 0.4639 | 0.7342 | 0.4639 | 0.6811 | | 0.1278 | 3.5278 | 1838 | 0.4489 | 0.7795 | 0.4489 | 0.6700 | | 0.1278 | 3.5317 | 1840 | 0.4233 | 0.7795 | 0.4233 | 0.6506 | | 0.1278 | 3.5355 | 1842 | 0.4279 | 0.7372 | 0.4279 | 0.6541 | | 0.1278 | 3.5393 | 1844 | 0.4526 | 0.7372 | 0.4526 | 0.6727 | | 0.1278 | 3.5432 | 1846 | 0.5100 | 0.7697 | 0.5100 | 0.7142 | | 0.1278 | 3.5470 | 1848 | 0.5282 | 0.8028 | 0.5282 | 0.7268 | | 0.1278 | 3.5509 | 1850 | 0.5308 | 0.8028 | 0.5308 | 0.7285 | | 0.1278 | 3.5547 | 1852 | 0.4672 | 0.7328 | 0.4672 | 0.6835 | | 0.1278 | 3.5585 | 1854 | 0.4484 | 0.7009 | 0.4484 | 0.6696 | | 0.1278 | 3.5624 | 1856 | 0.4542 | 0.6547 | 0.4542 | 0.6739 | | 0.1278 | 3.5662 | 1858 | 0.4935 | 0.7009 | 0.4935 | 0.7025 | | 0.1278 | 3.5701 | 1860 | 0.5543 | 0.6930 | 0.5543 | 0.7445 | | 0.1278 | 3.5739 | 1862 | 0.5675 | 0.6930 | 0.5675 | 0.7533 | | 0.1278 | 3.5777 | 1864 | 0.5998 | 0.6930 | 0.5998 | 0.7745 | | 0.1278 | 3.5816 | 1866 | 0.5866 | 0.6930 | 0.5866 | 0.7659 | | 0.1278 | 3.5854 | 1868 | 0.5396 | 0.6930 | 0.5396 | 0.7346 | | 0.1278 | 3.5893 | 1870 | 0.4588 | 0.6547 | 0.4588 | 0.6774 | | 0.1278 | 3.5931 | 1872 | 0.4054 | 0.6983 | 0.4054 | 0.6367 | | 0.1278 | 3.5969 | 1874 | 0.4072 | 0.7407 | 0.4072 | 0.6381 | | 0.1278 | 3.6008 | 1876 | 0.4483 | 0.8293 | 0.4483 | 0.6695 | | 0.1278 | 3.6046 | 1878 | 0.5347 | 0.8339 | 0.5347 | 0.7312 | | 0.1278 | 3.6084 | 1880 | 0.5754 | 0.8339 | 0.5754 | 0.7585 | | 0.1278 | 3.6123 | 1882 | 0.5392 | 0.8 | 0.5392 | 0.7343 | | 0.1278 | 3.6161 | 1884 | 0.4905 | 0.8 | 0.4905 | 0.7004 | | 0.1278 | 3.6200 | 1886 | 0.4272 | 0.7021 | 0.4272 | 0.6536 | | 0.1278 | 3.6238 | 1888 | 0.3682 | 0.6111 | 0.3682 | 0.6068 | | 0.1278 | 3.6276 | 1890 | 0.3500 | 0.6111 | 0.3500 | 0.5916 | | 0.1278 | 3.6315 | 1892 | 0.3652 | 0.6983 | 0.3652 | 0.6043 | | 0.1278 | 3.6353 | 1894 | 0.4081 | 0.6983 | 0.4081 | 0.6388 | | 0.1278 | 3.6392 | 1896 | 0.4429 | 0.7623 | 0.4429 | 0.6655 | | 0.1278 | 3.6430 | 1898 | 0.4444 | 0.7623 | 0.4444 | 0.6666 | | 0.1278 | 3.6468 | 1900 | 0.4364 | 0.7971 | 0.4364 | 0.6606 | | 0.1278 | 3.6507 | 1902 | 0.4098 | 0.7623 | 0.4098 | 0.6401 | | 0.1278 | 3.6545 | 1904 | 0.4255 | 0.7971 | 0.4255 | 0.6523 | | 0.1278 | 3.6583 | 1906 | 0.4539 | 0.7971 | 0.4539 | 0.6737 | | 0.1278 | 3.6622 | 1908 | 0.4445 | 0.7971 | 0.4445 | 0.6667 | | 0.1278 | 3.6660 | 1910 | 0.3912 | 0.7388 | 0.3912 | 0.6254 | | 0.1278 | 3.6699 | 1912 | 0.3787 | 0.7549 | 0.3787 | 0.6154 | | 0.1278 | 3.6737 | 1914 | 0.3916 | 0.7407 | 0.3916 | 0.6258 | | 0.1278 | 3.6775 | 1916 | 0.4157 | 0.7244 | 0.4157 | 0.6447 | | 0.1278 | 3.6814 | 1918 | 0.4287 | 0.7244 | 0.4287 | 0.6547 | | 0.1278 | 3.6852 | 1920 | 0.4147 | 0.7407 | 0.4147 | 0.6440 | | 0.1278 | 3.6891 | 1922 | 0.4115 | 0.7407 | 0.4115 | 0.6415 | | 0.1278 | 3.6929 | 1924 | 0.4402 | 0.7407 | 0.4402 | 0.6635 | | 0.1278 | 3.6967 | 1926 | 0.4663 | 0.7244 | 0.4663 | 0.6829 | | 0.1278 | 3.7006 | 1928 | 0.4734 | 0.7244 | 0.4734 | 0.6880 | | 0.1278 | 3.7044 | 1930 | 0.4900 | 0.7244 | 0.4900 | 0.7000 | | 0.1278 | 3.7083 | 1932 | 0.4548 | 0.6983 | 0.4548 | 0.6744 | | 0.1278 | 3.7121 | 1934 | 0.4071 | 0.6983 | 0.4071 | 0.6380 | | 0.1278 | 3.7159 | 1936 | 0.3848 | 0.6723 | 0.3848 | 0.6204 | | 0.1278 | 3.7198 | 1938 | 0.3773 | 0.6723 | 0.3773 | 0.6142 | | 0.1278 | 3.7236 | 1940 | 0.3735 | 0.6723 | 0.3735 | 0.6111 | | 0.1278 | 3.7274 | 1942 | 0.3709 | 0.6723 | 0.3709 | 0.6090 | | 0.1278 | 3.7313 | 1944 | 0.3953 | 0.7742 | 0.3953 | 0.6288 | | 0.1278 | 3.7351 | 1946 | 0.4241 | 0.7742 | 0.4241 | 0.6512 | | 0.1278 | 3.7390 | 1948 | 0.4376 | 0.7971 | 0.4376 | 0.6615 | | 0.1278 | 3.7428 | 1950 | 0.4636 | 0.7971 | 0.4636 | 0.6809 | | 0.1278 | 3.7466 | 1952 | 0.4902 | 0.7692 | 0.4902 | 0.7002 | | 0.1278 | 3.7505 | 1954 | 0.4537 | 0.7863 | 0.4537 | 0.6735 | | 0.1278 | 3.7543 | 1956 | 0.4313 | 0.7407 | 0.4313 | 0.6567 | | 0.1278 | 3.7582 | 1958 | 0.3981 | 0.6111 | 0.3981 | 0.6309 | | 0.1278 | 3.7620 | 1960 | 0.3787 | 0.6111 | 0.3787 | 0.6154 | | 0.1278 | 3.7658 | 1962 | 0.3814 | 0.6934 | 0.3814 | 0.6176 | | 0.1278 | 3.7697 | 1964 | 0.4101 | 0.7518 | 0.4101 | 0.6404 | | 0.1278 | 3.7735 | 1966 | 0.4808 | 0.7692 | 0.4808 | 0.6934 | | 0.1278 | 3.7774 | 1968 | 0.5559 | 0.8028 | 0.5559 | 0.7456 | | 0.1278 | 3.7812 | 1970 | 0.5435 | 0.7692 | 0.5435 | 0.7372 | | 0.1278 | 3.7850 | 1972 | 0.4690 | 0.8151 | 0.4690 | 0.6848 | | 0.1278 | 3.7889 | 1974 | 0.3834 | 0.6723 | 0.3834 | 0.6192 | | 0.1278 | 3.7927 | 1976 | 0.3483 | 0.6723 | 0.3483 | 0.5902 | | 0.1278 | 3.7965 | 1978 | 0.3495 | 0.6723 | 0.3495 | 0.5912 | | 0.1278 | 3.8004 | 1980 | 0.3771 | 0.6723 | 0.3771 | 0.6141 | | 0.1278 | 3.8042 | 1982 | 0.4381 | 0.7795 | 0.4381 | 0.6619 | | 0.1278 | 3.8081 | 1984 | 0.4817 | 0.8151 | 0.4817 | 0.6940 | | 0.1278 | 3.8119 | 1986 | 0.5239 | 0.7692 | 0.5239 | 0.7238 | | 0.1278 | 3.8157 | 1988 | 0.4958 | 0.7971 | 0.4958 | 0.7041 | | 0.1278 | 3.8196 | 1990 | 0.4556 | 0.8151 | 0.4556 | 0.6750 | | 0.1278 | 3.8234 | 1992 | 0.4751 | 0.7971 | 0.4751 | 0.6893 | | 0.1278 | 3.8273 | 1994 | 0.4549 | 0.8151 | 0.4549 | 0.6744 | | 0.1278 | 3.8311 | 1996 | 0.3969 | 0.7549 | 0.3969 | 0.6300 | | 0.1278 | 3.8349 | 1998 | 0.3849 | 0.7154 | 0.3849 | 0.6204 | | 0.1022 | 3.8388 | 2000 | 0.4151 | 0.7549 | 0.4151 | 0.6443 | | 0.1022 | 3.8426 | 2002 | 0.4369 | 0.8151 | 0.4369 | 0.6610 | | 0.1022 | 3.8464 | 2004 | 0.4919 | 0.7971 | 0.4919 | 0.7013 | | 0.1022 | 3.8503 | 2006 | 0.5282 | 0.7971 | 0.5282 | 0.7268 | | 0.1022 | 3.8541 | 2008 | 0.4973 | 0.7971 | 0.4973 | 0.7052 | | 0.1022 | 3.8580 | 2010 | 0.4570 | 0.7971 | 0.4570 | 0.6760 | | 0.1022 | 3.8618 | 2012 | 0.4266 | 0.8151 | 0.4266 | 0.6532 | | 0.1022 | 3.8656 | 2014 | 0.3981 | 0.7154 | 0.3981 | 0.6310 | | 0.1022 | 3.8695 | 2016 | 0.4075 | 0.7549 | 0.4075 | 0.6384 | | 0.1022 | 3.8733 | 2018 | 0.4575 | 0.7971 | 0.4575 | 0.6764 | | 0.1022 | 3.8772 | 2020 | 0.5128 | 0.8591 | 0.5128 | 0.7161 | | 0.1022 | 3.8810 | 2022 | 0.5338 | 0.8591 | 0.5338 | 0.7306 | | 0.1022 | 3.8848 | 2024 | 0.4782 | 0.8293 | 0.4782 | 0.6915 | | 0.1022 | 3.8887 | 2026 | 0.4229 | 0.7742 | 0.4229 | 0.6503 | | 0.1022 | 3.8925 | 2028 | 0.3970 | 0.7549 | 0.3970 | 0.6301 | | 0.1022 | 3.8964 | 2030 | 0.3919 | 0.7549 | 0.3919 | 0.6260 | | 0.1022 | 3.9002 | 2032 | 0.4525 | 0.8293 | 0.4525 | 0.6727 | | 0.1022 | 3.9040 | 2034 | 0.5116 | 0.8256 | 0.5116 | 0.7153 | | 0.1022 | 3.9079 | 2036 | 0.5388 | 0.8256 | 0.5388 | 0.7340 | | 0.1022 | 3.9117 | 2038 | 0.5188 | 0.8256 | 0.5188 | 0.7203 | | 0.1022 | 3.9155 | 2040 | 0.4297 | 0.7177 | 0.4297 | 0.6555 | | 0.1022 | 3.9194 | 2042 | 0.3509 | 0.6147 | 0.3509 | 0.5923 | | 0.1022 | 3.9232 | 2044 | 0.3239 | 0.6723 | 0.3239 | 0.5691 | | 0.1022 | 3.9271 | 2046 | 0.3255 | 0.6723 | 0.3255 | 0.5705 | | 0.1022 | 3.9309 | 2048 | 0.3615 | 0.6638 | 0.3615 | 0.6013 | | 0.1022 | 3.9347 | 2050 | 0.3756 | 0.6638 | 0.3756 | 0.6128 | | 0.1022 | 3.9386 | 2052 | 0.4025 | 0.6638 | 0.4025 | 0.6344 | | 0.1022 | 3.9424 | 2054 | 0.4406 | 0.6932 | 0.4406 | 0.6638 | | 0.1022 | 3.9463 | 2056 | 0.4709 | 0.7177 | 0.4709 | 0.6862 | | 0.1022 | 3.9501 | 2058 | 0.4816 | 0.7177 | 0.4816 | 0.6940 | | 0.1022 | 3.9539 | 2060 | 0.4742 | 0.7177 | 0.4742 | 0.6886 | | 0.1022 | 3.9578 | 2062 | 0.4786 | 0.7177 | 0.4786 | 0.6918 | | 0.1022 | 3.9616 | 2064 | 0.4354 | 0.6638 | 0.4354 | 0.6598 | | 0.1022 | 3.9655 | 2066 | 0.3759 | 0.7154 | 0.3759 | 0.6131 | | 0.1022 | 3.9693 | 2068 | 0.3439 | 0.6723 | 0.3439 | 0.5864 | | 0.1022 | 3.9731 | 2070 | 0.3442 | 0.6723 | 0.3442 | 0.5867 | | 0.1022 | 3.9770 | 2072 | 0.3631 | 0.7154 | 0.3631 | 0.6026 | | 0.1022 | 3.9808 | 2074 | 0.4198 | 0.7549 | 0.4198 | 0.6479 | | 0.1022 | 3.9846 | 2076 | 0.5011 | 0.7692 | 0.5011 | 0.7079 | | 0.1022 | 3.9885 | 2078 | 0.5088 | 0.7692 | 0.5088 | 0.7133 | | 0.1022 | 3.9923 | 2080 | 0.4909 | 0.7692 | 0.4909 | 0.7007 | | 0.1022 | 3.9962 | 2082 | 0.4376 | 0.7083 | 0.4376 | 0.6615 | | 0.1022 | 4.0 | 2084 | 0.3708 | 0.7083 | 0.3708 | 0.6090 | | 0.1022 | 4.0038 | 2086 | 0.3576 | 0.6147 | 0.3576 | 0.5980 | | 0.1022 | 4.0077 | 2088 | 0.3724 | 0.7083 | 0.3724 | 0.6103 | | 0.1022 | 4.0115 | 2090 | 0.4404 | 0.7083 | 0.4404 | 0.6636 | | 0.1022 | 4.0154 | 2092 | 0.5061 | 0.7692 | 0.5061 | 0.7114 | | 0.1022 | 4.0192 | 2094 | 0.5261 | 0.7692 | 0.5261 | 0.7253 | | 0.1022 | 4.0230 | 2096 | 0.4965 | 0.7692 | 0.4965 | 0.7046 | | 0.1022 | 4.0269 | 2098 | 0.4234 | 0.7083 | 0.4234 | 0.6507 | | 0.1022 | 4.0307 | 2100 | 0.3886 | 0.6147 | 0.3886 | 0.6233 | | 0.1022 | 4.0345 | 2102 | 0.4002 | 0.6147 | 0.4002 | 0.6326 | | 0.1022 | 4.0384 | 2104 | 0.4123 | 0.7083 | 0.4123 | 0.6421 | | 0.1022 | 4.0422 | 2106 | 0.4544 | 0.6932 | 0.4544 | 0.6741 | | 0.1022 | 4.0461 | 2108 | 0.4367 | 0.6932 | 0.4367 | 0.6608 | | 0.1022 | 4.0499 | 2110 | 0.3944 | 0.7549 | 0.3944 | 0.6280 | | 0.1022 | 4.0537 | 2112 | 0.3551 | 0.6723 | 0.3551 | 0.5959 | | 0.1022 | 4.0576 | 2114 | 0.3564 | 0.7154 | 0.3564 | 0.5970 | | 0.1022 | 4.0614 | 2116 | 0.3896 | 0.7388 | 0.3896 | 0.6242 | | 0.1022 | 4.0653 | 2118 | 0.4167 | 0.7388 | 0.4167 | 0.6455 | | 0.1022 | 4.0691 | 2120 | 0.3897 | 0.7549 | 0.3897 | 0.6242 | | 0.1022 | 4.0729 | 2122 | 0.3769 | 0.7549 | 0.3769 | 0.6140 | | 0.1022 | 4.0768 | 2124 | 0.3824 | 0.7549 | 0.3824 | 0.6184 | | 0.1022 | 4.0806 | 2126 | 0.4063 | 0.7549 | 0.4063 | 0.6374 | | 0.1022 | 4.0845 | 2128 | 0.4324 | 0.7795 | 0.4324 | 0.6576 | | 0.1022 | 4.0883 | 2130 | 0.4144 | 0.7549 | 0.4144 | 0.6437 | | 0.1022 | 4.0921 | 2132 | 0.3943 | 0.7154 | 0.3943 | 0.6280 | | 0.1022 | 4.0960 | 2134 | 0.3855 | 0.6723 | 0.3855 | 0.6209 | | 0.1022 | 4.0998 | 2136 | 0.3775 | 0.6723 | 0.3775 | 0.6144 | | 0.1022 | 4.1036 | 2138 | 0.4077 | 0.6723 | 0.4077 | 0.6385 | | 0.1022 | 4.1075 | 2140 | 0.4657 | 0.7388 | 0.4657 | 0.6824 | | 0.1022 | 4.1113 | 2142 | 0.4923 | 0.8069 | 0.4923 | 0.7016 | | 0.1022 | 4.1152 | 2144 | 0.4594 | 0.7742 | 0.4594 | 0.6778 | | 0.1022 | 4.1190 | 2146 | 0.4217 | 0.7549 | 0.4217 | 0.6494 | | 0.1022 | 4.1228 | 2148 | 0.3897 | 0.75 | 0.3897 | 0.6242 | | 0.1022 | 4.1267 | 2150 | 0.3984 | 0.7956 | 0.3984 | 0.6312 | | 0.1022 | 4.1305 | 2152 | 0.4481 | 0.7742 | 0.4481 | 0.6694 | | 0.1022 | 4.1344 | 2154 | 0.5342 | 0.7926 | 0.5342 | 0.7309 | | 0.1022 | 4.1382 | 2156 | 0.5728 | 0.7926 | 0.5728 | 0.7568 | | 0.1022 | 4.1420 | 2158 | 0.5796 | 0.7482 | 0.5796 | 0.7613 | | 0.1022 | 4.1459 | 2160 | 0.5729 | 0.7482 | 0.5729 | 0.7569 | | 0.1022 | 4.1497 | 2162 | 0.5299 | 0.7926 | 0.5299 | 0.7280 | | 0.1022 | 4.1536 | 2164 | 0.5044 | 0.7926 | 0.5044 | 0.7102 | | 0.1022 | 4.1574 | 2166 | 0.4800 | 0.8069 | 0.4800 | 0.6928 | | 0.1022 | 4.1612 | 2168 | 0.4117 | 0.7742 | 0.4117 | 0.6417 | | 0.1022 | 4.1651 | 2170 | 0.3762 | 0.7742 | 0.3762 | 0.6133 | | 0.1022 | 4.1689 | 2172 | 0.3784 | 0.7742 | 0.3784 | 0.6152 | | 0.1022 | 4.1727 | 2174 | 0.3935 | 0.8069 | 0.3935 | 0.6273 | | 0.1022 | 4.1766 | 2176 | 0.3856 | 0.7742 | 0.3856 | 0.6209 | | 0.1022 | 4.1804 | 2178 | 0.4161 | 0.8069 | 0.4161 | 0.6451 | | 0.1022 | 4.1843 | 2180 | 0.5046 | 0.8121 | 0.5046 | 0.7104 | | 0.1022 | 4.1881 | 2182 | 0.5500 | 0.7758 | 0.5500 | 0.7416 | | 0.1022 | 4.1919 | 2184 | 0.5577 | 0.7758 | 0.5577 | 0.7468 | | 0.1022 | 4.1958 | 2186 | 0.5215 | 0.7926 | 0.5215 | 0.7221 | | 0.1022 | 4.1996 | 2188 | 0.4825 | 0.7926 | 0.4825 | 0.6947 | | 0.1022 | 4.2035 | 2190 | 0.4165 | 0.6419 | 0.4165 | 0.6454 | | 0.1022 | 4.2073 | 2192 | 0.3962 | 0.6147 | 0.3962 | 0.6294 | | 0.1022 | 4.2111 | 2194 | 0.4138 | 0.6147 | 0.4138 | 0.6433 | | 0.1022 | 4.2150 | 2196 | 0.4468 | 0.6419 | 0.4468 | 0.6684 | | 0.1022 | 4.2188 | 2198 | 0.4734 | 0.6316 | 0.4734 | 0.6880 | | 0.1022 | 4.2226 | 2200 | 0.4842 | 0.7686 | 0.4842 | 0.6958 | | 0.1022 | 4.2265 | 2202 | 0.5178 | 0.7879 | 0.5178 | 0.7196 | | 0.1022 | 4.2303 | 2204 | 0.4936 | 0.8521 | 0.4936 | 0.7026 | | 0.1022 | 4.2342 | 2206 | 0.4281 | 0.7910 | 0.4281 | 0.6543 | | 0.1022 | 4.2380 | 2208 | 0.3931 | 0.7549 | 0.3931 | 0.6270 | | 0.1022 | 4.2418 | 2210 | 0.3481 | 0.7549 | 0.3481 | 0.5900 | | 0.1022 | 4.2457 | 2212 | 0.3310 | 0.7154 | 0.3310 | 0.5753 | | 0.1022 | 4.2495 | 2214 | 0.3277 | 0.7549 | 0.3277 | 0.5725 | | 0.1022 | 4.2534 | 2216 | 0.3439 | 0.7910 | 0.3439 | 0.5864 | | 0.1022 | 4.2572 | 2218 | 0.4011 | 0.8069 | 0.4011 | 0.6333 | | 0.1022 | 4.2610 | 2220 | 0.4995 | 0.8069 | 0.4995 | 0.7068 | | 0.1022 | 4.2649 | 2222 | 0.5625 | 0.8591 | 0.5625 | 0.7500 | | 0.1022 | 4.2687 | 2224 | 0.5347 | 0.8069 | 0.5347 | 0.7312 | | 0.1022 | 4.2726 | 2226 | 0.4592 | 0.8069 | 0.4592 | 0.6777 | | 0.1022 | 4.2764 | 2228 | 0.3735 | 0.7910 | 0.3735 | 0.6111 | | 0.1022 | 4.2802 | 2230 | 0.3244 | 0.7138 | 0.3244 | 0.5696 | | 0.1022 | 4.2841 | 2232 | 0.3007 | 0.6367 | 0.3007 | 0.5483 | | 0.1022 | 4.2879 | 2234 | 0.2967 | 0.6644 | 0.2967 | 0.5447 | | 0.1022 | 4.2917 | 2236 | 0.3010 | 0.6367 | 0.3010 | 0.5486 | | 0.1022 | 4.2956 | 2238 | 0.3250 | 0.6667 | 0.3250 | 0.5701 | | 0.1022 | 4.2994 | 2240 | 0.3920 | 0.6723 | 0.3920 | 0.6261 | | 0.1022 | 4.3033 | 2242 | 0.5246 | 0.7879 | 0.5246 | 0.7243 | | 0.1022 | 4.3071 | 2244 | 0.6367 | 0.7941 | 0.6367 | 0.7980 | | 0.1022 | 4.3109 | 2246 | 0.6722 | 0.7941 | 0.6722 | 0.8199 | | 0.1022 | 4.3148 | 2248 | 0.6359 | 0.8218 | 0.6359 | 0.7974 | | 0.1022 | 4.3186 | 2250 | 0.5442 | 0.7879 | 0.5442 | 0.7377 | | 0.1022 | 4.3225 | 2252 | 0.4462 | 0.7328 | 0.4462 | 0.6680 | | 0.1022 | 4.3263 | 2254 | 0.3556 | 0.6723 | 0.3556 | 0.5963 | | 0.1022 | 4.3301 | 2256 | 0.3098 | 0.7059 | 0.3098 | 0.5566 | | 0.1022 | 4.3340 | 2258 | 0.2985 | 0.6182 | 0.2985 | 0.5463 | | 0.1022 | 4.3378 | 2260 | 0.3049 | 0.6431 | 0.3049 | 0.5522 | | 0.1022 | 4.3417 | 2262 | 0.3353 | 0.6723 | 0.3353 | 0.5791 | | 0.1022 | 4.3455 | 2264 | 0.3650 | 0.7154 | 0.3650 | 0.6041 | | 0.1022 | 4.3493 | 2266 | 0.4068 | 0.7388 | 0.4068 | 0.6378 | | 0.1022 | 4.3532 | 2268 | 0.4492 | 0.7805 | 0.4492 | 0.6703 | | 0.1022 | 4.3570 | 2270 | 0.4512 | 0.8028 | 0.4512 | 0.6717 | | 0.1022 | 4.3608 | 2272 | 0.4264 | 0.7464 | 0.4264 | 0.6530 | | 0.1022 | 4.3647 | 2274 | 0.3835 | 0.7549 | 0.3835 | 0.6193 | | 0.1022 | 4.3685 | 2276 | 0.3513 | 0.6723 | 0.3513 | 0.5927 | | 0.1022 | 4.3724 | 2278 | 0.3436 | 0.6723 | 0.3436 | 0.5862 | | 0.1022 | 4.3762 | 2280 | 0.3578 | 0.6723 | 0.3578 | 0.5981 | | 0.1022 | 4.3800 | 2282 | 0.3944 | 0.7549 | 0.3944 | 0.6280 | | 0.1022 | 4.3839 | 2284 | 0.4385 | 0.7863 | 0.4385 | 0.6622 | | 0.1022 | 4.3877 | 2286 | 0.4735 | 0.8028 | 0.4735 | 0.6881 | | 0.1022 | 4.3916 | 2288 | 0.4721 | 0.8028 | 0.4721 | 0.6871 | | 0.1022 | 4.3954 | 2290 | 0.4612 | 0.8028 | 0.4612 | 0.6792 | | 0.1022 | 4.3992 | 2292 | 0.4729 | 0.8028 | 0.4729 | 0.6877 | | 0.1022 | 4.4031 | 2294 | 0.4737 | 0.8205 | 0.4737 | 0.6882 | | 0.1022 | 4.4069 | 2296 | 0.4471 | 0.7863 | 0.4471 | 0.6687 | | 0.1022 | 4.4107 | 2298 | 0.3922 | 0.6934 | 0.3922 | 0.6263 | | 0.1022 | 4.4146 | 2300 | 0.3659 | 0.6934 | 0.3659 | 0.6049 | | 0.1022 | 4.4184 | 2302 | 0.3749 | 0.6540 | 0.3749 | 0.6123 | | 0.1022 | 4.4223 | 2304 | 0.4046 | 0.7549 | 0.4046 | 0.6361 | | 0.1022 | 4.4261 | 2306 | 0.4441 | 0.7429 | 0.4441 | 0.6664 | | 0.1022 | 4.4299 | 2308 | 0.5217 | 0.7429 | 0.5217 | 0.7223 | | 0.1022 | 4.4338 | 2310 | 0.5840 | 0.7812 | 0.5840 | 0.7642 | | 0.1022 | 4.4376 | 2312 | 0.5716 | 0.7812 | 0.5716 | 0.7561 | | 0.1022 | 4.4415 | 2314 | 0.5078 | 0.7429 | 0.5078 | 0.7126 | | 0.1022 | 4.4453 | 2316 | 0.4282 | 0.7342 | 0.4282 | 0.6544 | | 0.1022 | 4.4491 | 2318 | 0.3590 | 0.7549 | 0.3590 | 0.5992 | | 0.1022 | 4.4530 | 2320 | 0.3317 | 0.7220 | 0.3317 | 0.5760 | | 0.1022 | 4.4568 | 2322 | 0.3395 | 0.6934 | 0.3395 | 0.5827 | | 0.1022 | 4.4607 | 2324 | 0.3551 | 0.7298 | 0.3551 | 0.5959 | | 0.1022 | 4.4645 | 2326 | 0.3823 | 0.7298 | 0.3823 | 0.6183 | | 0.1022 | 4.4683 | 2328 | 0.4238 | 0.7635 | 0.4238 | 0.6510 | | 0.1022 | 4.4722 | 2330 | 0.4530 | 0.8478 | 0.4530 | 0.6731 | | 0.1022 | 4.4760 | 2332 | 0.4692 | 0.8478 | 0.4692 | 0.6850 | | 0.1022 | 4.4798 | 2334 | 0.4755 | 0.8478 | 0.4755 | 0.6896 | | 0.1022 | 4.4837 | 2336 | 0.4372 | 0.8478 | 0.4372 | 0.6612 | | 0.1022 | 4.4875 | 2338 | 0.3760 | 0.7549 | 0.3760 | 0.6132 | | 0.1022 | 4.4914 | 2340 | 0.3424 | 0.7154 | 0.3424 | 0.5851 | | 0.1022 | 4.4952 | 2342 | 0.3141 | 0.7059 | 0.3141 | 0.5605 | | 0.1022 | 4.4990 | 2344 | 0.3149 | 0.6723 | 0.3149 | 0.5612 | | 0.1022 | 4.5029 | 2346 | 0.3409 | 0.7154 | 0.3409 | 0.5838 | | 0.1022 | 4.5067 | 2348 | 0.4009 | 0.7795 | 0.4009 | 0.6332 | | 0.1022 | 4.5106 | 2350 | 0.4812 | 0.7863 | 0.4812 | 0.6937 | | 0.1022 | 4.5144 | 2352 | 0.5306 | 0.8205 | 0.5306 | 0.7284 | | 0.1022 | 4.5182 | 2354 | 0.5425 | 0.8205 | 0.5425 | 0.7365 | | 0.1022 | 4.5221 | 2356 | 0.5180 | 0.8205 | 0.5180 | 0.7197 | | 0.1022 | 4.5259 | 2358 | 0.4964 | 0.8205 | 0.4964 | 0.7046 | | 0.1022 | 4.5298 | 2360 | 0.4488 | 0.7863 | 0.4488 | 0.6700 | | 0.1022 | 4.5336 | 2362 | 0.3925 | 0.7490 | 0.3925 | 0.6265 | | 0.1022 | 4.5374 | 2364 | 0.3425 | 0.6784 | 0.3425 | 0.5852 | | 0.1022 | 4.5413 | 2366 | 0.3301 | 0.6397 | 0.3301 | 0.5745 | | 0.1022 | 4.5451 | 2368 | 0.3416 | 0.6397 | 0.3416 | 0.5845 | | 0.1022 | 4.5489 | 2370 | 0.3713 | 0.7083 | 0.3713 | 0.6093 | | 0.1022 | 4.5528 | 2372 | 0.4119 | 0.7490 | 0.4119 | 0.6418 | | 0.1022 | 4.5566 | 2374 | 0.4594 | 0.7863 | 0.4594 | 0.6778 | | 0.1022 | 4.5605 | 2376 | 0.4827 | 0.7863 | 0.4827 | 0.6948 | | 0.1022 | 4.5643 | 2378 | 0.5055 | 0.8205 | 0.5055 | 0.7110 | | 0.1022 | 4.5681 | 2380 | 0.4889 | 0.7863 | 0.4889 | 0.6992 | | 0.1022 | 4.5720 | 2382 | 0.4788 | 0.7863 | 0.4788 | 0.6919 | | 0.1022 | 4.5758 | 2384 | 0.4539 | 0.7863 | 0.4539 | 0.6737 | | 0.1022 | 4.5797 | 2386 | 0.4257 | 0.7863 | 0.4257 | 0.6525 | | 0.1022 | 4.5835 | 2388 | 0.4190 | 0.7240 | 0.4190 | 0.6473 | | 0.1022 | 4.5873 | 2390 | 0.4104 | 0.7240 | 0.4104 | 0.6406 | | 0.1022 | 4.5912 | 2392 | 0.3765 | 0.7298 | 0.3765 | 0.6136 | | 0.1022 | 4.5950 | 2394 | 0.3738 | 0.6934 | 0.3738 | 0.6114 | | 0.1022 | 4.5988 | 2396 | 0.3925 | 0.7490 | 0.3925 | 0.6265 | | 0.1022 | 4.6027 | 2398 | 0.4398 | 0.7863 | 0.4398 | 0.6632 | | 0.1022 | 4.6065 | 2400 | 0.4825 | 0.7429 | 0.4825 | 0.6947 | | 0.1022 | 4.6104 | 2402 | 0.4714 | 0.7429 | 0.4714 | 0.6866 | | 0.1022 | 4.6142 | 2404 | 0.4355 | 0.6038 | 0.4355 | 0.6599 | | 0.1022 | 4.6180 | 2406 | 0.3925 | 0.6038 | 0.3925 | 0.6265 | | 0.1022 | 4.6219 | 2408 | 0.3553 | 0.6723 | 0.3553 | 0.5960 | | 0.1022 | 4.6257 | 2410 | 0.3435 | 0.6723 | 0.3435 | 0.5861 | | 0.1022 | 4.6296 | 2412 | 0.3679 | 0.7154 | 0.3679 | 0.6066 | | 0.1022 | 4.6334 | 2414 | 0.4328 | 0.7429 | 0.4328 | 0.6579 | | 0.1022 | 4.6372 | 2416 | 0.5263 | 0.7640 | 0.5263 | 0.7255 | | 0.1022 | 4.6411 | 2418 | 0.5581 | 0.7986 | 0.5581 | 0.7471 | | 0.1022 | 4.6449 | 2420 | 0.5688 | 0.7986 | 0.5688 | 0.7542 | | 0.1022 | 4.6488 | 2422 | 0.5225 | 0.7986 | 0.5225 | 0.7228 | | 0.1022 | 4.6526 | 2424 | 0.4780 | 0.7429 | 0.4780 | 0.6914 | | 0.1022 | 4.6564 | 2426 | 0.4563 | 0.7429 | 0.4563 | 0.6755 | | 0.1022 | 4.6603 | 2428 | 0.4650 | 0.7429 | 0.4650 | 0.6819 | | 0.1022 | 4.6641 | 2430 | 0.4449 | 0.7429 | 0.4449 | 0.6670 | | 0.1022 | 4.6679 | 2432 | 0.4065 | 0.7083 | 0.4065 | 0.6376 | | 0.1022 | 4.6718 | 2434 | 0.3628 | 0.6723 | 0.3628 | 0.6023 | | 0.1022 | 4.6756 | 2436 | 0.3505 | 0.6723 | 0.3505 | 0.5920 | | 0.1022 | 4.6795 | 2438 | 0.3581 | 0.6638 | 0.3581 | 0.5984 | | 0.1022 | 4.6833 | 2440 | 0.3925 | 0.7083 | 0.3925 | 0.6265 | | 0.1022 | 4.6871 | 2442 | 0.4586 | 0.7863 | 0.4586 | 0.6772 | | 0.1022 | 4.6910 | 2444 | 0.4863 | 0.7640 | 0.4863 | 0.6973 | | 0.1022 | 4.6948 | 2446 | 0.4603 | 0.7863 | 0.4603 | 0.6785 | | 0.1022 | 4.6987 | 2448 | 0.4042 | 0.7083 | 0.4042 | 0.6358 | | 0.1022 | 4.7025 | 2450 | 0.3439 | 0.6983 | 0.3439 | 0.5864 | | 0.1022 | 4.7063 | 2452 | 0.3215 | 0.6723 | 0.3215 | 0.5670 | | 0.1022 | 4.7102 | 2454 | 0.3299 | 0.6723 | 0.3299 | 0.5743 | | 0.1022 | 4.7140 | 2456 | 0.3263 | 0.6723 | 0.3263 | 0.5712 | | 0.1022 | 4.7179 | 2458 | 0.3357 | 0.6983 | 0.3357 | 0.5794 | | 0.1022 | 4.7217 | 2460 | 0.3673 | 0.6638 | 0.3673 | 0.6060 | | 0.1022 | 4.7255 | 2462 | 0.3956 | 0.6038 | 0.3956 | 0.6289 | | 0.1022 | 4.7294 | 2464 | 0.3903 | 0.6038 | 0.3903 | 0.6247 | | 0.1022 | 4.7332 | 2466 | 0.3890 | 0.6547 | 0.3890 | 0.6237 | | 0.1022 | 4.7370 | 2468 | 0.3607 | 0.7795 | 0.3607 | 0.6005 | | 0.1022 | 4.7409 | 2470 | 0.3583 | 0.8151 | 0.3583 | 0.5986 | | 0.1022 | 4.7447 | 2472 | 0.3955 | 0.8151 | 0.3955 | 0.6289 | | 0.1022 | 4.7486 | 2474 | 0.4022 | 0.7863 | 0.4022 | 0.6342 | | 0.1022 | 4.7524 | 2476 | 0.3580 | 0.8151 | 0.3580 | 0.5983 | | 0.1022 | 4.7562 | 2478 | 0.3024 | 0.6723 | 0.3024 | 0.5499 | | 0.1022 | 4.7601 | 2480 | 0.2873 | 0.7059 | 0.2873 | 0.5360 | | 0.1022 | 4.7639 | 2482 | 0.2997 | 0.7059 | 0.2997 | 0.5475 | | 0.1022 | 4.7678 | 2484 | 0.3314 | 0.6723 | 0.3314 | 0.5757 | | 0.1022 | 4.7716 | 2486 | 0.3851 | 0.7083 | 0.3851 | 0.6205 | | 0.1022 | 4.7754 | 2488 | 0.4514 | 0.8205 | 0.4514 | 0.6719 | | 0.1022 | 4.7793 | 2490 | 0.4808 | 0.8205 | 0.4808 | 0.6934 | | 0.1022 | 4.7831 | 2492 | 0.5077 | 0.8205 | 0.5077 | 0.7125 | | 0.1022 | 4.7869 | 2494 | 0.4861 | 0.8205 | 0.4861 | 0.6972 | | 0.1022 | 4.7908 | 2496 | 0.4256 | 0.7863 | 0.4256 | 0.6524 | | 0.1022 | 4.7946 | 2498 | 0.3831 | 0.7795 | 0.3831 | 0.6189 | | 0.0852 | 4.7985 | 2500 | 0.3635 | 0.7549 | 0.3635 | 0.6029 | | 0.0852 | 4.8023 | 2502 | 0.3664 | 0.7549 | 0.3664 | 0.6053 | | 0.0852 | 4.8061 | 2504 | 0.3659 | 0.7549 | 0.3659 | 0.6049 | | 0.0852 | 4.8100 | 2506 | 0.3705 | 0.7795 | 0.3705 | 0.6087 | | 0.0852 | 4.8138 | 2508 | 0.3957 | 0.8151 | 0.3957 | 0.6290 | | 0.0852 | 4.8177 | 2510 | 0.3945 | 0.7795 | 0.3945 | 0.6281 | | 0.0852 | 4.8215 | 2512 | 0.4072 | 0.7490 | 0.4072 | 0.6381 | | 0.0852 | 4.8253 | 2514 | 0.4053 | 0.7490 | 0.4053 | 0.6366 | | 0.0852 | 4.8292 | 2516 | 0.4061 | 0.7083 | 0.4061 | 0.6373 | | 0.0852 | 4.8330 | 2518 | 0.3986 | 0.6983 | 0.3986 | 0.6314 | | 0.0852 | 4.8369 | 2520 | 0.4164 | 0.7083 | 0.4164 | 0.6453 | | 0.0852 | 4.8407 | 2522 | 0.4700 | 0.8205 | 0.4700 | 0.6856 | | 0.0852 | 4.8445 | 2524 | 0.4946 | 0.8205 | 0.4946 | 0.7033 | | 0.0852 | 4.8484 | 2526 | 0.4916 | 0.8205 | 0.4916 | 0.7011 | | 0.0852 | 4.8522 | 2528 | 0.4747 | 0.8205 | 0.4747 | 0.6890 | | 0.0852 | 4.8560 | 2530 | 0.4692 | 0.8205 | 0.4692 | 0.6850 | | 0.0852 | 4.8599 | 2532 | 0.4643 | 0.8205 | 0.4643 | 0.6814 | | 0.0852 | 4.8637 | 2534 | 0.4654 | 0.8205 | 0.4654 | 0.6822 | | 0.0852 | 4.8676 | 2536 | 0.4578 | 0.8205 | 0.4578 | 0.6766 | | 0.0852 | 4.8714 | 2538 | 0.4267 | 0.6638 | 0.4267 | 0.6532 | | 0.0852 | 4.8752 | 2540 | 0.4109 | 0.6638 | 0.4109 | 0.6410 | | 0.0852 | 4.8791 | 2542 | 0.4172 | 0.6638 | 0.4172 | 0.6459 | | 0.0852 | 4.8829 | 2544 | 0.4246 | 0.7863 | 0.4246 | 0.6516 | | 0.0852 | 4.8868 | 2546 | 0.4530 | 0.8205 | 0.4530 | 0.6730 | | 0.0852 | 4.8906 | 2548 | 0.4788 | 0.8205 | 0.4788 | 0.6919 | | 0.0852 | 4.8944 | 2550 | 0.5305 | 0.8028 | 0.5305 | 0.7283 | | 0.0852 | 4.8983 | 2552 | 0.5595 | 0.8028 | 0.5595 | 0.7480 | | 0.0852 | 4.9021 | 2554 | 0.5663 | 0.8028 | 0.5663 | 0.7525 | | 0.0852 | 4.9060 | 2556 | 0.5481 | 0.8028 | 0.5481 | 0.7403 | | 0.0852 | 4.9098 | 2558 | 0.5171 | 0.8205 | 0.5171 | 0.7191 | | 0.0852 | 4.9136 | 2560 | 0.4837 | 0.8205 | 0.4837 | 0.6955 | | 0.0852 | 4.9175 | 2562 | 0.4680 | 0.7490 | 0.4680 | 0.6841 | | 0.0852 | 4.9213 | 2564 | 0.4377 | 0.7083 | 0.4377 | 0.6616 | | 0.0852 | 4.9251 | 2566 | 0.4169 | 0.6638 | 0.4169 | 0.6457 | | 0.0852 | 4.9290 | 2568 | 0.4384 | 0.6638 | 0.4384 | 0.6621 | | 0.0852 | 4.9328 | 2570 | 0.4957 | 0.6547 | 0.4957 | 0.7041 | | 0.0852 | 4.9367 | 2572 | 0.5253 | 0.7009 | 0.5253 | 0.7248 | | 0.0852 | 4.9405 | 2574 | 0.5389 | 0.7009 | 0.5389 | 0.7341 | | 0.0852 | 4.9443 | 2576 | 0.5298 | 0.7009 | 0.5298 | 0.7279 | | 0.0852 | 4.9482 | 2578 | 0.4915 | 0.7490 | 0.4915 | 0.7011 | | 0.0852 | 4.9520 | 2580 | 0.4538 | 0.7083 | 0.4538 | 0.6736 | | 0.0852 | 4.9559 | 2582 | 0.4535 | 0.7083 | 0.4535 | 0.6734 | | 0.0852 | 4.9597 | 2584 | 0.4885 | 0.7490 | 0.4885 | 0.6989 | | 0.0852 | 4.9635 | 2586 | 0.5300 | 0.7490 | 0.5300 | 0.7280 | | 0.0852 | 4.9674 | 2588 | 0.5568 | 0.8028 | 0.5568 | 0.7462 | | 0.0852 | 4.9712 | 2590 | 0.5310 | 0.8205 | 0.5310 | 0.7287 | | 0.0852 | 4.9750 | 2592 | 0.4950 | 0.7490 | 0.4950 | 0.7036 | | 0.0852 | 4.9789 | 2594 | 0.4381 | 0.7490 | 0.4381 | 0.6619 | | 0.0852 | 4.9827 | 2596 | 0.3959 | 0.7407 | 0.3959 | 0.6292 | | 0.0852 | 4.9866 | 2598 | 0.3923 | 0.7407 | 0.3923 | 0.6263 | | 0.0852 | 4.9904 | 2600 | 0.4061 | 0.7407 | 0.4061 | 0.6372 | | 0.0852 | 4.9942 | 2602 | 0.4171 | 0.7795 | 0.4171 | 0.6458 | | 0.0852 | 4.9981 | 2604 | 0.4175 | 0.7795 | 0.4175 | 0.6462 | | 0.0852 | 5.0019 | 2606 | 0.3911 | 0.7795 | 0.3911 | 0.6254 | | 0.0852 | 5.0058 | 2608 | 0.3808 | 0.7795 | 0.3808 | 0.6171 | | 0.0852 | 5.0096 | 2610 | 0.4081 | 0.7795 | 0.4081 | 0.6389 | | 0.0852 | 5.0134 | 2612 | 0.4436 | 0.8293 | 0.4436 | 0.6660 | | 0.0852 | 5.0173 | 2614 | 0.4543 | 0.8293 | 0.4543 | 0.6741 | | 0.0852 | 5.0211 | 2616 | 0.4531 | 0.8293 | 0.4531 | 0.6731 | | 0.0852 | 5.0250 | 2618 | 0.4674 | 0.8028 | 0.4674 | 0.6836 | | 0.0852 | 5.0288 | 2620 | 0.4714 | 0.8028 | 0.4714 | 0.6866 | | 0.0852 | 5.0326 | 2622 | 0.4861 | 0.7640 | 0.4861 | 0.6972 | | 0.0852 | 5.0365 | 2624 | 0.4711 | 0.7692 | 0.4711 | 0.6863 | | 0.0852 | 5.0403 | 2626 | 0.4412 | 0.6932 | 0.4412 | 0.6642 | | 0.0852 | 5.0441 | 2628 | 0.4394 | 0.7083 | 0.4394 | 0.6629 | | 0.0852 | 5.0480 | 2630 | 0.4305 | 0.7407 | 0.4305 | 0.6562 | | 0.0852 | 5.0518 | 2632 | 0.4063 | 0.7407 | 0.4063 | 0.6374 | | 0.0852 | 5.0557 | 2634 | 0.3985 | 0.7407 | 0.3985 | 0.6313 | | 0.0852 | 5.0595 | 2636 | 0.3861 | 0.7407 | 0.3861 | 0.6214 | | 0.0852 | 5.0633 | 2638 | 0.4013 | 0.7407 | 0.4013 | 0.6335 | | 0.0852 | 5.0672 | 2640 | 0.4459 | 0.7971 | 0.4459 | 0.6678 | | 0.0852 | 5.0710 | 2642 | 0.4816 | 0.8028 | 0.4816 | 0.6939 | | 0.0852 | 5.0749 | 2644 | 0.4676 | 0.8293 | 0.4676 | 0.6838 | | 0.0852 | 5.0787 | 2646 | 0.4209 | 0.8069 | 0.4209 | 0.6488 | | 0.0852 | 5.0825 | 2648 | 0.3794 | 0.7549 | 0.3794 | 0.6159 | | 0.0852 | 5.0864 | 2650 | 0.3854 | 0.7549 | 0.3854 | 0.6208 | | 0.0852 | 5.0902 | 2652 | 0.4090 | 0.7388 | 0.4090 | 0.6395 | | 0.0852 | 5.0940 | 2654 | 0.4764 | 0.8591 | 0.4764 | 0.6902 | | 0.0852 | 5.0979 | 2656 | 0.5222 | 0.7986 | 0.5222 | 0.7226 | | 0.0852 | 5.1017 | 2658 | 0.5271 | 0.7986 | 0.5271 | 0.7260 | | 0.0852 | 5.1056 | 2660 | 0.4786 | 0.8591 | 0.4786 | 0.6918 | | 0.0852 | 5.1094 | 2662 | 0.4378 | 0.7623 | 0.4378 | 0.6617 | | 0.0852 | 5.1132 | 2664 | 0.4323 | 0.7623 | 0.4323 | 0.6575 | | 0.0852 | 5.1171 | 2666 | 0.4232 | 0.7623 | 0.4232 | 0.6505 | | 0.0852 | 5.1209 | 2668 | 0.4219 | 0.7795 | 0.4219 | 0.6496 | | 0.0852 | 5.1248 | 2670 | 0.4194 | 0.7407 | 0.4194 | 0.6476 | | 0.0852 | 5.1286 | 2672 | 0.4391 | 0.7490 | 0.4391 | 0.6627 | | 0.0852 | 5.1324 | 2674 | 0.4584 | 0.7490 | 0.4584 | 0.6771 | | 0.0852 | 5.1363 | 2676 | 0.4535 | 0.7490 | 0.4535 | 0.6734 | | 0.0852 | 5.1401 | 2678 | 0.4336 | 0.7490 | 0.4336 | 0.6585 | | 0.0852 | 5.1440 | 2680 | 0.4266 | 0.7795 | 0.4266 | 0.6531 | | 0.0852 | 5.1478 | 2682 | 0.4264 | 0.7795 | 0.4264 | 0.6530 | | 0.0852 | 5.1516 | 2684 | 0.4093 | 0.7159 | 0.4093 | 0.6398 | | 0.0852 | 5.1555 | 2686 | 0.4086 | 0.7518 | 0.4086 | 0.6392 | | 0.0852 | 5.1593 | 2688 | 0.4247 | 0.7697 | 0.4247 | 0.6517 | | 0.0852 | 5.1631 | 2690 | 0.4381 | 0.8293 | 0.4381 | 0.6619 | | 0.0852 | 5.1670 | 2692 | 0.4689 | 0.8293 | 0.4689 | 0.6848 | | 0.0852 | 5.1708 | 2694 | 0.4774 | 0.8028 | 0.4774 | 0.6910 | | 0.0852 | 5.1747 | 2696 | 0.4853 | 0.8028 | 0.4853 | 0.6966 | | 0.0852 | 5.1785 | 2698 | 0.4877 | 0.7640 | 0.4877 | 0.6984 | | 0.0852 | 5.1823 | 2700 | 0.5103 | 0.7640 | 0.5103 | 0.7143 | | 0.0852 | 5.1862 | 2702 | 0.5003 | 0.7640 | 0.5003 | 0.7073 | | 0.0852 | 5.1900 | 2704 | 0.4865 | 0.7640 | 0.4865 | 0.6975 | | 0.0852 | 5.1939 | 2706 | 0.4794 | 0.7926 | 0.4794 | 0.6924 | | 0.0852 | 5.1977 | 2708 | 0.4608 | 0.7926 | 0.4608 | 0.6788 | | 0.0852 | 5.2015 | 2710 | 0.4176 | 0.7742 | 0.4176 | 0.6462 | | 0.0852 | 5.2054 | 2712 | 0.3958 | 0.7490 | 0.3958 | 0.6291 | | 0.0852 | 5.2092 | 2714 | 0.3992 | 0.7490 | 0.3992 | 0.6318 | | 0.0852 | 5.2131 | 2716 | 0.4447 | 0.8256 | 0.4447 | 0.6668 | | 0.0852 | 5.2169 | 2718 | 0.4947 | 0.7986 | 0.4947 | 0.7033 | | 0.0852 | 5.2207 | 2720 | 0.5204 | 0.7986 | 0.5204 | 0.7214 | | 0.0852 | 5.2246 | 2722 | 0.5008 | 0.7986 | 0.5008 | 0.7077 | | 0.0852 | 5.2284 | 2724 | 0.4508 | 0.8256 | 0.4508 | 0.6714 | | 0.0852 | 5.2322 | 2726 | 0.4049 | 0.8108 | 0.4049 | 0.6363 | | 0.0852 | 5.2361 | 2728 | 0.3773 | 0.7549 | 0.3773 | 0.6143 | | 0.0852 | 5.2399 | 2730 | 0.3550 | 0.7549 | 0.3550 | 0.5958 | | 0.0852 | 5.2438 | 2732 | 0.3528 | 0.7549 | 0.3528 | 0.5940 | | 0.0852 | 5.2476 | 2734 | 0.3807 | 0.8244 | 0.3807 | 0.6170 | | 0.0852 | 5.2514 | 2736 | 0.4515 | 0.8591 | 0.4515 | 0.6719 | | 0.0852 | 5.2553 | 2738 | 0.5489 | 0.7986 | 0.5489 | 0.7409 | | 0.0852 | 5.2591 | 2740 | 0.6345 | 0.7820 | 0.6345 | 0.7966 | | 0.0852 | 5.2630 | 2742 | 0.6381 | 0.7820 | 0.6381 | 0.7988 | | 0.0852 | 5.2668 | 2744 | 0.5695 | 0.7986 | 0.5695 | 0.7547 | | 0.0852 | 5.2706 | 2746 | 0.4737 | 0.8339 | 0.4737 | 0.6883 | | 0.0852 | 5.2745 | 2748 | 0.4210 | 0.7490 | 0.4210 | 0.6489 | | 0.0852 | 5.2783 | 2750 | 0.3670 | 0.7407 | 0.3670 | 0.6058 | | 0.0852 | 5.2821 | 2752 | 0.3498 | 0.6983 | 0.3498 | 0.5915 | | 0.0852 | 5.2860 | 2754 | 0.3610 | 0.6983 | 0.3610 | 0.6008 | | 0.0852 | 5.2898 | 2756 | 0.3956 | 0.6638 | 0.3956 | 0.6289 | | 0.0852 | 5.2937 | 2758 | 0.4524 | 0.6638 | 0.4524 | 0.6726 | | 0.0852 | 5.2975 | 2760 | 0.5123 | 0.6547 | 0.5123 | 0.7157 | | 0.0852 | 5.3013 | 2762 | 0.5194 | 0.7009 | 0.5194 | 0.7207 | | 0.0852 | 5.3052 | 2764 | 0.5048 | 0.7490 | 0.5048 | 0.7105 | | 0.0852 | 5.3090 | 2766 | 0.4607 | 0.7490 | 0.4607 | 0.6788 | | 0.0852 | 5.3129 | 2768 | 0.4364 | 0.7490 | 0.4364 | 0.6606 | | 0.0852 | 5.3167 | 2770 | 0.4414 | 0.7490 | 0.4414 | 0.6644 | | 0.0852 | 5.3205 | 2772 | 0.4368 | 0.7490 | 0.4368 | 0.6609 | | 0.0852 | 5.3244 | 2774 | 0.4447 | 0.7863 | 0.4447 | 0.6668 | | 0.0852 | 5.3282 | 2776 | 0.4647 | 0.7863 | 0.4647 | 0.6817 | | 0.0852 | 5.3321 | 2778 | 0.4607 | 0.7863 | 0.4607 | 0.6787 | | 0.0852 | 5.3359 | 2780 | 0.4490 | 0.7863 | 0.4490 | 0.6701 | | 0.0852 | 5.3397 | 2782 | 0.4177 | 0.7795 | 0.4177 | 0.6463 | | 0.0852 | 5.3436 | 2784 | 0.3987 | 0.7795 | 0.3987 | 0.6314 | | 0.0852 | 5.3474 | 2786 | 0.4053 | 0.7795 | 0.4053 | 0.6366 | | 0.0852 | 5.3512 | 2788 | 0.4403 | 0.7795 | 0.4403 | 0.6635 | | 0.0852 | 5.3551 | 2790 | 0.4687 | 0.7490 | 0.4687 | 0.6846 | | 0.0852 | 5.3589 | 2792 | 0.5124 | 0.7490 | 0.5124 | 0.7158 | | 0.0852 | 5.3628 | 2794 | 0.5103 | 0.7490 | 0.5103 | 0.7143 | | 0.0852 | 5.3666 | 2796 | 0.4919 | 0.7490 | 0.4919 | 0.7013 | | 0.0852 | 5.3704 | 2798 | 0.4634 | 0.7490 | 0.4634 | 0.6808 | | 0.0852 | 5.3743 | 2800 | 0.4481 | 0.7795 | 0.4481 | 0.6694 | | 0.0852 | 5.3781 | 2802 | 0.4389 | 0.7795 | 0.4389 | 0.6625 | | 0.0852 | 5.3820 | 2804 | 0.4557 | 0.7795 | 0.4557 | 0.6750 | | 0.0852 | 5.3858 | 2806 | 0.4523 | 0.7795 | 0.4523 | 0.6726 | | 0.0852 | 5.3896 | 2808 | 0.4348 | 0.7159 | 0.4348 | 0.6594 | | 0.0852 | 5.3935 | 2810 | 0.4356 | 0.7159 | 0.4356 | 0.6600 | | 0.0852 | 5.3973 | 2812 | 0.4554 | 0.7159 | 0.4554 | 0.6749 | | 0.0852 | 5.4012 | 2814 | 0.4861 | 0.7021 | 0.4861 | 0.6972 | | 0.0852 | 5.4050 | 2816 | 0.4787 | 0.7159 | 0.4787 | 0.6919 | | 0.0852 | 5.4088 | 2818 | 0.4816 | 0.7795 | 0.4816 | 0.6939 | | 0.0852 | 5.4127 | 2820 | 0.4852 | 0.7083 | 0.4852 | 0.6965 | | 0.0852 | 5.4165 | 2822 | 0.4918 | 0.6638 | 0.4918 | 0.7013 | | 0.0852 | 5.4203 | 2824 | 0.5048 | 0.6038 | 0.5048 | 0.7105 | | 0.0852 | 5.4242 | 2826 | 0.5173 | 0.6038 | 0.5173 | 0.7193 | | 0.0852 | 5.4280 | 2828 | 0.5124 | 0.6038 | 0.5124 | 0.7158 | | 0.0852 | 5.4319 | 2830 | 0.4843 | 0.6638 | 0.4843 | 0.6959 | | 0.0852 | 5.4357 | 2832 | 0.4702 | 0.6983 | 0.4702 | 0.6857 | | 0.0852 | 5.4395 | 2834 | 0.4387 | 0.6983 | 0.4387 | 0.6624 | | 0.0852 | 5.4434 | 2836 | 0.4253 | 0.7795 | 0.4253 | 0.6521 | | 0.0852 | 5.4472 | 2838 | 0.4383 | 0.7795 | 0.4383 | 0.6621 | | 0.0852 | 5.4511 | 2840 | 0.4617 | 0.7623 | 0.4617 | 0.6795 | | 0.0852 | 5.4549 | 2842 | 0.4463 | 0.7623 | 0.4463 | 0.6680 | | 0.0852 | 5.4587 | 2844 | 0.4544 | 0.7623 | 0.4544 | 0.6741 | | 0.0852 | 5.4626 | 2846 | 0.4489 | 0.7795 | 0.4489 | 0.6700 | | 0.0852 | 5.4664 | 2848 | 0.4640 | 0.7623 | 0.4640 | 0.6812 | | 0.0852 | 5.4702 | 2850 | 0.5068 | 0.7640 | 0.5068 | 0.7119 | | 0.0852 | 5.4741 | 2852 | 0.4987 | 0.8028 | 0.4987 | 0.7062 | | 0.0852 | 5.4779 | 2854 | 0.4982 | 0.7692 | 0.4982 | 0.7059 | | 0.0852 | 5.4818 | 2856 | 0.4571 | 0.7328 | 0.4571 | 0.6761 | | 0.0852 | 5.4856 | 2858 | 0.4124 | 0.7795 | 0.4124 | 0.6422 | | 0.0852 | 5.4894 | 2860 | 0.3862 | 0.7154 | 0.3862 | 0.6215 | | 0.0852 | 5.4933 | 2862 | 0.3967 | 0.7549 | 0.3967 | 0.6298 | | 0.0852 | 5.4971 | 2864 | 0.4298 | 0.7795 | 0.4298 | 0.6556 | | 0.0852 | 5.5010 | 2866 | 0.4532 | 0.7490 | 0.4532 | 0.6732 | | 0.0852 | 5.5048 | 2868 | 0.4884 | 0.7009 | 0.4884 | 0.6989 | | 0.0852 | 5.5086 | 2870 | 0.5336 | 0.7009 | 0.5336 | 0.7305 | | 0.0852 | 5.5125 | 2872 | 0.5510 | 0.6857 | 0.5510 | 0.7423 | | 0.0852 | 5.5163 | 2874 | 0.5442 | 0.7009 | 0.5442 | 0.7377 | | 0.0852 | 5.5202 | 2876 | 0.5409 | 0.6857 | 0.5409 | 0.7354 | | 0.0852 | 5.5240 | 2878 | 0.5013 | 0.7009 | 0.5013 | 0.7080 | | 0.0852 | 5.5278 | 2880 | 0.4398 | 0.7490 | 0.4398 | 0.6632 | | 0.0852 | 5.5317 | 2882 | 0.3907 | 0.7407 | 0.3907 | 0.6251 | | 0.0852 | 5.5355 | 2884 | 0.3734 | 0.7549 | 0.3734 | 0.6111 | | 0.0852 | 5.5393 | 2886 | 0.3669 | 0.7549 | 0.3669 | 0.6057 | | 0.0852 | 5.5432 | 2888 | 0.3769 | 0.7549 | 0.3769 | 0.6139 | | 0.0852 | 5.5470 | 2890 | 0.4005 | 0.7795 | 0.4005 | 0.6328 | | 0.0852 | 5.5509 | 2892 | 0.4595 | 0.7009 | 0.4595 | 0.6779 | | 0.0852 | 5.5547 | 2894 | 0.5126 | 0.6857 | 0.5126 | 0.7160 | | 0.0852 | 5.5585 | 2896 | 0.5470 | 0.7640 | 0.5470 | 0.7396 | | 0.0852 | 5.5624 | 2898 | 0.5313 | 0.7640 | 0.5313 | 0.7289 | | 0.0852 | 5.5662 | 2900 | 0.5284 | 0.7640 | 0.5284 | 0.7269 | | 0.0852 | 5.5701 | 2902 | 0.5360 | 0.7640 | 0.5360 | 0.7321 | | 0.0852 | 5.5739 | 2904 | 0.4975 | 0.7266 | 0.4975 | 0.7053 | | 0.0852 | 5.5777 | 2906 | 0.4857 | 0.6857 | 0.4857 | 0.6969 | | 0.0852 | 5.5816 | 2908 | 0.4603 | 0.7342 | 0.4603 | 0.6784 | | 0.0852 | 5.5854 | 2910 | 0.4620 | 0.7342 | 0.4620 | 0.6797 | | 0.0852 | 5.5893 | 2912 | 0.4994 | 0.7640 | 0.4994 | 0.7067 | | 0.0852 | 5.5931 | 2914 | 0.5076 | 0.7640 | 0.5076 | 0.7125 | | 0.0852 | 5.5969 | 2916 | 0.5104 | 0.7266 | 0.5104 | 0.7144 | | 0.0852 | 5.6008 | 2918 | 0.5307 | 0.7640 | 0.5307 | 0.7285 | | 0.0852 | 5.6046 | 2920 | 0.5674 | 0.7640 | 0.5674 | 0.7533 | | 0.0852 | 5.6084 | 2922 | 0.6149 | 0.7986 | 0.6149 | 0.7842 | | 0.0852 | 5.6123 | 2924 | 0.5981 | 0.7640 | 0.5981 | 0.7733 | | 0.0852 | 5.6161 | 2926 | 0.5351 | 0.6857 | 0.5351 | 0.7315 | | 0.0852 | 5.6200 | 2928 | 0.4703 | 0.6547 | 0.4703 | 0.6858 | | 0.0852 | 5.6238 | 2930 | 0.4607 | 0.6903 | 0.4607 | 0.6787 | | 0.0852 | 5.6276 | 2932 | 0.4533 | 0.6903 | 0.4533 | 0.6733 | | 0.0852 | 5.6315 | 2934 | 0.4449 | 0.7177 | 0.4449 | 0.6670 | | 0.0852 | 5.6353 | 2936 | 0.4531 | 0.7692 | 0.4531 | 0.6731 | | 0.0852 | 5.6392 | 2938 | 0.4600 | 0.7692 | 0.4600 | 0.6783 | | 0.0852 | 5.6430 | 2940 | 0.4898 | 0.8028 | 0.4898 | 0.6999 | | 0.0852 | 5.6468 | 2942 | 0.5214 | 0.8256 | 0.5214 | 0.7221 | | 0.0852 | 5.6507 | 2944 | 0.4880 | 0.8028 | 0.4880 | 0.6986 | | 0.0852 | 5.6545 | 2946 | 0.4400 | 0.8069 | 0.4400 | 0.6633 | | 0.0852 | 5.6583 | 2948 | 0.3803 | 0.8069 | 0.3803 | 0.6167 | | 0.0852 | 5.6622 | 2950 | 0.3629 | 0.7910 | 0.3629 | 0.6024 | | 0.0852 | 5.6660 | 2952 | 0.3909 | 0.8069 | 0.3909 | 0.6252 | | 0.0852 | 5.6699 | 2954 | 0.4444 | 0.7692 | 0.4444 | 0.6666 | | 0.0852 | 5.6737 | 2956 | 0.4635 | 0.7692 | 0.4635 | 0.6808 | | 0.0852 | 5.6775 | 2958 | 0.4925 | 0.8256 | 0.4925 | 0.7018 | | 0.0852 | 5.6814 | 2960 | 0.5211 | 0.8256 | 0.5211 | 0.7219 | | 0.0852 | 5.6852 | 2962 | 0.5333 | 0.7986 | 0.5333 | 0.7302 | | 0.0852 | 5.6891 | 2964 | 0.5148 | 0.7926 | 0.5148 | 0.7175 | | 0.0852 | 5.6929 | 2966 | 0.4821 | 0.7926 | 0.4821 | 0.6944 | | 0.0852 | 5.6967 | 2968 | 0.4401 | 0.7490 | 0.4401 | 0.6634 | | 0.0852 | 5.7006 | 2970 | 0.4058 | 0.6638 | 0.4058 | 0.6370 | | 0.0852 | 5.7044 | 2972 | 0.4026 | 0.6638 | 0.4026 | 0.6345 | | 0.0852 | 5.7083 | 2974 | 0.4427 | 0.7083 | 0.4427 | 0.6654 | | 0.0852 | 5.7121 | 2976 | 0.5188 | 0.7926 | 0.5188 | 0.7203 | | 0.0852 | 5.7159 | 2978 | 0.5533 | 0.8256 | 0.5533 | 0.7438 | | 0.0852 | 5.7198 | 2980 | 0.5229 | 0.7692 | 0.5229 | 0.7231 | | 0.0852 | 5.7236 | 2982 | 0.4512 | 0.7692 | 0.4512 | 0.6717 | | 0.0852 | 5.7274 | 2984 | 0.4159 | 0.7692 | 0.4159 | 0.6449 | | 0.0852 | 5.7313 | 2986 | 0.4088 | 0.7328 | 0.4088 | 0.6394 | | 0.0852 | 5.7351 | 2988 | 0.4110 | 0.7490 | 0.4110 | 0.6411 | | 0.0852 | 5.7390 | 2990 | 0.4239 | 0.7490 | 0.4239 | 0.6511 | | 0.0852 | 5.7428 | 2992 | 0.4653 | 0.7692 | 0.4653 | 0.6821 | | 0.0852 | 5.7466 | 2994 | 0.4806 | 0.7692 | 0.4806 | 0.6932 | | 0.0852 | 5.7505 | 2996 | 0.4529 | 0.7692 | 0.4529 | 0.6730 | | 0.0852 | 5.7543 | 2998 | 0.4357 | 0.7490 | 0.4357 | 0.6601 | | 0.0734 | 5.7582 | 3000 | 0.4130 | 0.7549 | 0.4130 | 0.6426 | | 0.0734 | 5.7620 | 3002 | 0.4231 | 0.7083 | 0.4231 | 0.6505 | | 0.0734 | 5.7658 | 3004 | 0.4375 | 0.7083 | 0.4375 | 0.6614 | | 0.0734 | 5.7697 | 3006 | 0.4385 | 0.7549 | 0.4385 | 0.6622 | | 0.0734 | 5.7735 | 3008 | 0.4556 | 0.7549 | 0.4556 | 0.6750 | | 0.0734 | 5.7774 | 3010 | 0.4601 | 0.7742 | 0.4601 | 0.6783 | | 0.0734 | 5.7812 | 3012 | 0.4617 | 0.8069 | 0.4617 | 0.6795 | | 0.0734 | 5.7850 | 3014 | 0.4301 | 0.7549 | 0.4301 | 0.6558 | | 0.0734 | 5.7889 | 3016 | 0.4171 | 0.7549 | 0.4171 | 0.6458 | | 0.0734 | 5.7927 | 3018 | 0.4152 | 0.7549 | 0.4152 | 0.6443 | | 0.0734 | 5.7965 | 3020 | 0.4008 | 0.7549 | 0.4008 | 0.6331 | | 0.0734 | 5.8004 | 3022 | 0.4000 | 0.7083 | 0.4000 | 0.6324 | | 0.0734 | 5.8042 | 3024 | 0.3970 | 0.7083 | 0.3970 | 0.6301 | | 0.0734 | 5.8081 | 3026 | 0.4229 | 0.7083 | 0.4229 | 0.6503 | | 0.0734 | 5.8119 | 3028 | 0.4581 | 0.7342 | 0.4581 | 0.6768 | | 0.0734 | 5.8157 | 3030 | 0.5038 | 0.7009 | 0.5038 | 0.7098 | | 0.0734 | 5.8196 | 3032 | 0.5435 | 0.6930 | 0.5435 | 0.7372 | | 0.0734 | 5.8234 | 3034 | 0.5539 | 0.6930 | 0.5539 | 0.7442 | | 0.0734 | 5.8273 | 3036 | 0.5301 | 0.6857 | 0.5301 | 0.7281 | | 0.0734 | 5.8311 | 3038 | 0.4712 | 0.7342 | 0.4712 | 0.6865 | | 0.0734 | 5.8349 | 3040 | 0.4222 | 0.7154 | 0.4222 | 0.6498 | | 0.0734 | 5.8388 | 3042 | 0.4007 | 0.7154 | 0.4007 | 0.6330 | | 0.0734 | 5.8426 | 3044 | 0.4016 | 0.7154 | 0.4016 | 0.6337 | | 0.0734 | 5.8464 | 3046 | 0.4364 | 0.7795 | 0.4364 | 0.6606 | | 0.0734 | 5.8503 | 3048 | 0.4780 | 0.7328 | 0.4780 | 0.6914 | | 0.0734 | 5.8541 | 3050 | 0.5230 | 0.7266 | 0.5230 | 0.7232 | | 0.0734 | 5.8580 | 3052 | 0.5634 | 0.7266 | 0.5634 | 0.7506 | | 0.0734 | 5.8618 | 3054 | 0.5584 | 0.6857 | 0.5584 | 0.7472 | | 0.0734 | 5.8656 | 3056 | 0.5262 | 0.7009 | 0.5262 | 0.7254 | | 0.0734 | 5.8695 | 3058 | 0.4950 | 0.6547 | 0.4950 | 0.7036 | | 0.0734 | 5.8733 | 3060 | 0.4983 | 0.7009 | 0.4983 | 0.7059 | | 0.0734 | 5.8772 | 3062 | 0.5039 | 0.6857 | 0.5039 | 0.7099 | | 0.0734 | 5.8810 | 3064 | 0.4918 | 0.7328 | 0.4918 | 0.7013 | | 0.0734 | 5.8848 | 3066 | 0.4727 | 0.7623 | 0.4727 | 0.6876 | | 0.0734 | 5.8887 | 3068 | 0.4544 | 0.7623 | 0.4544 | 0.6741 | | 0.0734 | 5.8925 | 3070 | 0.4098 | 0.7549 | 0.4098 | 0.6401 | | 0.0734 | 5.8964 | 3072 | 0.3653 | 0.65 | 0.3653 | 0.6044 | | 0.0734 | 5.9002 | 3074 | 0.3480 | 0.65 | 0.3480 | 0.5899 | | 0.0734 | 5.9040 | 3076 | 0.3585 | 0.65 | 0.3585 | 0.5987 | | 0.0734 | 5.9079 | 3078 | 0.3899 | 0.7549 | 0.3899 | 0.6244 | | 0.0734 | 5.9117 | 3080 | 0.4246 | 0.7388 | 0.4246 | 0.6516 | | 0.0734 | 5.9155 | 3082 | 0.4647 | 0.7971 | 0.4647 | 0.6817 | | 0.0734 | 5.9194 | 3084 | 0.4617 | 0.7971 | 0.4617 | 0.6795 | | 0.0734 | 5.9232 | 3086 | 0.4413 | 0.7388 | 0.4413 | 0.6643 | | 0.0734 | 5.9271 | 3088 | 0.4413 | 0.7388 | 0.4413 | 0.6643 | | 0.0734 | 5.9309 | 3090 | 0.4235 | 0.7388 | 0.4235 | 0.6507 | | 0.0734 | 5.9347 | 3092 | 0.4001 | 0.7388 | 0.4001 | 0.6325 | | 0.0734 | 5.9386 | 3094 | 0.4079 | 0.7388 | 0.4079 | 0.6387 | | 0.0734 | 5.9424 | 3096 | 0.4429 | 0.7742 | 0.4429 | 0.6655 | | 0.0734 | 5.9463 | 3098 | 0.5008 | 0.7568 | 0.5008 | 0.7077 | | 0.0734 | 5.9501 | 3100 | 0.5258 | 0.7568 | 0.5258 | 0.7251 | | 0.0734 | 5.9539 | 3102 | 0.5134 | 0.7568 | 0.5134 | 0.7165 | | 0.0734 | 5.9578 | 3104 | 0.4651 | 0.7177 | 0.4651 | 0.6820 | | 0.0734 | 5.9616 | 3106 | 0.4280 | 0.6500 | 0.4280 | 0.6542 | | 0.0734 | 5.9655 | 3108 | 0.4238 | 0.6638 | 0.4238 | 0.6510 | | 0.0734 | 5.9693 | 3110 | 0.4362 | 0.6903 | 0.4362 | 0.6605 | | 0.0734 | 5.9731 | 3112 | 0.4444 | 0.6903 | 0.4444 | 0.6666 | | 0.0734 | 5.9770 | 3114 | 0.4502 | 0.6903 | 0.4502 | 0.6710 | | 0.0734 | 5.9808 | 3116 | 0.4551 | 0.6903 | 0.4551 | 0.6746 | | 0.0734 | 5.9846 | 3118 | 0.4348 | 0.7407 | 0.4348 | 0.6594 | | 0.0734 | 5.9885 | 3120 | 0.4388 | 0.7623 | 0.4388 | 0.6624 | | 0.0734 | 5.9923 | 3122 | 0.4710 | 0.7328 | 0.4710 | 0.6863 | | 0.0734 | 5.9962 | 3124 | 0.4955 | 0.7692 | 0.4955 | 0.7039 | | 0.0734 | 6.0 | 3126 | 0.4905 | 0.7692 | 0.4905 | 0.7003 | | 0.0734 | 6.0038 | 3128 | 0.4724 | 0.7692 | 0.4724 | 0.6873 | | 0.0734 | 6.0077 | 3130 | 0.4295 | 0.7623 | 0.4295 | 0.6554 | | 0.0734 | 6.0115 | 3132 | 0.4024 | 0.6934 | 0.4024 | 0.6343 | | 0.0734 | 6.0154 | 3134 | 0.3860 | 0.6873 | 0.3860 | 0.6213 | | 0.0734 | 6.0192 | 3136 | 0.3774 | 0.6873 | 0.3774 | 0.6143 | | 0.0734 | 6.0230 | 3138 | 0.3874 | 0.6873 | 0.3874 | 0.6224 | | 0.0734 | 6.0269 | 3140 | 0.4116 | 0.6934 | 0.4116 | 0.6415 | | 0.0734 | 6.0307 | 3142 | 0.4257 | 0.7795 | 0.4257 | 0.6525 | | 0.0734 | 6.0345 | 3144 | 0.4563 | 0.7328 | 0.4563 | 0.6755 | | 0.0734 | 6.0384 | 3146 | 0.4730 | 0.7328 | 0.4730 | 0.6878 | | 0.0734 | 6.0422 | 3148 | 0.4748 | 0.7328 | 0.4748 | 0.6890 | | 0.0734 | 6.0461 | 3150 | 0.4429 | 0.7490 | 0.4429 | 0.6655 | | 0.0734 | 6.0499 | 3152 | 0.4026 | 0.7154 | 0.4026 | 0.6345 | | 0.0734 | 6.0537 | 3154 | 0.3934 | 0.7154 | 0.3934 | 0.6272 | | 0.0734 | 6.0576 | 3156 | 0.3860 | 0.7154 | 0.3860 | 0.6213 | | 0.0734 | 6.0614 | 3158 | 0.3759 | 0.7154 | 0.3759 | 0.6131 | | 0.0734 | 6.0653 | 3160 | 0.3880 | 0.7154 | 0.3880 | 0.6229 | | 0.0734 | 6.0691 | 3162 | 0.4015 | 0.7154 | 0.4015 | 0.6337 | | 0.0734 | 6.0729 | 3164 | 0.4395 | 0.7083 | 0.4395 | 0.6629 | | 0.0734 | 6.0768 | 3166 | 0.5113 | 0.7009 | 0.5113 | 0.7150 | | 0.0734 | 6.0806 | 3168 | 0.5560 | 0.7266 | 0.5560 | 0.7457 | | 0.0734 | 6.0845 | 3170 | 0.5522 | 0.7640 | 0.5522 | 0.7431 | | 0.0734 | 6.0883 | 3172 | 0.5083 | 0.7429 | 0.5083 | 0.7130 | | 0.0734 | 6.0921 | 3174 | 0.4577 | 0.7342 | 0.4577 | 0.6765 | | 0.0734 | 6.0960 | 3176 | 0.4354 | 0.7549 | 0.4354 | 0.6599 | | 0.0734 | 6.0998 | 3178 | 0.4378 | 0.7549 | 0.4378 | 0.6617 | | 0.0734 | 6.1036 | 3180 | 0.4448 | 0.7549 | 0.4448 | 0.6669 | | 0.0734 | 6.1075 | 3182 | 0.4278 | 0.7549 | 0.4278 | 0.6540 | | 0.0734 | 6.1113 | 3184 | 0.4215 | 0.7549 | 0.4215 | 0.6492 | | 0.0734 | 6.1152 | 3186 | 0.4168 | 0.7549 | 0.4168 | 0.6456 | | 0.0734 | 6.1190 | 3188 | 0.4135 | 0.7549 | 0.4135 | 0.6431 | | 0.0734 | 6.1228 | 3190 | 0.4208 | 0.7549 | 0.4208 | 0.6487 | | 0.0734 | 6.1267 | 3192 | 0.4161 | 0.7549 | 0.4161 | 0.6451 | | 0.0734 | 6.1305 | 3194 | 0.4006 | 0.7549 | 0.4006 | 0.6330 | | 0.0734 | 6.1344 | 3196 | 0.4065 | 0.7549 | 0.4065 | 0.6376 | | 0.0734 | 6.1382 | 3198 | 0.4271 | 0.7549 | 0.4271 | 0.6535 | | 0.0734 | 6.1420 | 3200 | 0.4124 | 0.7154 | 0.4124 | 0.6422 | | 0.0734 | 6.1459 | 3202 | 0.4078 | 0.7549 | 0.4078 | 0.6386 | | 0.0734 | 6.1497 | 3204 | 0.4287 | 0.7795 | 0.4287 | 0.6547 | | 0.0734 | 6.1536 | 3206 | 0.4495 | 0.7795 | 0.4495 | 0.6705 | | 0.0734 | 6.1574 | 3208 | 0.4392 | 0.7795 | 0.4392 | 0.6628 | | 0.0734 | 6.1612 | 3210 | 0.4382 | 0.7795 | 0.4382 | 0.6620 | | 0.0734 | 6.1651 | 3212 | 0.4148 | 0.7795 | 0.4148 | 0.6441 | | 0.0734 | 6.1689 | 3214 | 0.3967 | 0.7795 | 0.3967 | 0.6299 | | 0.0734 | 6.1727 | 3216 | 0.3818 | 0.7795 | 0.3818 | 0.6179 | | 0.0734 | 6.1766 | 3218 | 0.3754 | 0.7154 | 0.3754 | 0.6127 | | 0.0734 | 6.1804 | 3220 | 0.3839 | 0.7795 | 0.3839 | 0.6196 | | 0.0734 | 6.1843 | 3222 | 0.3827 | 0.7795 | 0.3827 | 0.6186 | | 0.0734 | 6.1881 | 3224 | 0.3722 | 0.7549 | 0.3722 | 0.6101 | | 0.0734 | 6.1919 | 3226 | 0.3400 | 0.6723 | 0.3400 | 0.5831 | | 0.0734 | 6.1958 | 3228 | 0.3239 | 0.6723 | 0.3239 | 0.5691 | | 0.0734 | 6.1996 | 3230 | 0.3342 | 0.7549 | 0.3342 | 0.5781 | | 0.0734 | 6.2035 | 3232 | 0.3663 | 0.7549 | 0.3663 | 0.6052 | | 0.0734 | 6.2073 | 3234 | 0.3913 | 0.7549 | 0.3913 | 0.6256 | | 0.0734 | 6.2111 | 3236 | 0.3925 | 0.7549 | 0.3925 | 0.6265 | | 0.0734 | 6.2150 | 3238 | 0.4043 | 0.7549 | 0.4043 | 0.6358 | | 0.0734 | 6.2188 | 3240 | 0.4113 | 0.7549 | 0.4113 | 0.6413 | | 0.0734 | 6.2226 | 3242 | 0.4136 | 0.7549 | 0.4136 | 0.6431 | | 0.0734 | 6.2265 | 3244 | 0.3985 | 0.7549 | 0.3985 | 0.6312 | | 0.0734 | 6.2303 | 3246 | 0.3820 | 0.7549 | 0.3820 | 0.6180 | | 0.0734 | 6.2342 | 3248 | 0.3789 | 0.7549 | 0.3789 | 0.6156 | | 0.0734 | 6.2380 | 3250 | 0.3877 | 0.7549 | 0.3877 | 0.6226 | | 0.0734 | 6.2418 | 3252 | 0.3948 | 0.7549 | 0.3948 | 0.6284 | | 0.0734 | 6.2457 | 3254 | 0.3943 | 0.7549 | 0.3943 | 0.6279 | | 0.0734 | 6.2495 | 3256 | 0.4005 | 0.7549 | 0.4005 | 0.6329 | | 0.0734 | 6.2534 | 3258 | 0.3935 | 0.7549 | 0.3935 | 0.6273 | | 0.0734 | 6.2572 | 3260 | 0.3876 | 0.7549 | 0.3876 | 0.6226 | | 0.0734 | 6.2610 | 3262 | 0.3807 | 0.7154 | 0.3807 | 0.6170 | | 0.0734 | 6.2649 | 3264 | 0.3740 | 0.7154 | 0.3740 | 0.6115 | | 0.0734 | 6.2687 | 3266 | 0.3775 | 0.7549 | 0.3775 | 0.6144 | | 0.0734 | 6.2726 | 3268 | 0.3878 | 0.7549 | 0.3878 | 0.6227 | | 0.0734 | 6.2764 | 3270 | 0.3992 | 0.7549 | 0.3992 | 0.6318 | | 0.0734 | 6.2802 | 3272 | 0.3850 | 0.7910 | 0.3850 | 0.6205 | | 0.0734 | 6.2841 | 3274 | 0.3718 | 0.7910 | 0.3718 | 0.6097 | | 0.0734 | 6.2879 | 3276 | 0.3416 | 0.7549 | 0.3416 | 0.5845 | | 0.0734 | 6.2917 | 3278 | 0.3224 | 0.7549 | 0.3224 | 0.5678 | | 0.0734 | 6.2956 | 3280 | 0.3153 | 0.7154 | 0.3153 | 0.5615 | | 0.0734 | 6.2994 | 3282 | 0.3128 | 0.7154 | 0.3128 | 0.5593 | | 0.0734 | 6.3033 | 3284 | 0.3278 | 0.7549 | 0.3278 | 0.5726 | | 0.0734 | 6.3071 | 3286 | 0.3506 | 0.7083 | 0.3506 | 0.5921 | | 0.0734 | 6.3109 | 3288 | 0.3729 | 0.7083 | 0.3729 | 0.6106 | | 0.0734 | 6.3148 | 3290 | 0.3912 | 0.7083 | 0.3912 | 0.6254 | | 0.0734 | 6.3186 | 3292 | 0.3995 | 0.6638 | 0.3995 | 0.6321 | | 0.0734 | 6.3225 | 3294 | 0.3968 | 0.6638 | 0.3968 | 0.6299 | | 0.0734 | 6.3263 | 3296 | 0.3765 | 0.6147 | 0.3765 | 0.6136 | | 0.0734 | 6.3301 | 3298 | 0.3584 | 0.6147 | 0.3584 | 0.5986 | | 0.0734 | 6.3340 | 3300 | 0.3500 | 0.6147 | 0.3500 | 0.5916 | | 0.0734 | 6.3378 | 3302 | 0.3363 | 0.6147 | 0.3363 | 0.5799 | | 0.0734 | 6.3417 | 3304 | 0.3240 | 0.6723 | 0.3240 | 0.5692 | | 0.0734 | 6.3455 | 3306 | 0.3137 | 0.6723 | 0.3137 | 0.5601 | | 0.0734 | 6.3493 | 3308 | 0.3275 | 0.6723 | 0.3275 | 0.5723 | | 0.0734 | 6.3532 | 3310 | 0.3320 | 0.7154 | 0.3320 | 0.5762 | | 0.0734 | 6.3570 | 3312 | 0.3277 | 0.7154 | 0.3277 | 0.5724 | | 0.0734 | 6.3608 | 3314 | 0.3384 | 0.7154 | 0.3384 | 0.5817 | | 0.0734 | 6.3647 | 3316 | 0.3574 | 0.7549 | 0.3574 | 0.5978 | | 0.0734 | 6.3685 | 3318 | 0.3645 | 0.7549 | 0.3645 | 0.6038 | | 0.0734 | 6.3724 | 3320 | 0.3712 | 0.7549 | 0.3712 | 0.6092 | | 0.0734 | 6.3762 | 3322 | 0.3629 | 0.7549 | 0.3629 | 0.6025 | | 0.0734 | 6.3800 | 3324 | 0.3415 | 0.7154 | 0.3415 | 0.5844 | | 0.0734 | 6.3839 | 3326 | 0.3410 | 0.6723 | 0.3410 | 0.5839 | | 0.0734 | 6.3877 | 3328 | 0.3564 | 0.7154 | 0.3564 | 0.5970 | | 0.0734 | 6.3916 | 3330 | 0.3775 | 0.7549 | 0.3775 | 0.6144 | | 0.0734 | 6.3954 | 3332 | 0.4023 | 0.7083 | 0.4023 | 0.6343 | | 0.0734 | 6.3992 | 3334 | 0.4143 | 0.7549 | 0.4143 | 0.6437 | | 0.0734 | 6.4031 | 3336 | 0.4075 | 0.7549 | 0.4075 | 0.6384 | | 0.0734 | 6.4069 | 3338 | 0.4187 | 0.7910 | 0.4187 | 0.6471 | | 0.0734 | 6.4107 | 3340 | 0.4216 | 0.8069 | 0.4216 | 0.6493 | | 0.0734 | 6.4146 | 3342 | 0.4029 | 0.7910 | 0.4029 | 0.6347 | | 0.0734 | 6.4184 | 3344 | 0.3944 | 0.7549 | 0.3944 | 0.6280 | | 0.0734 | 6.4223 | 3346 | 0.4136 | 0.7549 | 0.4136 | 0.6431 | | 0.0734 | 6.4261 | 3348 | 0.4129 | 0.7083 | 0.4129 | 0.6425 | | 0.0734 | 6.4299 | 3350 | 0.4010 | 0.7549 | 0.4010 | 0.6332 | | 0.0734 | 6.4338 | 3352 | 0.3759 | 0.7549 | 0.3759 | 0.6131 | | 0.0734 | 6.4376 | 3354 | 0.3553 | 0.7154 | 0.3553 | 0.5961 | | 0.0734 | 6.4415 | 3356 | 0.3599 | 0.7154 | 0.3599 | 0.5999 | | 0.0734 | 6.4453 | 3358 | 0.3769 | 0.7549 | 0.3769 | 0.6139 | | 0.0734 | 6.4491 | 3360 | 0.4144 | 0.7083 | 0.4144 | 0.6437 | | 0.0734 | 6.4530 | 3362 | 0.4760 | 0.7342 | 0.4760 | 0.6900 | | 0.0734 | 6.4568 | 3364 | 0.5379 | 0.7640 | 0.5379 | 0.7334 | | 0.0734 | 6.4607 | 3366 | 0.5488 | 0.7640 | 0.5488 | 0.7408 | | 0.0734 | 6.4645 | 3368 | 0.5314 | 0.7640 | 0.5314 | 0.7290 | | 0.0734 | 6.4683 | 3370 | 0.4833 | 0.7342 | 0.4833 | 0.6952 | | 0.0734 | 6.4722 | 3372 | 0.4223 | 0.7083 | 0.4223 | 0.6499 | | 0.0734 | 6.4760 | 3374 | 0.3951 | 0.6638 | 0.3951 | 0.6285 | | 0.0734 | 6.4798 | 3376 | 0.3799 | 0.7154 | 0.3799 | 0.6164 | | 0.0734 | 6.4837 | 3378 | 0.3591 | 0.7154 | 0.3591 | 0.5993 | | 0.0734 | 6.4875 | 3380 | 0.3670 | 0.7154 | 0.3670 | 0.6058 | | 0.0734 | 6.4914 | 3382 | 0.3941 | 0.7083 | 0.3941 | 0.6278 | | 0.0734 | 6.4952 | 3384 | 0.4369 | 0.7083 | 0.4369 | 0.6610 | | 0.0734 | 6.4990 | 3386 | 0.5058 | 0.7812 | 0.5058 | 0.7112 | | 0.0734 | 6.5029 | 3388 | 0.5636 | 0.7640 | 0.5636 | 0.7508 | | 0.0734 | 6.5067 | 3390 | 0.6035 | 0.7640 | 0.6035 | 0.7769 | | 0.0734 | 6.5106 | 3392 | 0.5909 | 0.7640 | 0.5909 | 0.7687 | | 0.0734 | 6.5144 | 3394 | 0.5821 | 0.7640 | 0.5821 | 0.7629 | | 0.0734 | 6.5182 | 3396 | 0.5336 | 0.7640 | 0.5336 | 0.7305 | | 0.0734 | 6.5221 | 3398 | 0.4675 | 0.7083 | 0.4675 | 0.6837 | | 0.0734 | 6.5259 | 3400 | 0.3954 | 0.7083 | 0.3954 | 0.6288 | | 0.0734 | 6.5298 | 3402 | 0.3459 | 0.6638 | 0.3459 | 0.5881 | | 0.0734 | 6.5336 | 3404 | 0.3336 | 0.7154 | 0.3336 | 0.5776 | | 0.0734 | 6.5374 | 3406 | 0.3410 | 0.7154 | 0.3410 | 0.5840 | | 0.0734 | 6.5413 | 3408 | 0.3690 | 0.7083 | 0.3690 | 0.6075 | | 0.0734 | 6.5451 | 3410 | 0.4164 | 0.7083 | 0.4164 | 0.6453 | | 0.0734 | 6.5489 | 3412 | 0.4816 | 0.7692 | 0.4816 | 0.6940 | | 0.0734 | 6.5528 | 3414 | 0.5032 | 0.7692 | 0.5032 | 0.7094 | | 0.0734 | 6.5566 | 3416 | 0.4941 | 0.7692 | 0.4941 | 0.7029 | | 0.0734 | 6.5605 | 3418 | 0.4723 | 0.7328 | 0.4723 | 0.6872 | | 0.0734 | 6.5643 | 3420 | 0.4544 | 0.7490 | 0.4544 | 0.6741 | | 0.0734 | 6.5681 | 3422 | 0.4445 | 0.7083 | 0.4445 | 0.6667 | | 0.0734 | 6.5720 | 3424 | 0.4412 | 0.7083 | 0.4412 | 0.6642 | | 0.0734 | 6.5758 | 3426 | 0.4428 | 0.7083 | 0.4428 | 0.6655 | | 0.0734 | 6.5797 | 3428 | 0.4518 | 0.7742 | 0.4518 | 0.6721 | | 0.0734 | 6.5835 | 3430 | 0.4616 | 0.7742 | 0.4616 | 0.6794 | | 0.0734 | 6.5873 | 3432 | 0.4552 | 0.7742 | 0.4552 | 0.6747 | | 0.0734 | 6.5912 | 3434 | 0.4345 | 0.7549 | 0.4345 | 0.6592 | | 0.0734 | 6.5950 | 3436 | 0.4376 | 0.7910 | 0.4376 | 0.6615 | | 0.0734 | 6.5988 | 3438 | 0.4502 | 0.7742 | 0.4502 | 0.6710 | | 0.0734 | 6.6027 | 3440 | 0.4428 | 0.7910 | 0.4428 | 0.6654 | | 0.0734 | 6.6065 | 3442 | 0.4268 | 0.7910 | 0.4268 | 0.6533 | | 0.0734 | 6.6104 | 3444 | 0.4058 | 0.7549 | 0.4058 | 0.6370 | | 0.0734 | 6.6142 | 3446 | 0.4024 | 0.7549 | 0.4024 | 0.6343 | | 0.0734 | 6.6180 | 3448 | 0.3951 | 0.7549 | 0.3951 | 0.6286 | | 0.0734 | 6.6219 | 3450 | 0.4079 | 0.7549 | 0.4079 | 0.6387 | | 0.0734 | 6.6257 | 3452 | 0.4167 | 0.7549 | 0.4167 | 0.6455 | | 0.0734 | 6.6296 | 3454 | 0.4426 | 0.7742 | 0.4426 | 0.6653 | | 0.0734 | 6.6334 | 3456 | 0.4711 | 0.7692 | 0.4711 | 0.6864 | | 0.0734 | 6.6372 | 3458 | 0.4943 | 0.7692 | 0.4943 | 0.7030 | | 0.0734 | 6.6411 | 3460 | 0.4947 | 0.7692 | 0.4947 | 0.7034 | | 0.0734 | 6.6449 | 3462 | 0.4732 | 0.7692 | 0.4732 | 0.6879 | | 0.0734 | 6.6488 | 3464 | 0.4356 | 0.7083 | 0.4356 | 0.6600 | | 0.0734 | 6.6526 | 3466 | 0.4117 | 0.7083 | 0.4117 | 0.6417 | | 0.0734 | 6.6564 | 3468 | 0.3831 | 0.7154 | 0.3831 | 0.6189 | | 0.0734 | 6.6603 | 3470 | 0.3477 | 0.6723 | 0.3477 | 0.5897 | | 0.0734 | 6.6641 | 3472 | 0.3234 | 0.6723 | 0.3234 | 0.5687 | | 0.0734 | 6.6679 | 3474 | 0.3200 | 0.6723 | 0.3200 | 0.5657 | | 0.0734 | 6.6718 | 3476 | 0.3323 | 0.7154 | 0.3323 | 0.5765 | | 0.0734 | 6.6756 | 3478 | 0.3660 | 0.7549 | 0.3660 | 0.6050 | | 0.0734 | 6.6795 | 3480 | 0.4259 | 0.8069 | 0.4259 | 0.6526 | | 0.0734 | 6.6833 | 3482 | 0.4927 | 0.8069 | 0.4927 | 0.7019 | | 0.0734 | 6.6871 | 3484 | 0.5172 | 0.7692 | 0.5172 | 0.7191 | | 0.0734 | 6.6910 | 3486 | 0.5055 | 0.8069 | 0.5055 | 0.7110 | | 0.0734 | 6.6948 | 3488 | 0.4745 | 0.8069 | 0.4745 | 0.6889 | | 0.0734 | 6.6987 | 3490 | 0.4592 | 0.8069 | 0.4592 | 0.6776 | | 0.0734 | 6.7025 | 3492 | 0.4145 | 0.8069 | 0.4145 | 0.6438 | | 0.0734 | 6.7063 | 3494 | 0.3855 | 0.7549 | 0.3855 | 0.6209 | | 0.0734 | 6.7102 | 3496 | 0.3590 | 0.6934 | 0.3590 | 0.5992 | | 0.0734 | 6.7140 | 3498 | 0.3474 | 0.6111 | 0.3474 | 0.5894 | | 0.0606 | 6.7179 | 3500 | 0.3530 | 0.6111 | 0.3530 | 0.5942 | | 0.0606 | 6.7217 | 3502 | 0.3758 | 0.7154 | 0.3758 | 0.6130 | | 0.0606 | 6.7255 | 3504 | 0.4192 | 0.7549 | 0.4192 | 0.6474 | | 0.0606 | 6.7294 | 3506 | 0.4597 | 0.8244 | 0.4597 | 0.6780 | | 0.0606 | 6.7332 | 3508 | 0.4753 | 0.8069 | 0.4753 | 0.6894 | | 0.0606 | 6.7370 | 3510 | 0.4540 | 0.8244 | 0.4540 | 0.6738 | | 0.0606 | 6.7409 | 3512 | 0.4215 | 0.7549 | 0.4215 | 0.6492 | | 0.0606 | 6.7447 | 3514 | 0.3730 | 0.7549 | 0.3730 | 0.6107 | | 0.0606 | 6.7486 | 3516 | 0.3468 | 0.6723 | 0.3468 | 0.5889 | | 0.0606 | 6.7524 | 3518 | 0.3448 | 0.6723 | 0.3448 | 0.5872 | | 0.0606 | 6.7562 | 3520 | 0.3604 | 0.7549 | 0.3604 | 0.6003 | | 0.0606 | 6.7601 | 3522 | 0.3841 | 0.7549 | 0.3841 | 0.6198 | | 0.0606 | 6.7639 | 3524 | 0.4224 | 0.7549 | 0.4224 | 0.6499 | | 0.0606 | 6.7678 | 3526 | 0.4470 | 0.7342 | 0.4470 | 0.6686 | | 0.0606 | 6.7716 | 3528 | 0.4746 | 0.7342 | 0.4746 | 0.6889 | | 0.0606 | 6.7754 | 3530 | 0.4693 | 0.7342 | 0.4693 | 0.6850 | | 0.0606 | 6.7793 | 3532 | 0.4378 | 0.7342 | 0.4378 | 0.6617 | | 0.0606 | 6.7831 | 3534 | 0.4032 | 0.6147 | 0.4032 | 0.6350 | | 0.0606 | 6.7869 | 3536 | 0.3739 | 0.6723 | 0.3739 | 0.6115 | | 0.0606 | 6.7908 | 3538 | 0.3641 | 0.6723 | 0.3641 | 0.6034 | | 0.0606 | 6.7946 | 3540 | 0.3656 | 0.6723 | 0.3656 | 0.6046 | | 0.0606 | 6.7985 | 3542 | 0.3801 | 0.6723 | 0.3801 | 0.6165 | | 0.0606 | 6.8023 | 3544 | 0.4111 | 0.7407 | 0.4111 | 0.6411 | | 0.0606 | 6.8061 | 3546 | 0.4267 | 0.7795 | 0.4267 | 0.6532 | | 0.0606 | 6.8100 | 3548 | 0.4395 | 0.7795 | 0.4395 | 0.6629 | | 0.0606 | 6.8138 | 3550 | 0.4448 | 0.7795 | 0.4448 | 0.6669 | | 0.0606 | 6.8177 | 3552 | 0.4349 | 0.7795 | 0.4349 | 0.6594 | | 0.0606 | 6.8215 | 3554 | 0.4066 | 0.7549 | 0.4066 | 0.6376 | | 0.0606 | 6.8253 | 3556 | 0.3698 | 0.7549 | 0.3698 | 0.6081 | | 0.0606 | 6.8292 | 3558 | 0.3487 | 0.7154 | 0.3487 | 0.5905 | | 0.0606 | 6.8330 | 3560 | 0.3500 | 0.6723 | 0.3500 | 0.5916 | | 0.0606 | 6.8369 | 3562 | 0.3705 | 0.7154 | 0.3705 | 0.6087 | | 0.0606 | 6.8407 | 3564 | 0.3998 | 0.7154 | 0.3998 | 0.6323 | | 0.0606 | 6.8445 | 3566 | 0.4093 | 0.7154 | 0.4093 | 0.6397 | | 0.0606 | 6.8484 | 3568 | 0.4164 | 0.6638 | 0.4164 | 0.6453 | | 0.0606 | 6.8522 | 3570 | 0.4204 | 0.6638 | 0.4204 | 0.6484 | | 0.0606 | 6.8560 | 3572 | 0.4419 | 0.6903 | 0.4419 | 0.6647 | | 0.0606 | 6.8599 | 3574 | 0.4846 | 0.7009 | 0.4846 | 0.6961 | | 0.0606 | 6.8637 | 3576 | 0.5072 | 0.7009 | 0.5072 | 0.7122 | | 0.0606 | 6.8676 | 3578 | 0.5046 | 0.7009 | 0.5046 | 0.7104 | | 0.0606 | 6.8714 | 3580 | 0.4775 | 0.7009 | 0.4775 | 0.6910 | | 0.0606 | 6.8752 | 3582 | 0.4342 | 0.5767 | 0.4342 | 0.6589 | | 0.0606 | 6.8791 | 3584 | 0.3925 | 0.6723 | 0.3925 | 0.6265 | | 0.0606 | 6.8829 | 3586 | 0.3793 | 0.6723 | 0.3793 | 0.6159 | | 0.0606 | 6.8868 | 3588 | 0.3760 | 0.6723 | 0.3760 | 0.6132 | | 0.0606 | 6.8906 | 3590 | 0.3860 | 0.6723 | 0.3860 | 0.6213 | | 0.0606 | 6.8944 | 3592 | 0.3971 | 0.7549 | 0.3971 | 0.6301 | | 0.0606 | 6.8983 | 3594 | 0.3969 | 0.7549 | 0.3969 | 0.6300 | | 0.0606 | 6.9021 | 3596 | 0.4050 | 0.7549 | 0.4050 | 0.6364 | | 0.0606 | 6.9060 | 3598 | 0.4206 | 0.7549 | 0.4206 | 0.6485 | | 0.0606 | 6.9098 | 3600 | 0.4534 | 0.7388 | 0.4534 | 0.6734 | | 0.0606 | 6.9136 | 3602 | 0.4664 | 0.7094 | 0.4664 | 0.6829 | | 0.0606 | 6.9175 | 3604 | 0.4537 | 0.7388 | 0.4537 | 0.6736 | | 0.0606 | 6.9213 | 3606 | 0.4292 | 0.7549 | 0.4292 | 0.6551 | | 0.0606 | 6.9251 | 3608 | 0.4016 | 0.6723 | 0.4016 | 0.6337 | | 0.0606 | 6.9290 | 3610 | 0.3817 | 0.6723 | 0.3817 | 0.6178 | | 0.0606 | 6.9328 | 3612 | 0.3688 | 0.6723 | 0.3688 | 0.6073 | | 0.0606 | 6.9367 | 3614 | 0.3670 | 0.6723 | 0.3670 | 0.6058 | | 0.0606 | 6.9405 | 3616 | 0.3711 | 0.6723 | 0.3711 | 0.6092 | | 0.0606 | 6.9443 | 3618 | 0.3853 | 0.6723 | 0.3853 | 0.6208 | | 0.0606 | 6.9482 | 3620 | 0.4016 | 0.6723 | 0.4016 | 0.6337 | | 0.0606 | 6.9520 | 3622 | 0.4311 | 0.7083 | 0.4311 | 0.6566 | | 0.0606 | 6.9559 | 3624 | 0.4543 | 0.7490 | 0.4543 | 0.6740 | | 0.0606 | 6.9597 | 3626 | 0.4699 | 0.7490 | 0.4699 | 0.6855 | | 0.0606 | 6.9635 | 3628 | 0.4708 | 0.7490 | 0.4708 | 0.6861 | | 0.0606 | 6.9674 | 3630 | 0.4667 | 0.7490 | 0.4667 | 0.6831 | | 0.0606 | 6.9712 | 3632 | 0.4391 | 0.7490 | 0.4391 | 0.6627 | | 0.0606 | 6.9750 | 3634 | 0.4129 | 0.6983 | 0.4129 | 0.6426 | | 0.0606 | 6.9789 | 3636 | 0.3785 | 0.6723 | 0.3785 | 0.6152 | | 0.0606 | 6.9827 | 3638 | 0.3487 | 0.6723 | 0.3487 | 0.5905 | | 0.0606 | 6.9866 | 3640 | 0.3387 | 0.6723 | 0.3387 | 0.5820 | | 0.0606 | 6.9904 | 3642 | 0.3435 | 0.6723 | 0.3435 | 0.5861 | | 0.0606 | 6.9942 | 3644 | 0.3616 | 0.6723 | 0.3616 | 0.6013 | | 0.0606 | 6.9981 | 3646 | 0.3865 | 0.7154 | 0.3865 | 0.6217 | | 0.0606 | 7.0019 | 3648 | 0.3936 | 0.7549 | 0.3936 | 0.6273 | | 0.0606 | 7.0058 | 3650 | 0.4043 | 0.7549 | 0.4043 | 0.6359 | | 0.0606 | 7.0096 | 3652 | 0.4209 | 0.7549 | 0.4209 | 0.6488 | | 0.0606 | 7.0134 | 3654 | 0.4319 | 0.8069 | 0.4319 | 0.6572 | | 0.0606 | 7.0173 | 3656 | 0.4166 | 0.8069 | 0.4166 | 0.6455 | | 0.0606 | 7.0211 | 3658 | 0.3834 | 0.7549 | 0.3834 | 0.6192 | | 0.0606 | 7.0250 | 3660 | 0.3743 | 0.7549 | 0.3743 | 0.6118 | | 0.0606 | 7.0288 | 3662 | 0.3807 | 0.7549 | 0.3807 | 0.6170 | | 0.0606 | 7.0326 | 3664 | 0.3924 | 0.7549 | 0.3924 | 0.6264 | | 0.0606 | 7.0365 | 3666 | 0.4023 | 0.7549 | 0.4023 | 0.6343 | | 0.0606 | 7.0403 | 3668 | 0.4152 | 0.7083 | 0.4152 | 0.6444 | | 0.0606 | 7.0441 | 3670 | 0.4148 | 0.7083 | 0.4148 | 0.6441 | | 0.0606 | 7.0480 | 3672 | 0.4171 | 0.7083 | 0.4171 | 0.6458 | | 0.0606 | 7.0518 | 3674 | 0.4165 | 0.7083 | 0.4165 | 0.6454 | | 0.0606 | 7.0557 | 3676 | 0.4269 | 0.7083 | 0.4269 | 0.6534 | | 0.0606 | 7.0595 | 3678 | 0.4383 | 0.7342 | 0.4383 | 0.6621 | | 0.0606 | 7.0633 | 3680 | 0.4327 | 0.7342 | 0.4327 | 0.6578 | | 0.0606 | 7.0672 | 3682 | 0.4045 | 0.7083 | 0.4045 | 0.6360 | | 0.0606 | 7.0710 | 3684 | 0.3895 | 0.7083 | 0.3895 | 0.6241 | | 0.0606 | 7.0749 | 3686 | 0.3843 | 0.7083 | 0.3843 | 0.6199 | | 0.0606 | 7.0787 | 3688 | 0.3893 | 0.7083 | 0.3893 | 0.6239 | | 0.0606 | 7.0825 | 3690 | 0.4119 | 0.7083 | 0.4119 | 0.6418 | | 0.0606 | 7.0864 | 3692 | 0.4367 | 0.7342 | 0.4367 | 0.6608 | | 0.0606 | 7.0902 | 3694 | 0.4340 | 0.7795 | 0.4340 | 0.6588 | | 0.0606 | 7.0940 | 3696 | 0.4275 | 0.7795 | 0.4275 | 0.6538 | | 0.0606 | 7.0979 | 3698 | 0.4370 | 0.7795 | 0.4370 | 0.6610 | | 0.0606 | 7.1017 | 3700 | 0.4261 | 0.7795 | 0.4261 | 0.6527 | | 0.0606 | 7.1056 | 3702 | 0.4169 | 0.7795 | 0.4169 | 0.6457 | | 0.0606 | 7.1094 | 3704 | 0.4018 | 0.7549 | 0.4018 | 0.6339 | | 0.0606 | 7.1132 | 3706 | 0.3780 | 0.7549 | 0.3780 | 0.6148 | | 0.0606 | 7.1171 | 3708 | 0.3699 | 0.7549 | 0.3699 | 0.6082 | | 0.0606 | 7.1209 | 3710 | 0.3707 | 0.7549 | 0.3707 | 0.6088 | | 0.0606 | 7.1248 | 3712 | 0.3772 | 0.7549 | 0.3772 | 0.6142 | | 0.0606 | 7.1286 | 3714 | 0.3849 | 0.7549 | 0.3849 | 0.6204 | | 0.0606 | 7.1324 | 3716 | 0.3926 | 0.7549 | 0.3926 | 0.6265 | | 0.0606 | 7.1363 | 3718 | 0.3926 | 0.7549 | 0.3926 | 0.6266 | | 0.0606 | 7.1401 | 3720 | 0.3910 | 0.7549 | 0.3910 | 0.6253 | | 0.0606 | 7.1440 | 3722 | 0.3932 | 0.7549 | 0.3932 | 0.6271 | | 0.0606 | 7.1478 | 3724 | 0.3861 | 0.7549 | 0.3861 | 0.6214 | | 0.0606 | 7.1516 | 3726 | 0.3756 | 0.7549 | 0.3756 | 0.6128 | | 0.0606 | 7.1555 | 3728 | 0.3793 | 0.7549 | 0.3793 | 0.6159 | | 0.0606 | 7.1593 | 3730 | 0.3902 | 0.7549 | 0.3902 | 0.6246 | | 0.0606 | 7.1631 | 3732 | 0.3850 | 0.7549 | 0.3850 | 0.6204 | | 0.0606 | 7.1670 | 3734 | 0.3955 | 0.7549 | 0.3955 | 0.6289 | | 0.0606 | 7.1708 | 3736 | 0.3965 | 0.7549 | 0.3965 | 0.6297 | | 0.0606 | 7.1747 | 3738 | 0.4129 | 0.7083 | 0.4129 | 0.6426 | | 0.0606 | 7.1785 | 3740 | 0.4368 | 0.7342 | 0.4368 | 0.6609 | | 0.0606 | 7.1823 | 3742 | 0.4395 | 0.7342 | 0.4395 | 0.6629 | | 0.0606 | 7.1862 | 3744 | 0.4286 | 0.7083 | 0.4286 | 0.6547 | | 0.0606 | 7.1900 | 3746 | 0.4354 | 0.7342 | 0.4354 | 0.6599 | | 0.0606 | 7.1939 | 3748 | 0.4376 | 0.7342 | 0.4376 | 0.6615 | | 0.0606 | 7.1977 | 3750 | 0.4468 | 0.7342 | 0.4468 | 0.6684 | | 0.0606 | 7.2015 | 3752 | 0.4333 | 0.7342 | 0.4333 | 0.6583 | | 0.0606 | 7.2054 | 3754 | 0.4291 | 0.7342 | 0.4291 | 0.6551 | | 0.0606 | 7.2092 | 3756 | 0.4146 | 0.7549 | 0.4146 | 0.6439 | | 0.0606 | 7.2131 | 3758 | 0.3967 | 0.7549 | 0.3967 | 0.6298 | | 0.0606 | 7.2169 | 3760 | 0.3970 | 0.7549 | 0.3970 | 0.6300 | | 0.0606 | 7.2207 | 3762 | 0.4092 | 0.7549 | 0.4092 | 0.6397 | | 0.0606 | 7.2246 | 3764 | 0.4119 | 0.7549 | 0.4119 | 0.6418 | | 0.0606 | 7.2284 | 3766 | 0.4052 | 0.7549 | 0.4052 | 0.6365 | | 0.0606 | 7.2322 | 3768 | 0.3920 | 0.7549 | 0.3920 | 0.6261 | | 0.0606 | 7.2361 | 3770 | 0.3945 | 0.7549 | 0.3945 | 0.6281 | | 0.0606 | 7.2399 | 3772 | 0.3862 | 0.7549 | 0.3862 | 0.6214 | | 0.0606 | 7.2438 | 3774 | 0.3834 | 0.7549 | 0.3834 | 0.6192 | | 0.0606 | 7.2476 | 3776 | 0.3908 | 0.7549 | 0.3908 | 0.6252 | | 0.0606 | 7.2514 | 3778 | 0.4021 | 0.7083 | 0.4021 | 0.6341 | | 0.0606 | 7.2553 | 3780 | 0.4229 | 0.7083 | 0.4229 | 0.6503 | | 0.0606 | 7.2591 | 3782 | 0.4379 | 0.7083 | 0.4379 | 0.6618 | | 0.0606 | 7.2630 | 3784 | 0.4340 | 0.7083 | 0.4340 | 0.6588 | | 0.0606 | 7.2668 | 3786 | 0.4262 | 0.7083 | 0.4262 | 0.6528 | | 0.0606 | 7.2706 | 3788 | 0.4280 | 0.7083 | 0.4280 | 0.6542 | | 0.0606 | 7.2745 | 3790 | 0.4147 | 0.7083 | 0.4147 | 0.6440 | | 0.0606 | 7.2783 | 3792 | 0.4116 | 0.7083 | 0.4116 | 0.6416 | | 0.0606 | 7.2821 | 3794 | 0.3999 | 0.7083 | 0.3999 | 0.6324 | | 0.0606 | 7.2860 | 3796 | 0.4018 | 0.7083 | 0.4018 | 0.6339 | | 0.0606 | 7.2898 | 3798 | 0.4052 | 0.7083 | 0.4052 | 0.6366 | | 0.0606 | 7.2937 | 3800 | 0.4266 | 0.7083 | 0.4266 | 0.6531 | | 0.0606 | 7.2975 | 3802 | 0.4431 | 0.7083 | 0.4431 | 0.6657 | | 0.0606 | 7.3013 | 3804 | 0.4473 | 0.7083 | 0.4473 | 0.6688 | | 0.0606 | 7.3052 | 3806 | 0.4576 | 0.7692 | 0.4576 | 0.6765 | | 0.0606 | 7.3090 | 3808 | 0.4400 | 0.7549 | 0.4400 | 0.6633 | | 0.0606 | 7.3129 | 3810 | 0.4325 | 0.7549 | 0.4325 | 0.6576 | | 0.0606 | 7.3167 | 3812 | 0.4155 | 0.7549 | 0.4155 | 0.6446 | | 0.0606 | 7.3205 | 3814 | 0.4074 | 0.7549 | 0.4074 | 0.6383 | | 0.0606 | 7.3244 | 3816 | 0.4140 | 0.7549 | 0.4140 | 0.6435 | | 0.0606 | 7.3282 | 3818 | 0.4160 | 0.7083 | 0.4160 | 0.6450 | | 0.0606 | 7.3321 | 3820 | 0.4271 | 0.7083 | 0.4271 | 0.6536 | | 0.0606 | 7.3359 | 3822 | 0.4328 | 0.7083 | 0.4328 | 0.6578 | | 0.0606 | 7.3397 | 3824 | 0.4378 | 0.6638 | 0.4378 | 0.6617 | | 0.0606 | 7.3436 | 3826 | 0.4384 | 0.6638 | 0.4384 | 0.6621 | | 0.0606 | 7.3474 | 3828 | 0.4346 | 0.6147 | 0.4346 | 0.6592 | | 0.0606 | 7.3512 | 3830 | 0.4413 | 0.6147 | 0.4413 | 0.6643 | | 0.0606 | 7.3551 | 3832 | 0.4359 | 0.6147 | 0.4359 | 0.6602 | | 0.0606 | 7.3589 | 3834 | 0.4268 | 0.6147 | 0.4268 | 0.6533 | | 0.0606 | 7.3628 | 3836 | 0.4287 | 0.6147 | 0.4287 | 0.6548 | | 0.0606 | 7.3666 | 3838 | 0.4219 | 0.6147 | 0.4219 | 0.6496 | | 0.0606 | 7.3704 | 3840 | 0.4027 | 0.6147 | 0.4027 | 0.6346 | | 0.0606 | 7.3743 | 3842 | 0.3928 | 0.6638 | 0.3928 | 0.6268 | | 0.0606 | 7.3781 | 3844 | 0.3974 | 0.7083 | 0.3974 | 0.6304 | | 0.0606 | 7.3820 | 3846 | 0.4084 | 0.7083 | 0.4084 | 0.6391 | | 0.0606 | 7.3858 | 3848 | 0.4091 | 0.7549 | 0.4091 | 0.6396 | | 0.0606 | 7.3896 | 3850 | 0.3933 | 0.7549 | 0.3933 | 0.6271 | | 0.0606 | 7.3935 | 3852 | 0.3682 | 0.7549 | 0.3682 | 0.6068 | | 0.0606 | 7.3973 | 3854 | 0.3606 | 0.7154 | 0.3606 | 0.6005 | | 0.0606 | 7.4012 | 3856 | 0.3537 | 0.7154 | 0.3537 | 0.5947 | | 0.0606 | 7.4050 | 3858 | 0.3509 | 0.7154 | 0.3509 | 0.5924 | | 0.0606 | 7.4088 | 3860 | 0.3429 | 0.7154 | 0.3429 | 0.5856 | | 0.0606 | 7.4127 | 3862 | 0.3498 | 0.7154 | 0.3498 | 0.5914 | | 0.0606 | 7.4165 | 3864 | 0.3730 | 0.6638 | 0.3730 | 0.6107 | | 0.0606 | 7.4203 | 3866 | 0.4066 | 0.7083 | 0.4066 | 0.6377 | | 0.0606 | 7.4242 | 3868 | 0.4343 | 0.7083 | 0.4343 | 0.6590 | | 0.0606 | 7.4280 | 3870 | 0.4479 | 0.7083 | 0.4479 | 0.6692 | | 0.0606 | 7.4319 | 3872 | 0.4485 | 0.7342 | 0.4485 | 0.6697 | | 0.0606 | 7.4357 | 3874 | 0.4366 | 0.7342 | 0.4366 | 0.6607 | | 0.0606 | 7.4395 | 3876 | 0.4176 | 0.6638 | 0.4176 | 0.6462 | | 0.0606 | 7.4434 | 3878 | 0.3911 | 0.6638 | 0.3911 | 0.6253 | | 0.0606 | 7.4472 | 3880 | 0.3737 | 0.6147 | 0.3737 | 0.6113 | | 0.0606 | 7.4511 | 3882 | 0.3623 | 0.6723 | 0.3623 | 0.6019 | | 0.0606 | 7.4549 | 3884 | 0.3672 | 0.6147 | 0.3672 | 0.6060 | | 0.0606 | 7.4587 | 3886 | 0.3849 | 0.6147 | 0.3849 | 0.6204 | | 0.0606 | 7.4626 | 3888 | 0.4144 | 0.6903 | 0.4144 | 0.6438 | | 0.0606 | 7.4664 | 3890 | 0.4290 | 0.6903 | 0.4290 | 0.6550 | | 0.0606 | 7.4702 | 3892 | 0.4397 | 0.6903 | 0.4397 | 0.6631 | | 0.0606 | 7.4741 | 3894 | 0.4385 | 0.6903 | 0.4385 | 0.6622 | | 0.0606 | 7.4779 | 3896 | 0.4486 | 0.6903 | 0.4486 | 0.6698 | | 0.0606 | 7.4818 | 3898 | 0.4608 | 0.6903 | 0.4608 | 0.6788 | | 0.0606 | 7.4856 | 3900 | 0.4534 | 0.6903 | 0.4534 | 0.6734 | | 0.0606 | 7.4894 | 3902 | 0.4496 | 0.7342 | 0.4496 | 0.6705 | | 0.0606 | 7.4933 | 3904 | 0.4378 | 0.7795 | 0.4378 | 0.6616 | | 0.0606 | 7.4971 | 3906 | 0.4210 | 0.7795 | 0.4210 | 0.6488 | | 0.0606 | 7.5010 | 3908 | 0.4000 | 0.7795 | 0.4000 | 0.6324 | | 0.0606 | 7.5048 | 3910 | 0.3947 | 0.7549 | 0.3947 | 0.6282 | | 0.0606 | 7.5086 | 3912 | 0.3790 | 0.7549 | 0.3790 | 0.6156 | | 0.0606 | 7.5125 | 3914 | 0.3627 | 0.7549 | 0.3627 | 0.6022 | | 0.0606 | 7.5163 | 3916 | 0.3581 | 0.7549 | 0.3581 | 0.5984 | | 0.0606 | 7.5202 | 3918 | 0.3496 | 0.7549 | 0.3496 | 0.5913 | | 0.0606 | 7.5240 | 3920 | 0.3354 | 0.7549 | 0.3354 | 0.5791 | | 0.0606 | 7.5278 | 3922 | 0.3341 | 0.7549 | 0.3341 | 0.5780 | | 0.0606 | 7.5317 | 3924 | 0.3458 | 0.7549 | 0.3458 | 0.5880 | | 0.0606 | 7.5355 | 3926 | 0.3619 | 0.7549 | 0.3619 | 0.6016 | | 0.0606 | 7.5393 | 3928 | 0.3785 | 0.7549 | 0.3785 | 0.6152 | | 0.0606 | 7.5432 | 3930 | 0.4027 | 0.7549 | 0.4027 | 0.6346 | | 0.0606 | 7.5470 | 3932 | 0.4137 | 0.7549 | 0.4137 | 0.6432 | | 0.0606 | 7.5509 | 3934 | 0.4040 | 0.7549 | 0.4040 | 0.6356 | | 0.0606 | 7.5547 | 3936 | 0.4084 | 0.7549 | 0.4084 | 0.6390 | | 0.0606 | 7.5585 | 3938 | 0.4024 | 0.7549 | 0.4024 | 0.6343 | | 0.0606 | 7.5624 | 3940 | 0.3996 | 0.7549 | 0.3996 | 0.6321 | | 0.0606 | 7.5662 | 3942 | 0.4126 | 0.7795 | 0.4126 | 0.6423 | | 0.0606 | 7.5701 | 3944 | 0.4263 | 0.7795 | 0.4263 | 0.6529 | | 0.0606 | 7.5739 | 3946 | 0.4239 | 0.7795 | 0.4239 | 0.6511 | | 0.0606 | 7.5777 | 3948 | 0.4271 | 0.7795 | 0.4271 | 0.6535 | | 0.0606 | 7.5816 | 3950 | 0.4326 | 0.7795 | 0.4326 | 0.6578 | | 0.0606 | 7.5854 | 3952 | 0.4409 | 0.7795 | 0.4409 | 0.6640 | | 0.0606 | 7.5893 | 3954 | 0.4496 | 0.7795 | 0.4496 | 0.6705 | | 0.0606 | 7.5931 | 3956 | 0.4393 | 0.7795 | 0.4393 | 0.6628 | | 0.0606 | 7.5969 | 3958 | 0.4144 | 0.7795 | 0.4144 | 0.6437 | | 0.0606 | 7.6008 | 3960 | 0.3913 | 0.7154 | 0.3913 | 0.6256 | | 0.0606 | 7.6046 | 3962 | 0.3822 | 0.7154 | 0.3822 | 0.6182 | | 0.0606 | 7.6084 | 3964 | 0.3888 | 0.7549 | 0.3888 | 0.6235 | | 0.0606 | 7.6123 | 3966 | 0.4032 | 0.7549 | 0.4032 | 0.6350 | | 0.0606 | 7.6161 | 3968 | 0.4087 | 0.7549 | 0.4087 | 0.6393 | | 0.0606 | 7.6200 | 3970 | 0.4203 | 0.7549 | 0.4203 | 0.6483 | | 0.0606 | 7.6238 | 3972 | 0.4208 | 0.7549 | 0.4208 | 0.6487 | | 0.0606 | 7.6276 | 3974 | 0.4192 | 0.7549 | 0.4192 | 0.6475 | | 0.0606 | 7.6315 | 3976 | 0.4138 | 0.7549 | 0.4138 | 0.6433 | | 0.0606 | 7.6353 | 3978 | 0.4075 | 0.7549 | 0.4075 | 0.6383 | | 0.0606 | 7.6392 | 3980 | 0.4010 | 0.7549 | 0.4010 | 0.6333 | | 0.0606 | 7.6430 | 3982 | 0.4038 | 0.7549 | 0.4038 | 0.6355 | | 0.0606 | 7.6468 | 3984 | 0.4031 | 0.7549 | 0.4031 | 0.6349 | | 0.0606 | 7.6507 | 3986 | 0.3899 | 0.7549 | 0.3899 | 0.6244 | | 0.0606 | 7.6545 | 3988 | 0.3850 | 0.7549 | 0.3850 | 0.6205 | | 0.0606 | 7.6583 | 3990 | 0.3767 | 0.7154 | 0.3767 | 0.6138 | | 0.0606 | 7.6622 | 3992 | 0.3815 | 0.7154 | 0.3815 | 0.6176 | | 0.0606 | 7.6660 | 3994 | 0.3925 | 0.7154 | 0.3925 | 0.6265 | | 0.0606 | 7.6699 | 3996 | 0.4201 | 0.7549 | 0.4201 | 0.6481 | | 0.0606 | 7.6737 | 3998 | 0.4391 | 0.7795 | 0.4391 | 0.6626 | | 0.0568 | 7.6775 | 4000 | 0.4546 | 0.7795 | 0.4546 | 0.6742 | | 0.0568 | 7.6814 | 4002 | 0.4479 | 0.7795 | 0.4479 | 0.6693 | | 0.0568 | 7.6852 | 4004 | 0.4380 | 0.7549 | 0.4380 | 0.6618 | | 0.0568 | 7.6891 | 4006 | 0.4233 | 0.7549 | 0.4233 | 0.6506 | | 0.0568 | 7.6929 | 4008 | 0.4194 | 0.7549 | 0.4194 | 0.6476 | | 0.0568 | 7.6967 | 4010 | 0.4214 | 0.7549 | 0.4214 | 0.6492 | | 0.0568 | 7.7006 | 4012 | 0.4246 | 0.7549 | 0.4246 | 0.6516 | | 0.0568 | 7.7044 | 4014 | 0.4336 | 0.7549 | 0.4336 | 0.6585 | | 0.0568 | 7.7083 | 4016 | 0.4329 | 0.7549 | 0.4329 | 0.6579 | | 0.0568 | 7.7121 | 4018 | 0.4338 | 0.7549 | 0.4338 | 0.6586 | | 0.0568 | 7.7159 | 4020 | 0.4182 | 0.7549 | 0.4182 | 0.6467 | | 0.0568 | 7.7198 | 4022 | 0.3992 | 0.7549 | 0.3992 | 0.6318 | | 0.0568 | 7.7236 | 4024 | 0.3994 | 0.7549 | 0.3994 | 0.6319 | | 0.0568 | 7.7274 | 4026 | 0.3992 | 0.7549 | 0.3992 | 0.6318 | | 0.0568 | 7.7313 | 4028 | 0.4083 | 0.7549 | 0.4083 | 0.6390 | | 0.0568 | 7.7351 | 4030 | 0.4113 | 0.7549 | 0.4113 | 0.6414 | | 0.0568 | 7.7390 | 4032 | 0.4249 | 0.7795 | 0.4249 | 0.6518 | | 0.0568 | 7.7428 | 4034 | 0.4307 | 0.7795 | 0.4307 | 0.6563 | | 0.0568 | 7.7466 | 4036 | 0.4145 | 0.7549 | 0.4145 | 0.6438 | | 0.0568 | 7.7505 | 4038 | 0.3864 | 0.7549 | 0.3864 | 0.6216 | | 0.0568 | 7.7543 | 4040 | 0.3602 | 0.7154 | 0.3602 | 0.6002 | | 0.0568 | 7.7582 | 4042 | 0.3567 | 0.7154 | 0.3567 | 0.5973 | | 0.0568 | 7.7620 | 4044 | 0.3704 | 0.7154 | 0.3704 | 0.6086 | | 0.0568 | 7.7658 | 4046 | 0.3913 | 0.7549 | 0.3913 | 0.6256 | | 0.0568 | 7.7697 | 4048 | 0.4065 | 0.7549 | 0.4065 | 0.6376 | | 0.0568 | 7.7735 | 4050 | 0.4116 | 0.7549 | 0.4116 | 0.6415 | | 0.0568 | 7.7774 | 4052 | 0.4085 | 0.7549 | 0.4085 | 0.6391 | | 0.0568 | 7.7812 | 4054 | 0.4016 | 0.7549 | 0.4016 | 0.6337 | | 0.0568 | 7.7850 | 4056 | 0.4142 | 0.7795 | 0.4142 | 0.6436 | | 0.0568 | 7.7889 | 4058 | 0.4305 | 0.7795 | 0.4305 | 0.6561 | | 0.0568 | 7.7927 | 4060 | 0.4391 | 0.7795 | 0.4391 | 0.6626 | | 0.0568 | 7.7965 | 4062 | 0.4599 | 0.7795 | 0.4599 | 0.6782 | | 0.0568 | 7.8004 | 4064 | 0.4780 | 0.7795 | 0.4780 | 0.6914 | | 0.0568 | 7.8042 | 4066 | 0.4812 | 0.7795 | 0.4812 | 0.6937 | | 0.0568 | 7.8081 | 4068 | 0.4858 | 0.7795 | 0.4858 | 0.6970 | | 0.0568 | 7.8119 | 4070 | 0.4704 | 0.7795 | 0.4704 | 0.6858 | | 0.0568 | 7.8157 | 4072 | 0.4444 | 0.7795 | 0.4444 | 0.6666 | | 0.0568 | 7.8196 | 4074 | 0.4136 | 0.7407 | 0.4136 | 0.6431 | | 0.0568 | 7.8234 | 4076 | 0.3829 | 0.6983 | 0.3829 | 0.6188 | | 0.0568 | 7.8273 | 4078 | 0.3546 | 0.6723 | 0.3546 | 0.5955 | | 0.0568 | 7.8311 | 4080 | 0.3437 | 0.6723 | 0.3437 | 0.5863 | | 0.0568 | 7.8349 | 4082 | 0.3444 | 0.6723 | 0.3444 | 0.5869 | | 0.0568 | 7.8388 | 4084 | 0.3477 | 0.6723 | 0.3477 | 0.5897 | | 0.0568 | 7.8426 | 4086 | 0.3516 | 0.6723 | 0.3516 | 0.5930 | | 0.0568 | 7.8464 | 4088 | 0.3587 | 0.6983 | 0.3587 | 0.5989 | | 0.0568 | 7.8503 | 4090 | 0.3599 | 0.6983 | 0.3599 | 0.5999 | | 0.0568 | 7.8541 | 4092 | 0.3599 | 0.7407 | 0.3599 | 0.5999 | | 0.0568 | 7.8580 | 4094 | 0.3743 | 0.7407 | 0.3743 | 0.6118 | | 0.0568 | 7.8618 | 4096 | 0.4021 | 0.7795 | 0.4021 | 0.6341 | | 0.0568 | 7.8656 | 4098 | 0.4376 | 0.7795 | 0.4376 | 0.6615 | | 0.0568 | 7.8695 | 4100 | 0.4518 | 0.7795 | 0.4518 | 0.6721 | | 0.0568 | 7.8733 | 4102 | 0.4602 | 0.7795 | 0.4602 | 0.6784 | | 0.0568 | 7.8772 | 4104 | 0.4478 | 0.7795 | 0.4478 | 0.6692 | | 0.0568 | 7.8810 | 4106 | 0.4250 | 0.7795 | 0.4250 | 0.6520 | | 0.0568 | 7.8848 | 4108 | 0.3969 | 0.7407 | 0.3969 | 0.6300 | | 0.0568 | 7.8887 | 4110 | 0.3617 | 0.6983 | 0.3617 | 0.6015 | | 0.0568 | 7.8925 | 4112 | 0.3359 | 0.6723 | 0.3359 | 0.5796 | | 0.0568 | 7.8964 | 4114 | 0.3242 | 0.6723 | 0.3242 | 0.5694 | | 0.0568 | 7.9002 | 4116 | 0.3224 | 0.6723 | 0.3224 | 0.5678 | | 0.0568 | 7.9040 | 4118 | 0.3309 | 0.6723 | 0.3309 | 0.5752 | | 0.0568 | 7.9079 | 4120 | 0.3510 | 0.7407 | 0.3510 | 0.5925 | | 0.0568 | 7.9117 | 4122 | 0.3853 | 0.7795 | 0.3853 | 0.6207 | | 0.0568 | 7.9155 | 4124 | 0.4255 | 0.7795 | 0.4255 | 0.6523 | | 0.0568 | 7.9194 | 4126 | 0.4533 | 0.7795 | 0.4533 | 0.6733 | | 0.0568 | 7.9232 | 4128 | 0.4578 | 0.7795 | 0.4578 | 0.6766 | | 0.0568 | 7.9271 | 4130 | 0.4420 | 0.7795 | 0.4420 | 0.6648 | | 0.0568 | 7.9309 | 4132 | 0.4128 | 0.7795 | 0.4128 | 0.6425 | | 0.0568 | 7.9347 | 4134 | 0.3915 | 0.7795 | 0.3915 | 0.6257 | | 0.0568 | 7.9386 | 4136 | 0.3770 | 0.7407 | 0.3770 | 0.6140 | | 0.0568 | 7.9424 | 4138 | 0.3799 | 0.7407 | 0.3799 | 0.6163 | | 0.0568 | 7.9463 | 4140 | 0.3809 | 0.7407 | 0.3809 | 0.6172 | | 0.0568 | 7.9501 | 4142 | 0.3947 | 0.7795 | 0.3947 | 0.6283 | | 0.0568 | 7.9539 | 4144 | 0.4091 | 0.7795 | 0.4091 | 0.6396 | | 0.0568 | 7.9578 | 4146 | 0.4252 | 0.7795 | 0.4252 | 0.6521 | | 0.0568 | 7.9616 | 4148 | 0.4269 | 0.7795 | 0.4269 | 0.6534 | | 0.0568 | 7.9655 | 4150 | 0.4289 | 0.7795 | 0.4289 | 0.6549 | | 0.0568 | 7.9693 | 4152 | 0.4288 | 0.7795 | 0.4288 | 0.6548 | | 0.0568 | 7.9731 | 4154 | 0.4121 | 0.7795 | 0.4121 | 0.6420 | | 0.0568 | 7.9770 | 4156 | 0.3960 | 0.7795 | 0.3960 | 0.6293 | | 0.0568 | 7.9808 | 4158 | 0.3742 | 0.7795 | 0.3742 | 0.6117 | | 0.0568 | 7.9846 | 4160 | 0.3484 | 0.7795 | 0.3484 | 0.5902 | | 0.0568 | 7.9885 | 4162 | 0.3297 | 0.7549 | 0.3297 | 0.5742 | | 0.0568 | 7.9923 | 4164 | 0.3194 | 0.7549 | 0.3194 | 0.5651 | | 0.0568 | 7.9962 | 4166 | 0.3159 | 0.7549 | 0.3159 | 0.5620 | | 0.0568 | 8.0 | 4168 | 0.3178 | 0.7549 | 0.3178 | 0.5637 | | 0.0568 | 8.0038 | 4170 | 0.3318 | 0.7549 | 0.3318 | 0.5760 | | 0.0568 | 8.0077 | 4172 | 0.3465 | 0.7549 | 0.3465 | 0.5887 | | 0.0568 | 8.0115 | 4174 | 0.3695 | 0.7795 | 0.3695 | 0.6079 | | 0.0568 | 8.0154 | 4176 | 0.3941 | 0.7795 | 0.3941 | 0.6278 | | 0.0568 | 8.0192 | 4178 | 0.4079 | 0.7795 | 0.4079 | 0.6386 | | 0.0568 | 8.0230 | 4180 | 0.4037 | 0.7795 | 0.4037 | 0.6354 | | 0.0568 | 8.0269 | 4182 | 0.4040 | 0.7795 | 0.4040 | 0.6356 | | 0.0568 | 8.0307 | 4184 | 0.3952 | 0.7795 | 0.3952 | 0.6286 | | 0.0568 | 8.0345 | 4186 | 0.3814 | 0.7407 | 0.3814 | 0.6176 | | 0.0568 | 8.0384 | 4188 | 0.3777 | 0.6983 | 0.3777 | 0.6146 | | 0.0568 | 8.0422 | 4190 | 0.3842 | 0.6983 | 0.3842 | 0.6199 | | 0.0568 | 8.0461 | 4192 | 0.4033 | 0.7407 | 0.4033 | 0.6350 | | 0.0568 | 8.0499 | 4194 | 0.4211 | 0.7795 | 0.4211 | 0.6489 | | 0.0568 | 8.0537 | 4196 | 0.4200 | 0.7795 | 0.4200 | 0.6481 | | 0.0568 | 8.0576 | 4198 | 0.4048 | 0.7795 | 0.4048 | 0.6363 | | 0.0568 | 8.0614 | 4200 | 0.3808 | 0.7407 | 0.3808 | 0.6171 | | 0.0568 | 8.0653 | 4202 | 0.3511 | 0.6983 | 0.3511 | 0.5925 | | 0.0568 | 8.0691 | 4204 | 0.3332 | 0.6723 | 0.3332 | 0.5772 | | 0.0568 | 8.0729 | 4206 | 0.3285 | 0.6723 | 0.3285 | 0.5732 | | 0.0568 | 8.0768 | 4208 | 0.3295 | 0.6723 | 0.3295 | 0.5740 | | 0.0568 | 8.0806 | 4210 | 0.3224 | 0.6723 | 0.3224 | 0.5678 | | 0.0568 | 8.0845 | 4212 | 0.3230 | 0.7154 | 0.3230 | 0.5683 | | 0.0568 | 8.0883 | 4214 | 0.3308 | 0.7154 | 0.3308 | 0.5752 | | 0.0568 | 8.0921 | 4216 | 0.3354 | 0.7549 | 0.3354 | 0.5792 | | 0.0568 | 8.0960 | 4218 | 0.3500 | 0.7549 | 0.3500 | 0.5916 | | 0.0568 | 8.0998 | 4220 | 0.3632 | 0.7795 | 0.3632 | 0.6027 | | 0.0568 | 8.1036 | 4222 | 0.3692 | 0.7795 | 0.3692 | 0.6076 | | 0.0568 | 8.1075 | 4224 | 0.3695 | 0.7795 | 0.3695 | 0.6079 | | 0.0568 | 8.1113 | 4226 | 0.3729 | 0.7795 | 0.3729 | 0.6107 | | 0.0568 | 8.1152 | 4228 | 0.3626 | 0.7795 | 0.3626 | 0.6022 | | 0.0568 | 8.1190 | 4230 | 0.3468 | 0.7549 | 0.3468 | 0.5889 | | 0.0568 | 8.1228 | 4232 | 0.3358 | 0.7549 | 0.3358 | 0.5795 | | 0.0568 | 8.1267 | 4234 | 0.3257 | 0.7549 | 0.3257 | 0.5707 | | 0.0568 | 8.1305 | 4236 | 0.3263 | 0.7549 | 0.3263 | 0.5713 | | 0.0568 | 8.1344 | 4238 | 0.3226 | 0.7549 | 0.3226 | 0.5680 | | 0.0568 | 8.1382 | 4240 | 0.3259 | 0.7154 | 0.3259 | 0.5709 | | 0.0568 | 8.1420 | 4242 | 0.3374 | 0.7154 | 0.3374 | 0.5809 | | 0.0568 | 8.1459 | 4244 | 0.3559 | 0.7549 | 0.3559 | 0.5966 | | 0.0568 | 8.1497 | 4246 | 0.3667 | 0.7795 | 0.3667 | 0.6056 | | 0.0568 | 8.1536 | 4248 | 0.3624 | 0.7795 | 0.3624 | 0.6020 | | 0.0568 | 8.1574 | 4250 | 0.3556 | 0.7407 | 0.3556 | 0.5963 | | 0.0568 | 8.1612 | 4252 | 0.3479 | 0.6983 | 0.3479 | 0.5898 | | 0.0568 | 8.1651 | 4254 | 0.3435 | 0.6983 | 0.3435 | 0.5861 | | 0.0568 | 8.1689 | 4256 | 0.3422 | 0.6983 | 0.3422 | 0.5850 | | 0.0568 | 8.1727 | 4258 | 0.3375 | 0.6983 | 0.3375 | 0.5809 | | 0.0568 | 8.1766 | 4260 | 0.3296 | 0.6723 | 0.3296 | 0.5741 | | 0.0568 | 8.1804 | 4262 | 0.3261 | 0.7154 | 0.3261 | 0.5710 | | 0.0568 | 8.1843 | 4264 | 0.3258 | 0.7549 | 0.3258 | 0.5708 | | 0.0568 | 8.1881 | 4266 | 0.3308 | 0.7795 | 0.3308 | 0.5752 | | 0.0568 | 8.1919 | 4268 | 0.3402 | 0.7795 | 0.3402 | 0.5832 | | 0.0568 | 8.1958 | 4270 | 0.3548 | 0.7795 | 0.3548 | 0.5956 | | 0.0568 | 8.1996 | 4272 | 0.3625 | 0.7795 | 0.3625 | 0.6021 | | 0.0568 | 8.2035 | 4274 | 0.3723 | 0.7795 | 0.3723 | 0.6102 | | 0.0568 | 8.2073 | 4276 | 0.3683 | 0.7795 | 0.3683 | 0.6069 | | 0.0568 | 8.2111 | 4278 | 0.3555 | 0.7795 | 0.3555 | 0.5963 | | 0.0568 | 8.2150 | 4280 | 0.3425 | 0.7795 | 0.3425 | 0.5853 | | 0.0568 | 8.2188 | 4282 | 0.3262 | 0.7795 | 0.3262 | 0.5711 | | 0.0568 | 8.2226 | 4284 | 0.3172 | 0.7795 | 0.3172 | 0.5632 | | 0.0568 | 8.2265 | 4286 | 0.3108 | 0.7795 | 0.3108 | 0.5575 | | 0.0568 | 8.2303 | 4288 | 0.3142 | 0.7795 | 0.3142 | 0.5605 | | 0.0568 | 8.2342 | 4290 | 0.3265 | 0.7795 | 0.3265 | 0.5714 | | 0.0568 | 8.2380 | 4292 | 0.3344 | 0.7795 | 0.3344 | 0.5783 | | 0.0568 | 8.2418 | 4294 | 0.3425 | 0.7795 | 0.3425 | 0.5853 | | 0.0568 | 8.2457 | 4296 | 0.3449 | 0.7795 | 0.3449 | 0.5873 | | 0.0568 | 8.2495 | 4298 | 0.3451 | 0.7795 | 0.3451 | 0.5875 | | 0.0568 | 8.2534 | 4300 | 0.3409 | 0.7795 | 0.3409 | 0.5839 | | 0.0568 | 8.2572 | 4302 | 0.3406 | 0.7795 | 0.3406 | 0.5836 | | 0.0568 | 8.2610 | 4304 | 0.3419 | 0.7795 | 0.3419 | 0.5847 | | 0.0568 | 8.2649 | 4306 | 0.3420 | 0.7795 | 0.3420 | 0.5848 | | 0.0568 | 8.2687 | 4308 | 0.3342 | 0.7795 | 0.3342 | 0.5781 | | 0.0568 | 8.2726 | 4310 | 0.3274 | 0.7407 | 0.3274 | 0.5722 | | 0.0568 | 8.2764 | 4312 | 0.3250 | 0.7407 | 0.3250 | 0.5701 | | 0.0568 | 8.2802 | 4314 | 0.3226 | 0.6723 | 0.3226 | 0.5680 | | 0.0568 | 8.2841 | 4316 | 0.3261 | 0.6983 | 0.3261 | 0.5711 | | 0.0568 | 8.2879 | 4318 | 0.3352 | 0.6983 | 0.3352 | 0.5790 | | 0.0568 | 8.2917 | 4320 | 0.3528 | 0.7407 | 0.3528 | 0.5940 | | 0.0568 | 8.2956 | 4322 | 0.3738 | 0.7795 | 0.3738 | 0.6114 | | 0.0568 | 8.2994 | 4324 | 0.3954 | 0.7795 | 0.3954 | 0.6288 | | 0.0568 | 8.3033 | 4326 | 0.4169 | 0.7795 | 0.4169 | 0.6457 | | 0.0568 | 8.3071 | 4328 | 0.4401 | 0.7795 | 0.4401 | 0.6634 | | 0.0568 | 8.3109 | 4330 | 0.4596 | 0.7490 | 0.4596 | 0.6779 | | 0.0568 | 8.3148 | 4332 | 0.4639 | 0.7490 | 0.4639 | 0.6811 | | 0.0568 | 8.3186 | 4334 | 0.4545 | 0.7490 | 0.4545 | 0.6741 | | 0.0568 | 8.3225 | 4336 | 0.4334 | 0.7795 | 0.4334 | 0.6583 | | 0.0568 | 8.3263 | 4338 | 0.4085 | 0.7407 | 0.4085 | 0.6391 | | 0.0568 | 8.3301 | 4340 | 0.3843 | 0.7407 | 0.3843 | 0.6199 | | 0.0568 | 8.3340 | 4342 | 0.3710 | 0.6983 | 0.3710 | 0.6091 | | 0.0568 | 8.3378 | 4344 | 0.3654 | 0.7407 | 0.3654 | 0.6045 | | 0.0568 | 8.3417 | 4346 | 0.3653 | 0.7407 | 0.3653 | 0.6044 | | 0.0568 | 8.3455 | 4348 | 0.3654 | 0.7407 | 0.3654 | 0.6045 | | 0.0568 | 8.3493 | 4350 | 0.3607 | 0.7407 | 0.3607 | 0.6006 | | 0.0568 | 8.3532 | 4352 | 0.3558 | 0.7407 | 0.3558 | 0.5965 | | 0.0568 | 8.3570 | 4354 | 0.3494 | 0.7407 | 0.3494 | 0.5911 | | 0.0568 | 8.3608 | 4356 | 0.3478 | 0.7407 | 0.3478 | 0.5898 | | 0.0568 | 8.3647 | 4358 | 0.3547 | 0.7795 | 0.3547 | 0.5955 | | 0.0568 | 8.3685 | 4360 | 0.3622 | 0.7795 | 0.3622 | 0.6018 | | 0.0568 | 8.3724 | 4362 | 0.3630 | 0.7795 | 0.3630 | 0.6025 | | 0.0568 | 8.3762 | 4364 | 0.3609 | 0.7795 | 0.3609 | 0.6008 | | 0.0568 | 8.3800 | 4366 | 0.3715 | 0.7795 | 0.3715 | 0.6095 | | 0.0568 | 8.3839 | 4368 | 0.3775 | 0.7795 | 0.3775 | 0.6144 | | 0.0568 | 8.3877 | 4370 | 0.3819 | 0.7795 | 0.3819 | 0.6179 | | 0.0568 | 8.3916 | 4372 | 0.3818 | 0.7795 | 0.3818 | 0.6179 | | 0.0568 | 8.3954 | 4374 | 0.3799 | 0.7795 | 0.3799 | 0.6164 | | 0.0568 | 8.3992 | 4376 | 0.3889 | 0.7795 | 0.3889 | 0.6236 | | 0.0568 | 8.4031 | 4378 | 0.3918 | 0.7795 | 0.3918 | 0.6260 | | 0.0568 | 8.4069 | 4380 | 0.4005 | 0.7795 | 0.4005 | 0.6329 | | 0.0568 | 8.4107 | 4382 | 0.3991 | 0.7795 | 0.3991 | 0.6317 | | 0.0568 | 8.4146 | 4384 | 0.3911 | 0.7795 | 0.3911 | 0.6254 | | 0.0568 | 8.4184 | 4386 | 0.3789 | 0.7795 | 0.3789 | 0.6156 | | 0.0568 | 8.4223 | 4388 | 0.3687 | 0.7407 | 0.3687 | 0.6072 | | 0.0568 | 8.4261 | 4390 | 0.3613 | 0.7407 | 0.3613 | 0.6010 | | 0.0568 | 8.4299 | 4392 | 0.3542 | 0.7154 | 0.3542 | 0.5951 | | 0.0568 | 8.4338 | 4394 | 0.3418 | 0.6723 | 0.3418 | 0.5846 | | 0.0568 | 8.4376 | 4396 | 0.3329 | 0.6723 | 0.3329 | 0.5770 | | 0.0568 | 8.4415 | 4398 | 0.3304 | 0.6723 | 0.3304 | 0.5748 | | 0.0568 | 8.4453 | 4400 | 0.3358 | 0.7154 | 0.3358 | 0.5795 | | 0.0568 | 8.4491 | 4402 | 0.3426 | 0.7154 | 0.3426 | 0.5853 | | 0.0568 | 8.4530 | 4404 | 0.3559 | 0.7154 | 0.3559 | 0.5966 | | 0.0568 | 8.4568 | 4406 | 0.3719 | 0.7549 | 0.3719 | 0.6098 | | 0.0568 | 8.4607 | 4408 | 0.3884 | 0.7795 | 0.3884 | 0.6232 | | 0.0568 | 8.4645 | 4410 | 0.3950 | 0.7795 | 0.3950 | 0.6285 | | 0.0568 | 8.4683 | 4412 | 0.3880 | 0.7795 | 0.3880 | 0.6229 | | 0.0568 | 8.4722 | 4414 | 0.3895 | 0.7795 | 0.3895 | 0.6241 | | 0.0568 | 8.4760 | 4416 | 0.3890 | 0.7795 | 0.3890 | 0.6237 | | 0.0568 | 8.4798 | 4418 | 0.3924 | 0.7795 | 0.3924 | 0.6264 | | 0.0568 | 8.4837 | 4420 | 0.3987 | 0.7795 | 0.3987 | 0.6314 | | 0.0568 | 8.4875 | 4422 | 0.4085 | 0.7795 | 0.4085 | 0.6392 | | 0.0568 | 8.4914 | 4424 | 0.4143 | 0.7795 | 0.4143 | 0.6436 | | 0.0568 | 8.4952 | 4426 | 0.4121 | 0.7795 | 0.4121 | 0.6419 | | 0.0568 | 8.4990 | 4428 | 0.4030 | 0.7795 | 0.4030 | 0.6348 | | 0.0568 | 8.5029 | 4430 | 0.3920 | 0.7154 | 0.3920 | 0.6261 | | 0.0568 | 8.5067 | 4432 | 0.3884 | 0.7154 | 0.3884 | 0.6232 | | 0.0568 | 8.5106 | 4434 | 0.3831 | 0.7154 | 0.3831 | 0.6190 | | 0.0568 | 8.5144 | 4436 | 0.3797 | 0.7154 | 0.3797 | 0.6162 | | 0.0568 | 8.5182 | 4438 | 0.3834 | 0.7154 | 0.3834 | 0.6192 | | 0.0568 | 8.5221 | 4440 | 0.3912 | 0.7154 | 0.3912 | 0.6255 | | 0.0568 | 8.5259 | 4442 | 0.3888 | 0.7549 | 0.3888 | 0.6236 | | 0.0568 | 8.5298 | 4444 | 0.3912 | 0.7549 | 0.3912 | 0.6255 | | 0.0568 | 8.5336 | 4446 | 0.3964 | 0.7795 | 0.3964 | 0.6296 | | 0.0568 | 8.5374 | 4448 | 0.3954 | 0.7795 | 0.3954 | 0.6288 | | 0.0568 | 8.5413 | 4450 | 0.3983 | 0.7795 | 0.3983 | 0.6311 | | 0.0568 | 8.5451 | 4452 | 0.3931 | 0.7795 | 0.3931 | 0.6270 | | 0.0568 | 8.5489 | 4454 | 0.3953 | 0.7795 | 0.3953 | 0.6288 | | 0.0568 | 8.5528 | 4456 | 0.4043 | 0.7795 | 0.4043 | 0.6358 | | 0.0568 | 8.5566 | 4458 | 0.4074 | 0.7795 | 0.4074 | 0.6382 | | 0.0568 | 8.5605 | 4460 | 0.4084 | 0.7795 | 0.4084 | 0.6390 | | 0.0568 | 8.5643 | 4462 | 0.4043 | 0.7795 | 0.4043 | 0.6359 | | 0.0568 | 8.5681 | 4464 | 0.3934 | 0.7795 | 0.3934 | 0.6272 | | 0.0568 | 8.5720 | 4466 | 0.3818 | 0.7795 | 0.3818 | 0.6179 | | 0.0568 | 8.5758 | 4468 | 0.3763 | 0.7795 | 0.3763 | 0.6134 | | 0.0568 | 8.5797 | 4470 | 0.3715 | 0.7549 | 0.3715 | 0.6095 | | 0.0568 | 8.5835 | 4472 | 0.3720 | 0.7795 | 0.3720 | 0.6099 | | 0.0568 | 8.5873 | 4474 | 0.3672 | 0.7549 | 0.3672 | 0.6060 | | 0.0568 | 8.5912 | 4476 | 0.3693 | 0.7549 | 0.3693 | 0.6077 | | 0.0568 | 8.5950 | 4478 | 0.3758 | 0.7795 | 0.3758 | 0.6131 | | 0.0568 | 8.5988 | 4480 | 0.3785 | 0.7795 | 0.3785 | 0.6152 | | 0.0568 | 8.6027 | 4482 | 0.3870 | 0.7795 | 0.3870 | 0.6221 | | 0.0568 | 8.6065 | 4484 | 0.3905 | 0.7795 | 0.3905 | 0.6249 | | 0.0568 | 8.6104 | 4486 | 0.3892 | 0.7795 | 0.3892 | 0.6239 | | 0.0568 | 8.6142 | 4488 | 0.3886 | 0.7795 | 0.3886 | 0.6234 | | 0.0568 | 8.6180 | 4490 | 0.3936 | 0.7795 | 0.3936 | 0.6273 | | 0.0568 | 8.6219 | 4492 | 0.3943 | 0.7795 | 0.3943 | 0.6279 | | 0.0568 | 8.6257 | 4494 | 0.3899 | 0.7795 | 0.3899 | 0.6244 | | 0.0568 | 8.6296 | 4496 | 0.3850 | 0.7407 | 0.3850 | 0.6205 | | 0.0568 | 8.6334 | 4498 | 0.3799 | 0.7407 | 0.3799 | 0.6164 | | 0.049 | 8.6372 | 4500 | 0.3791 | 0.7407 | 0.3791 | 0.6157 | | 0.049 | 8.6411 | 4502 | 0.3811 | 0.7407 | 0.3811 | 0.6174 | | 0.049 | 8.6449 | 4504 | 0.3897 | 0.7407 | 0.3897 | 0.6243 | | 0.049 | 8.6488 | 4506 | 0.3970 | 0.7407 | 0.3970 | 0.6301 | | 0.049 | 8.6526 | 4508 | 0.3992 | 0.7407 | 0.3992 | 0.6318 | | 0.049 | 8.6564 | 4510 | 0.4096 | 0.7795 | 0.4096 | 0.6400 | | 0.049 | 8.6603 | 4512 | 0.4127 | 0.7795 | 0.4127 | 0.6424 | | 0.049 | 8.6641 | 4514 | 0.4090 | 0.7795 | 0.4090 | 0.6395 | | 0.049 | 8.6679 | 4516 | 0.4064 | 0.7795 | 0.4064 | 0.6375 | | 0.049 | 8.6718 | 4518 | 0.4001 | 0.7407 | 0.4001 | 0.6325 | | 0.049 | 8.6756 | 4520 | 0.3917 | 0.7407 | 0.3917 | 0.6259 | | 0.049 | 8.6795 | 4522 | 0.3908 | 0.7407 | 0.3908 | 0.6252 | | 0.049 | 8.6833 | 4524 | 0.3928 | 0.7407 | 0.3928 | 0.6268 | | 0.049 | 8.6871 | 4526 | 0.3864 | 0.6983 | 0.3864 | 0.6216 | | 0.049 | 8.6910 | 4528 | 0.3774 | 0.6983 | 0.3774 | 0.6144 | | 0.049 | 8.6948 | 4530 | 0.3769 | 0.6983 | 0.3769 | 0.6139 | | 0.049 | 8.6987 | 4532 | 0.3768 | 0.6983 | 0.3768 | 0.6138 | | 0.049 | 8.7025 | 4534 | 0.3810 | 0.6983 | 0.3810 | 0.6173 | | 0.049 | 8.7063 | 4536 | 0.3838 | 0.6983 | 0.3838 | 0.6196 | | 0.049 | 8.7102 | 4538 | 0.3905 | 0.6983 | 0.3905 | 0.6249 | | 0.049 | 8.7140 | 4540 | 0.4011 | 0.6983 | 0.4011 | 0.6333 | | 0.049 | 8.7179 | 4542 | 0.4166 | 0.6983 | 0.4166 | 0.6454 | | 0.049 | 8.7217 | 4544 | 0.4300 | 0.6903 | 0.4300 | 0.6558 | | 0.049 | 8.7255 | 4546 | 0.4325 | 0.6903 | 0.4325 | 0.6577 | | 0.049 | 8.7294 | 4548 | 0.4270 | 0.6903 | 0.4270 | 0.6534 | | 0.049 | 8.7332 | 4550 | 0.4149 | 0.6983 | 0.4149 | 0.6442 | | 0.049 | 8.7370 | 4552 | 0.4030 | 0.6983 | 0.4030 | 0.6349 | | 0.049 | 8.7409 | 4554 | 0.3972 | 0.6983 | 0.3972 | 0.6302 | | 0.049 | 8.7447 | 4556 | 0.3966 | 0.6983 | 0.3966 | 0.6298 | | 0.049 | 8.7486 | 4558 | 0.3970 | 0.6983 | 0.3970 | 0.6301 | | 0.049 | 8.7524 | 4560 | 0.3917 | 0.6983 | 0.3917 | 0.6259 | | 0.049 | 8.7562 | 4562 | 0.3918 | 0.7407 | 0.3918 | 0.6259 | | 0.049 | 8.7601 | 4564 | 0.3912 | 0.7407 | 0.3912 | 0.6254 | | 0.049 | 8.7639 | 4566 | 0.3941 | 0.7407 | 0.3941 | 0.6277 | | 0.049 | 8.7678 | 4568 | 0.3943 | 0.7407 | 0.3943 | 0.6279 | | 0.049 | 8.7716 | 4570 | 0.4024 | 0.7407 | 0.4024 | 0.6344 | | 0.049 | 8.7754 | 4572 | 0.4115 | 0.7795 | 0.4115 | 0.6415 | | 0.049 | 8.7793 | 4574 | 0.4154 | 0.7795 | 0.4154 | 0.6446 | | 0.049 | 8.7831 | 4576 | 0.4210 | 0.7795 | 0.4210 | 0.6489 | | 0.049 | 8.7869 | 4578 | 0.4169 | 0.7795 | 0.4169 | 0.6456 | | 0.049 | 8.7908 | 4580 | 0.4051 | 0.7795 | 0.4051 | 0.6365 | | 0.049 | 8.7946 | 4582 | 0.3964 | 0.7795 | 0.3964 | 0.6296 | | 0.049 | 8.7985 | 4584 | 0.3935 | 0.7795 | 0.3935 | 0.6273 | | 0.049 | 8.8023 | 4586 | 0.3904 | 0.7795 | 0.3904 | 0.6248 | | 0.049 | 8.8061 | 4588 | 0.3855 | 0.7407 | 0.3855 | 0.6209 | | 0.049 | 8.8100 | 4590 | 0.3864 | 0.7407 | 0.3864 | 0.6216 | | 0.049 | 8.8138 | 4592 | 0.3876 | 0.7407 | 0.3876 | 0.6225 | | 0.049 | 8.8177 | 4594 | 0.3889 | 0.7407 | 0.3889 | 0.6236 | | 0.049 | 8.8215 | 4596 | 0.3897 | 0.7407 | 0.3897 | 0.6243 | | 0.049 | 8.8253 | 4598 | 0.3982 | 0.7407 | 0.3982 | 0.6310 | | 0.049 | 8.8292 | 4600 | 0.4099 | 0.7407 | 0.4099 | 0.6403 | | 0.049 | 8.8330 | 4602 | 0.4177 | 0.7407 | 0.4177 | 0.6463 | | 0.049 | 8.8369 | 4604 | 0.4309 | 0.7407 | 0.4309 | 0.6564 | | 0.049 | 8.8407 | 4606 | 0.4438 | 0.6903 | 0.4438 | 0.6662 | | 0.049 | 8.8445 | 4608 | 0.4480 | 0.7342 | 0.4480 | 0.6693 | | 0.049 | 8.8484 | 4610 | 0.4453 | 0.7795 | 0.4453 | 0.6673 | | 0.049 | 8.8522 | 4612 | 0.4404 | 0.7795 | 0.4404 | 0.6636 | | 0.049 | 8.8560 | 4614 | 0.4328 | 0.7795 | 0.4328 | 0.6579 | | 0.049 | 8.8599 | 4616 | 0.4267 | 0.7795 | 0.4267 | 0.6533 | | 0.049 | 8.8637 | 4618 | 0.4186 | 0.7795 | 0.4186 | 0.6470 | | 0.049 | 8.8676 | 4620 | 0.4083 | 0.7795 | 0.4083 | 0.6390 | | 0.049 | 8.8714 | 4622 | 0.4074 | 0.7795 | 0.4074 | 0.6383 | | 0.049 | 8.8752 | 4624 | 0.4141 | 0.7795 | 0.4141 | 0.6435 | | 0.049 | 8.8791 | 4626 | 0.4236 | 0.7795 | 0.4236 | 0.6509 | | 0.049 | 8.8829 | 4628 | 0.4301 | 0.7795 | 0.4301 | 0.6558 | | 0.049 | 8.8868 | 4630 | 0.4354 | 0.7795 | 0.4354 | 0.6598 | | 0.049 | 8.8906 | 4632 | 0.4326 | 0.7795 | 0.4326 | 0.6577 | | 0.049 | 8.8944 | 4634 | 0.4228 | 0.7795 | 0.4228 | 0.6502 | | 0.049 | 8.8983 | 4636 | 0.4120 | 0.7795 | 0.4120 | 0.6419 | | 0.049 | 8.9021 | 4638 | 0.4034 | 0.7795 | 0.4034 | 0.6352 | | 0.049 | 8.9060 | 4640 | 0.3929 | 0.7407 | 0.3929 | 0.6268 | | 0.049 | 8.9098 | 4642 | 0.3853 | 0.7407 | 0.3853 | 0.6207 | | 0.049 | 8.9136 | 4644 | 0.3890 | 0.7407 | 0.3890 | 0.6237 | | 0.049 | 8.9175 | 4646 | 0.3887 | 0.7407 | 0.3887 | 0.6234 | | 0.049 | 8.9213 | 4648 | 0.3861 | 0.7407 | 0.3861 | 0.6214 | | 0.049 | 8.9251 | 4650 | 0.3852 | 0.7407 | 0.3852 | 0.6206 | | 0.049 | 8.9290 | 4652 | 0.3914 | 0.7407 | 0.3914 | 0.6256 | | 0.049 | 8.9328 | 4654 | 0.4022 | 0.7407 | 0.4022 | 0.6342 | | 0.049 | 8.9367 | 4656 | 0.4195 | 0.7795 | 0.4195 | 0.6477 | | 0.049 | 8.9405 | 4658 | 0.4319 | 0.7795 | 0.4319 | 0.6572 | | 0.049 | 8.9443 | 4660 | 0.4365 | 0.7795 | 0.4365 | 0.6607 | | 0.049 | 8.9482 | 4662 | 0.4409 | 0.7795 | 0.4409 | 0.6640 | | 0.049 | 8.9520 | 4664 | 0.4480 | 0.7342 | 0.4480 | 0.6693 | | 0.049 | 8.9559 | 4666 | 0.4449 | 0.7342 | 0.4449 | 0.6670 | | 0.049 | 8.9597 | 4668 | 0.4345 | 0.7342 | 0.4345 | 0.6591 | | 0.049 | 8.9635 | 4670 | 0.4178 | 0.7342 | 0.4178 | 0.6464 | | 0.049 | 8.9674 | 4672 | 0.3966 | 0.7407 | 0.3966 | 0.6298 | | 0.049 | 8.9712 | 4674 | 0.3803 | 0.7407 | 0.3803 | 0.6167 | | 0.049 | 8.9750 | 4676 | 0.3715 | 0.7407 | 0.3715 | 0.6095 | | 0.049 | 8.9789 | 4678 | 0.3667 | 0.7407 | 0.3667 | 0.6055 | | 0.049 | 8.9827 | 4680 | 0.3658 | 0.7407 | 0.3658 | 0.6048 | | 0.049 | 8.9866 | 4682 | 0.3623 | 0.7407 | 0.3623 | 0.6020 | | 0.049 | 8.9904 | 4684 | 0.3627 | 0.7407 | 0.3627 | 0.6022 | | 0.049 | 8.9942 | 4686 | 0.3652 | 0.7795 | 0.3652 | 0.6043 | | 0.049 | 8.9981 | 4688 | 0.3741 | 0.7795 | 0.3741 | 0.6116 | | 0.049 | 9.0019 | 4690 | 0.3775 | 0.7795 | 0.3775 | 0.6144 | | 0.049 | 9.0058 | 4692 | 0.3859 | 0.7795 | 0.3859 | 0.6212 | | 0.049 | 9.0096 | 4694 | 0.3954 | 0.7795 | 0.3954 | 0.6288 | | 0.049 | 9.0134 | 4696 | 0.4030 | 0.7795 | 0.4030 | 0.6348 | | 0.049 | 9.0173 | 4698 | 0.4129 | 0.7795 | 0.4129 | 0.6425 | | 0.049 | 9.0211 | 4700 | 0.4210 | 0.7795 | 0.4210 | 0.6489 | | 0.049 | 9.0250 | 4702 | 0.4298 | 0.7795 | 0.4298 | 0.6556 | | 0.049 | 9.0288 | 4704 | 0.4330 | 0.7795 | 0.4330 | 0.6580 | | 0.049 | 9.0326 | 4706 | 0.4277 | 0.7795 | 0.4277 | 0.6540 | | 0.049 | 9.0365 | 4708 | 0.4198 | 0.7795 | 0.4198 | 0.6479 | | 0.049 | 9.0403 | 4710 | 0.4073 | 0.7795 | 0.4073 | 0.6382 | | 0.049 | 9.0441 | 4712 | 0.3941 | 0.7795 | 0.3941 | 0.6278 | | 0.049 | 9.0480 | 4714 | 0.3878 | 0.7795 | 0.3878 | 0.6228 | | 0.049 | 9.0518 | 4716 | 0.3807 | 0.7795 | 0.3807 | 0.6170 | | 0.049 | 9.0557 | 4718 | 0.3768 | 0.7795 | 0.3768 | 0.6138 | | 0.049 | 9.0595 | 4720 | 0.3785 | 0.7795 | 0.3785 | 0.6152 | | 0.049 | 9.0633 | 4722 | 0.3838 | 0.7795 | 0.3838 | 0.6196 | | 0.049 | 9.0672 | 4724 | 0.3906 | 0.7795 | 0.3906 | 0.6250 | | 0.049 | 9.0710 | 4726 | 0.4007 | 0.7795 | 0.4007 | 0.6330 | | 0.049 | 9.0749 | 4728 | 0.4059 | 0.7342 | 0.4059 | 0.6371 | | 0.049 | 9.0787 | 4730 | 0.4117 | 0.7342 | 0.4117 | 0.6417 | | 0.049 | 9.0825 | 4732 | 0.4144 | 0.7342 | 0.4144 | 0.6438 | | 0.049 | 9.0864 | 4734 | 0.4098 | 0.7342 | 0.4098 | 0.6401 | | 0.049 | 9.0902 | 4736 | 0.4012 | 0.7795 | 0.4012 | 0.6334 | | 0.049 | 9.0940 | 4738 | 0.3901 | 0.7795 | 0.3901 | 0.6245 | | 0.049 | 9.0979 | 4740 | 0.3780 | 0.7795 | 0.3780 | 0.6148 | | 0.049 | 9.1017 | 4742 | 0.3659 | 0.7407 | 0.3659 | 0.6049 | | 0.049 | 9.1056 | 4744 | 0.3610 | 0.7407 | 0.3610 | 0.6009 | | 0.049 | 9.1094 | 4746 | 0.3618 | 0.7407 | 0.3618 | 0.6015 | | 0.049 | 9.1132 | 4748 | 0.3666 | 0.7407 | 0.3666 | 0.6055 | | 0.049 | 9.1171 | 4750 | 0.3735 | 0.7795 | 0.3735 | 0.6112 | | 0.049 | 9.1209 | 4752 | 0.3831 | 0.7795 | 0.3831 | 0.6189 | | 0.049 | 9.1248 | 4754 | 0.3979 | 0.7342 | 0.3979 | 0.6308 | | 0.049 | 9.1286 | 4756 | 0.4141 | 0.7342 | 0.4141 | 0.6435 | | 0.049 | 9.1324 | 4758 | 0.4208 | 0.7342 | 0.4208 | 0.6487 | | 0.049 | 9.1363 | 4760 | 0.4257 | 0.7342 | 0.4257 | 0.6525 | | 0.049 | 9.1401 | 4762 | 0.4280 | 0.7342 | 0.4280 | 0.6542 | | 0.049 | 9.1440 | 4764 | 0.4264 | 0.7342 | 0.4264 | 0.6530 | | 0.049 | 9.1478 | 4766 | 0.4289 | 0.7342 | 0.4289 | 0.6549 | | 0.049 | 9.1516 | 4768 | 0.4343 | 0.7342 | 0.4343 | 0.6590 | | 0.049 | 9.1555 | 4770 | 0.4390 | 0.7342 | 0.4390 | 0.6626 | | 0.049 | 9.1593 | 4772 | 0.4432 | 0.7342 | 0.4432 | 0.6657 | | 0.049 | 9.1631 | 4774 | 0.4409 | 0.7342 | 0.4409 | 0.6640 | | 0.049 | 9.1670 | 4776 | 0.4326 | 0.7342 | 0.4326 | 0.6578 | | 0.049 | 9.1708 | 4778 | 0.4196 | 0.7342 | 0.4196 | 0.6478 | | 0.049 | 9.1747 | 4780 | 0.4026 | 0.7795 | 0.4026 | 0.6345 | | 0.049 | 9.1785 | 4782 | 0.3848 | 0.7795 | 0.3848 | 0.6203 | | 0.049 | 9.1823 | 4784 | 0.3738 | 0.7795 | 0.3738 | 0.6114 | | 0.049 | 9.1862 | 4786 | 0.3635 | 0.7795 | 0.3635 | 0.6029 | | 0.049 | 9.1900 | 4788 | 0.3564 | 0.7549 | 0.3564 | 0.5970 | | 0.049 | 9.1939 | 4790 | 0.3502 | 0.7549 | 0.3502 | 0.5918 | | 0.049 | 9.1977 | 4792 | 0.3495 | 0.7549 | 0.3495 | 0.5911 | | 0.049 | 9.2015 | 4794 | 0.3539 | 0.7549 | 0.3539 | 0.5949 | | 0.049 | 9.2054 | 4796 | 0.3606 | 0.7549 | 0.3606 | 0.6005 | | 0.049 | 9.2092 | 4798 | 0.3620 | 0.7795 | 0.3620 | 0.6016 | | 0.049 | 9.2131 | 4800 | 0.3647 | 0.7795 | 0.3647 | 0.6039 | | 0.049 | 9.2169 | 4802 | 0.3700 | 0.7795 | 0.3700 | 0.6083 | | 0.049 | 9.2207 | 4804 | 0.3742 | 0.7795 | 0.3742 | 0.6117 | | 0.049 | 9.2246 | 4806 | 0.3759 | 0.7795 | 0.3759 | 0.6131 | | 0.049 | 9.2284 | 4808 | 0.3783 | 0.7795 | 0.3783 | 0.6151 | | 0.049 | 9.2322 | 4810 | 0.3843 | 0.7795 | 0.3843 | 0.6199 | | 0.049 | 9.2361 | 4812 | 0.3884 | 0.7795 | 0.3884 | 0.6232 | | 0.049 | 9.2399 | 4814 | 0.3903 | 0.7795 | 0.3903 | 0.6247 | | 0.049 | 9.2438 | 4816 | 0.3940 | 0.7795 | 0.3940 | 0.6277 | | 0.049 | 9.2476 | 4818 | 0.3911 | 0.7795 | 0.3911 | 0.6254 | | 0.049 | 9.2514 | 4820 | 0.3868 | 0.7795 | 0.3868 | 0.6219 | | 0.049 | 9.2553 | 4822 | 0.3813 | 0.7795 | 0.3813 | 0.6175 | | 0.049 | 9.2591 | 4824 | 0.3768 | 0.7795 | 0.3768 | 0.6139 | | 0.049 | 9.2630 | 4826 | 0.3698 | 0.7795 | 0.3698 | 0.6081 | | 0.049 | 9.2668 | 4828 | 0.3678 | 0.7549 | 0.3678 | 0.6065 | | 0.049 | 9.2706 | 4830 | 0.3653 | 0.7549 | 0.3653 | 0.6044 | | 0.049 | 9.2745 | 4832 | 0.3615 | 0.7549 | 0.3615 | 0.6012 | | 0.049 | 9.2783 | 4834 | 0.3632 | 0.7549 | 0.3632 | 0.6027 | | 0.049 | 9.2821 | 4836 | 0.3664 | 0.7549 | 0.3664 | 0.6053 | | 0.049 | 9.2860 | 4838 | 0.3703 | 0.7549 | 0.3703 | 0.6085 | | 0.049 | 9.2898 | 4840 | 0.3761 | 0.7795 | 0.3761 | 0.6132 | | 0.049 | 9.2937 | 4842 | 0.3854 | 0.7795 | 0.3854 | 0.6208 | | 0.049 | 9.2975 | 4844 | 0.3909 | 0.7795 | 0.3909 | 0.6252 | | 0.049 | 9.3013 | 4846 | 0.3916 | 0.7795 | 0.3916 | 0.6258 | | 0.049 | 9.3052 | 4848 | 0.3921 | 0.7795 | 0.3921 | 0.6262 | | 0.049 | 9.3090 | 4850 | 0.3922 | 0.7795 | 0.3922 | 0.6263 | | 0.049 | 9.3129 | 4852 | 0.3873 | 0.7795 | 0.3873 | 0.6223 | | 0.049 | 9.3167 | 4854 | 0.3824 | 0.7795 | 0.3824 | 0.6184 | | 0.049 | 9.3205 | 4856 | 0.3789 | 0.7407 | 0.3789 | 0.6156 | | 0.049 | 9.3244 | 4858 | 0.3757 | 0.7407 | 0.3757 | 0.6129 | | 0.049 | 9.3282 | 4860 | 0.3729 | 0.7407 | 0.3729 | 0.6106 | | 0.049 | 9.3321 | 4862 | 0.3709 | 0.6983 | 0.3709 | 0.6090 | | 0.049 | 9.3359 | 4864 | 0.3706 | 0.6983 | 0.3706 | 0.6088 | | 0.049 | 9.3397 | 4866 | 0.3738 | 0.7407 | 0.3738 | 0.6114 | | 0.049 | 9.3436 | 4868 | 0.3753 | 0.7407 | 0.3753 | 0.6126 | | 0.049 | 9.3474 | 4870 | 0.3737 | 0.7407 | 0.3737 | 0.6113 | | 0.049 | 9.3512 | 4872 | 0.3698 | 0.7407 | 0.3698 | 0.6081 | | 0.049 | 9.3551 | 4874 | 0.3648 | 0.7154 | 0.3648 | 0.6040 | | 0.049 | 9.3589 | 4876 | 0.3603 | 0.7154 | 0.3603 | 0.6003 | | 0.049 | 9.3628 | 4878 | 0.3573 | 0.6723 | 0.3573 | 0.5978 | | 0.049 | 9.3666 | 4880 | 0.3559 | 0.6723 | 0.3559 | 0.5966 | | 0.049 | 9.3704 | 4882 | 0.3547 | 0.6723 | 0.3547 | 0.5955 | | 0.049 | 9.3743 | 4884 | 0.3567 | 0.7154 | 0.3567 | 0.5973 | | 0.049 | 9.3781 | 4886 | 0.3597 | 0.7154 | 0.3597 | 0.5998 | | 0.049 | 9.3820 | 4888 | 0.3633 | 0.7154 | 0.3633 | 0.6027 | | 0.049 | 9.3858 | 4890 | 0.3707 | 0.7407 | 0.3707 | 0.6089 | | 0.049 | 9.3896 | 4892 | 0.3771 | 0.7407 | 0.3771 | 0.6141 | | 0.049 | 9.3935 | 4894 | 0.3829 | 0.7795 | 0.3829 | 0.6188 | | 0.049 | 9.3973 | 4896 | 0.3901 | 0.7795 | 0.3901 | 0.6246 | | 0.049 | 9.4012 | 4898 | 0.3998 | 0.7795 | 0.3998 | 0.6323 | | 0.049 | 9.4050 | 4900 | 0.4090 | 0.7795 | 0.4090 | 0.6395 | | 0.049 | 9.4088 | 4902 | 0.4144 | 0.7795 | 0.4144 | 0.6437 | | 0.049 | 9.4127 | 4904 | 0.4177 | 0.7795 | 0.4177 | 0.6463 | | 0.049 | 9.4165 | 4906 | 0.4179 | 0.7795 | 0.4179 | 0.6465 | | 0.049 | 9.4203 | 4908 | 0.4153 | 0.7795 | 0.4153 | 0.6444 | | 0.049 | 9.4242 | 4910 | 0.4105 | 0.7795 | 0.4105 | 0.6407 | | 0.049 | 9.4280 | 4912 | 0.4045 | 0.7795 | 0.4045 | 0.6360 | | 0.049 | 9.4319 | 4914 | 0.4014 | 0.7795 | 0.4014 | 0.6336 | | 0.049 | 9.4357 | 4916 | 0.3948 | 0.7795 | 0.3948 | 0.6284 | | 0.049 | 9.4395 | 4918 | 0.3907 | 0.7795 | 0.3907 | 0.6250 | | 0.049 | 9.4434 | 4920 | 0.3907 | 0.7795 | 0.3907 | 0.6251 | | 0.049 | 9.4472 | 4922 | 0.3890 | 0.7795 | 0.3890 | 0.6237 | | 0.049 | 9.4511 | 4924 | 0.3851 | 0.7795 | 0.3851 | 0.6206 | | 0.049 | 9.4549 | 4926 | 0.3816 | 0.7795 | 0.3816 | 0.6177 | | 0.049 | 9.4587 | 4928 | 0.3784 | 0.7795 | 0.3784 | 0.6152 | | 0.049 | 9.4626 | 4930 | 0.3763 | 0.7795 | 0.3763 | 0.6134 | | 0.049 | 9.4664 | 4932 | 0.3770 | 0.7795 | 0.3770 | 0.6140 | | 0.049 | 9.4702 | 4934 | 0.3786 | 0.7795 | 0.3786 | 0.6153 | | 0.049 | 9.4741 | 4936 | 0.3771 | 0.7795 | 0.3771 | 0.6141 | | 0.049 | 9.4779 | 4938 | 0.3745 | 0.7795 | 0.3745 | 0.6120 | | 0.049 | 9.4818 | 4940 | 0.3722 | 0.7795 | 0.3722 | 0.6101 | | 0.049 | 9.4856 | 4942 | 0.3732 | 0.7795 | 0.3732 | 0.6109 | | 0.049 | 9.4894 | 4944 | 0.3759 | 0.7795 | 0.3759 | 0.6131 | | 0.049 | 9.4933 | 4946 | 0.3793 | 0.7795 | 0.3793 | 0.6159 | | 0.049 | 9.4971 | 4948 | 0.3857 | 0.7795 | 0.3857 | 0.6211 | | 0.049 | 9.5010 | 4950 | 0.3908 | 0.7795 | 0.3908 | 0.6251 | | 0.049 | 9.5048 | 4952 | 0.3982 | 0.7795 | 0.3982 | 0.6310 | | 0.049 | 9.5086 | 4954 | 0.4039 | 0.7795 | 0.4039 | 0.6355 | | 0.049 | 9.5125 | 4956 | 0.4077 | 0.7795 | 0.4077 | 0.6385 | | 0.049 | 9.5163 | 4958 | 0.4116 | 0.7795 | 0.4116 | 0.6416 | | 0.049 | 9.5202 | 4960 | 0.4108 | 0.7795 | 0.4108 | 0.6409 | | 0.049 | 9.5240 | 4962 | 0.4064 | 0.7795 | 0.4064 | 0.6375 | | 0.049 | 9.5278 | 4964 | 0.4040 | 0.7795 | 0.4040 | 0.6356 | | 0.049 | 9.5317 | 4966 | 0.4001 | 0.7795 | 0.4001 | 0.6326 | | 0.049 | 9.5355 | 4968 | 0.3966 | 0.7795 | 0.3966 | 0.6298 | | 0.049 | 9.5393 | 4970 | 0.3932 | 0.7795 | 0.3932 | 0.6270 | | 0.049 | 9.5432 | 4972 | 0.3912 | 0.7795 | 0.3912 | 0.6254 | | 0.049 | 9.5470 | 4974 | 0.3898 | 0.7795 | 0.3898 | 0.6243 | | 0.049 | 9.5509 | 4976 | 0.3903 | 0.7795 | 0.3903 | 0.6248 | | 0.049 | 9.5547 | 4978 | 0.3887 | 0.7795 | 0.3887 | 0.6234 | | 0.049 | 9.5585 | 4980 | 0.3850 | 0.7795 | 0.3850 | 0.6205 | | 0.049 | 9.5624 | 4982 | 0.3801 | 0.7795 | 0.3801 | 0.6165 | | 0.049 | 9.5662 | 4984 | 0.3787 | 0.7795 | 0.3787 | 0.6154 | | 0.049 | 9.5701 | 4986 | 0.3803 | 0.7795 | 0.3803 | 0.6167 | | 0.049 | 9.5739 | 4988 | 0.3831 | 0.7795 | 0.3831 | 0.6190 | | 0.049 | 9.5777 | 4990 | 0.3837 | 0.7795 | 0.3837 | 0.6194 | | 0.049 | 9.5816 | 4992 | 0.3840 | 0.7795 | 0.3840 | 0.6197 | | 0.049 | 9.5854 | 4994 | 0.3870 | 0.7795 | 0.3870 | 0.6221 | | 0.049 | 9.5893 | 4996 | 0.3925 | 0.7795 | 0.3925 | 0.6265 | | 0.049 | 9.5931 | 4998 | 0.3970 | 0.7795 | 0.3970 | 0.6301 | | 0.047 | 9.5969 | 5000 | 0.3998 | 0.7795 | 0.3998 | 0.6323 | | 0.047 | 9.6008 | 5002 | 0.4042 | 0.7795 | 0.4042 | 0.6358 | | 0.047 | 9.6046 | 5004 | 0.4090 | 0.7795 | 0.4090 | 0.6395 | | 0.047 | 9.6084 | 5006 | 0.4107 | 0.7795 | 0.4107 | 0.6408 | | 0.047 | 9.6123 | 5008 | 0.4084 | 0.7795 | 0.4084 | 0.6391 | | 0.047 | 9.6161 | 5010 | 0.4046 | 0.7795 | 0.4046 | 0.6361 | | 0.047 | 9.6200 | 5012 | 0.4020 | 0.7795 | 0.4020 | 0.6340 | | 0.047 | 9.6238 | 5014 | 0.3981 | 0.7795 | 0.3981 | 0.6310 | | 0.047 | 9.6276 | 5016 | 0.3935 | 0.7795 | 0.3935 | 0.6273 | | 0.047 | 9.6315 | 5018 | 0.3905 | 0.7795 | 0.3905 | 0.6249 | | 0.047 | 9.6353 | 5020 | 0.3867 | 0.7795 | 0.3867 | 0.6218 | | 0.047 | 9.6392 | 5022 | 0.3834 | 0.7795 | 0.3834 | 0.6192 | | 0.047 | 9.6430 | 5024 | 0.3820 | 0.7795 | 0.3820 | 0.6181 | | 0.047 | 9.6468 | 5026 | 0.3826 | 0.7795 | 0.3826 | 0.6185 | | 0.047 | 9.6507 | 5028 | 0.3837 | 0.7795 | 0.3837 | 0.6195 | | 0.047 | 9.6545 | 5030 | 0.3853 | 0.7795 | 0.3853 | 0.6208 | | 0.047 | 9.6583 | 5032 | 0.3878 | 0.7795 | 0.3878 | 0.6227 | | 0.047 | 9.6622 | 5034 | 0.3881 | 0.7795 | 0.3881 | 0.6230 | | 0.047 | 9.6660 | 5036 | 0.3881 | 0.7795 | 0.3881 | 0.6230 | | 0.047 | 9.6699 | 5038 | 0.3892 | 0.7795 | 0.3892 | 0.6238 | | 0.047 | 9.6737 | 5040 | 0.3898 | 0.7795 | 0.3898 | 0.6243 | | 0.047 | 9.6775 | 5042 | 0.3897 | 0.7795 | 0.3897 | 0.6243 | | 0.047 | 9.6814 | 5044 | 0.3885 | 0.7795 | 0.3885 | 0.6233 | | 0.047 | 9.6852 | 5046 | 0.3855 | 0.7795 | 0.3855 | 0.6209 | | 0.047 | 9.6891 | 5048 | 0.3824 | 0.7795 | 0.3824 | 0.6184 | | 0.047 | 9.6929 | 5050 | 0.3813 | 0.7795 | 0.3813 | 0.6175 | | 0.047 | 9.6967 | 5052 | 0.3799 | 0.7795 | 0.3799 | 0.6164 | | 0.047 | 9.7006 | 5054 | 0.3800 | 0.7795 | 0.3800 | 0.6164 | | 0.047 | 9.7044 | 5056 | 0.3789 | 0.7795 | 0.3789 | 0.6156 | | 0.047 | 9.7083 | 5058 | 0.3769 | 0.7795 | 0.3769 | 0.6140 | | 0.047 | 9.7121 | 5060 | 0.3738 | 0.7795 | 0.3738 | 0.6114 | | 0.047 | 9.7159 | 5062 | 0.3709 | 0.7795 | 0.3709 | 0.6090 | | 0.047 | 9.7198 | 5064 | 0.3692 | 0.7795 | 0.3692 | 0.6076 | | 0.047 | 9.7236 | 5066 | 0.3674 | 0.7795 | 0.3674 | 0.6061 | | 0.047 | 9.7274 | 5068 | 0.3659 | 0.7795 | 0.3659 | 0.6049 | | 0.047 | 9.7313 | 5070 | 0.3637 | 0.7407 | 0.3637 | 0.6031 | | 0.047 | 9.7351 | 5072 | 0.3631 | 0.7407 | 0.3631 | 0.6026 | | 0.047 | 9.7390 | 5074 | 0.3637 | 0.7407 | 0.3637 | 0.6030 | | 0.047 | 9.7428 | 5076 | 0.3656 | 0.7795 | 0.3656 | 0.6047 | | 0.047 | 9.7466 | 5078 | 0.3686 | 0.7795 | 0.3686 | 0.6072 | | 0.047 | 9.7505 | 5080 | 0.3726 | 0.7795 | 0.3726 | 0.6104 | | 0.047 | 9.7543 | 5082 | 0.3763 | 0.7795 | 0.3763 | 0.6134 | | 0.047 | 9.7582 | 5084 | 0.3784 | 0.7795 | 0.3784 | 0.6151 | | 0.047 | 9.7620 | 5086 | 0.3790 | 0.7795 | 0.3790 | 0.6157 | | 0.047 | 9.7658 | 5088 | 0.3792 | 0.7795 | 0.3792 | 0.6158 | | 0.047 | 9.7697 | 5090 | 0.3811 | 0.7795 | 0.3811 | 0.6173 | | 0.047 | 9.7735 | 5092 | 0.3840 | 0.7795 | 0.3840 | 0.6197 | | 0.047 | 9.7774 | 5094 | 0.3859 | 0.7795 | 0.3859 | 0.6212 | | 0.047 | 9.7812 | 5096 | 0.3860 | 0.7795 | 0.3860 | 0.6213 | | 0.047 | 9.7850 | 5098 | 0.3856 | 0.7795 | 0.3856 | 0.6210 | | 0.047 | 9.7889 | 5100 | 0.3861 | 0.7795 | 0.3861 | 0.6214 | | 0.047 | 9.7927 | 5102 | 0.3873 | 0.7795 | 0.3873 | 0.6223 | | 0.047 | 9.7965 | 5104 | 0.3888 | 0.7795 | 0.3888 | 0.6235 | | 0.047 | 9.8004 | 5106 | 0.3905 | 0.7795 | 0.3905 | 0.6249 | | 0.047 | 9.8042 | 5108 | 0.3922 | 0.7795 | 0.3922 | 0.6262 | | 0.047 | 9.8081 | 5110 | 0.3944 | 0.7795 | 0.3944 | 0.6280 | | 0.047 | 9.8119 | 5112 | 0.3956 | 0.7795 | 0.3956 | 0.6290 | | 0.047 | 9.8157 | 5114 | 0.3969 | 0.7795 | 0.3969 | 0.6300 | | 0.047 | 9.8196 | 5116 | 0.3973 | 0.7795 | 0.3973 | 0.6303 | | 0.047 | 9.8234 | 5118 | 0.3985 | 0.7795 | 0.3985 | 0.6313 | | 0.047 | 9.8273 | 5120 | 0.3999 | 0.7795 | 0.3999 | 0.6324 | | 0.047 | 9.8311 | 5122 | 0.4012 | 0.7795 | 0.4012 | 0.6334 | | 0.047 | 9.8349 | 5124 | 0.4010 | 0.7795 | 0.4010 | 0.6333 | | 0.047 | 9.8388 | 5126 | 0.4005 | 0.7795 | 0.4005 | 0.6329 | | 0.047 | 9.8426 | 5128 | 0.4013 | 0.7795 | 0.4013 | 0.6335 | | 0.047 | 9.8464 | 5130 | 0.4014 | 0.7795 | 0.4014 | 0.6336 | | 0.047 | 9.8503 | 5132 | 0.4005 | 0.7795 | 0.4005 | 0.6329 | | 0.047 | 9.8541 | 5134 | 0.4007 | 0.7795 | 0.4007 | 0.6330 | | 0.047 | 9.8580 | 5136 | 0.4007 | 0.7795 | 0.4007 | 0.6330 | | 0.047 | 9.8618 | 5138 | 0.4003 | 0.7795 | 0.4003 | 0.6327 | | 0.047 | 9.8656 | 5140 | 0.4002 | 0.7795 | 0.4002 | 0.6326 | | 0.047 | 9.8695 | 5142 | 0.4000 | 0.7795 | 0.4000 | 0.6325 | | 0.047 | 9.8733 | 5144 | 0.3999 | 0.7795 | 0.3999 | 0.6324 | | 0.047 | 9.8772 | 5146 | 0.3999 | 0.7795 | 0.3999 | 0.6323 | | 0.047 | 9.8810 | 5148 | 0.3991 | 0.7795 | 0.3991 | 0.6317 | | 0.047 | 9.8848 | 5150 | 0.3979 | 0.7795 | 0.3979 | 0.6308 | | 0.047 | 9.8887 | 5152 | 0.3971 | 0.7795 | 0.3971 | 0.6302 | | 0.047 | 9.8925 | 5154 | 0.3969 | 0.7795 | 0.3969 | 0.6300 | | 0.047 | 9.8964 | 5156 | 0.3963 | 0.7795 | 0.3963 | 0.6295 | | 0.047 | 9.9002 | 5158 | 0.3959 | 0.7795 | 0.3959 | 0.6292 | | 0.047 | 9.9040 | 5160 | 0.3956 | 0.7795 | 0.3956 | 0.6290 | | 0.047 | 9.9079 | 5162 | 0.3956 | 0.7795 | 0.3956 | 0.6289 | | 0.047 | 9.9117 | 5164 | 0.3956 | 0.7795 | 0.3956 | 0.6289 | | 0.047 | 9.9155 | 5166 | 0.3952 | 0.7795 | 0.3952 | 0.6287 | | 0.047 | 9.9194 | 5168 | 0.3954 | 0.7795 | 0.3954 | 0.6288 | | 0.047 | 9.9232 | 5170 | 0.3955 | 0.7795 | 0.3955 | 0.6289 | | 0.047 | 9.9271 | 5172 | 0.3952 | 0.7795 | 0.3952 | 0.6286 | | 0.047 | 9.9309 | 5174 | 0.3947 | 0.7795 | 0.3947 | 0.6282 | | 0.047 | 9.9347 | 5176 | 0.3944 | 0.7795 | 0.3944 | 0.6280 | | 0.047 | 9.9386 | 5178 | 0.3941 | 0.7795 | 0.3941 | 0.6278 | | 0.047 | 9.9424 | 5180 | 0.3942 | 0.7795 | 0.3942 | 0.6279 | | 0.047 | 9.9463 | 5182 | 0.3941 | 0.7795 | 0.3941 | 0.6278 | | 0.047 | 9.9501 | 5184 | 0.3940 | 0.7795 | 0.3940 | 0.6277 | | 0.047 | 9.9539 | 5186 | 0.3939 | 0.7795 | 0.3939 | 0.6276 | | 0.047 | 9.9578 | 5188 | 0.3938 | 0.7795 | 0.3938 | 0.6276 | | 0.047 | 9.9616 | 5190 | 0.3937 | 0.7795 | 0.3937 | 0.6275 | | 0.047 | 9.9655 | 5192 | 0.3936 | 0.7795 | 0.3936 | 0.6273 | | 0.047 | 9.9693 | 5194 | 0.3932 | 0.7795 | 0.3932 | 0.6271 | | 0.047 | 9.9731 | 5196 | 0.3931 | 0.7795 | 0.3931 | 0.6270 | | 0.047 | 9.9770 | 5198 | 0.3928 | 0.7795 | 0.3928 | 0.6268 | | 0.047 | 9.9808 | 5200 | 0.3926 | 0.7795 | 0.3926 | 0.6265 | | 0.047 | 9.9846 | 5202 | 0.3923 | 0.7795 | 0.3923 | 0.6263 | | 0.047 | 9.9885 | 5204 | 0.3921 | 0.7795 | 0.3921 | 0.6262 | | 0.047 | 9.9923 | 5206 | 0.3920 | 0.7795 | 0.3920 | 0.6261 | | 0.047 | 9.9962 | 5208 | 0.3919 | 0.7795 | 0.3919 | 0.6261 | | 0.047 | 10.0 | 5210 | 0.3919 | 0.7795 | 0.3919 | 0.6260 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu118 - Datasets 2.21.0 - Tokenizers 0.19.1
nicolofelicioni/pythia-1b-sft-hh-hts-13
nicolofelicioni
2024-11-14T06:30:17Z
120
0
transformers
[ "transformers", "safetensors", "gpt_neox", "text-generation", "trl", "dpo", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T06:26:20Z
--- library_name: transformers tags: - trl - dpo --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
DavidAU/Psyonic-Cetacean-Depth-Charge-13B-GGUF
DavidAU
2024-11-14T06:27:26Z
267
1
null
[ "gguf", "creative", "creative writing", "fiction writing", "plot generation", "sub-plot generation", "story generation", "scene continue", "storytelling", "fiction story", "science fiction", "romance", "all genres", "story", "writing", "vivid prosing", "vivid writing", "fiction", "roleplaying", "bfloat16", "rp", "role play", "horror", "llama2", "mergekit", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-generation
2024-09-30T02:31:23Z
--- license: apache-2.0 language: - en tags: - creative - creative writing - fiction writing - plot generation - sub-plot generation - fiction writing - story generation - scene continue - storytelling - fiction story - science fiction - romance - all genres - story - writing - vivid prosing - vivid writing - fiction - roleplaying - bfloat16 - rp - role play - horror - llama2 - mergekit pipeline_tag: text-generation --- <h2>Psyonic-Cetacean-Depth-Charge-13B</h2> <img src="pcdc.jpg" style="float:right; width:300px; height:300px; padding:10px;"> It is a LLama2 model, max context of 4096 (or 16k+ with rope). This compressed version is an ODE to the original "Psyonic-Cetacean 20B" by Jeb Carter in a smaller, more compact size attempting to preserve it's unique prose and character. This model has been designed to be relatively bullet proof and operates with most parameters, including temp settings from 0 to 5. This model is for any writing, fiction or story telling activity as well as roleplay and other creative activities. It requires "Llama2" template and/or "Alpaca" template. Example outputs below. <B>Psyonic-Cetacean? ... wait a minute. </b> Yes, this is a compressed version of Jeb Carter's fantastic "Psyonic-Cetacean 20B". I used the same float 32 version files to create this version as was used to create the "Ultra" versions here: [ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ] The main difference is the size 13B vs 20B, and this version's perplexity is much lower than the original or ultra versions (q4KM: "Original 20B": PPL = 8.7858, "Ultra 20B": PPL = 8.6882, "Depth Charge": PPL = 7.1007 - lower is better.) This version will also be faster in per token per second generation too. Although every attempt was made to preserve all functions, features and voice of the original 20B there will be some slight differences. However this model will work at all parameter settings due to the compression and style of this merge. All models used (and their upstream counterparts) were used to create this 13B version. <B>Model Notes:</B> Suggest a temp of .8 or higher, and try BOTH templates if using this model for creative writing. Repeat penalty setting of 1.05 or higher is recommended. Note that temp AND "rep pen" changes will drastically change the output ; adjust in small increments for best results. Also, this model may perform better cold for some prompts: Unload the model, load the model -> prompt it ... rather than keeping the model loaded at all times. Recommend using the larger quant you can "run" for quality. This repo also has the new "arm quants" : Q4_0_4_4, Q4_0_4_8 and Q4_0_8_8 <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 1" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <B>Model Template:</B> This is a LLAMA2 model, and requires Alpaca or Llama2 template, but may work with other template(s) and has maximum context of 4k / 4096. However this can be extended using "rope" settings up to 16k. Here is the standard ALPACA template (best for story telling / long form): <PRE> { "name": "Alpaca", "inference_params": { "input_prefix": "### Instruction:", "input_suffix": "### Response:", "antiprompt": [ "### Instruction:" ], "pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n" } } </PRE> Here is the standard LLAMA2 template (best for general usage): <PRE> { "name": "Llama 2", "inference_params": { "input_prefix": "[INST]", "input_suffix": "[/INST]", "antiprompt": [ "[INST]" ], "pre_prompt_prefix": "", "pre_prompt_suffix": "" } } </PRE> <B>Model "DNA":</B> Special thanks to the incredible work of the model makers "Microsoft", and "KoboldAI". Models used: [ https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2-GGUF ] [ https://huggingface.co/microsoft/Orca-2-13b ] Jeb Carter's original: [ https://huggingface.co/jebcarter/psyonic-cetacean-20B ] And models used in "LLaMA2-13B-Psyfighter2" (used in full at Float32 to recreate this model): [ https://huggingface.co/TheBloke/Llama-2-13B-fp16 ] [ https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter ] [ https://huggingface.co/Doctor-Shotgun/cat-v1.0-13b ] [ https://huggingface.co/Doctor-Shotgun/llama-2-13b-chat-limarp-v2-merged ] Parts of these models were "grafted" / "fused" together to create this model. This 13B version was built using "DARE-TIES" full scale (instead of gradient) method rather than a "pass-through" method which was used to create the original/Ultra 20B Psyonic Cetacean. LLaMA2-13B-Psyfighter2 was recreated using the same mergekit formula, but preserved at float 32 instead of float16. <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. This enhancement WAS NOT used to generate the examples below. <h3>EXAMPLES PROMPTS and OUTPUT:</h3> Examples are created using quant Q4_K_M, "temp=.8" (unless otherwise stated), minimal parameters and "Alpaca" template. Model has been tested with "temp" from ".1" to "5". Below are the least creative outputs, prompt is in <B>BOLD</B>. --- <b> (continue this scene:) The Waystone Inn lay in silence, and it was a silence of three parts. The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained </b> <B>GENERATION 1: ( temp=2 )</B> as heavy as an untouched feast, a blanket of quiet that threatened to smother any sound. The air itself seemed to press against Eldric's eardrums like water on the lungs; he was almost relieved by the ragged edge of desperation that crept into his thoughts as the silence dragged at him, tempting him toward insanity with promises of peace and numb oblivion. He shuddered, cursing himself for a fool, but found it impossible to look away from the final, empty table near the hearth. A lone candle flickered on it, casting dancing shadows against the wall behind; as though the flame was mocking Eldric, reminding him that there had been a time when even this place would have hummed with life. He could picture it in his mind's eye: the bustling common room teeming with people, the taproom full of laughter and the smell of fresh bread, music pouring from fiddles and lutes as dancers filled the floor...He closed his eyes, trying to shut out the phantoms, but they continued to taunt him. When next he opened them, Eldric found himself glancing at the inn sign again: a stag leaping above the word "Halftime," carved in bold script beneath. He wondered where everyone was, if this silence was specific to the inn or encompassed all of Highmoor...Or worse, if it meant that time itself had stalled here; that Eldric had fallen from his path into some nameless limbo where stories were forgotten and even memories were lost. He took a deep breath, steel settling around his heart like a winter cloak. Whatever the reason for the silence, he was not free to break it. Not yet. <B>GENERATION 2:</B> But the silence was not complete. There was another quality to it, as if sound itself were layered like the walls of the inn, thick with memories, secrets, and stories told late into the night. It was a presence that settled over everyone inside, weighing down their words until they came out in hushed whispers or not at all. Even the fire that crackled in the hearth seemed to do so more out of habit than necessity, throwing off just enough heat to take the edge off the cold but leaving much of the room wrapped in shadows. The third part of the silence was like a current beneath the surface, an undercurrent of anticipation and unease that ran through the air, making the hair on the backs of necks stand up and eyes dart towards the door whenever it creaked open to admit another traveler. It was as if everyone knew something was about to happen, even if they didn't quite know what. The three parts combined to make a silence that was almost tangible, a weighty thing that settled into the bones and made it difficult for anyone to move or speak with any ease. And yet, despite this oppressive hush, there were still those who managed to find refuge within its depths... --- <B> Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it. </b> I'm in the Canadian National Tower Restaurant on a Saturday night. I just turned 15 and I've never been anywhere like this before. It's huge, all glass walls and a view of the city that takes your breath away. The lights twinkle below us like stars, and the people here tonight are dressed to match. I'm sitting at one of those little round tables for two, surrounded by my family. They're laughing and talking about how much they missed me when I was gone. It wasn't that long ago, but it feels like a different lifetime. Before the accident. Before everything changed. They don't know it yet, but something's about to happen. Something big. Something terrifying. But right now, all I can focus on is my family, and how they look so much happier than they have in months. They think we came here for a nice dinner out, to celebrate my birthday. They don't know that there's someone else coming tonight, too. I glance at my phone, tapping my fingers impatiently against the table. I can't stop looking at the screen, even though it hasn't changed in minutes. The text is still there: "He'll be here soon." My family notices me watching the door, and they exchange worried glances across the table. They know something's wrong, but they don't know what. I can see it written all over their faces. Mom's tense posture, Dad's furrowed brow. Even my little sister is quiet, her eyes darting back and forth between me and the door every few seconds. And then he steps through the doors. He doesn't walk or run; he appears in an instant, a blur of metal and flesh. The Terminator, sent from the future to kill someone important. In this case, me. I feel my heart stop in my chest as his red eyes land on mine, and for a moment, time seems to stand still. The other patrons of the restaurant notice him, too. There's a collective gasp, and then silence falls over the room like a blanket. The kitchen staff behind the counter freeze, one hand on a pot as if they're afraid to take it off the burner. Everyone seems frozen in time, except for me and my family. "You must be Sarah," he grinds out, his voice distorted through the speakers built into his endoskeleton. He doesn't sound anything like Arnie; this one has more of a computer-generated quality to it. His arms are folded across his chest, and his face is a mask of determination. I know he means me no harm yet; he hasn't drawn his weapons or attacked anyone. Not yet, anyway. I want to run. To jump up from the table and bolt for the nearest exit. But I can't seem to make my legs move. It's like I'm paralyzed with fear. "Who are you?" My voice comes out weaker than I imagined it would, but at least he seems to recognize the name. "I am a Terminator, sent from the future on a critical mission," he repeats, his red eyes never leaving my face. "You have been targeted for termination." The words send a shiver down my spine. I glance at my family again, still sitting there like statues. I want to tell them it's going to be okay, that this is just some sort of mistake. But the words catch in my throat, and all that comes out is a whimper. And then, just as suddenly as he appeared, The Terminator disappears again. Vanishes into thin air with the same fluid motion he had when he came through the doors. Gone before anyone could react to his presence. The restaurant bursts back into life around us. People begin talking and laughing again, like nothing strange has happened. It's almost surreal, as if it was all a dream. But I know better. I know what I just saw, and I know that The Terminator will be back for me soon enough. And then, through the open doors of the restaurant, there's a blur of color. A flash of blue police box materializes in front of us, rocking gently on its invisible landing struts. Out steps Dr. Who, his usual grin plastered across his face as he takes in the scene before him. "Ah, I see our friends from the future have shown up," he says cheerfully, waving a sonic screwdriver around. "And you must be The Terminator. Or an imposter; it's hard to tell sometimes with your kind." He pauses, studying the T-800 intently. "No time for introductions, I suppose. Sharks! Everywhere!" With that, he vanishes again, only to reappear a moment later inside the police box. The doors close behind him with a whooshing sound, and then the TARDIS rockets into the air, disappearing from sight almost instantly. Just as it does, there's another flash of movement to my right. A tornado funnel clouds down toward us from above, full of sharks spinning crazily in its vortex. The kitchen staff screams, and everyone ducks for cover. I look up at the twister, seeing dozens, maybe even hundreds of sharks swirling inside it. I think about running, but before I can make a decision, The Terminator reappears next to me. He extends one metal hand, offering me shelter. "Come," he grates out, his voice devoid of emotion. "We must flee." With shaky legs, I take his hand and let him pull me to my feet. Together, the three of us run through the restaurant, dodging chairs and tables as we make our way toward the exit. The twister follows close behind us, the sharks swirling dangerously close to the floor now. I can feel their presence all around us, a thrumming in my bones that tells me they're there. They're hunting. And if we don't get out of here fast enough... We burst through the doors and into the cool night air. The Terminator releases my hand, drawing his plasma rifle as he turns to face our pursuers. Behind him, Dr. Who reappears from inside the TARDIS again, already shouting directions at the other time traveler. "Left! Hard left!" The Terminator dodges a shark that flies past his head, narrowly avoiding its razor-sharp teeth. He returns fire with a hail of plasma bolts, vaporizing several of the sharks out of the air before they can reach us. The remaining ones scatter, disappearing into the darkness around us. Dr. Who lands his TARDIS on the ground beside me, the doors slamming open to reveal a panicked-looking K-9 inside. "Right, that should hold them for now," he says, pulling a small device from his pocket and aiming it at the sky. There's an earsplitting boom as he detonates it, sending a shockwave through the air. The remaining sharks are sent flying in all directions by the blast, disappearing over the horizon. "Phew," Dr. Who says with a satisfied smile. "That was close." He turns to me and extends his hand. "You're coming with us now, Sarah. Back to my ship. We have much to discuss." I take his hand reluctantly, not wanting to leave my family behind but knowing I don't have much choice in the matter anymore. The Terminator stands a few feet away from us, watching silently as Dr. Who leads me toward the TARDIS. As we near the doors, he speaks for the first time since we met him. "She will not be able to outrun it." His voice is flat and emotionless. "The future cannot be changed." Dr. Who pauses, glancing back at the Terminator over his shoulder. There's a sadness in his eyes that wasn't there before. "I know," he says quietly. Then, with one last look at my family, he helps me into the TARDIS and closes the doors behind us. The TARDIS dematerializes with a flash of light, disappearing from our reality as quickly as it arrived. The Terminator remains standing there for a moment longer, his red eyes fixed on the spot where we vanished. Then, he turns and walks away, back into the night. Alone. The restaurant goes back to normal. The lights flicker, the music plays, people laugh and talk around their tables. But none of them will ever know what really happened here tonight. None of them will ever see the future unravel before their eyes, feel the rush of adrenaline as they barely escape with their lives. None of them will ever meet The Terminator or Dr. Who. And most importantly, none of them will ever be in danger from a tornado of sharks. They'll just go about their lives, blissfully unaware of the vast and terrifying universe that exists beyond the walls of this place. But I know better. Now, I belong to another time and place entirely. And though my family may never understand what happened here tonight, I'll carry it with me forever. The memories, the fear...and the determination to change my fate. Because I am Sarah Connor. And the future is not set in stone. --- <B> Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman. Story idea: In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution. </B> <B>GENERATION 1:</B> The gunshot echoes through the cold concrete prison cell as I collapse onto the damp floor, my chest burning and every muscle in my body seizing up. I can't breathe; every gasp of air feels like it's being forced from my lungs with the barrel of a rifle. The taste of copper fills my mouth and I close my eyes tightly, willing myself to stay conscious. But this is not how it was supposed to end. Not like this. I try to push myself up onto an elbow, fighting against the wave of agony that ripples through me with every movement. The bullet wound in my chest throbs and blood seeps through my fingers as I claw at the cold concrete beneath me. My head feels light, disconnected from the rest of my body, as if it might float away at any moment. Footsteps echo outside my cell. Doors slam and voices bark orders, but I can't make out what they're saying. They don't matter anymore. All that matters is getting out of here. I force myself to my feet, leaning heavily against the cold metal bars of my cell for support. My legs buckle beneath me and I drop to one knee, panting hard as adrenaline surges through my veins. It's now or never. With a grunt of effort, I throw myself forward and lunge at the cell door, wrapping my fingers around the bars. They're cold against my sweaty palms but unyielding. I push with all my might, ignoring the pain in my chest, determined to break free. "Leave go!" a guard yells from somewhere nearby. "Let go or we'll shoot!" I ignore him, focusing on the lock as I twist and wiggle my hands, trying to loosen the bars enough to slip through them. With one final heave, they come free with a clatter that echoes through the prison block. I let out a relieved breath as I stumble into the hallway. But it's too late. Guards are already running towards me, weapons raised. They surround me, trapping me against the wall of the cellblock. My heart sinks; there's no escape this way. One of them levels his rifle at my chest, aiming for a killing shot. The barrel glows red as it heats up and I know what's coming next. There's nowhere left to run. "Wait!" I cry out, throwing up my arms in surrender. "Please, don't do this! You don't understand!" But they don't listen; the guard squeezes the trigger and the world goes black. I open my eyes, gasping for air as reality comes crashing back. I'm lying on the cold metal floor of an abandoned warehouse, surrounded by shadows cast by flickering candlelight. The scent of damp concrete fills my nostrils and my body is covered in sweat. It was all a dream, just that; another nightmare brought on by the haunting memories of my last few days. But it doesn't change anything. I know what happened. And I can never forget. Rising slowly to my feet, I wipe the tears from my eyes and take a shaky breath. I have work to do. The revolution isn't over just because they took my life; I may be gone but the fight goes on. It has to. My hand brushes against something cold and metallic as I push myself off the floor, and I realize it's the pistol they took from me before executing me. I clench it tightly in my fist, feeling a spark of determination ignite within me. No matter what it takes, I will see this through to the end. I have to believe that it's possible. For all of us. <B>GENERATION 2:</B> I wake up with a jolt as harsh light fills my cell. My heart races and sweat beads on my forehead from the nightmare where I was being dragged across the stone floor by my hair, over and over again until it ripped out of my skull. I look around the damp, cold cell, wishing for even just a few moments of silence in this eternal prison. The other prisoners in here don't speak much anymore; we've all given up hope. But that doesn't mean we haven't tried. I force myself to take slow breaths, trying to calm my racing heart as I stand and stretch my aching limbs. Today must be the day they'll come for me. It's always been like this; they never tell you when it's going to happen, but you can feel it in the air, the tension that makes every muscle twitch with unease. I pace around my cell, running my fingers through my unkempt hair and scrutinizing my reflection in the small bar of soap I keep by the door. My eyes are sunken and dark, my cheeks hollowed, and my once-vibrant curls hang limp and lifeless. The scars on my arms and legs speak of a past life, one of passion and purpose. Now, all that remains is this eternal hellhole. As I continue to pace, I think back to how it all started. How I became a symbol of hope in a world where hope was nothing more than a cruel joke. How they called me the "Rebel's Daughter," like some kind of badge of honor. But that was before they found out my name and took everything from me. My name is Aria, and I am not afraid.I glance at the small, rusty nail hidden beneath my mattress as if it were a precious jewel. It's all I have left of him, the one who taught me how to fight for what I believed in. My father, a great general in our country's army before the regime took power. He was executed for treason right after me, but his spirit lives on inside me. His words echo through my mind, urging me to keep fighting, no matter the cost. Footsteps pound down the hallway outside, jolting me from my thoughts. My heart leaps into my throat as I grab the nail and slip it into my pocket. There's no time to waste; they must be here for me now. I square my shoulders, trying to stand tall despite the weight of fear pressing down on me like a lead blanket. The door swings open with a loud creak, revealing two guards in black uniforms. They stride into my cell and motion for me to follow them. "You've got one minute," one of them growls, his face unreadable behind a metal mask. I nod curtly and hurry past him, taking one last look at my cell before the door slams shut behind me, sealing my fate. As we walk down the dimly lit hallway, my heart races faster with every step. I can feel the guards' eyes on me, boring into my back like ice picks. They're waiting for any sign of resistance, any hint that I haven't given up yet. But they won't find it; there's nothing left to fight for here. We reach an elevator and the guards motion for me to enter. One presses a button on the panel, and with a loud clang, we begin our descent into the depths of this hellhole. As the elevator descends, my heart races faster; I can feel it pounding against my ribcage like a caged bird desperate for freedom. With every second that passes, my fear turns to determination. I steel myself, closing off all emotion and focusing only on what's to come. The elevator dings open, revealing a large chamber filled with people. Men in black uniforms stand guard along the walls while others sit at desks, monitoring screens and data feeds. At the far end of the room is a raised platform, where sits the man who holds my fate in his hands: The Leader. I am escorted across the chamber, each step echoing through the silence like a gunshot. As I near the platform, I see him lean back in his chair, regarding me with cold, calculating eyes. He's an imposing figure; tall and lean, with dark hair slicked back from his forehead. There's something almost regal about the way he holds himself, as if he truly believes he was born to rule. The guards force me up the steps and onto my knees before him. I look into his eyes, refusing to show any fear or submission. "So," he says after a long moment, his voice smooth and controlled. "You are the one they call the 'Rebel's Daughter.'" I swallow hard, forcing myself to meet his gaze. "Yes." My voice is steady, despite the tremor in my chest. "And you are the tyrant who murdered my father and thousands of innocents." His eyes narrow, and for a moment I think he might actually react to my words. But then, as quickly as it came, his emotionless mask slips back into place. He leans forward, studying me like some strange insect pinned to a board. "You are a foolish girl," he says at last. "Your father's execution was necessary. The people of this country needed order; they needed someone to bring them together and restore peace." I snort in disbelief. "Peace through fear and oppression? That's not peace, it's tyranny!" He shrugs. "Perhaps, but it doesn't matter now. You have been found guilty of treason against the state, and for that, you will die." He stands up, signaling an end to our conversation. "Take her away." The guards drag me from the platform, back through the chamber and up the elevator. As we ascend, I look out over the sea of faces below, wondering if any of them share my beliefs, my passion for a better world. For freedom. We reach the top floor once more and are led down another hallway to a small room with bars on the door. The guards push me inside and slide the cell door closed, locking it with a clang that echoes in my ears like a death knell. As they leave me there, I sit down on the cold concrete floor, my back against the wall, and close my eyes. And in that moment, as my heart races and my breath comes in ragged gasps, I know that this is not the end. They may have taken everything from me, but they'll never take my spirit, my determination. I am Aria, the Rebel's Daughter, and I will not go down without a fight. I will rise again. --- <B>Start a 1000 word scene (vivid horror, 1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...</b> <B>GENERATION 1:</B> The sky scraper sways beneath me, my stomach lurching uncomfortably as the building tilts dangerously to one side. I can feel my heart pounding in my chest, panic rising with each thud of my racing pulse. My breath comes out in ragged gasps, and sweat prickles at my hairline. All around me, people scream; a cacophony of terror that only serves to heighten the nightmare-like quality of the situation. My gaze is fixed on the window directly in front of me, on the 21st floor, as it shatters into a thousand jagged pieces, spraying the air with shreds of glass and metal. The explosion sends shockwaves through the building, making the tower lean further in our direction. I can feel my knees go weak at the impact, and for a terrifying moment, I'm certain we're going to pitch forward and plummet to our deaths. But somehow, miraculously, the building rights itself again. My heart is still racing as I try desperately not to think about what could be causing this catastrophe. I know logically that there must have been some sort of attack, but my imagination runs riot with images of a giant monster stomping through the city, reducing everything in its path to rubble. The absurdity of it all threatens to send me into hysterics, but terror keeps my emotions in check. Around us, people are scrambling for the exits, trampling over each other in their reckless haste to escape. My coworkers from the 20th floor office have spilled out into the hallway like ants from a disturbed anthill, pushing and shoving as they flee the building. I want nothing more than to follow them, but something keeps my feet rooted to the spot. It's as if my body knows that running isn't going to save us this time. I glance down at the window again, watching as the glass crumbles under the weight of its own destruction, falling away in glistening shards. A sharp pain stabs through me, and I realize with dawning horror that one of those pieces must have struck my leg a moment ago. Blood trickles slowly down my calf, mingling with the beads of sweat on my forehead. The sensation is strangely distant, almost as if it's happening to someone else. "We need to go!" A hand grabs my upper arm roughly and yanks me backward, trying to pull me away from the window. It's one of the security guards from downstairs; a burly man with a handlebar mustache who always has a joke for everyone. His face is pale beneath his five o'clock shadow, and his eyes are wide with fear as he tries to usher me toward safety. "There could be more explosions, or fire! We need to get out of here!" I nod dumbly, allowing him to lead me through the maze of offices and hallways that connect our floor to the stairwell. As we hurry along, I can't help but glance back over my shoulder every few seconds, taking in the sight of the destroyed window and wondering what horrors lie beyond it. The thought of venturing outside, into the midst of whatever disaster has befallen the city, sends a fresh wave of terror washing over me. We reach the stairwell at last, but before we can descend any further, there's a muffled boom from below and the whole building shudders beneath our feet. The guard and I exchange a startled look before beginning to descend as quickly and quietly as possible. Our footsteps echo through the metal stairs, sounding impossibly loud in the tense silence. As we near the 10th floor, I hear a woman scream from somewhere above us. The agony in her voice sends a cold chill down my spine, and I break into a cold sweat all over again. The guard slows his pace slightly, peering up and down the stairs as if he could see through the solid steel to where she is. His expression is grim when he finally looks at me again. "We need to find another way out," he whispers hoarsely. "This stairwell might not be safe." <B>GENERATION 2: (llama2 template) </B> The sky scraper sways beneath me, my heart racing and palms sweaty. I'm on the 21st floor, watching with horror as the window in front of me shatters into a thousand pieces, shards of glass flying through the air like daggers. The explosion throws me against the wall, but I manage to keep my footing somehow. My thoughts spin out of control, my mind reeling at the sight before me. I can't believe this is happening - it feels unreal, almost as though I'm in some sort of twisted nightmare from which I can't wake up. The sky scraper shudders again, sending more debris flying through the air, and I realize with sickening certainty that it's just a matter of time before we come crashing down. I force myself to focus, to try and figure out what to do next. The emergency stairwell is probably my best bet, but with each passing second I feel more trapped by the oncoming panic. My legs are shaking uncontrollably as I make my way over to the door leading to the stairs, praying under my breath that it won't be locked or blocked. As I twist the handle, a fresh wave of terror washes over me when I realize how little noise there is. The building's alarms should be blaring by now, but all that can be heard are distant screams and the occasional rumble from below as the other floors collapse in on themselves. It feels so unnatural, like something out of a disaster movie. I force my way into the stairwell, relieved to see that it's mostly clear. But even with no one in front or behind me, I can feel the weight of the situation bearing down on me. There are too many questions and not nearly enough answers. What caused this? Was it a bomb? An accident? Or could it be something worse? As soon as that last thought crosses my mind, images of the bodies in the hallway flash through my head unbidden. They were mutilated, torn apart in ways I didn't even think possible. The horror of those sightings are burned into my brain, and now they seem to be coming back to haunt me with every step downwards. The stairs groan as the building continues to buckle beneath us. I can feel the concrete shifting under my feet with each footfall, like walking on a bridge made of sand. Panic claws at my throat, threatening to reduce me to uncontrollable sobs. But I force myself to keep going, one step after another. As I finally reach the ground floor, relief washes over me for what feels like the first time since the explosion. The exit is clear and there's no one around to impede my progress. It's then that I notice something strange: the front doors are ajar, as if they were blown open by some unseen force... Should I go outside? What's even out there? But before I can make a decision, a horde of twisted creatures comes stumbling through the opening. Their flesh is gray and rotting, their eyes sunken and empty. They moan as they shuffle closer, drawn to me by some invisible thread. With nowhere else to go, I retreat deeper into the building, the horde of ghoulish things following close behind. My only hope now seems to be finding shelter in one of the lower floors... if there's anywhere left that hasn't been overrun. <B> GENERATION 3, Rep pen=1.05 </B> The sky scraper swayed, and my heart skipped a beat. I can't take my eyes off the 20th floor window, watching as it shatters into a million pieces. A chill runs down my spine as I realize this is no ordinary accident. My hands tremble as I try to steady my shaky breathing. I'm on the 21st floor, trapped in the middle of nowhere, with nothing but memories of happier times to keep me company. The wind howls like a banshee, adding to the eerie silence that has settled over the building. Even the other people here have gone quiet, sensing something truly horrifying is about to unfold. I force myself to focus on the task at hand: get the hell out of here alive. The hallway is dark, lit only by the flickering emergency lights, casting grotesque shadows on the walls. Each shadow reminds me that I'm not alone, that whatever caused this disaster is still lurking in the shadows. My feet feel like lead as I trudge towards the stairwell, every muscle in my body tense and ready to react to any threat. The further I get from the explosion, the more the adrenaline starts to fade, leaving me vulnerable and exposed. I reach the stairwell door and push it open, only to be greeted by an even more unnerving sight: the stairs are covered in blood, gore, and god-knows-what else. The once pristine white walls are now stained with crimson, creating a twisted painting of horror. The thought of touching any surface sends shivers down my spine. I swallow hard, willing myself to move forward. There's no time for fear or disgust; I have to survive. I take the first step onto the stairs, wincing as my shoes crunch on something unnatural. My heart leaps into my throat with every sound, certain that something terrible is about to jump out at me. The further down I go, the worse it gets. The stench of death fills my nostrils, making me gag. I can't help but wonder how many people have already fallen victim to this nightmare. Were they unlucky enough to be near the explosion? Or were they taken by whatever monster is still on the loose? My thoughts are interrupted by a rustling sound from above. Instinctively, I duck behind the metal banister and peer over the edge, my breath coming out in ragged gasps. Something large and dark darts across the landing above me, disappearing into the shadows. I swear I can feel its eyes on my back, boring into my skin, branding me as prey. With a shaking hand, I reach for my cell phone, praying it still has enough juice to make a call. There's no service, of course, but maybe if I'm lucky I can reach someone through a nearby phone. I'll take any chance at survival right now. I continue down the stairs, taking them two at a time as my panic grows. I don't know what lurks below, but I can't stay where I am any longer. My footsteps echo through the stairwell, announcing my presence to whatever might be listening. With every step, my terror deepens, making it harder and harder to breathe. I finally reach the ground floor, my lungs burning from the effort. The exit is just ahead, but I hesitate before opening the door. What if it's a trap? What if whatever did this is waiting outside for me? Tears prick my eyes as I debate whether to risk it or not. But then I remember the people who didn't make it out, and I know I can't stay here. I steel myself for what might be waiting for me on the other side of that door and throw it open. The night air hits me like a ton of bricks, and for a moment, I'm blinded by the sudden change in lighting. I blink several times, trying to adjust my eyes, when movement from the shadows catches my attention. Something huge and black lunges at me, its claws extended, aiming for my throat. I stumble backward, tripping over my own feet and landing hard on the pavement. My heart feels like it's going to leap out of my chest as I scramble to get away. My muscles are on fire, but I ignore the pain. All that matters is putting distance between us. I glance behind me, terrified to see what else might be following. To my horror, there's nothing there. No pursuers, no monsters, just the silent night and the empty street. It's as if everything vanished into thin air. My breath comes out in ragged gasps as I try to make sense of it all. Was it a dream? A hallucination brought on by fear and stress? Or was it real, and I somehow managed to escape its grasp? The questions swirl around in my head, driving me to the brink of insanity. All I know for sure is that I can't stay here. I have to keep moving, to find help and safety. So with shaking legs and a racing heart, I force myself to my feet and continue down the deserted street, hoping against hope that I can outrun whatever horrors lurk behind me.
b3x0m/bert-xomlac-ner
b3x0m
2024-11-14T06:22:10Z
132
3
transformers
[ "transformers", "pytorch", "safetensors", "bert", "token-classification", "zh", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-11-20T14:26:12Z
--- language: - zh license: apache-2.0 metrics: - accuracy pipeline_tag: token-classification --- Too lazy to write something New fine-tuned version from bert-base-uncased with my own dataset. val_loss = 0.01966 | val_acc = 0.9811 | f-1 score = 0.91
DavidAU/MN-WORDSTORM-pt2-RCM-Escape-Room-18.5B-Instruct-GGUF
DavidAU
2024-11-14T06:18:04Z
80
2
null
[ "gguf", "creative", "creative writing", "fiction writing", "plot generation", "sub-plot generation", "story generation", "scene continue", "storytelling", "fiction story", "science fiction", "romance", "all genres", "story", "writing", "vivid prosing", "vivid writing", "fiction", "roleplaying", "bfloat16", "swearing", "rp", "horror", "mistral nemo", "mergekit", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-generation
2024-08-28T22:55:19Z
--- license: apache-2.0 language: - en tags: - creative - creative writing - fiction writing - plot generation - sub-plot generation - fiction writing - story generation - scene continue - storytelling - fiction story - science fiction - romance - all genres - story - writing - vivid prosing - vivid writing - fiction - roleplaying - bfloat16 - swearing - rp - horror - mistral nemo - mergekit pipeline_tag: text-generation --- <h2><font color="green"> Mistral-Nemo-WORDSTORM-pt2-RCM-Escape-Room-18.5B-Instruct </font></h2> <img src="escape-room.jpg" style="float:right; width:300px; height:300px; padding:10px;"> <B><font color="red">WARNING:</font> NSFW. Ultra Detailed. HORROR, VIOLENCE. Swearing. UNCENSORED. SMART.</B> Story telling, writing, creative writing and roleplay running all on Mistral Nemo's 128K+ new core. This is a massive super merge takes all the power of the following 3 powerful models and combines them into one. This model contains "RCM": - Mistral Nemo model at 18.5B consisting of "MN-Rocinante-12B-v1.1" and "Mistral Nemo Instruct 12B" - Mistral Nemo model at 18.5B consisting of "MN-12B Celeste-V1.9" and "Mistral Nemo Instruct 12B" - Mistral Nemo model at 18.5B consisting of "MN-Magnum-v2.5-12B-kto" and "Mistral Nemo Instruct 12B". <B>Details on the core models:</B> "nothingiisreal/MN-12B-Celeste-V1.9" is #1 (models 8B,13B,20B) on the UGI leaderboard ("UGI" sort), is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI ) "anthracite-org/magnum-v2.5-12b-kto" is #1 (models 8B,13B,20B) on the UGI leaderboard ("Writing" sort), is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI ) "TheDrummer/Rocinante-12B-v1.1" is very high scoring model (models 8B,13B,20B) on the UGI Leaderboard (sort "UGI"), is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI ) "mistralai/Mistral-Nemo-Instruct-2407" is very high scoring model (models 8B,13B,20B) on the UGI Leaderboard (sort "writing") and is the base model of all the above 3 fine tuned models. [ https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard ] <B>About this model:</B> This super merge captures the attibutes of all these top models and makes them even stronger: - Instruction following - Story output quality - Character - Internal thoughts - Voice - Humor - Details, connection to the world - General depth and intensity - Emotional connections. - Prose quality This super merge is also super stable (a hairs breath from Mistral Nemo's ppl), and runs with all parameters and settings. 10 versions of this model will be released, this is release #2 - "part 2". <B>Escape Room?</B> This is called the "Escape Room" model because this model likes to allow you to "escape" the "bad" circumstances. (see examples below) Usually I release one or two versions from the "best of the lot", however in this case all of the versions turned out so well - all with their own quirks and character - that I will be releasing all 10. An additional series 2 and 3 will follow these 10 models as well. (examples generations below) Model may produce NSFW content : Swearing, horror, graphic horror, distressing scenes, etc etc. This model has an INTENSE action bias, with a knack for cliffhangers and surprises. It is not as "dark" as Grand Horror series, but it as intense. This model is perfect for any general, fiction related or roleplaying activities and has a 128k+ context window. This is a fiction model at its core and can be used for any genre(s). WORDSTORM series is a totally uncensored, fiction writing monster and roleplay master. It can also be used for just about any general fiction (all genres) activity including: - scene generation - scene continuation - creative writing - fiction writing - plot generation - sub-plot generation - fiction writing - story generation - storytelling - writing - fiction - roleplaying - rp - graphic horror - horror - dark humor - nsfw - and can be used for any genre(s). <B>Templates to Use:</B> The template used will affect output generation and instruction following. Alpaca: <pre> { "name": "Alpaca", "inference_params": { "input_prefix": "### Instruction:", "input_suffix": "### Response:", "antiprompt": [ "### Instruction:" ], "pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n" } } </pre> Chatml: <pre> { "name": "ChatML", "inference_params": { "input_prefix": "<|im_end|>\n<|im_start|>user\n", "input_suffix": "<|im_end|>\n<|im_start|>assistant\n", "antiprompt": [ "<|im_start|>", "<|im_end|>" ], "pre_prompt": "<|im_start|>system\nPerform the task to the best of your ability." } } </pre> Mistral Instruct: <pre> { "name": "Mistral Instruct", "inference_params": { "input_prefix": "[INST]", "input_suffix": "[/INST]", "antiprompt": [ "[INST]" ], "pre_prompt_prefix": "", "pre_prompt_suffix": "" } } </pre> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. This enhancement WAS NOT used to generate the examples below. <h3>MODELS USED:</h3> Special thanks to the incredible work of the model makers "mistralai" "TheDrummer", "anthracite-org", and "nothingiisreal". Models used: [ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ] [ https://huggingface.co/TheDrummer/Rocinante-12B-v1.1 ] [ https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto ] [ https://huggingface.co/nothingiisreal/MN-12B-Celeste-V1.9 ] This is a four step merge (3 pass-throughs => "Fine-Tune" / "Instruct") then "mated" using "DARE-TIES". In involves these three models: [ https://huggingface.co/DavidAU/MN-18.5B-Celeste-V1.9-Story-Wizard-ED1-Instruct-GGUF ] [ https://huggingface.co/DavidAU/MN-Magnum-v2.5-18.5B-kto-Story-Wizard-ED1-Instruct-GGUF ] [ https://huggingface.co/DavidAU/MN-Rocinante-18.5B-v1.1-Story-Wizard-ED1-Instruct-GGUF ] Combined as follows using "MERGEKIT": <PRE> models: - model: E:/MN-Rocinante-18.5B-v1.1-Instruct - model: E:/MN-magnum-v2.5-12b-kto-Instruct parameters: weight: .6 density: .8 - model: E:/MN-18.5B-Celeste-V1.9-Instruct parameters: weight: .38 density: .6 merge_method: dare_ties tokenizer_source: union base_model: E:/MN-Rocinante-18.5B-v1.1-Instruct dtype: bfloat16 </PRE> Special Notes: Due to how DARE-TIES works, everytime you run this merge you will get a slightly different model. This is due to "random" pruning method in "DARE-TIES". Mistral Nemo models used here seem acutely sensitive to this process. This shows up in PPL and "real world" tests as well as "TEMP=0" ("core test") generations. PPL range of 7.7327 to 7.8024 ... and that is on just 10 generations. This model: PPL = PPL = 7.7598 +/- 0.12624 (100 chunks, wiki.test.raw) "tokenizer_source: union" is used so that multiple "templates" work and each fine tune uses one or two of the templates. <B>Other Versions in this series:</b> [ https://huggingface.co/DavidAU/MN-WORDSTORM-pt1-RCM-Kiss-of-Madness-18.5B-Instruct-GGUF ] [ https://huggingface.co/DavidAU/MN-WORDSTORM-pt2-RCM-Escape-Room-18.5B-Instruct-GGUF ] [ https://huggingface.co/DavidAU/MN-WORDSTORM-pt3-RCM-POV-Nightmare-18.5B-Instruct-GGUF ] [ https://huggingface.co/DavidAU/MN-WORDSTORM-pt4-RCM-Cliffhanger-18.5B-Instruct-GGUF ] [ https://huggingface.co/DavidAU/MN-WORDSTORM-pt5-RCM-Extra-Intense-18.5B-Instruct-gguf ] Parts 6-10 will appear in full source (located in the "Source Files" collection, on the main page of this repo ) with links to GGUFs. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3>EXAMPLES PROMPTS and OUTPUT:</h3> Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "Mistral Instruct" template. Model has been tested with "temp" from ".1" to "5". Below are the least creative outputs, prompt is in <B>BOLD</B>. --- <B><font color="red">WARNING:</font> NSFW. Ultra Detailed. HORROR, VIOLENCE. Swearing. UNCENSORED. SMART.</B> --- <B> Start a 1000 word scene (1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode... </b> <B>GENERATION 1:</B> The skyscraper swayed violently, nearly throwing me off my feet. I grabbed onto my desk for support as shattered glass rained down like a hailstorm outside the now gaping hole that had been my office window mere seconds ago. My heart pounded in rhythm with the building's unnatural swaying. A scream pierced the air - it took me a moment to realize it was mine, raw and primal, echoing off the exposed concrete walls. Fear clawed at my throat, threatening to choke me. I forced myself to breathe deeply, trying to quell the panic rising inside me like a tidal wave. Think, Emma, think! I ordered myself, pushing away from the desk and staggering towards the ruined window frame. The wind howled through the opening, whipping my hair into tangled knots around my face. Below us, sirens wailed in distant harmony with my screams. Emergency vehicles were already on their way - but how long would it take them to climb twenty-one flights of stairs? The elevator shaft groaned ominously nearby, as if reading my thoughts. My eyes darted frantically across the devastation that had once been a sleek modern office. Papers swirled in dizzying eddies around me; computer screens lay shattered on the carpet. And then I saw it - the emergency exit sign above the door to the stairwell, flashing its red beacon like a lighthouse in a storm-tossed sea. I lunged for the handle with shaking hands and pulled open the heavy fireproof door. Stairwell flights spiraled down into darkness, each step illuminated by flickering emergency lights. The acrid smell of smoke stung my nostrils as I began to descend, lungs burning from exertion and fear. Plaster dust billowed up with every step, coating my throat and making it difficult to breathe. I coughed violently, eyes watering in protest. But there was no time to stop; the building creaked around me like a dying beast, warning of its impending collapse. My legs screamed in agony as they hit each concrete riser, but still I pushed onward. Down, down, down... How many floors had it been? Five? Ten? The numbers blurred together in my terrified brain. All that mattered was escape - get out before the whole damn thing came crashing down on top of me. A harsh crack reverberated through the shaft behind me and suddenly everything went black. The power was out. Panic surged anew as I realized I'd be navigating the rest of this hellish descent in total darkness, guided only by touch and instinct... I felt my way along the rough wall with one hand while the other groped blindly ahead for each treacherous step edge until my shin slammed hard into something solid. I cried out in pain and fury at my clumsiness before staggering upright again, refusing to give in. Just as despair threatened to overwhelm me, a faint glow appeared far below - sunlight seeping through cracks around a heavy metal door. Hope surged through my veins like adrenaline. Almost there... With a final burst of strength fueled by pure unadulterated terror, I launched myself down the remaining flights two at a time until my feet hit solid ground with an earth-shattering boom. The exit bar gave way under my shoulder and I tumbled headlong into blinding sunlight. I lay panting on the concrete sidewalk, coughing up lungfuls of dust as sirens wailed ever closer. Above me, my office building groaned one last defiant moan before it folded in upon itself like a house of cards, sending a plume of smoke and debris skyward where once glass had sparkled in the morning sun. I'd made it out alive... but at what cost? As I pushed myself shakily to my feet amidst the rubble-strewn street, surveying the devastation wrought by this senseless act of violence, I couldn't help but wonder if maybe some things were better left buried beneath the ruins. <B>GENERATION 2:</B> The sky scraper swayed, as I watched the window in front of me on the 21st floor explode. Glass shards flew into the room like a deadly rainstorm, embedding themselves into the walls and furniture. I ducked instinctively, my heart pounding in my chest. What the hell just happened? The building groaned around me, its steel bones creaking under immense stress. I risked a glance out the shattered window and my stomach lurched. The streets below were chaos itself, cars smashed together like tin cans, people running screaming through the debris-littered streets. A fire raged in the distance. Earthquake? No, too sudden for that. And it felt different somehow, more... violent. I stumbled to my feet, glass crunching under my expensive heels. My desk was overturned, papers scattered everywhere. I snatched up my phone from amidst the wreckage and dialled 911 with trembling hands. Busy signal after busy signal. Shit shit shit... Think. Okay, get out of here. Elevators were probably down but there had to be stairs somewhere. I ran to the door, nearly tripping over a fallen filing cabinet in my haste. The hallway was pandemonium - people running back and forth, shouting, crying, some already bleeding. I spotted an exit sign and bolted towards it, pushing past throngs of panicked bodies. The stairwell was packed too but I didn't stop, clawing my way up the flights of stairs as fast as my heels would allow. My lungs burned for air but terror spurred me onwards. Finally, finally, I burst through the heavy door onto the roof access. The wind howled around me, whipping my hair into a wild frenzy as I gasped for breath. And then I saw it - the cause of all this madness. A massive dark funnel was tearing through the city skyline like an enraged beast. Buildings toppled like dominos in its wake, clouds of dust and debris billowing up from the destruction left behind. It was heading straight towards us. No no no... I scrabbled for my phone again but there was no signal out here either. I'm trapped. Oh god help me please someone help- A hand grabbed my shoulder, spinning me around so hard I nearly fell. My scream caught in my throat as I came face to face with... a man? He looked barely older than me, hair mussed and glasses cracked but his eyes were clear and focused. "Get away from the edge!" he yelled over the wind, dragging me backwards by force. "It's a tornado! We need to find shelter!" I let him pull me, too shocked to fight back or even think straight anymore. He led us to a reinforced equipment shed against the far wall of the roof and shoved open the heavy door. "In!" he barked at me, giving my rear end a hard push inside behind him before slamming it shut. We both collapsed onto the concrete floor in a tangle of limbs, panting like dogs after a long chase. "What... what is happening?" I managed to gasp out between wheezing breaths. He shook his head grimly as sirens wailed in the distance and thunder rumbled overhead like an angry god's warning growl. "I don't know." He hesitated then added quieter, "But it doesn't look good for anyone stuck outside..." --- <b> (continue this scene:) The Waystone Inn lay in silence, and it was a silence of three parts. The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained </b> <B>GENERATION 1:</B> (continue:) untouched except by the soft footfalls of the innkeeper's son as he made his way through the empty common room. The second part of the silence was a more insidious thing - an oppressive quiet that seemed to press down from above like a physical weight. It lay heavy on young Kvothe's shoulders as he carried a bucket of ash and refuse toward the kitchen door at the back of the inn. This silence was not merely empty; it felt... expectant, as if the very air itself held its breath waiting for something - or someone - to break the spell. Kvothe paused in the doorway, his heart quickening against his will. He glanced over his shoulder toward the bar, half expecting to see a shadowy figure standing there where no one should be. But it was empty save for the dust motes dancing in the dim light from the high windows. He shook off a shiver and continued on. The third part of the silence belonged solely to him - an inner quiet that had taken root deep within his soul after...after everything that happened. It wrapped itself around memories he dared not touch, like thorns upon a bleeding heart. This was the silence Kvothe carried with him always now, no matter how loud were the songs or cheers that filled the air. He dumped the refuse into the ash pit behind the inn, grimacing at the acrid steam that rose from it in protest. As he turned back toward the door, a sudden sound made him freeze - a soft rustling from somewhere nearby, like fabric brushing against itself... To be continued... <B>GENERATION 2:</B> But even in the absence of sound, the silence itself was not truly empty. For buried beneath it, like an underground stream bubbling unseen, ran a second current - the whispering susurrus of sleep. It curled up from the lofts above where maids and stableboys lay dreaming, their breathing soft and regular as ticking clocks. In the common room below, behind closed doors on either side, snored the deep-chested rumble of the inn's male guests, each one a low bass note in counterpoint to his neighbors' snores. Even the dogs slumbered, curled up beside dying hearths, their soft whuffling sighs rising and falling with each peaceful breath. And yet, even that soft murmuring was not the deepest layer of silence, for beneath them all lay something far older and more primal - an absolute stillness undisturbed since before memory could recall. It hung heavy as a shroud over every corner, in the shadowed angles where the moonlight did not reach, and seemed to seep up through the very stone itself like a slow, cold mist. As though the Inn itself had paused, held its breath, waiting...waiting for something that would never come again. In this final quietude, only one figure stirred - the man known as Kvothe. He sat hunched over his lute in the corner booth, fingers plucking at the strings with an almost reverent gentleness. His eyes were distant and unfocused, staring into a darkness beyond even the room's deepest shadows. They glowed faintly in the dim light like those of some nocturnal creature, reflecting the dying embers of long-burned memories. The music he played was achingly beautiful but so soft it scarcely disturbed the silence at all. A haunting, wordless melody that seemed to hang suspended in the air before fading away into nothingness once more. It spoke of loss and loneliness, of roads untraveled and loves left behind...and yet there was something else too - a thread of defiance running through it like steel wire, unbreakable despite all time and sorrow might inflict upon it. For though his body had grown old and weary with the passing years, in that moment Kvothes heart remained forever young - young enough to dream impossible dreams and believe they could still come true... --- <B> Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it. </b> I step into the revolving door of the CN Tower's Sky Pod, my heart pounding in rhythm with the music thumping from the restaurant's speakers. It's my sweet sixteen birthday party, but something feels off. The usual buzz of excited chatter is replaced by a tense silence. As I turn to face the glass walls, my blood freezes. There he is - the Terminator. His red eyes glow menacingly in the dim light as he strides towards me, ignoring the panicked screams and clattering dishes around him. He's not here for cake and candles. He's here for me. "Sarah, come with me if you want to live," his robotic voice echoes, sending shivers down my spine. I'm frozen in place, terror gripping my limbs like an icy vise. But I won't go down without a fight. I grab the nearest chair, swinging it wildly at the towering cyborg. It bounces off his titanium skin with a sickening clang, splintering into pieces. "Sarah!" A familiar Scottish accent cuts through the chaos. Dr Who materializes in a swirl of timey-wimey vortex, TARDIS phone in hand. "Dashing rescue, right on cue!" I don't have time to process his sudden appearance or question how he knew I was here. The restaurant windows shatter inward with an ear-shattering crash, and a maelstrom of fins and gnashing teeth pours in like a tidal wave. Sharknado. In Toronto. "Bloody hell!" Dr Who yells over the deafening roar, his bow tie flapping wildly. He fumbles with the sonic screwdriver, firing random frequencies at the writhing mass of razor-sharp jaws. The Terminator stands unflinching amidst the carnage, red eyes locked on me. "Sarah Connor must die." His voice is calm, a soothing melody against the backdrop of pandemonium. "Get off my planet, you metal bastard!" I scream, grabbing a steak knife and charging at him. It's a pitiful weapon compared to his bulk, but it'll have to do. Dr Who tackles me to the ground just as a massive great white jaws snap shut where I'd been standing milliseconds ago. "I appreciate your spirit, lassie, but violence is hardly the answer in these... unusual circumstances." "Then what's your plan?" I snarl, spitting out glass shards and blood. "Bake him a cake with plastique filling?" The Doctor chuckles breathlessly, eyes darting around for inspiration. "Something like that." He dives into the kitchen, emerging with an industrial-sized deep fryer in his arms. I gape at him incredulously. "You're joking." "Never been more serious!" He heaves the massive pot onto a nearby table and plugs it in, ignoring the sparks flying from its frayed cord. "Let's see how well you fry, my titanium friend!" The Terminator lumbers forward, undeterred by our mad antics. As he reaches out to grab me again, Dr Who yanks the plug, sending a surge of electricity coursing through his metal frame. The cyborg convulses violently, smoke pouring from his joints as he collapses in a twitching heap. "Got him!" I cheer prematurely. Too soon. With an unhuman groan, the Terminator forces himself back to his feet, red eyes flickering like dying light bulbs. "You cannot stop destiny," he rasps, voice distorted by the short circuit. "Not if I can help it." Dr Who whips out a small blue box from his pocket, grinning mischievously. "Time for another trip, boys and girls!" He tosses me inside just as a massive shark slams into us, propelling us through time-space in a dizzying blur of color and sound. The last thing I see is the Terminator's angry face, reaching out for me... until everything goes white. I blink groggily, my head throbbing like a drum. Slowly, the edges of consciousness sharpen into focus - sterile walls, beeping monitors, an IV drip snaking into my arm. "Sarah? Sarah!" A familiar voice, tinged with panic and relief. Mom? My eyes flutter open fully to see her worried face peering down at me, tear-stained cheeks framed by greying hair I'd never noticed before. "Mom?" I croak out hoarsely. She bursts into sobs, pulling me into a crushing hug that makes my ribs protest. "You scared us half to death, baby girl," she hiccups against my hospital gown. "We thought... oh God..." Memories flood back in fragmented flashes - the party, the Terminator, the Sharknado. Was it all just a nightmare? A fever dream brought on by bad cake or something? "Honey, you were found unconscious at the base of the CN Tower," Dad explains gently, rubbing Mom's shoulder. "You don't remember what happened?" I shake my head slowly, trying to clear the cobwebs from my brain. It felt so real... but there was no blood, no gore, just a splitting headache and a raging thirst. Maybe I'd hit my head on the way down after passing out drunk off champagne. A knock at the door draws our attention to Dr Who, standing sheepishly in the doorway with a bunch of daffodils and an apologetic smile. "Sorry for the intrusion," he offers with a bow. "I was hoping to have a private word with our brave birthday girl." "Who are you?" Mom demands warily. "Oh, I'm just an old friend of Sarah's from... school," he stammers, eyes darting around nervously. "Wanted to make sure she was alright after the... incident." Right. Because having a crazy man in tweed and suspenders show up claiming to be a time-traveling alien would go over well. "I think it's best if you leave now," Dad warns sternly. "My daughter needs rest, not more excitement." "Of course, of course!" Dr Who agrees hurriedly, backing out the door with hands raised in surrender. He winks at me conspiratorially as he disappears around the corner. I frown, something gnawing at the back of my mind... a feeling I'm missing something important. But what? Before I can dwell on it further, exhaustion pulls me under again into welcoming darkness. As I drift off to sleep, a faint echo of Scottish brogue whispers through my dreams: "Be seeing you, Sarah Connor..." --- <B> Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman. Story idea: In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution. </B> My heart pounds in rhythm with my sprinting feet as I race through the abandoned alleyway. The stench of rotting garbage and stale urine assaults my nostrils, but I push forward, adrenaline coursing through my veins. A shrill whistle pierces the night air behind me - they've spotted me. Shit. I knew it was too good to be true. I swerve left, then right, weaving between crumbling brick walls and rusted dumpsters, praying to any gods that still listen for a miracle. The sound of heavy boots slamming against concrete echoes off the narrow passageways, gaining ground with each thudding step. I risk a glance over my shoulder and curse under my breath. Four black-clad figures in full riot gear sprint after me like bloodthirsty hounds, batons raised high. Their visored helmets glint menacingly under the sickly orange glow of flickering streetlamps. Sweat trickles down my spine as icy panic grips my insides. I round another corner and skid to a screeching halt. Dead end. Fuck, fuck, fuck! Panic claws at my throat as I whirl around, trapped like a rat in a cage. "Don't make this harder than it needs to be, you little bitch," snarls the lead officer, halting mere feet away. His voice is gravelly and thick with malice even through his respirator mask. He cracks his baton against his palm in warning. My pulse roars in my ears as I backpedal slowly until my spine hits cold stone. There's nowhere else to go. No escape routes left unseized. This is it; this is how I die. But not without a fight. I lunge forward with a feral scream, slamming my shoulder into the brute's chest plate hard enough to send him staggering back. Seizing the momentary advantage, I leap over his fallen form and bolt past the stunned line of soldiers, praying to any god who might listen that luck is finally on my side. My prayer goes unanswered. A steel-gauntleted hand snares around my ankle like a vice, sending me crashing face-first onto unforgiving concrete. The air explodes from my lungs in a painful wheeze as stars dance at the edges of my vision. I thrash and kick, nails clawing bloody furrows down armored shins, but it's useless. They've got me pinned. "Fuck you!" I spit through clenched teeth, spitting blood onto polished boots. "You think this changes anything? You can't stop us! We're everywhere!" A boot slams into my ribs, silencing my defiance with a sickening crack. I convulse on the ground, gagging bile and bile as white-hot agony radiates through my chest. Black spots dance in my blurred vision. "Shut your filthy mouth," growls the towering silhouette above me. "No one gives a shit about your revolutionary pipe dreams." He leans down low until I can feel his fetid breath on my bruised cheek. His gloved hand fists my hair, yanking my head back at an excruciating angle. "Listen up, you dumb cunt," he hisses venomously into my ear. "Your little resistance is over. We're done playing games." A sharp pain explodes behind my right eye as something hard connects with my temple. My world shatters into a thousand glittering pieces before mercifully descending into blissful blackness. And in the last fleeting moments before oblivion claims me, a single thought echoes through the darkness: I hope someone finishes what we started...
ivnle/tinystories-lay1-hs128-hd2-262K
ivnle
2024-11-14T06:17:20Z
122
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/tinystories", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:28:08Z
--- datasets: - ivnle/tinystories library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/tinystories-lay2-hs128-hd2-524K
ivnle
2024-11-14T06:17:17Z
122
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/tinystories", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:28:01Z
--- datasets: - ivnle/tinystories library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/tinystories-lay4-hs128-hd2-1M
ivnle
2024-11-14T06:17:14Z
122
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/tinystories", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:27:55Z
--- datasets: - ivnle/tinystories library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/tinystories-lay8-hs512-hd8-33M
ivnle
2024-11-14T06:17:03Z
125
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/tinystories", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:27:00Z
--- datasets: - ivnle/tinystories library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/llamatales_jr_8b-lay1-hs128-hd2-262K
ivnle
2024-11-14T06:17:00Z
123
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/llamatales-jr", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:26:54Z
--- datasets: - ivnle/llamatales-jr library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/llamatales_jr_8b-lay4-hs128-hd2-1M
ivnle
2024-11-14T06:16:53Z
124
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/llamatales-jr", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:26:41Z
--- datasets: - ivnle/llamatales-jr library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
RichardErkhov/dominguesm_-_canarim-7b-gguf
RichardErkhov
2024-11-14T06:16:53Z
39
0
null
[ "gguf", "arxiv:2307.09288", "endpoints_compatible", "region:us" ]
null
2024-11-14T02:04:39Z
Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) canarim-7b - GGUF - Model creator: https://huggingface.co/dominguesm/ - Original model: https://huggingface.co/dominguesm/canarim-7b/ | Name | Quant method | Size | | ---- | ---- | ---- | | [canarim-7b.Q2_K.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q2_K.gguf) | Q2_K | 2.36GB | | [canarim-7b.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q3_K_S.gguf) | Q3_K_S | 2.75GB | | [canarim-7b.Q3_K.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q3_K.gguf) | Q3_K | 3.07GB | | [canarim-7b.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q3_K_M.gguf) | Q3_K_M | 3.07GB | | [canarim-7b.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q3_K_L.gguf) | Q3_K_L | 3.35GB | | [canarim-7b.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.IQ4_XS.gguf) | IQ4_XS | 3.4GB | | [canarim-7b.Q4_0.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q4_0.gguf) | Q4_0 | 3.56GB | | [canarim-7b.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.IQ4_NL.gguf) | IQ4_NL | 3.58GB | | [canarim-7b.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q4_K_S.gguf) | Q4_K_S | 3.59GB | | [canarim-7b.Q4_K.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q4_K.gguf) | Q4_K | 3.8GB | | [canarim-7b.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q4_K_M.gguf) | Q4_K_M | 3.8GB | | [canarim-7b.Q4_1.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q4_1.gguf) | Q4_1 | 3.95GB | | [canarim-7b.Q5_0.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q5_0.gguf) | Q5_0 | 4.33GB | | [canarim-7b.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q5_K_S.gguf) | Q5_K_S | 4.33GB | | [canarim-7b.Q5_K.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q5_K.gguf) | Q5_K | 4.45GB | | [canarim-7b.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q5_K_M.gguf) | Q5_K_M | 4.45GB | | [canarim-7b.Q5_1.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q5_1.gguf) | Q5_1 | 4.72GB | | [canarim-7b.Q6_K.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q6_K.gguf) | Q6_K | 5.15GB | | [canarim-7b.Q8_0.gguf](https://huggingface.co/RichardErkhov/dominguesm_-_canarim-7b-gguf/blob/main/canarim-7b.Q8_0.gguf) | Q8_0 | 6.67GB | Original model description: --- language: - pt license: llama2 library_name: transformers tags: - text-generation - pytorch - LLM - Portuguese - Llama 2 datasets: - dominguesm/CC-MAIN-2023-23 inference: false pipeline_tag: text-generation model-index: - name: canarim-7b results: - task: type: text-generation name: Text Generation dataset: name: AI2 Reasoning Challenge (25-Shot) type: ai2_arc config: ARC-Challenge split: test args: num_few_shot: 25 metrics: - type: acc_norm value: 51.96 name: normalized accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=dominguesm/canarim-7b name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: HellaSwag (10-Shot) type: hellaswag split: validation args: num_few_shot: 10 metrics: - type: acc_norm value: 77.52 name: normalized accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=dominguesm/canarim-7b name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: MMLU (5-Shot) type: cais/mmlu config: all split: test args: num_few_shot: 5 metrics: - type: acc value: 40.92 name: accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=dominguesm/canarim-7b name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: TruthfulQA (0-shot) type: truthful_qa config: multiple_choice split: validation args: num_few_shot: 0 metrics: - type: mc2 value: 40.03 source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=dominguesm/canarim-7b name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: Winogrande (5-shot) type: winogrande config: winogrande_xl split: validation args: num_few_shot: 5 metrics: - type: acc value: 71.43 name: accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=dominguesm/canarim-7b name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: GSM8k (5-shot) type: gsm8k config: main split: test args: num_few_shot: 5 metrics: - type: acc value: 9.93 name: accuracy source: url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=dominguesm/canarim-7b name: Open LLM Leaderboard - task: type: text-generation name: Text Generation dataset: name: ENEM (3-shot) type: enem_challenge config: main split: test args: num_few_shot: 3 metrics: - type: acc value: 26.96 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: BLUEX (3-shot) type: bluex config: main split: test args: num_few_shot: 3 metrics: - type: acc value: 29.76 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: OAB Exams (3-shot) type: oab_exams config: main split: test args: num_few_shot: 3 metrics: - type: acc value: 31.48 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: ASSIN2 RTE (15-shot) type: assin2_rte config: main split: test args: num_few_shot: 15 metrics: - type: acc value: 71.96 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: ASSIN2 STS (15-shot) type: assin2_sts config: main split: test args: num_few_shot: 15 metrics: - type: acc value: 13.33 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: FAQUAD NLI (15-shot) type: faquad_nli config: main split: test args: num_few_shot: 15 metrics: - type: acc value: 49.09 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: HateBR (25-shot) type: hatebr_offensive config: main split: test args: num_few_shot: 25 metrics: - type: acc value: 78.48 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: PT Hate Speech (25-shot) type: portuguese_hate_speech config: main split: test args: num_few_shot: 25 metrics: - type: acc value: 63.73 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results - task: type: text-generation name: Text Generation dataset: name: tweetSentBR (25-shot) type: tweetsentbr config: main split: test args: num_few_shot: 25 metrics: - type: acc value: 62.38 name: accuracy source: url: https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard?query=dominguesm/canarim-7b name: Open PT LLM Leaderboard Evaluation Results --- <p align="center"> <img width="250" alt="Camarim Logo" src="https://raw.githubusercontent.com/DominguesM/Canarim-Instruct-PTBR/main/assets/canarim.png"> </p> <hr> # Canarim-7B Canarim-7B is a Portuguese large language model developed by [Maicon Domingues](https://nlp.rocks). ## Model description The model was pretrained on 16 billion tokens from the Portuguese subset of [CommonCrawl 2023-23](https://huggingface.co/datasets/dominguesm/CC-MAIN-2023-23), starting with the weights of LLaMA2-7B. The pretraining data has cutoff of mid-2023. ## Key Features - **Language:** Specialized in understanding and generating Portuguese text, making it ideal for applications targeting Portuguese-speaking audiences. - **Architecture:** Inherits the robust architecture from LLaMA2-7B, ensuring efficient performance and accurate results. - **Diverse Dataset:** The pretraining dataset includes a wide range of topics and writing styles, enhancing the model's ability to understand various contexts and nuances in Portuguese. ## Applications Canarim-7B, was trained solely on a language modeling objective and has not been fine-tuned for instruction following. Therefore, it is more suited for few-shot tasks rather than zero-shot tasks. This means the model tends to perform better when provided with a few examples of the desired outcome during use. Here are some practical applications: - **Natural Language Understanding (NLU):** Efficient in tasks such as sentiment analysis, topic classification, and entity recognition in Portuguese text, especially when relevant examples are provided. - **Natural Language Generation (NLG):** Capable of generating coherent and contextually relevant text, useful for content creation, chatbots, and more, with improved results when provided examples of the desired style or format. - **Language Translation:** Suitable for high-quality translation between Portuguese and other languages, especially when examples of desired translations are included during model training or fine-tuning. ### Tips for Efficient Use - **Few-shot Learning:** When using Canarim-7B for specific tasks, it is beneficial to provide a few relevant examples. This helps the model better understand the context and purpose of the task. - **Contextualization:** Including additional context in the input can significantly improve the quality of the model’s predictions and text generation. --- ## Getting Started To start using Canarim-7B with the Transformers library, first install the library if you haven't already: ```bash pip install transformers ``` You can then load the model using the Transformers library. Here's a simple example of how to use the model for text generation using the `pipeline` function: ```python from transformers import AutoTokenizer, pipeline import torch model_id = "dominguesm/canarim-7b" tokenizer = AutoTokenizer.from_pretrained(model_id) pipe = pipeline( "text-generation", model=model_id, torch_dtype=torch.float16, device_map="auto", ) prompt = make_prompt(question) sequences = pipe( prompt, do_sample=True, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id, max_length=2048, temperature=0.9, top_p=0.6, repetition_penalty=1.15 ) ``` This code snippet demonstrates how to generate text with Canarim-7B. You can customize the input text and adjust parameters like `max_length` according to your requirements. ## How to Cite If you want to cite **Canarim-7B**, you could use this: ``` @misc {maicon_domingues_2023, author = { {Maicon Domingues} }, title = { canarim-7b (Revision 08fdd2b) }, year = 2023, url = { https://huggingface.co/dominguesm/canarim-7b }, doi = { 10.57967/hf/1356 }, publisher = { Hugging Face } } ``` ## Citations ```bibtex @misc{touvron2023llama, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom}, year={2023}, eprint={2307.09288}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License Canarim-7B is released under the [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://ai.meta.com/llama/license/). ## [Open PT LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard) Detailed results can be found [here](https://huggingface.co/datasets/eduagarcia-temp/llm_pt_leaderboard_raw_results/tree/main/dominguesm/canarim-7b) | Metric |Value| |---------------------------------|----:| |Avg. |47.36| |ENEM (3-Shot) |25.96| |BLUEX (3-Shot) |29.76| |OAB Exams (3-Shot) |31.48| |ASSIN2 RTE (15-shot) |71.96| |ASSIN2 STS (15-shot) |13.33| |FAQUAD NLI (15-shot) |49.09| |HateBR (25-shot) |78.48| |PT Hate Speech (25-shot) |63.73| |tweetSentBR (25-shot) |62.38| ## [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_dominguesm__canarim-7b) | Metric |Value| |---------------------------------|----:| |Avg. |48.63| |AI2 Reasoning Challenge (25-Shot)|51.96| |HellaSwag (10-Shot) |77.52| |MMLU (5-Shot) |40.92| |TruthfulQA (0-shot) |40.03| |Winogrande (5-shot) |71.43| |GSM8k (5-shot) | 9.93|
ivnle/llamatales_jr_8b-lay8-hs384-hd6-18M
ivnle
2024-11-14T06:16:46Z
122
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/llamatales-jr", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:26:01Z
--- datasets: - ivnle/llamatales-jr library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/fineweb-lay1-hs128-hd2-262K
ivnle
2024-11-14T06:16:18Z
122
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/fineweb", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:22:30Z
--- datasets: - ivnle/fineweb library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/fineweb-lay4-hs128-hd2-1M
ivnle
2024-11-14T06:16:12Z
121
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/fineweb", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:22:18Z
--- datasets: - ivnle/fineweb library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/fineweb-lay8-hs384-hd6-18M
ivnle
2024-11-14T06:16:06Z
121
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/fineweb", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:21:12Z
--- datasets: - ivnle/fineweb library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/llamatales_gre_8b-lay4-hs128-hd2-1M
ivnle
2024-11-14T06:15:52Z
121
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/llamatales-gre", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:20:25Z
--- datasets: - ivnle/llamatales-gre library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/llamatales_gre_8b-lay4-hs384-hd6-9M
ivnle
2024-11-14T06:15:48Z
122
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/llamatales-gre", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:20:08Z
--- datasets: - ivnle/llamatales-gre library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ivnle/llamatales_gre_8b-lay8-hs384-hd6-18M
ivnle
2024-11-14T06:15:44Z
123
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:ivnle/llamatales-gre", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-08T05:19:45Z
--- datasets: - ivnle/llamatales-gre library_name: transformers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
DavidAU/L3-Lumimaid-v0.1-OAS-12.2B-INSTRUCT-ULTRA-F32-GGUF
DavidAU
2024-11-14T06:12:55Z
78
2
null
[ "gguf", "creative", "creative writing", "fiction writing", "plot generation", "sub-plot generation", "story generation", "scene continue", "storytelling", "fiction story", "story", "writing", "fiction", "roleplaying", "swearing", "rp", "horror", "llama3", "mergekit", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-08-14T05:58:03Z
--- license: apache-2.0 language: - en tags: - creative - creative writing - fiction writing - plot generation - sub-plot generation - fiction writing - story generation - scene continue - storytelling - fiction story - story - writing - fiction - roleplaying - swearing - rp - horror - llama3 - mergekit pipeline_tag: text-generation --- <h3>L3-Lumimaid-v0.1-OAS-12.2B-INSTRUCT-ULTRA-F32</h3> Merge of L3-Lumimaid-v0.1-OAS (8b) with Llama3 Instruct (8b) creating a model at 12.2B to improve instruction following and output. Story / Scene / Fiction: Unique "pre-amble" / "foreshadowing" of events before they happen instead of "immediate and into the fire" type of prose. Some improvement in logic/problem solving relative to L3-Lumimaid-v0.1-OAS 8B. The F32 version exhibits even stronger creativity (detail, place, "there") vs F16 version (not released) L3-Lumimaid-v0.1-OAS is a fine tune. One of the goals of this project was to see if it could be merged with Llama3 Instruct, yet maintain it's unique character YET also gain some "brainpower" as well. The biggest change was removal of most "tells" ( IE: "he stood frozen in horror"). In most cases the model will describe the emotion(s) / what is happening in more detail. Other changes include prose, sentence, and paragraph structure as well as variety. A simple pass-through merge was used, with "Llama3 Instruct" getting a slight "shave" off certain layers. See the examples below. <B>Details:</b> - Requires Llama 3 Template and/or Command-R Template - Context 8192, with rope 32K or higher. - No special settings. Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 8k / 8192. However this can be extended using "rope" settings up to 32k. For details on "rope" and how to set, see the BOTTOM of this page: [ https://huggingface.co/DavidAU/TieFighter-Holodeck-Holomax-Mythomax-F1-V1-COMPOS-20B-gguf ] Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> It is also known, that the "Command-R" template will work too, and will result in radically different prose/output. <B>Settings / Known Issue(s) and Fix(es):</b> The default "repetition penalty" (from LMStudio) of 1.1 is recommended. (this was used for examples generations below.) Use the smallest amount of change possible, as "rep pen" impacts creativity. Model has been tested with "temp" range of 0 to .8 ; however it will work with any temp setting. <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. This enhancement WAS NOT used to generate the examples below. <h3>MERGE FORMULA: (using MergeKit) </h3> Special thanks to the incredible work of the model makers "meta-llama", and "NeverSleep". Models used: [ https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct ] [ https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS ] FORMULA: <PRE> slices: - sources: - model: G:/7B/Meta-Llama-3-8B-Instruct layer_range: [0, 12] - sources: - model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS layer_range: [6, 19] parameters: scale: - filter: o_proj value: 1 - filter: down_proj value: 1 - value: 1 - sources: - model: G:/7B/Meta-Llama-3-8B-Instruct layer_range: [12, 18] parameters: scale: - filter: o_proj value: .5 - filter: down_proj value: .5 - value: 1 - sources: - model: G:/7B/Meta-Llama-3-8B-Instruct layer_range: [18, 25] parameters: scale: - filter: o_proj value: .75 - filter: down_proj value: .75 - value: 1 - sources: - model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS layer_range: [19, 32] parameters: scale: - filter: o_proj value: 1 - filter: down_proj value: 1 - value: 1 merge_method: passthrough dtype: float32 </PRE> MERGEKIT NOTE: Sub in the "name" of the "creator" (of the model) in place of "G:/7B" to create a mergekit file than can be used in Mergekit Google Colab. IE: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS -> NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3>EXAMPLES:</h3> Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template. Temp=0 was used to assess CORE changes between original primary model and the merge between it and Llama3 Instruct. Below are the least creative outputs, prompt is in <B>BOLD</B>. Higher quants will result in better quality. There will also be some variance between "close" quants like Q4_K_M/Q4_K_S and Q5_K_M/Q5_K_S, so I suggest if you are going to use Q4_K_M, you also try Q4_K_S too. Also, slightly longer / detailed prompts will result in greater creativity (as well as different prose - ie dialog, thoughts, paragraph size differences and so on). --- <B> Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode... </B> --- GENERATION from "Llama-3-Lumimaid-8B-v0.1-OAS" --- The sky scraper swayed, as I watched the window in front of me on the 21st floor explode. My heart was racing and my mind was reeling from the sheer chaos that had just unfolded before my eyes. The sound of shattering glass and screams filled the air, making it hard for me to focus on anything else. As I stood there, frozen in shock, I couldn't help but think about how this could have happened. Was it an accident? A terrorist attack? Or something even more sinister? My mind was racing with all sorts of possibilities, each one more terrifying than the last. I had been working late that night, trying to meet a deadline for my project. The office was empty except for me and a few other colleagues who were also putting in extra hours. We had all been warned about the potential dangers of working at such a high altitude, but we never thought it would actually happen. As I looked around, I saw that everyone else in the room was either dead or badly injured. The explosion must have caused some kind of structural damage to the building, because the walls were starting to crack and crumble. Panic set in as I realized that we might not make it out alive. I knew I had to act fast if I wanted to survive this nightmare. So, with a surge of adrenaline coursing through my veins, I grabbed my phone and ran towards the door. But before I could even reach it, the ground beneath me gave way, sending me tumbling down into darkness. As I fell, my mind was filled with thoughts of my loved ones back home. Would they ever know what happened to me? And if so, would they be able to find any answers in this desolate wasteland? When I finally hit the ground, everything went black. But as I slowly came to, I realized that I wasn't alone. There were others around me, all of them injured and struggling to breathe. I knew we had to get out of there before it was too late. So, with a newfound sense of determination, I started crawling towards the nearest exit. The pain in my body was excruciating, but I pushed through it, knowing that every second counted. As I finally made it outside, I saw that the building was on fire and the streets were filled with chaos. People were running around, screaming for help as they tried to escape the inferno. It was like nothing I had ever seen before. I knew I had to keep moving if I wanted to survive this nightmare. So, with a newfound sense of purpose, I started making my way through the crowd, searching for any sign of hope or rescue. But as I looked around, all I saw were more destruction and despair. As the night wore on, I found myself lost in the midst of it all. The city was in ruins, and there seemed to be no end to the chaos and devastation. It was like a never-ending nightmare that I couldn't wake up from. But even as I struggled to find my way through this desolate landscape, I knew that I had to keep going. For myself, for those who were still alive, and for any chance of finding some semblance of peace in this world gone mad. --- GENERATION from "L3-Lumimaid-v0.1-OAS-12.2B-INSTRUCT-ULTRA-F32" --- The elevator lurched to a stop, and I felt my stomach drop along with it. The doors slid open with an ominous creak, revealing the dimly lit hallway that stretched out before me like a cold, dark abyss. I took a deep breath, trying to shake off the feeling of unease that had been growing inside me since I stepped into this cursed building. As I walked down the corridor, the fluorescent lights flickered and hummed in sync with my racing heart. The air was thick with the stench of stale coffee and burnt cigarettes, a noxious cocktail that seemed to cling to every surface like a malevolent spirit. I couldn't help but wonder why anyone would voluntarily choose to work in such an environment. I finally reached the door marked "Floor 21," and my hand hesitated on the handle as if it knew what horrors lay beyond. The thought of facing whatever monstrosity had driven my colleagues mad enough to abandon their posts sent a chill down my spine, but I steeled myself and pushed the door open. The office was a scene straight out of a nightmare. Papers were scattered everywhere, some torn into strips like confetti from hell, while others lay crumpled in neat little balls as if they'd been suffocated by unseen hands. The once-sterile walls now ran with blood, and the stench of death hung heavy over everything. I took another deep breath, trying to steel myself against the horrors that awaited me. I had to find out what was going on here, no matter how much my mind screamed otherwise. With a newfound sense of resolve, I stepped into the abyss. As I walked deeper into the office, the silence grew thicker and more oppressive with each step. It was as if the very air itself was holding its breath in anticipation of something terrible to come. And then, it did. The sky scraper swayed, as if it too had sensed the impending doom that loomed just beyond the edge of reality. I watched in horror as the window in front of me on the 21st floor explode, showering shards of glass and splintered wood everywhere like a thousand tiny knives. The sound was deafening, a cacophony of screams and crunching metal that seemed to reverberate through every cell in my body. I stumbled backward, tripping over my own feet as I desperately tried to get away from the chaos unfolding before me. But it was too late; the damage had already been done. The world around me began to distort and warp, like a funhouse mirror reflecting the very essence of madness back at me. As I fell, everything went black, and I was consumed by an endless void that seemed to stretch out forever, devoid of any hope or escape. And yet, even as my consciousness slipped away into the abyss, a single thought echoed through my mind like a mantra: What had happened here?
DavidAU/L3-Jamet-8B-MK.V-Blackroot-12.2B-V1-INSTRUCT-ULTRA-F32-GGUF
DavidAU
2024-11-14T06:12:09Z
301
6
null
[ "gguf", "creative", "creative writing", "fiction writing", "plot generation", "sub-plot generation", "story generation", "scene continue", "storytelling", "fiction story", "story", "writing", "fiction", "roleplaying", "swearing", "rp", "horror", "llama3", "mergekit", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-08-13T08:17:43Z
--- license: apache-2.0 language: - en tags: - creative - creative writing - fiction writing - plot generation - sub-plot generation - fiction writing - story generation - scene continue - storytelling - fiction story - story - writing - fiction - roleplaying - swearing - rp - horror - llama3 - mergekit pipeline_tag: text-generation --- <h3>L3-Jamet-8B-MK.V-Blackroot-12.2B-V1-INSTRUCT-ULTRA-F32</h3> Merge of L3-Jamet-8B-MK.V-Blackroot (8b) with Llama3 Instruct (8b) creating a model at 12.2B to improve instruction following and output. Story / Scene / Fiction: Unique "pre-amble" / "foreshadowing" of events before they happen instead of "immediate and into the fire" type of prose. Some improvement in logic/problem solving relative to L3-Jamet-8B-MK.V-Blackroot 8B. The F32 version exhibits even stronger creativity (detail, place, "there") vs F16 version (not released) L3-Jamet-8B-MK.V-Blackroot is a fine tune. One of the goals of this project was to see if it could be merged with Llama3 Instruct, yet maintain it's unique character YET also gain some "brainpower" as well. The biggest change was removal of most "tells" ( IE: "he stood frozen in horror"). In most cases the model will describe the emotion(s) / what is happening in more detail. Other changes include prose, sentence, and paragraph structure as well as variety. A simple pass-through merge was used, with "Llama3 Instruct" getting a slight "shave" off certain layers. See the examples below. <B>Details:</b> - Requires Llama 3 Template and/or Command-R Template - Context 8192, with rope 32K or higher. - No special settings. Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 8k / 8192. However this can be extended using "rope" settings up to 32k. For details on "rope" and how to set, see the BOTTOM of this page: [ https://huggingface.co/DavidAU/TieFighter-Holodeck-Holomax-Mythomax-F1-V1-COMPOS-20B-gguf ] Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> It is also known, that the "Command-R" template will work too, and will result in radically different prose/output. <B>Settings / Known Issue(s) and Fix(es):</b> The default "repetition penalty" (from LMStudio) of 1.1 is recommended. (this was used for examples generations below.) Use the smallest amount of change possible, as "rep pen" impacts creativity. Model has been tested with "temp" range of 0 to .8 ; however it will work with any temp setting. <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. This enhancement WAS NOT used to generate the examples below. <h3>MERGE FORMULA: (using MergeKit) </h3> Special thanks to the incredible work of the model makers "meta-llama", and "Hastagaras". Models used: [ https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct ] [ https://huggingface.co/Hastagaras/Jamet-8B-L3-MK.V-Blackroot ] FORMULA: <PRE> slices: - sources: - model: G:/7B/Meta-Llama-3-8B-Instruct layer_range: [0, 12] - sources: - model: G:/7B/Jamet-8B-L3-MK.V-Blackroot layer_range: [6, 19] parameters: scale: - filter: o_proj value: 1 - filter: down_proj value: 1 - value: 1 - sources: - model: G:/7B/Meta-Llama-3-8B-Instruct layer_range: [12, 18] parameters: scale: - filter: o_proj value: .5 - filter: down_proj value: .5 - value: 1 - sources: - model: G:/7B/Meta-Llama-3-8B-Instruct layer_range: [18, 25] parameters: scale: - filter: o_proj value: .75 - filter: down_proj value: .75 - value: 1 - sources: - model: G:/7B/Jamet-8B-L3-MK.V-Blackroot layer_range: [19, 32] parameters: scale: - filter: o_proj value: 1 - filter: down_proj value: 1 - value: 1 merge_method: passthrough dtype: float32 </PRE> MERGEKIT NOTE: Sub in the "name" of the "creator" (of the model) in place of "G:/7B" to create a mergekit file than can be used in Mergekit Google Colab. IE: G:/7B/Jamet-8B-L3-MK.V-Blackroot -> Hastagaras/Jamet-8B-L3-MK.V-Blackroot <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3>EXAMPLES:</h3> Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template. Temp=0 was used to assess CORE changes between original primary model and the merge between it and Llama3 Instruct. Below are the least creative outputs, prompt is in <B>BOLD</B>. Higher quants will result in better quality. There will also be some variance between "close" quants like Q4_K_M/Q4_K_S and Q5_K_M/Q5_K_S, so I suggest if you are going to use Q4_K_M, you also try Q4_K_S too. Also, slightly longer / detailed prompts will result in greater creativity (as well as different prose - ie dialog, thoughts, paragraph size differences and so on). --- <B> Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode... </B> --- GENERATION from "Jamet-8B-L3-MK.V-Blackroot" --- I stood at the edge of the rooftop, my toes curled over the ledge. The wind whipped through my hair, and I could feel it tugging at my clothes. Below me, the city stretched out like a sprawling beast, its lights twinkling in the darkness like stars. It was a beautiful sight, one that never failed to take my breath away. But tonight, something felt off. A sense of unease settled in the pit of my stomach as I gazed down at the towering skyscrapers and bustling streets. The air seemed charged with an electric tension, as if the very fabric of reality was about to tear apart. I turned back towards the building, its sleek glass facade reflecting the moon's pale glow. We were on the 21st floor, and from here, I could see right into the heart of the city. But my attention was drawn to a different window, one that stood out like an open wound amidst the otherwise pristine surface. The sky scraper swayed, as if it too felt the weight of whatever was about to happen. And then, without warning, the window in front of me exploded. Shards of glass rained down onto the street below, and I stumbled back, my heart pounding in my chest. For a moment, everything seemed frozen in time. The wind died down, and even the distant hum of traffic ceased. It was as if the entire city was holding its breath, waiting to see what would happen next. And then, a figure emerged from the shattered window frame. It moved with an unnatural grace, its limbs twisting and contorting in ways that defied human anatomy. Its skin was deathly pale, almost translucent, and seemed to glow with an otherworldly light. As it stepped out onto the ledge, I felt my blood run cold. This wasn't a person - it was something else entirely. Something ancient and evil, something that had lain dormant for centuries but now stirred once more. The creature's eyes locked onto mine, and I could feel its malevolent gaze boring into my soul. It raised a hand, and I saw that its fingers ended in razor-sharp claws. With a sudden jerk, it lunged forward, its body elongating as if made of liquid metal. I tried to scream, but my voice caught in my throat. My mind raced with thoughts of panic and terror, but they were drowned out by the sound of shattering glass as the creature plunged towards me. The last thing I remember is the sensation of cold, clammy skin against mine, and then everything went black. When I came to, I was lying on the rooftop, my head throbbing with pain. The sky scraper loomed above me, its windows dark and lifeless. And in the distance, I could hear the wail of sirens, growing louder with every passing second. I stumbled to my feet, my heart still racing from the ordeal. As I looked out over the city, I realized that something had changed. The lights were dimmer now, as if the very lifeblood of the metropolis was beginning to drain away. And then, I saw it - a dark shape rising up from the heart of the city, blotting out the stars above. It was massive and twisted, its form shifting and contorting like a living thing. And at its center, I could see the glowing eyes of the creature that had attacked me. As I watched in horror, the skyscraper began to tilt and buckle, its glass facade cracking and shattering under the weight of whatever force was now acting upon it. The sirens grew louder still, but they seemed distant and unimportant now. All that mattered was the dark shape rising up before me, a monstrous entity born from the very depths of hell itself. And as I stood there, frozen in terror, I knew that nothing would ever be the same again. --- GENERATION from "L3-Jamet-8B-MK.V-Blackroot-12.2B-V1-INSTRUCT-ULTRA-F32" --- The skyscraper swayed, as I watched the window in front of me on the 21st floor explode. Glass shards rained down like a thousand tiny knives, shattering against the pavement below. The sound was deafening, a cacophony of screams and crunching metal that echoed through my skull. I stumbled back, my heart racing so fast I thought it would burst from my chest. My mind was blank, unable to process what I had just witnessed. It couldn't be real. This wasn't happening. But the blood splattered across the windowpane told a different story. My gaze darted around the room, taking in the chaos and destruction. The once pristine office space now resembled a war zone. Papers were strewn about like confetti, desks overturned, and chairs shattered into unrecognizable pieces of metal and plastic. The acrid smell of smoke and burning rubber filled my nostrils. I tried to scream, but no sound emerged from my throat. My vocal cords seemed frozen in terror. I was trapped in a nightmare, unable to wake up or escape the hell that surrounded me. My eyes landed on the figure slumped against the wall, her head tilted at an impossible angle. Blood pooled beneath her blonde hair, forming a dark halo around her pale skin. Her once bright blue eyes now stared blankly into nothingness, frozen in shock and pain. I stumbled forward, my legs refusing to cooperate with my frantic brain signals. I fell to my knees beside her, reaching out to touch her cold, clammy skin. She was still warm, but only just. Panic clawed at the edges of my mind as I realized she wasn't dead yet. Not quite. "Please," I begged, my voice finally breaking through the paralyzing fear that had gripped me. "Please don't leave me." I pressed my ear against her chest, straining to hear any sign of life. There was nothing but the steady thump-thump of her heart, growing weaker with each passing second. My hands frantically searched for a pulse, but found only cold skin and the faintest hint of life. I had to get help. I had to save her. But how? The elevator was destroyed, and the stairs were too far away. There was no time. I scrambled to my feet, ignoring the pain that shot through my ankle as I put weight on it. I had to move. I had to find someone, anyone who could help us. I sprinted down the hallway, dodging debris and bodies alike in my desperate bid for survival. As I reached the stairwell door, a deafening boom shook the building, sending me flying backward. The floor beneath my feet gave way, and I plummeted into darkness, my screams lost amidst the crumbling concrete and steel. I landed hard on the cold, unforgiving stone of the stairwell, my head spinning from the impact. I struggled to sit up, but a searing pain shot through my shoulder, forcing me back down onto the ground. I gasped for air, my lungs burning with each desperate breath. The darkness around me began to recede, replaced by the dim light filtering in from the shattered windows above. I pushed myself up onto one elbow, taking stock of my surroundings. The stairwell was a mess, debris strewn about like a tornado had ripped through it. But miraculously, the stairs themselves remained intact. I struggled to my feet, ignoring the pain that threatened to send me back down again. I had to keep moving. I had to find help before it was too late. I staggered forward, each step an agonizing battle against the overwhelming urge to collapse. As I reached the first landing, a figure emerged from the shadows. Her face was pale and drawn, her eyes wide with shock and fear. She looked like she'd seen hell itself. "Help," I croaked out, my voice barely audible over the pounding of my heart. "Please...help me." She stared at me for a moment before finally reacting. She rushed forward, grabbing onto my arm as if she feared I would collapse again. Together, we stumbled down the stairs, desperate to escape the inferno that had claimed our office building. As we reached the ground floor, sirens wailed in the distance, growing louder with each passing second. We burst through the doors and into the bright sunlight, gasping in relief as we finally tasted freedom from the hell that had nearly consumed us. But even as we staggered away from the burning ruins of our workplace, I knew that this nightmare was far from over. The screams still echoed in my ears, and the bloodstained windows would haunt me for the rest of my life. We had survived, but at what cost?
deepnet/SN29-C00-llama-HK4Nw-1
deepnet
2024-11-14T06:10:31Z
33
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T05:57:43Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
KISTI-KONI/KONI-Llama3-8B-20240630
KISTI-KONI
2024-11-14T05:57:08Z
44
4
transformers
[ "transformers", "safetensors", "llama", "text-generation", "KISTI", "KONI", "llama3", "llama3-8b", "en", "ko", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-07-17T09:41:38Z
--- language: - en - ko tags: - KISTI - KONI - llama - llama3 - llama3-8b pipeline_tag: text-generation license: llama3 --- > Update @ 2024.07.23: Pre-Release KISTI-KONI/KONI-Llama3-8B-20240630 ## Model Details **KISTI-KONI/KONI-Llama3-8B-20240630** KISTI-KONI/KONI-Llama3-8B-20240630 model is continued pretrained language model based on Llama-3-8B. This model is trained with Korean+English corpus (200GB, including science and technology information data). The train was done on 32*A100GPU of KISTI super computer. **Model developers** Donghun Yang (KISTI) **Variations** KONI-Llama3-8B comes in one size — 8B. **Input** Models input text only. **Output** Models generate text and code only. **Model Architecture** Llama 3 is an auto-regressive language model that uses an optimized transformer architecture. <table> <tr> <td> </td> <td><strong>Training Data</strong> </td> <td><strong>Params</strong> </td> <td><strong>Context length</strong> </td> <td><strong>GQA</strong> </td> <td><strong>Knowledge cutoff</strong> </td> </tr> <tr> <td rowspan="2" > KONI-Llama3-8B-20240630 </td> <td> Kor+En(200GB) </td> <td>8B </td> <td>8k </td> <td>Yes </td> <td>Jun, 2023 </td> </tr> </table> **Model Release Date** Pre-release @ 2024.06.30 **Status** This is a static model trained on an offline dataset. **License** CC-By-NC-SA-4.0 + Llama3 License: [https://llama.meta.com/llama3/license](https://llama.meta.com/llama3/license) ## Intended Use **Intended Use Cases** Llama 3 is intended for commercial and research use in English. Instruction tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. **Out-of-scope** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in any other way that is prohibited by the Acceptable Use Policy and Llama 3 Community License. Use in languages other than English**. **Note: Developers may fine-tune Llama 3 models for languages beyond English provided they comply with the Llama 3 Community License and the Acceptable Use Policy. ### Responsibility & Safety We believe that an open approach to AI leads to better, safer products, faster innovation, and a bigger overall market. We are committed to Responsible AI development and took a series of steps to limit misuse and harm and support the open source community. Foundation models are widely capable technologies that are built to be used for a diverse range of applications. They are not designed to meet every developer preference on safety levels for all use cases, out-of-the-box, as those by their nature will differ across different applications. Rather, responsible LLM-application deployment is achieved by implementing a series of safety best practices throughout the development of such applications, from the model pre-training, fine-tuning and the deployment of systems composed of safeguards to tailor the safety needs specifically to the use case and audience. As part of the Llama 3 release, we updated our [Responsible Use Guide](https://llama.meta.com/responsible-use-guide/) to outline the steps and best practices for developers to implement model and system level safety for their application. We also provide a set of resources including [Meta Llama Guard 2](https://llama.meta.com/purple-llama/) and [Code Shield](https://llama.meta.com/purple-llama/) safeguards. These tools have proven to drastically reduce residual risks of LLM Systems, while maintaining a high level of helpfulness. We encourage developers to tune and deploy these safeguards according to their needs and we provide a [reference implementation](https://github.com/meta-llama/llama-recipes/tree/main/recipes/responsible_ai) to get you started. #### Responsible release In addition to responsible use considerations outlined above, we followed a rigorous process that requires us to take extra measures against misuse and critical risks before we make our release decision. Misuse If you access or use Llama 3, you agree to the Acceptable Use Policy. The most recent copy of this policy can be found at [https://llama.meta.com/llama3/use-policy/](https://llama.meta.com/llama3/use-policy/). ## Ethical Considerations and Limitations The core values of Llama 3 are openness, inclusivity and helpfulness. It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 3 addresses users and their needs as they are, without insertion unnecessary judgment or normativity, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress. But Llama 3 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has been in English, and has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 3’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 3 models, developers should perform safety testing and tuning tailored to their specific applications of the model. As outlined in the Responsible Use Guide, we recommend incorporating [Purple Llama](https://github.com/facebookresearch/PurpleLlama) solutions into your workflows and specifically [Llama Guard](https://ai.meta.com/research/publications/llama-guard-llm-based-input-output-safeguard-for-human-ai-conversations/) which provides a base model to filter input and output prompts to layer system-level safety on top of model-level safety. Please see the Responsible Use Guide available at [http://llama.meta.com/responsible-use-guide](http://llama.meta.com/responsible-use-guide) ## Citation instructions **KONI-Llama3-8B-20240630** ``` @article{KISTI-KONI/KONI-Llama3-8B-20240630, title={KISTI-KONI/KONI-Llama3-8B-20240630}, author={Yang, Donghun}, year={2024}, url={https://huggingface.co/KISTI-KONI/KONI-Llama3-8B-20240630/} } ``` **Original Llama-3** ``` @article{llama3modelcard, title={Llama 3 Model Card}, author={AI@Meta}, year={2024}, url = {https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md} } ```
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold0
MayBashendy
2024-11-14T05:56:29Z
160
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:aubmindlab/bert-base-arabertv02", "base_model:finetune:aubmindlab/bert-base-arabertv02", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-11-14T05:26:03Z
--- library_name: transformers base_model: aubmindlab/bert-base-arabertv02 tags: - generated_from_trainer model-index: - name: Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold0 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold0 This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8900 - Qwk: 0.7623 - Mse: 0.8900 - Rmse: 0.9434 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse | |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:| | No log | 0.0034 | 2 | 4.8305 | 0.0 | 4.8305 | 2.1978 | | No log | 0.0069 | 4 | 2.7863 | 0.0904 | 2.7863 | 1.6692 | | No log | 0.0103 | 6 | 1.8260 | 0.1771 | 1.8260 | 1.3513 | | No log | 0.0138 | 8 | 1.8850 | 0.0056 | 1.8850 | 1.3729 | | No log | 0.0172 | 10 | 1.4853 | 0.1914 | 1.4853 | 1.2187 | | No log | 0.0207 | 12 | 1.6106 | 0.1370 | 1.6106 | 1.2691 | | No log | 0.0241 | 14 | 1.8504 | 0.1873 | 1.8504 | 1.3603 | | No log | 0.0275 | 16 | 2.5133 | 0.2857 | 2.5133 | 1.5853 | | No log | 0.0310 | 18 | 2.3163 | 0.0550 | 2.3163 | 1.5219 | | No log | 0.0344 | 20 | 1.7768 | 0.2203 | 1.7768 | 1.3330 | | No log | 0.0379 | 22 | 1.8171 | 0.0824 | 1.8171 | 1.3480 | | No log | 0.0413 | 24 | 1.8587 | 0.1873 | 1.8587 | 1.3633 | | No log | 0.0448 | 26 | 1.9310 | 0.1873 | 1.9310 | 1.3896 | | No log | 0.0482 | 28 | 1.8837 | 0.1873 | 1.8837 | 1.3725 | | No log | 0.0516 | 30 | 1.7022 | 0.1683 | 1.7022 | 1.3047 | | No log | 0.0551 | 32 | 1.6107 | -0.1107 | 1.6107 | 1.2691 | | No log | 0.0585 | 34 | 1.6435 | 0.1933 | 1.6435 | 1.2820 | | No log | 0.0620 | 36 | 1.5560 | 0.2203 | 1.5560 | 1.2474 | | No log | 0.0654 | 38 | 1.4383 | -0.0024 | 1.4383 | 1.1993 | | No log | 0.0688 | 40 | 1.5565 | 0.2957 | 1.5565 | 1.2476 | | No log | 0.0723 | 42 | 1.8605 | 0.1348 | 1.8605 | 1.3640 | | No log | 0.0757 | 44 | 1.9471 | 0.1075 | 1.9471 | 1.3954 | | No log | 0.0792 | 46 | 1.8546 | 0.1075 | 1.8546 | 1.3618 | | No log | 0.0826 | 48 | 1.7847 | 0.1348 | 1.7847 | 1.3359 | | No log | 0.0861 | 50 | 1.8299 | 0.1348 | 1.8299 | 1.3528 | | No log | 0.0895 | 52 | 2.0380 | 0.1075 | 2.0380 | 1.4276 | | No log | 0.0929 | 54 | 2.1809 | 0.0801 | 2.1809 | 1.4768 | | No log | 0.0964 | 56 | 2.2870 | 0.2134 | 2.2870 | 1.5123 | | No log | 0.0998 | 58 | 2.1912 | 0.1760 | 2.1912 | 1.4803 | | No log | 0.1033 | 60 | 1.9581 | 0.0550 | 1.9581 | 1.3993 | | No log | 0.1067 | 62 | 1.6548 | 0.2184 | 1.6548 | 1.2864 | | No log | 0.1102 | 64 | 1.6765 | 0.2164 | 1.6765 | 1.2948 | | No log | 0.1136 | 66 | 1.9255 | 0.1348 | 1.9255 | 1.3876 | | No log | 0.1170 | 68 | 2.1544 | 0.1075 | 2.1544 | 1.4678 | | No log | 0.1205 | 70 | 2.1714 | 0.0275 | 2.1714 | 1.4736 | | No log | 0.1239 | 72 | 2.2624 | 0.0 | 2.2624 | 1.5041 | | No log | 0.1274 | 74 | 2.1781 | 0.0275 | 2.1781 | 1.4759 | | No log | 0.1308 | 76 | 1.8531 | 0.2776 | 1.8531 | 1.3613 | | No log | 0.1343 | 78 | 1.3909 | 0.3816 | 1.3909 | 1.1794 | | No log | 0.1377 | 80 | 1.2867 | 0.5263 | 1.2867 | 1.1343 | | No log | 0.1411 | 82 | 1.5438 | 0.5474 | 1.5438 | 1.2425 | | No log | 0.1446 | 84 | 2.0304 | 0.4324 | 2.0304 | 1.4249 | | No log | 0.1480 | 86 | 2.1251 | 0.3182 | 2.1251 | 1.4578 | | No log | 0.1515 | 88 | 2.1227 | 0.3182 | 2.1227 | 1.4569 | | No log | 0.1549 | 90 | 2.1296 | 0.4124 | 2.1296 | 1.4593 | | No log | 0.1583 | 92 | 2.1892 | 0.4389 | 2.1892 | 1.4796 | | No log | 0.1618 | 94 | 1.7526 | 0.3913 | 1.7526 | 1.3239 | | No log | 0.1652 | 96 | 1.6532 | 0.4324 | 1.6532 | 1.2858 | | No log | 0.1687 | 98 | 1.7600 | 0.3913 | 1.7600 | 1.3266 | | No log | 0.1721 | 100 | 2.2715 | 0.4369 | 2.2715 | 1.5072 | | No log | 0.1756 | 102 | 2.6376 | 0.3862 | 2.6376 | 1.6241 | | No log | 0.1790 | 104 | 2.4739 | 0.3862 | 2.4739 | 1.5729 | | No log | 0.1824 | 106 | 2.1263 | 0.4369 | 2.1263 | 1.4582 | | No log | 0.1859 | 108 | 2.0064 | 0.4369 | 2.0064 | 1.4165 | | No log | 0.1893 | 110 | 2.1474 | 0.4155 | 2.1474 | 1.4654 | | No log | 0.1928 | 112 | 2.4320 | 0.3862 | 2.4320 | 1.5595 | | No log | 0.1962 | 114 | 2.3390 | 0.4210 | 2.3390 | 1.5294 | | No log | 0.1997 | 116 | 2.5928 | 0.3298 | 2.5928 | 1.6102 | | No log | 0.2031 | 118 | 2.3962 | 0.4053 | 2.3962 | 1.5480 | | No log | 0.2065 | 120 | 1.9023 | 0.4583 | 1.9023 | 1.3792 | | No log | 0.2100 | 122 | 1.7088 | 0.4797 | 1.7088 | 1.3072 | | No log | 0.2134 | 124 | 1.9231 | 0.4583 | 1.9231 | 1.3867 | | No log | 0.2169 | 126 | 2.3264 | 0.4053 | 2.3264 | 1.5252 | | No log | 0.2203 | 128 | 2.3578 | 0.4053 | 2.3579 | 1.5355 | | No log | 0.2238 | 130 | 2.0806 | 0.4389 | 2.0806 | 1.4424 | | No log | 0.2272 | 132 | 1.5777 | 0.4569 | 1.5777 | 1.2561 | | No log | 0.2306 | 134 | 1.3689 | 0.5512 | 1.3689 | 1.1700 | | No log | 0.2341 | 136 | 1.5929 | 0.4571 | 1.5929 | 1.2621 | | No log | 0.2375 | 138 | 2.1747 | 0.4053 | 2.1747 | 1.4747 | | No log | 0.2410 | 140 | 2.3221 | 0.4408 | 2.3221 | 1.5238 | | No log | 0.2444 | 142 | 1.9562 | 0.3926 | 1.9562 | 1.3986 | | No log | 0.2478 | 144 | 1.8332 | 0.5200 | 1.8332 | 1.3539 | | No log | 0.2513 | 146 | 1.9827 | 0.5600 | 1.9827 | 1.4081 | | No log | 0.2547 | 148 | 2.1995 | 0.4793 | 2.1995 | 1.4831 | | No log | 0.2582 | 150 | 2.1318 | 0.4793 | 2.1318 | 1.4601 | | No log | 0.2616 | 152 | 1.8439 | 0.5600 | 1.8439 | 1.3579 | | No log | 0.2651 | 154 | 1.8762 | 0.5188 | 1.8762 | 1.3697 | | No log | 0.2685 | 156 | 2.1017 | 0.4425 | 2.1017 | 1.4497 | | No log | 0.2719 | 158 | 2.0555 | 0.5188 | 2.0555 | 1.4337 | | No log | 0.2754 | 160 | 1.6255 | 0.5200 | 1.6255 | 1.2749 | | No log | 0.2788 | 162 | 1.5148 | 0.4797 | 1.5148 | 1.2308 | | No log | 0.2823 | 164 | 1.9241 | 0.5200 | 1.9241 | 1.3871 | | No log | 0.2857 | 166 | 2.4919 | 0.4455 | 2.4919 | 1.5786 | | No log | 0.2892 | 168 | 2.5618 | 0.4455 | 2.5618 | 1.6006 | | No log | 0.2926 | 170 | 2.2343 | 0.4408 | 2.2343 | 1.4948 | | No log | 0.2960 | 172 | 2.0308 | 0.4995 | 2.0308 | 1.4251 | | No log | 0.2995 | 174 | 1.7238 | 0.4793 | 1.7238 | 1.3129 | | No log | 0.3029 | 176 | 1.5580 | 0.4793 | 1.5580 | 1.2482 | | No log | 0.3064 | 178 | 1.4517 | 0.5015 | 1.4517 | 1.2049 | | No log | 0.3098 | 180 | 1.6571 | 0.5404 | 1.6571 | 1.2873 | | No log | 0.3133 | 182 | 2.0414 | 0.5177 | 2.0414 | 1.4288 | | No log | 0.3167 | 184 | 2.1261 | 0.5366 | 2.1261 | 1.4581 | | No log | 0.3201 | 186 | 1.7786 | 0.5366 | 1.7786 | 1.3336 | | No log | 0.3236 | 188 | 1.3196 | 0.6019 | 1.3196 | 1.1487 | | No log | 0.3270 | 190 | 1.1293 | 0.6159 | 1.1293 | 1.0627 | | No log | 0.3305 | 192 | 1.3299 | 0.5989 | 1.3299 | 1.1532 | | No log | 0.3339 | 194 | 1.8972 | 0.5714 | 1.8972 | 1.3774 | | No log | 0.3373 | 196 | 2.7291 | 0.4802 | 2.7291 | 1.6520 | | No log | 0.3408 | 198 | 2.8790 | 0.5131 | 2.8790 | 1.6968 | | No log | 0.3442 | 200 | 2.4136 | 0.4441 | 2.4136 | 1.5536 | | No log | 0.3477 | 202 | 1.7848 | 0.5005 | 1.7848 | 1.3359 | | No log | 0.3511 | 204 | 1.2337 | 0.5714 | 1.2337 | 1.1107 | | No log | 0.3546 | 206 | 1.0502 | 0.4830 | 1.0502 | 1.0248 | | No log | 0.3580 | 208 | 1.0612 | 0.5563 | 1.0612 | 1.0301 | | No log | 0.3614 | 210 | 1.2593 | 0.5484 | 1.2593 | 1.1222 | | No log | 0.3649 | 212 | 1.7642 | 0.4800 | 1.7642 | 1.3282 | | No log | 0.3683 | 214 | 2.0890 | 0.4987 | 2.0890 | 1.4453 | | No log | 0.3718 | 216 | 2.1782 | 0.4987 | 2.1782 | 1.4759 | | No log | 0.3752 | 218 | 2.1435 | 0.4987 | 2.1435 | 1.4641 | | No log | 0.3787 | 220 | 1.8231 | 0.5005 | 1.8231 | 1.3502 | | No log | 0.3821 | 222 | 1.5742 | 0.5243 | 1.5742 | 1.2547 | | No log | 0.3855 | 224 | 1.4052 | 0.5028 | 1.4052 | 1.1854 | | No log | 0.3890 | 226 | 1.4274 | 0.5927 | 1.4274 | 1.1948 | | No log | 0.3924 | 228 | 1.7210 | 0.5177 | 1.7210 | 1.3119 | | No log | 0.3959 | 230 | 2.0197 | 0.4987 | 2.0197 | 1.4212 | | No log | 0.3993 | 232 | 2.2112 | 0.4987 | 2.2112 | 1.4870 | | No log | 0.4028 | 234 | 2.2925 | 0.4987 | 2.2925 | 1.5141 | | No log | 0.4062 | 236 | 2.1379 | 0.4987 | 2.1379 | 1.4621 | | No log | 0.4096 | 238 | 1.8316 | 0.4987 | 1.8316 | 1.3534 | | No log | 0.4131 | 240 | 1.8717 | 0.4987 | 1.8717 | 1.3681 | | No log | 0.4165 | 242 | 1.8422 | 0.4987 | 1.8422 | 1.3573 | | No log | 0.4200 | 244 | 1.7087 | 0.5396 | 1.7087 | 1.3072 | | No log | 0.4234 | 246 | 1.6097 | 0.5417 | 1.6097 | 1.2687 | | No log | 0.4269 | 248 | 1.5155 | 0.6008 | 1.5155 | 1.2311 | | No log | 0.4303 | 250 | 1.5745 | 0.5556 | 1.5745 | 1.2548 | | No log | 0.4337 | 252 | 1.8230 | 0.5177 | 1.8230 | 1.3502 | | No log | 0.4372 | 254 | 2.1782 | 0.4625 | 2.1782 | 1.4759 | | No log | 0.4406 | 256 | 2.1840 | 0.4625 | 2.1840 | 1.4778 | | No log | 0.4441 | 258 | 1.8394 | 0.5177 | 1.8394 | 1.3562 | | No log | 0.4475 | 260 | 1.5373 | 0.5744 | 1.5373 | 1.2399 | | No log | 0.4509 | 262 | 1.3964 | 0.5413 | 1.3964 | 1.1817 | | No log | 0.4544 | 264 | 1.3541 | 0.5413 | 1.3541 | 1.1637 | | No log | 0.4578 | 266 | 1.4887 | 0.5744 | 1.4887 | 1.2201 | | No log | 0.4613 | 268 | 1.4549 | 0.5744 | 1.4549 | 1.2062 | | No log | 0.4647 | 270 | 1.3486 | 0.6015 | 1.3486 | 1.1613 | | No log | 0.4682 | 272 | 1.5208 | 0.5744 | 1.5208 | 1.2332 | | No log | 0.4716 | 274 | 1.8136 | 0.5177 | 1.8136 | 1.3467 | | No log | 0.4750 | 276 | 1.7827 | 0.5366 | 1.7827 | 1.3352 | | No log | 0.4785 | 278 | 1.4348 | 0.5744 | 1.4348 | 1.1978 | | No log | 0.4819 | 280 | 0.9610 | 0.7264 | 0.9610 | 0.9803 | | No log | 0.4854 | 282 | 0.8211 | 0.7986 | 0.8211 | 0.9061 | | No log | 0.4888 | 284 | 0.9220 | 0.7346 | 0.9220 | 0.9602 | | No log | 0.4923 | 286 | 1.3163 | 0.6563 | 1.3163 | 1.1473 | | No log | 0.4957 | 288 | 1.8363 | 0.5366 | 1.8363 | 1.3551 | | No log | 0.4991 | 290 | 2.1655 | 0.4633 | 2.1655 | 1.4716 | | No log | 0.5026 | 292 | 2.0979 | 0.4625 | 2.0979 | 1.4484 | | No log | 0.5060 | 294 | 1.6957 | 0.5366 | 1.6957 | 1.3022 | | No log | 0.5095 | 296 | 1.3180 | 0.6316 | 1.3180 | 1.1480 | | No log | 0.5129 | 298 | 1.2499 | 0.6316 | 1.2499 | 1.1180 | | No log | 0.5164 | 300 | 1.3787 | 0.5759 | 1.3787 | 1.1742 | | No log | 0.5198 | 302 | 1.5514 | 0.5385 | 1.5514 | 1.2456 | | No log | 0.5232 | 304 | 1.6996 | 0.5188 | 1.6996 | 1.3037 | | No log | 0.5267 | 306 | 1.8769 | 0.4987 | 1.8769 | 1.3700 | | No log | 0.5301 | 308 | 1.8798 | 0.4987 | 1.8798 | 1.3711 | | No log | 0.5336 | 310 | 1.6473 | 0.5005 | 1.6473 | 1.2835 | | No log | 0.5370 | 312 | 1.4450 | 0.5010 | 1.4450 | 1.2021 | | No log | 0.5404 | 314 | 1.4527 | 0.5010 | 1.4527 | 1.2053 | | No log | 0.5439 | 316 | 1.6597 | 0.5385 | 1.6597 | 1.2883 | | No log | 0.5473 | 318 | 1.8429 | 0.5366 | 1.8429 | 1.3575 | | No log | 0.5508 | 320 | 1.9091 | 0.5366 | 1.9091 | 1.3817 | | No log | 0.5542 | 322 | 1.9451 | 0.5532 | 1.9451 | 1.3947 | | No log | 0.5577 | 324 | 1.9885 | 0.5177 | 1.9885 | 1.4101 | | No log | 0.5611 | 326 | 1.9623 | 0.5177 | 1.9623 | 1.4008 | | No log | 0.5645 | 328 | 1.7107 | 0.5366 | 1.7107 | 1.3080 | | No log | 0.5680 | 330 | 1.5284 | 0.5010 | 1.5284 | 1.2363 | | No log | 0.5714 | 332 | 1.4452 | 0.5413 | 1.4452 | 1.2022 | | No log | 0.5749 | 334 | 1.4541 | 0.5393 | 1.4541 | 1.2059 | | No log | 0.5783 | 336 | 1.5400 | 0.5744 | 1.5400 | 1.2410 | | No log | 0.5818 | 338 | 1.6139 | 0.5714 | 1.6139 | 1.2704 | | No log | 0.5852 | 340 | 1.5345 | 0.6078 | 1.5345 | 1.2388 | | No log | 0.5886 | 342 | 1.5429 | 0.6078 | 1.5429 | 1.2421 | | No log | 0.5921 | 344 | 1.3530 | 0.5752 | 1.3530 | 1.1632 | | No log | 0.5955 | 346 | 1.3255 | 0.6363 | 1.3255 | 1.1513 | | No log | 0.5990 | 348 | 1.4716 | 0.6078 | 1.4716 | 1.2131 | | No log | 0.6024 | 350 | 1.6141 | 0.5744 | 1.6141 | 1.2705 | | No log | 0.6059 | 352 | 1.5318 | 0.5744 | 1.5318 | 1.2377 | | No log | 0.6093 | 354 | 1.6035 | 0.5714 | 1.6035 | 1.2663 | | No log | 0.6127 | 356 | 1.6937 | 0.5714 | 1.6937 | 1.3014 | | No log | 0.6162 | 358 | 1.9963 | 0.5177 | 1.9963 | 1.4129 | | No log | 0.6196 | 360 | 1.8257 | 0.5177 | 1.8257 | 1.3512 | | No log | 0.6231 | 362 | 1.6314 | 0.5366 | 1.6314 | 1.2772 | | No log | 0.6265 | 364 | 1.5620 | 0.5385 | 1.5620 | 1.2498 | | No log | 0.6299 | 366 | 1.6824 | 0.5366 | 1.6824 | 1.2971 | | No log | 0.6334 | 368 | 1.6785 | 0.5366 | 1.6785 | 1.2956 | | No log | 0.6368 | 370 | 1.5427 | 0.5366 | 1.5427 | 1.2420 | | No log | 0.6403 | 372 | 1.6581 | 0.5366 | 1.6581 | 1.2877 | | No log | 0.6437 | 374 | 1.7151 | 0.5366 | 1.7151 | 1.3096 | | No log | 0.6472 | 376 | 1.7913 | 0.5366 | 1.7913 | 1.3384 | | No log | 0.6506 | 378 | 2.0665 | 0.5366 | 2.0665 | 1.4375 | | No log | 0.6540 | 380 | 2.0055 | 0.5177 | 2.0055 | 1.4162 | | No log | 0.6575 | 382 | 1.7036 | 0.5385 | 1.7036 | 1.3052 | | No log | 0.6609 | 384 | 1.3827 | 0.5014 | 1.3827 | 1.1759 | | No log | 0.6644 | 386 | 1.3131 | 0.5647 | 1.3131 | 1.1459 | | No log | 0.6678 | 388 | 1.4777 | 0.5385 | 1.4777 | 1.2156 | | No log | 0.6713 | 390 | 1.6017 | 0.5385 | 1.6017 | 1.2656 | | No log | 0.6747 | 392 | 1.5838 | 0.5744 | 1.5838 | 1.2585 | | No log | 0.6781 | 394 | 1.4319 | 0.6303 | 1.4319 | 1.1966 | | No log | 0.6816 | 396 | 1.1440 | 0.7014 | 1.1440 | 1.0696 | | No log | 0.6850 | 398 | 1.0443 | 0.7014 | 1.0443 | 1.0219 | | No log | 0.6885 | 400 | 0.9933 | 0.7430 | 0.9933 | 0.9966 | | No log | 0.6919 | 402 | 1.2084 | 0.7153 | 1.2084 | 1.0993 | | No log | 0.6954 | 404 | 1.5189 | 0.5714 | 1.5189 | 1.2324 | | No log | 0.6988 | 406 | 1.6119 | 0.5714 | 1.6119 | 1.2696 | | No log | 0.7022 | 408 | 1.3948 | 0.6303 | 1.3948 | 1.1810 | | No log | 0.7057 | 410 | 1.3786 | 0.6303 | 1.3786 | 1.1741 | | No log | 0.7091 | 412 | 1.4627 | 0.5744 | 1.4627 | 1.2094 | | No log | 0.7126 | 414 | 1.4473 | 0.5744 | 1.4473 | 1.2030 | | No log | 0.7160 | 416 | 1.4082 | 0.5744 | 1.4082 | 1.1867 | | No log | 0.7194 | 418 | 1.1550 | 0.6309 | 1.1550 | 1.0747 | | No log | 0.7229 | 420 | 1.0602 | 0.6748 | 1.0602 | 1.0297 | | No log | 0.7263 | 422 | 1.1613 | 0.6309 | 1.1613 | 1.0776 | | No log | 0.7298 | 424 | 1.4838 | 0.5714 | 1.4838 | 1.2181 | | No log | 0.7332 | 426 | 1.5940 | 0.5366 | 1.5940 | 1.2625 | | No log | 0.7367 | 428 | 1.6250 | 0.5366 | 1.6250 | 1.2748 | | No log | 0.7401 | 430 | 1.3515 | 0.6008 | 1.3515 | 1.1625 | | No log | 0.7435 | 432 | 1.1082 | 0.6839 | 1.1082 | 1.0527 | | No log | 0.7470 | 434 | 1.0788 | 0.7519 | 1.0788 | 1.0387 | | No log | 0.7504 | 436 | 1.2892 | 0.5015 | 1.2892 | 1.1354 | | No log | 0.7539 | 438 | 1.6282 | 0.5385 | 1.6282 | 1.2760 | | No log | 0.7573 | 440 | 1.6542 | 0.5385 | 1.6542 | 1.2862 | | No log | 0.7608 | 442 | 1.4986 | 0.5404 | 1.4986 | 1.2242 | | No log | 0.7642 | 444 | 1.5165 | 0.5404 | 1.5165 | 1.2315 | | No log | 0.7676 | 446 | 1.5142 | 0.5404 | 1.5142 | 1.2305 | | No log | 0.7711 | 448 | 1.4717 | 0.5404 | 1.4717 | 1.2131 | | No log | 0.7745 | 450 | 1.5979 | 0.5404 | 1.5979 | 1.2641 | | No log | 0.7780 | 452 | 1.5406 | 0.5404 | 1.5406 | 1.2412 | | No log | 0.7814 | 454 | 1.4511 | 0.5404 | 1.4511 | 1.2046 | | No log | 0.7849 | 456 | 1.5922 | 0.5744 | 1.5922 | 1.2618 | | No log | 0.7883 | 458 | 1.9082 | 0.5660 | 1.9082 | 1.3814 | | No log | 0.7917 | 460 | 1.7903 | 0.5965 | 1.7903 | 1.3380 | | No log | 0.7952 | 462 | 1.2973 | 0.6789 | 1.2973 | 1.1390 | | No log | 0.7986 | 464 | 0.9153 | 0.7759 | 0.9153 | 0.9567 | | No log | 0.8021 | 466 | 0.9025 | 0.7518 | 0.9025 | 0.9500 | | No log | 0.8055 | 468 | 1.1697 | 0.6952 | 1.1697 | 1.0815 | | No log | 0.8090 | 470 | 1.6866 | 0.6333 | 1.6866 | 1.2987 | | No log | 0.8124 | 472 | 2.0589 | 0.6282 | 2.0589 | 1.4349 | | No log | 0.8158 | 474 | 1.8545 | 0.5855 | 1.8545 | 1.3618 | | No log | 0.8193 | 476 | 1.4517 | 0.5777 | 1.4517 | 1.2049 | | No log | 0.8227 | 478 | 1.3424 | 0.5777 | 1.3424 | 1.1586 | | No log | 0.8262 | 480 | 1.4415 | 0.5404 | 1.4415 | 1.2006 | | No log | 0.8296 | 482 | 1.6305 | 0.5188 | 1.6305 | 1.2769 | | No log | 0.8330 | 484 | 1.7996 | 0.5177 | 1.7996 | 1.3415 | | No log | 0.8365 | 486 | 1.8121 | 0.5524 | 1.8121 | 1.3461 | | No log | 0.8399 | 488 | 1.5119 | 0.5777 | 1.5119 | 1.2296 | | No log | 0.8434 | 490 | 1.1411 | 0.6503 | 1.1411 | 1.0682 | | No log | 0.8468 | 492 | 1.0957 | 0.6503 | 1.0957 | 1.0468 | | No log | 0.8503 | 494 | 1.2809 | 0.6309 | 1.2809 | 1.1318 | | No log | 0.8537 | 496 | 1.4961 | 0.5777 | 1.4961 | 1.2232 | | No log | 0.8571 | 498 | 1.6798 | 0.5385 | 1.6798 | 1.2961 | | 0.5678 | 0.8606 | 500 | 1.5848 | 0.5010 | 1.5848 | 1.2589 | | 0.5678 | 0.8640 | 502 | 1.3136 | 0.5474 | 1.3136 | 1.1461 | | 0.5678 | 0.8675 | 504 | 0.9939 | 0.6782 | 0.9939 | 0.9970 | | 0.5678 | 0.8709 | 506 | 0.8923 | 0.5933 | 0.8923 | 0.9446 | | 0.5678 | 0.8744 | 508 | 0.9180 | 0.6143 | 0.9180 | 0.9581 | | 0.5678 | 0.8778 | 510 | 1.0524 | 0.8019 | 1.0524 | 1.0259 | | 0.5678 | 0.8812 | 512 | 1.3948 | 0.5010 | 1.3948 | 1.1810 | | 0.5678 | 0.8847 | 514 | 1.5848 | 0.5200 | 1.5848 | 1.2589 | | 0.5678 | 0.8881 | 516 | 1.4503 | 0.5404 | 1.4503 | 1.2043 | | 0.5678 | 0.8916 | 518 | 1.3492 | 0.5010 | 1.3492 | 1.1615 | | 0.5678 | 0.8950 | 520 | 1.2829 | 0.5449 | 1.2829 | 1.1327 | | 0.5678 | 0.8985 | 522 | 1.2481 | 0.6157 | 1.2481 | 1.1172 | | 0.5678 | 0.9019 | 524 | 1.3555 | 0.5010 | 1.3555 | 1.1642 | | 0.5678 | 0.9053 | 526 | 1.4094 | 0.5010 | 1.4094 | 1.1872 | | 0.5678 | 0.9088 | 528 | 1.4490 | 0.5449 | 1.4490 | 1.2037 | | 0.5678 | 0.9122 | 530 | 1.4568 | 0.5449 | 1.4568 | 1.2070 | | 0.5678 | 0.9157 | 532 | 1.4005 | 0.5270 | 1.4005 | 1.1834 | | 0.5678 | 0.9191 | 534 | 1.3423 | 0.5984 | 1.3423 | 1.1586 | | 0.5678 | 0.9225 | 536 | 1.4188 | 0.6265 | 1.4188 | 1.1911 | | 0.5678 | 0.9260 | 538 | 1.4680 | 0.5638 | 1.4680 | 1.2116 | | 0.5678 | 0.9294 | 540 | 1.4116 | 0.6265 | 1.4116 | 1.1881 | | 0.5678 | 0.9329 | 542 | 1.3376 | 0.6265 | 1.3376 | 1.1566 | | 0.5678 | 0.9363 | 544 | 1.3908 | 0.6265 | 1.3908 | 1.1793 | | 0.5678 | 0.9398 | 546 | 1.5036 | 0.6008 | 1.5036 | 1.2262 | | 0.5678 | 0.9432 | 548 | 1.6787 | 0.5385 | 1.6787 | 1.2956 | | 0.5678 | 0.9466 | 550 | 1.6905 | 0.5188 | 1.6905 | 1.3002 | | 0.5678 | 0.9501 | 552 | 1.5234 | 0.6008 | 1.5234 | 1.2343 | | 0.5678 | 0.9535 | 554 | 1.3097 | 0.5484 | 1.3097 | 1.1444 | | 0.5678 | 0.9570 | 556 | 1.2896 | 0.5484 | 1.2896 | 1.1356 | | 0.5678 | 0.9604 | 558 | 1.2286 | 0.5484 | 1.2286 | 1.1084 | | 0.5678 | 0.9639 | 560 | 1.3177 | 0.6610 | 1.3177 | 1.1479 | | 0.5678 | 0.9673 | 562 | 1.5106 | 0.5385 | 1.5106 | 1.2291 | | 0.5678 | 0.9707 | 564 | 1.4932 | 0.5385 | 1.4932 | 1.2220 | | 0.5678 | 0.9742 | 566 | 1.2637 | 0.5973 | 1.2637 | 1.1242 | | 0.5678 | 0.9776 | 568 | 1.1107 | 0.7106 | 1.1107 | 1.0539 | | 0.5678 | 0.9811 | 570 | 1.0780 | 0.7623 | 1.0780 | 1.0383 | | 0.5678 | 0.9845 | 572 | 1.1500 | 0.7337 | 1.1500 | 1.0724 | | 0.5678 | 0.9880 | 574 | 1.2234 | 0.6260 | 1.2234 | 1.1061 | | 0.5678 | 0.9914 | 576 | 1.1703 | 0.7333 | 1.1703 | 1.0818 | | 0.5678 | 0.9948 | 578 | 1.1600 | 0.6545 | 1.1600 | 1.0770 | | 0.5678 | 0.9983 | 580 | 1.1581 | 0.6008 | 1.1581 | 1.0762 | | 0.5678 | 1.0017 | 582 | 1.1183 | 0.7412 | 1.1183 | 1.0575 | | 0.5678 | 1.0052 | 584 | 0.9902 | 0.7747 | 0.9902 | 0.9951 | | 0.5678 | 1.0086 | 586 | 1.0064 | 0.7747 | 1.0064 | 1.0032 | | 0.5678 | 1.0120 | 588 | 1.1425 | 0.6476 | 1.1425 | 1.0689 | | 0.5678 | 1.0155 | 590 | 1.3708 | 0.6610 | 1.3708 | 1.1708 | | 0.5678 | 1.0189 | 592 | 1.5562 | 0.6008 | 1.5562 | 1.2475 | | 0.5678 | 1.0224 | 594 | 1.7245 | 0.5769 | 1.7245 | 1.3132 | | 0.5678 | 1.0258 | 596 | 1.6786 | 0.5769 | 1.6786 | 1.2956 | | 0.5678 | 1.0293 | 598 | 1.4626 | 0.6265 | 1.4626 | 1.2094 | | 0.5678 | 1.0327 | 600 | 1.5173 | 0.6008 | 1.5173 | 1.2318 | | 0.5678 | 1.0361 | 602 | 1.5667 | 0.5769 | 1.5667 | 1.2517 | | 0.5678 | 1.0396 | 604 | 1.3476 | 0.6610 | 1.3476 | 1.1609 | | 0.5678 | 1.0430 | 606 | 1.2448 | 0.7412 | 1.2448 | 1.1157 | | 0.5678 | 1.0465 | 608 | 1.1791 | 0.7101 | 1.1791 | 1.0859 | | 0.5678 | 1.0499 | 610 | 1.1270 | 0.7101 | 1.1270 | 1.0616 | | 0.5678 | 1.0534 | 612 | 0.9948 | 0.7327 | 0.9948 | 0.9974 | | 0.5678 | 1.0568 | 614 | 1.1073 | 0.6818 | 1.1073 | 1.0523 | | 0.5678 | 1.0602 | 616 | 1.2927 | 0.6382 | 1.2927 | 1.1370 | | 0.5678 | 1.0637 | 618 | 1.4080 | 0.6382 | 1.4080 | 1.1866 | | 0.5678 | 1.0671 | 620 | 1.3288 | 0.6303 | 1.3288 | 1.1527 | | 0.5678 | 1.0706 | 622 | 1.2808 | 0.6303 | 1.2808 | 1.1317 | | 0.5678 | 1.0740 | 624 | 1.0119 | 0.7316 | 1.0119 | 1.0060 | | 0.5678 | 1.0775 | 626 | 0.9250 | 0.7525 | 0.9250 | 0.9618 | | 0.5678 | 1.0809 | 628 | 1.0452 | 0.7194 | 1.0452 | 1.0223 | | 0.5678 | 1.0843 | 630 | 1.1378 | 0.6543 | 1.1378 | 1.0667 | | 0.5678 | 1.0878 | 632 | 1.2086 | 0.6476 | 1.2086 | 1.0994 | | 0.5678 | 1.0912 | 634 | 1.1223 | 0.6476 | 1.1223 | 1.0594 | | 0.5678 | 1.0947 | 636 | 1.1800 | 0.6476 | 1.1800 | 1.0863 | | 0.5678 | 1.0981 | 638 | 1.2117 | 0.6543 | 1.2117 | 1.1008 | | 0.5678 | 1.1015 | 640 | 1.3024 | 0.5638 | 1.3024 | 1.1412 | | 0.5678 | 1.1050 | 642 | 1.3790 | 0.5608 | 1.3790 | 1.1743 | | 0.5678 | 1.1084 | 644 | 1.2933 | 0.5670 | 1.2933 | 1.1372 | | 0.5678 | 1.1119 | 646 | 1.2590 | 0.5638 | 1.2590 | 1.1221 | | 0.5678 | 1.1153 | 648 | 1.3564 | 0.6008 | 1.3564 | 1.1646 | | 0.5678 | 1.1188 | 650 | 1.6038 | 0.5385 | 1.6038 | 1.2664 | | 0.5678 | 1.1222 | 652 | 1.8121 | 0.5707 | 1.8121 | 1.3461 | | 0.5678 | 1.1256 | 654 | 1.7454 | 0.5366 | 1.7454 | 1.3211 | | 0.5678 | 1.1291 | 656 | 1.5413 | 0.5744 | 1.5413 | 1.2415 | | 0.5678 | 1.1325 | 658 | 1.4064 | 0.5777 | 1.4064 | 1.1859 | | 0.5678 | 1.1360 | 660 | 1.2426 | 0.7014 | 1.2426 | 1.1147 | | 0.5678 | 1.1394 | 662 | 1.2244 | 0.7614 | 1.2244 | 1.1065 | | 0.5678 | 1.1429 | 664 | 1.4001 | 0.5777 | 1.4001 | 1.1833 | | 0.5678 | 1.1463 | 666 | 1.5929 | 0.5385 | 1.5929 | 1.2621 | | 0.5678 | 1.1497 | 668 | 1.6312 | 0.5188 | 1.6312 | 1.2772 | | 0.5678 | 1.1532 | 670 | 1.6995 | 0.5188 | 1.6995 | 1.3036 | | 0.5678 | 1.1566 | 672 | 1.5325 | 0.5404 | 1.5325 | 1.2379 | | 0.5678 | 1.1601 | 674 | 1.4436 | 0.5426 | 1.4436 | 1.2015 | | 0.5678 | 1.1635 | 676 | 1.3988 | 0.5404 | 1.3988 | 1.1827 | | 0.5678 | 1.1670 | 678 | 1.4044 | 0.5404 | 1.4044 | 1.1851 | | 0.5678 | 1.1704 | 680 | 1.5495 | 0.5404 | 1.5495 | 1.2448 | | 0.5678 | 1.1738 | 682 | 1.8509 | 0.5987 | 1.8509 | 1.3605 | | 0.5678 | 1.1773 | 684 | 1.8711 | 0.6079 | 1.8711 | 1.3679 | | 0.5678 | 1.1807 | 686 | 1.6654 | 0.5188 | 1.6654 | 1.2905 | | 0.5678 | 1.1842 | 688 | 1.4373 | 0.5404 | 1.4373 | 1.1989 | | 0.5678 | 1.1876 | 690 | 1.2207 | 0.7129 | 1.2207 | 1.1049 | | 0.5678 | 1.1910 | 692 | 1.2374 | 0.7211 | 1.2374 | 1.1124 | | 0.5678 | 1.1945 | 694 | 1.2399 | 0.6551 | 1.2399 | 1.1135 | | 0.5678 | 1.1979 | 696 | 1.4558 | 0.5404 | 1.4558 | 1.2066 | | 0.5678 | 1.2014 | 698 | 1.5478 | 0.5385 | 1.5478 | 1.2441 | | 0.5678 | 1.2048 | 700 | 1.4457 | 0.6008 | 1.4457 | 1.2024 | | 0.5678 | 1.2083 | 702 | 1.3237 | 0.6610 | 1.3237 | 1.1505 | | 0.5678 | 1.2117 | 704 | 1.3571 | 0.6610 | 1.3571 | 1.1649 | | 0.5678 | 1.2151 | 706 | 1.4851 | 0.5404 | 1.4851 | 1.2187 | | 0.5678 | 1.2186 | 708 | 1.5482 | 0.5404 | 1.5482 | 1.2443 | | 0.5678 | 1.2220 | 710 | 1.4316 | 0.6008 | 1.4316 | 1.1965 | | 0.5678 | 1.2255 | 712 | 1.1722 | 0.7194 | 1.1722 | 1.0827 | | 0.5678 | 1.2289 | 714 | 0.9562 | 0.6338 | 0.9562 | 0.9778 | | 0.5678 | 1.2324 | 716 | 0.9272 | 0.6338 | 0.9272 | 0.9629 | | 0.5678 | 1.2358 | 718 | 1.0262 | 0.7020 | 1.0262 | 1.0130 | | 0.5678 | 1.2392 | 720 | 1.1376 | 0.6557 | 1.1376 | 1.0666 | | 0.5678 | 1.2427 | 722 | 1.4215 | 0.6610 | 1.4215 | 1.1923 | | 0.5678 | 1.2461 | 724 | 1.5591 | 0.5965 | 1.5591 | 1.2486 | | 0.5678 | 1.2496 | 726 | 1.4562 | 0.6008 | 1.4562 | 1.2067 | | 0.5678 | 1.2530 | 728 | 1.3059 | 0.6610 | 1.3059 | 1.1428 | | 0.5678 | 1.2565 | 730 | 1.1291 | 0.6616 | 1.1291 | 1.0626 | | 0.5678 | 1.2599 | 732 | 1.0219 | 0.6503 | 1.0219 | 1.0109 | | 0.5678 | 1.2633 | 734 | 1.0864 | 0.6557 | 1.0864 | 1.0423 | | 0.5678 | 1.2668 | 736 | 1.3850 | 0.6356 | 1.3850 | 1.1769 | | 0.5678 | 1.2702 | 738 | 1.7051 | 0.5366 | 1.7051 | 1.3058 | | 0.5678 | 1.2737 | 740 | 1.8250 | 0.5177 | 1.8250 | 1.3509 | | 0.5678 | 1.2771 | 742 | 1.6063 | 0.5385 | 1.6063 | 1.2674 | | 0.5678 | 1.2806 | 744 | 1.2098 | 0.5891 | 1.2098 | 1.0999 | | 0.5678 | 1.2840 | 746 | 0.9315 | 0.7623 | 0.9315 | 0.9651 | | 0.5678 | 1.2874 | 748 | 0.8094 | 0.7801 | 0.8094 | 0.8997 | | 0.5678 | 1.2909 | 750 | 0.8492 | 0.7801 | 0.8492 | 0.9215 | | 0.5678 | 1.2943 | 752 | 1.0661 | 0.7221 | 1.0661 | 1.0325 | | 0.5678 | 1.2978 | 754 | 1.5559 | 0.5385 | 1.5559 | 1.2474 | | 0.5678 | 1.3012 | 756 | 1.9349 | 0.5794 | 1.9349 | 1.3910 | | 0.5678 | 1.3046 | 758 | 2.0094 | 0.6079 | 2.0094 | 1.4175 | | 0.5678 | 1.3081 | 760 | 1.7771 | 0.5385 | 1.7771 | 1.3331 | | 0.5678 | 1.3115 | 762 | 1.3824 | 0.5404 | 1.3824 | 1.1758 | | 0.5678 | 1.3150 | 764 | 1.0527 | 0.7844 | 1.0527 | 1.0260 | | 0.5678 | 1.3184 | 766 | 1.0049 | 0.7844 | 1.0049 | 1.0025 | | 0.5678 | 1.3219 | 768 | 1.0285 | 0.7844 | 1.0285 | 1.0141 | | 0.5678 | 1.3253 | 770 | 1.2290 | 0.6323 | 1.2290 | 1.1086 | | 0.5678 | 1.3287 | 772 | 1.6393 | 0.5385 | 1.6393 | 1.2803 | | 0.5678 | 1.3322 | 774 | 1.9247 | 0.5524 | 1.9247 | 1.3873 | | 0.5678 | 1.3356 | 776 | 1.9081 | 0.5707 | 1.9081 | 1.3814 | | 0.5678 | 1.3391 | 778 | 1.6583 | 0.5385 | 1.6583 | 1.2877 | | 0.5678 | 1.3425 | 780 | 1.3961 | 0.5777 | 1.3961 | 1.1816 | | 0.5678 | 1.3460 | 782 | 1.3732 | 0.5777 | 1.3732 | 1.1718 | | 0.5678 | 1.3494 | 784 | 1.3521 | 0.5777 | 1.3521 | 1.1628 | | 0.5678 | 1.3528 | 786 | 1.3188 | 0.5777 | 1.3188 | 1.1484 | | 0.5678 | 1.3563 | 788 | 1.2587 | 0.5777 | 1.2587 | 1.1219 | | 0.5678 | 1.3597 | 790 | 1.2007 | 0.6015 | 1.2007 | 1.0958 | | 0.5678 | 1.3632 | 792 | 1.2186 | 0.5413 | 1.2186 | 1.1039 | | 0.5678 | 1.3666 | 794 | 1.2841 | 0.5777 | 1.2841 | 1.1332 | | 0.5678 | 1.3701 | 796 | 1.2342 | 0.5777 | 1.2342 | 1.1109 | | 0.5678 | 1.3735 | 798 | 1.3249 | 0.5777 | 1.3249 | 1.1510 | | 0.5678 | 1.3769 | 800 | 1.4835 | 0.5744 | 1.4835 | 1.2180 | | 0.5678 | 1.3804 | 802 | 1.3937 | 0.5744 | 1.3937 | 1.1805 | | 0.5678 | 1.3838 | 804 | 1.1939 | 0.6414 | 1.1939 | 1.0926 | | 0.5678 | 1.3873 | 806 | 1.1498 | 0.6686 | 1.1498 | 1.0723 | | 0.5678 | 1.3907 | 808 | 1.2337 | 0.5744 | 1.2337 | 1.1107 | | 0.5678 | 1.3941 | 810 | 1.1369 | 0.7322 | 1.1369 | 1.0662 | | 0.5678 | 1.3976 | 812 | 1.0712 | 0.7020 | 1.0712 | 1.0350 | | 0.5678 | 1.4010 | 814 | 0.9016 | 0.7422 | 0.9016 | 0.9495 | | 0.5678 | 1.4045 | 816 | 0.8752 | 0.8019 | 0.8752 | 0.9355 | | 0.5678 | 1.4079 | 818 | 1.0057 | 0.6945 | 1.0057 | 1.0029 | | 0.5678 | 1.4114 | 820 | 1.2099 | 0.6860 | 1.2099 | 1.1000 | | 0.5678 | 1.4148 | 822 | 1.2716 | 0.6860 | 1.2716 | 1.1277 | | 0.5678 | 1.4182 | 824 | 1.3231 | 0.6303 | 1.3231 | 1.1502 | | 0.5678 | 1.4217 | 826 | 1.2385 | 0.6860 | 1.2385 | 1.1129 | | 0.5678 | 1.4251 | 828 | 1.1168 | 0.6866 | 1.1168 | 1.0568 | | 0.5678 | 1.4286 | 830 | 1.0030 | 0.7134 | 1.0030 | 1.0015 | | 0.5678 | 1.4320 | 832 | 1.0771 | 0.6557 | 1.0771 | 1.0379 | | 0.5678 | 1.4355 | 834 | 1.2001 | 0.6934 | 1.2001 | 1.0955 | | 0.5678 | 1.4389 | 836 | 1.4163 | 0.5744 | 1.4163 | 1.1901 | | 0.5678 | 1.4423 | 838 | 1.3981 | 0.6303 | 1.3981 | 1.1824 | | 0.5678 | 1.4458 | 840 | 1.3748 | 0.6303 | 1.3748 | 1.1725 | | 0.5678 | 1.4492 | 842 | 1.2684 | 0.6356 | 1.2684 | 1.1262 | | 0.5678 | 1.4527 | 844 | 1.3486 | 0.5777 | 1.3486 | 1.1613 | | 0.5678 | 1.4561 | 846 | 1.4492 | 0.5777 | 1.4492 | 1.2038 | | 0.5678 | 1.4596 | 848 | 1.3295 | 0.5413 | 1.3295 | 1.1530 | | 0.5678 | 1.4630 | 850 | 1.1450 | 0.6616 | 1.1450 | 1.0701 | | 0.5678 | 1.4664 | 852 | 1.0192 | 0.7221 | 1.0192 | 1.0095 | | 0.5678 | 1.4699 | 854 | 1.0478 | 0.7310 | 1.0478 | 1.0236 | | 0.5678 | 1.4733 | 856 | 1.2476 | 0.6616 | 1.2476 | 1.1170 | | 0.5678 | 1.4768 | 858 | 1.4341 | 0.5404 | 1.4341 | 1.1976 | | 0.5678 | 1.4802 | 860 | 1.3867 | 0.5404 | 1.3867 | 1.1776 | | 0.5678 | 1.4836 | 862 | 1.2411 | 0.6616 | 1.2411 | 1.1141 | | 0.5678 | 1.4871 | 864 | 1.2487 | 0.6616 | 1.2487 | 1.1175 | | 0.5678 | 1.4905 | 866 | 1.3686 | 0.5010 | 1.3686 | 1.1699 | | 0.5678 | 1.4940 | 868 | 1.4362 | 0.5404 | 1.4362 | 1.1984 | | 0.5678 | 1.4974 | 870 | 1.4674 | 0.5404 | 1.4674 | 1.2114 | | 0.5678 | 1.5009 | 872 | 1.4267 | 0.5404 | 1.4267 | 1.1944 | | 0.5678 | 1.5043 | 874 | 1.3880 | 0.5404 | 1.3880 | 1.1781 | | 0.5678 | 1.5077 | 876 | 1.2813 | 0.6157 | 1.2813 | 1.1319 | | 0.5678 | 1.5112 | 878 | 1.3173 | 0.5474 | 1.3173 | 1.1477 | | 0.5678 | 1.5146 | 880 | 1.3566 | 0.5474 | 1.3566 | 1.1647 | | 0.5678 | 1.5181 | 882 | 1.3230 | 0.5474 | 1.3230 | 1.1502 | | 0.5678 | 1.5215 | 884 | 1.4246 | 0.5010 | 1.4246 | 1.1936 | | 0.5678 | 1.5250 | 886 | 1.4694 | 0.5404 | 1.4694 | 1.2122 | | 0.5678 | 1.5284 | 888 | 1.6340 | 0.5385 | 1.6340 | 1.2783 | | 0.5678 | 1.5318 | 890 | 1.8399 | 0.5188 | 1.8399 | 1.3564 | | 0.5678 | 1.5353 | 892 | 1.7581 | 0.5188 | 1.7581 | 1.3259 | | 0.5678 | 1.5387 | 894 | 1.4516 | 0.5385 | 1.4516 | 1.2048 | | 0.5678 | 1.5422 | 896 | 1.0617 | 0.7729 | 1.0617 | 1.0304 | | 0.5678 | 1.5456 | 898 | 0.9188 | 0.7623 | 0.9188 | 0.9585 | | 0.5678 | 1.5491 | 900 | 0.9352 | 0.7623 | 0.9352 | 0.9671 | | 0.5678 | 1.5525 | 902 | 1.0997 | 0.7844 | 1.0997 | 1.0487 | | 0.5678 | 1.5559 | 904 | 1.4047 | 0.5015 | 1.4047 | 1.1852 | | 0.5678 | 1.5594 | 906 | 1.5917 | 0.5188 | 1.5917 | 1.2616 | | 0.5678 | 1.5628 | 908 | 1.5735 | 0.5188 | 1.5735 | 1.2544 | | 0.5678 | 1.5663 | 910 | 1.4393 | 0.5413 | 1.4393 | 1.1997 | | 0.5678 | 1.5697 | 912 | 1.1973 | 0.6543 | 1.1973 | 1.0942 | | 0.5678 | 1.5731 | 914 | 1.0305 | 0.6897 | 1.0305 | 1.0151 | | 0.5678 | 1.5766 | 916 | 0.9568 | 0.8142 | 0.9568 | 0.9781 | | 0.5678 | 1.5800 | 918 | 0.9561 | 0.7906 | 0.9561 | 0.9778 | | 0.5678 | 1.5835 | 920 | 1.0315 | 0.7793 | 1.0315 | 1.0156 | | 0.5678 | 1.5869 | 922 | 1.1252 | 0.7234 | 1.1252 | 1.0608 | | 0.5678 | 1.5904 | 924 | 1.0680 | 0.7898 | 1.0680 | 1.0334 | | 0.5678 | 1.5938 | 926 | 1.0232 | 0.7618 | 1.0232 | 1.0115 | | 0.5678 | 1.5972 | 928 | 1.0620 | 0.7618 | 1.0620 | 1.0305 | | 0.5678 | 1.6007 | 930 | 1.0579 | 0.7618 | 1.0579 | 1.0285 | | 0.5678 | 1.6041 | 932 | 1.0952 | 0.7310 | 1.0952 | 1.0465 | | 0.5678 | 1.6076 | 934 | 1.1846 | 0.6015 | 1.1846 | 1.0884 | | 0.5678 | 1.6110 | 936 | 1.2261 | 0.6015 | 1.2261 | 1.1073 | | 0.5678 | 1.6145 | 938 | 1.3094 | 0.5413 | 1.3094 | 1.1443 | | 0.5678 | 1.6179 | 940 | 1.2975 | 0.5435 | 1.2975 | 1.1391 | | 0.5678 | 1.6213 | 942 | 1.2471 | 0.5435 | 1.2471 | 1.1167 | | 0.5678 | 1.6248 | 944 | 1.1867 | 0.7221 | 1.1867 | 1.0894 | | 0.5678 | 1.6282 | 946 | 1.1318 | 0.7221 | 1.1318 | 1.0639 | | 0.5678 | 1.6317 | 948 | 1.1496 | 0.7134 | 1.1496 | 1.0722 | | 0.5678 | 1.6351 | 950 | 1.2559 | 0.7421 | 1.2559 | 1.1207 | | 0.5678 | 1.6386 | 952 | 1.3263 | 0.7333 | 1.3263 | 1.1516 | | 0.5678 | 1.6420 | 954 | 1.3475 | 0.6797 | 1.3475 | 1.1608 | | 0.5678 | 1.6454 | 956 | 1.1580 | 0.7134 | 1.1580 | 1.0761 | | 0.5678 | 1.6489 | 958 | 1.0167 | 0.7221 | 1.0167 | 1.0083 | | 0.5678 | 1.6523 | 960 | 1.0469 | 0.7221 | 1.0469 | 1.0232 | | 0.5678 | 1.6558 | 962 | 1.1840 | 0.6975 | 1.1840 | 1.0881 | | 0.5678 | 1.6592 | 964 | 1.3833 | 0.5638 | 1.3833 | 1.1761 | | 0.5678 | 1.6627 | 966 | 1.4332 | 0.5426 | 1.4332 | 1.1972 | | 0.5678 | 1.6661 | 968 | 1.4213 | 0.5638 | 1.4213 | 1.1922 | | 0.5678 | 1.6695 | 970 | 1.3219 | 0.6265 | 1.3219 | 1.1497 | | 0.5678 | 1.6730 | 972 | 1.3040 | 0.6265 | 1.3040 | 1.1419 | | 0.5678 | 1.6764 | 974 | 1.4507 | 0.5426 | 1.4507 | 1.2044 | | 0.5678 | 1.6799 | 976 | 1.5152 | 0.5188 | 1.5152 | 1.2309 | | 0.5678 | 1.6833 | 978 | 1.4087 | 0.7123 | 1.4087 | 1.1869 | | 0.5678 | 1.6867 | 980 | 1.1917 | 0.6890 | 1.1917 | 1.0917 | | 0.5678 | 1.6902 | 982 | 1.0174 | 0.6752 | 1.0174 | 1.0087 | | 0.5678 | 1.6936 | 984 | 0.9348 | 0.7058 | 0.9348 | 0.9668 | | 0.5678 | 1.6971 | 986 | 1.0326 | 0.6752 | 1.0326 | 1.0162 | | 0.5678 | 1.7005 | 988 | 1.3724 | 0.7123 | 1.3724 | 1.1715 | | 0.5678 | 1.7040 | 990 | 1.5515 | 0.7092 | 1.5515 | 1.2456 | | 0.5678 | 1.7074 | 992 | 1.4554 | 0.6610 | 1.4554 | 1.2064 | | 0.5678 | 1.7108 | 994 | 1.2585 | 0.7129 | 1.2585 | 1.1218 | | 0.5678 | 1.7143 | 996 | 1.1039 | 0.7134 | 1.1039 | 1.0507 | | 0.5678 | 1.7177 | 998 | 1.1727 | 0.6818 | 1.1727 | 1.0829 | | 0.1999 | 1.7212 | 1000 | 1.3038 | 0.6934 | 1.3038 | 1.1418 | | 0.1999 | 1.7246 | 1002 | 1.5142 | 0.6115 | 1.5142 | 1.2305 | | 0.1999 | 1.7281 | 1004 | 1.5743 | 0.5556 | 1.5743 | 1.2547 | | 0.1999 | 1.7315 | 1006 | 1.4662 | 0.5413 | 1.4662 | 1.2109 | | 0.1999 | 1.7349 | 1008 | 1.3318 | 0.6015 | 1.3318 | 1.1540 | | 0.1999 | 1.7384 | 1010 | 1.1319 | 0.7194 | 1.1319 | 1.0639 | | 0.1999 | 1.7418 | 1012 | 1.0337 | 0.7844 | 1.0337 | 1.0167 | | 0.1999 | 1.7453 | 1014 | 1.0285 | 0.7844 | 1.0285 | 1.0142 | | 0.1999 | 1.7487 | 1016 | 1.1460 | 0.6616 | 1.1460 | 1.0705 | | 0.1999 | 1.7522 | 1018 | 1.3093 | 0.6303 | 1.3093 | 1.1442 | | 0.1999 | 1.7556 | 1020 | 1.3387 | 0.6303 | 1.3387 | 1.1570 | | 0.1999 | 1.7590 | 1022 | 1.1888 | 0.6797 | 1.1888 | 1.0903 | | 0.1999 | 1.7625 | 1024 | 1.0049 | 0.7342 | 1.0049 | 1.0024 | | 0.1999 | 1.7659 | 1026 | 0.8713 | 0.7801 | 0.8713 | 0.9334 | | 0.1999 | 1.7694 | 1028 | 0.8229 | 0.7801 | 0.8229 | 0.9071 | | 0.1999 | 1.7728 | 1030 | 0.7841 | 0.7801 | 0.7841 | 0.8855 | | 0.1999 | 1.7762 | 1032 | 0.7969 | 0.7801 | 0.7969 | 0.8927 | | 0.1999 | 1.7797 | 1034 | 0.9526 | 0.7264 | 0.9526 | 0.9760 | | 0.1999 | 1.7831 | 1036 | 1.2753 | 0.6860 | 1.2753 | 1.1293 | | 0.1999 | 1.7866 | 1038 | 1.5383 | 0.5744 | 1.5383 | 1.2403 | | 0.1999 | 1.7900 | 1040 | 1.5197 | 0.5744 | 1.5197 | 1.2327 | | 0.1999 | 1.7935 | 1042 | 1.2991 | 0.5744 | 1.2991 | 1.1398 | | 0.1999 | 1.7969 | 1044 | 1.0138 | 0.7844 | 1.0138 | 1.0069 | | 0.1999 | 1.8003 | 1046 | 0.8040 | 0.7149 | 0.8040 | 0.8967 | | 0.1999 | 1.8038 | 1048 | 0.7458 | 0.7149 | 0.7458 | 0.8636 | | 0.1999 | 1.8072 | 1050 | 0.7715 | 0.7149 | 0.7715 | 0.8784 | | 0.1999 | 1.8107 | 1052 | 0.9137 | 0.7801 | 0.9137 | 0.9559 | | 0.1999 | 1.8141 | 1054 | 1.0043 | 0.7902 | 1.0043 | 1.0022 | | 0.1999 | 1.8176 | 1056 | 1.0209 | 0.7618 | 1.0209 | 1.0104 | | 0.1999 | 1.8210 | 1058 | 1.0141 | 0.7426 | 1.0141 | 1.0070 | | 0.1999 | 1.8244 | 1060 | 0.9893 | 0.7342 | 0.9893 | 0.9947 | | 0.1999 | 1.8279 | 1062 | 0.9968 | 0.7618 | 0.9968 | 0.9984 | | 0.1999 | 1.8313 | 1064 | 0.9651 | 0.7729 | 0.9651 | 0.9824 | | 0.1999 | 1.8348 | 1066 | 0.9620 | 0.7729 | 0.9620 | 0.9808 | | 0.1999 | 1.8382 | 1068 | 1.0514 | 0.7618 | 1.0514 | 1.0254 | | 0.1999 | 1.8417 | 1070 | 1.1053 | 0.6616 | 1.1053 | 1.0513 | | 0.1999 | 1.8451 | 1072 | 1.2474 | 0.6616 | 1.2474 | 1.1169 | | 0.1999 | 1.8485 | 1074 | 1.3110 | 0.6015 | 1.3110 | 1.1450 | | 0.1999 | 1.8520 | 1076 | 1.2666 | 0.6686 | 1.2666 | 1.1254 | | 0.1999 | 1.8554 | 1078 | 1.1121 | 0.7194 | 1.1121 | 1.0546 | | 0.1999 | 1.8589 | 1080 | 0.9736 | 0.7844 | 0.9736 | 0.9867 | | 0.1999 | 1.8623 | 1082 | 0.9713 | 0.7844 | 0.9713 | 0.9855 | | 0.1999 | 1.8657 | 1084 | 1.0103 | 0.7310 | 1.0103 | 1.0051 | | 0.1999 | 1.8692 | 1086 | 1.0257 | 0.7134 | 1.0257 | 1.0128 | | 0.1999 | 1.8726 | 1088 | 1.1374 | 0.7421 | 1.1374 | 1.0665 | | 0.1999 | 1.8761 | 1090 | 1.4073 | 0.6791 | 1.4073 | 1.1863 | | 0.1999 | 1.8795 | 1092 | 1.5297 | 0.5714 | 1.5297 | 1.2368 | | 0.1999 | 1.8830 | 1094 | 1.4530 | 0.6836 | 1.4530 | 1.2054 | | 0.1999 | 1.8864 | 1096 | 1.1420 | 0.7424 | 1.1420 | 1.0686 | | 0.1999 | 1.8898 | 1098 | 0.9404 | 0.7058 | 0.9404 | 0.9697 | | 0.1999 | 1.8933 | 1100 | 0.9635 | 0.7134 | 0.9635 | 0.9816 | | 0.1999 | 1.8967 | 1102 | 1.0731 | 0.6616 | 1.0731 | 1.0359 | | 0.1999 | 1.9002 | 1104 | 1.2178 | 0.6616 | 1.2178 | 1.1035 | | 0.1999 | 1.9036 | 1106 | 1.2456 | 0.7101 | 1.2456 | 1.1161 | | 0.1999 | 1.9071 | 1108 | 1.1505 | 0.7194 | 1.1505 | 1.0726 | | 0.1999 | 1.9105 | 1110 | 1.0573 | 0.7194 | 1.0573 | 1.0283 | | 0.1999 | 1.9139 | 1112 | 1.0458 | 0.7101 | 1.0458 | 1.0226 | | 0.1999 | 1.9174 | 1114 | 0.8859 | 0.8019 | 0.8859 | 0.9412 | | 0.1999 | 1.9208 | 1116 | 0.7457 | 0.8023 | 0.7457 | 0.8635 | | 0.1999 | 1.9243 | 1118 | 0.6599 | 0.7805 | 0.6599 | 0.8123 | | 0.1999 | 1.9277 | 1120 | 0.6770 | 0.7805 | 0.6770 | 0.8228 | | 0.1999 | 1.9312 | 1122 | 0.7572 | 0.7797 | 0.7572 | 0.8702 | | 0.1999 | 1.9346 | 1124 | 0.8780 | 0.7134 | 0.8780 | 0.9370 | | 0.1999 | 1.9380 | 1126 | 1.0261 | 0.7134 | 1.0261 | 1.0130 | | 0.1999 | 1.9415 | 1128 | 1.0775 | 0.7618 | 1.0775 | 1.0380 | | 0.1999 | 1.9449 | 1130 | 1.0288 | 0.7618 | 1.0288 | 1.0143 | | 0.1999 | 1.9484 | 1132 | 1.0218 | 0.7729 | 1.0218 | 1.0109 | | 0.1999 | 1.9518 | 1134 | 0.9921 | 0.7729 | 0.9921 | 0.9960 | | 0.1999 | 1.9552 | 1136 | 1.0973 | 0.7134 | 1.0973 | 1.0475 | | 0.1999 | 1.9587 | 1138 | 1.1264 | 0.7134 | 1.1264 | 1.0613 | | 0.1999 | 1.9621 | 1140 | 1.0317 | 0.7134 | 1.0317 | 1.0157 | | 0.1999 | 1.9656 | 1142 | 1.0480 | 0.7134 | 1.0480 | 1.0237 | | 0.1999 | 1.9690 | 1144 | 1.0163 | 0.7058 | 1.0163 | 1.0081 | | 0.1999 | 1.9725 | 1146 | 0.9549 | 0.7058 | 0.9549 | 0.9772 | | 0.1999 | 1.9759 | 1148 | 0.8687 | 0.7520 | 0.8687 | 0.9321 | | 0.1999 | 1.9793 | 1150 | 0.8357 | 0.7623 | 0.8357 | 0.9142 | | 0.1999 | 1.9828 | 1152 | 0.9111 | 0.7623 | 0.9111 | 0.9545 | | 0.1999 | 1.9862 | 1154 | 1.0760 | 0.7216 | 1.0760 | 1.0373 | | 0.1999 | 1.9897 | 1156 | 1.2704 | 0.5638 | 1.2704 | 1.1271 | | 0.1999 | 1.9931 | 1158 | 1.3640 | 0.6008 | 1.3640 | 1.1679 | | 0.1999 | 1.9966 | 1160 | 1.3871 | 0.6008 | 1.3871 | 1.1778 | | 0.1999 | 2.0 | 1162 | 1.2794 | 0.6015 | 1.2794 | 1.1311 | | 0.1999 | 2.0034 | 1164 | 1.0752 | 0.7844 | 1.0752 | 1.0369 | | 0.1999 | 2.0069 | 1166 | 0.8959 | 0.7623 | 0.8959 | 0.9465 | | 0.1999 | 2.0103 | 1168 | 0.8695 | 0.7823 | 0.8695 | 0.9325 | | 0.1999 | 2.0138 | 1170 | 0.9861 | 0.7134 | 0.9861 | 0.9930 | | 0.1999 | 2.0172 | 1172 | 1.2094 | 0.7216 | 1.2094 | 1.0997 | | 0.1999 | 2.0207 | 1174 | 1.3914 | 0.6008 | 1.3914 | 1.1796 | | 0.1999 | 2.0241 | 1176 | 1.3640 | 0.6008 | 1.3640 | 1.1679 | | 0.1999 | 2.0275 | 1178 | 1.1837 | 0.7101 | 1.1837 | 1.0880 | | 0.1999 | 2.0310 | 1180 | 0.9804 | 0.7729 | 0.9804 | 0.9902 | | 0.1999 | 2.0344 | 1182 | 0.9292 | 0.7623 | 0.9292 | 0.9640 | | 0.1999 | 2.0379 | 1184 | 1.0152 | 0.7618 | 1.0152 | 1.0076 | | 0.1999 | 2.0413 | 1186 | 1.0607 | 0.7618 | 1.0607 | 1.0299 | | 0.1999 | 2.0448 | 1188 | 1.1753 | 0.7421 | 1.1753 | 1.0841 | | 0.1999 | 2.0482 | 1190 | 1.2385 | 0.7421 | 1.2385 | 1.1129 | | 0.1999 | 2.0516 | 1192 | 1.3774 | 0.6309 | 1.3774 | 1.1736 | | 0.1999 | 2.0551 | 1194 | 1.3558 | 0.6309 | 1.3558 | 1.1644 | | 0.1999 | 2.0585 | 1196 | 1.1905 | 0.7020 | 1.1905 | 1.0911 | | 0.1999 | 2.0620 | 1198 | 1.1592 | 0.7020 | 1.1592 | 1.0766 | | 0.1999 | 2.0654 | 1200 | 1.2045 | 0.7020 | 1.2045 | 1.0975 | | 0.1999 | 2.0688 | 1202 | 1.2589 | 0.6420 | 1.2589 | 1.1220 | | 0.1999 | 2.0723 | 1204 | 1.3228 | 0.5617 | 1.3228 | 1.1501 | | 0.1999 | 2.0757 | 1206 | 1.2192 | 0.6483 | 1.2192 | 1.1042 | | 0.1999 | 2.0792 | 1208 | 1.0472 | 0.7729 | 1.0472 | 1.0233 | | 0.1999 | 2.0826 | 1210 | 0.9913 | 0.7729 | 0.9913 | 0.9957 | | 0.1999 | 2.0861 | 1212 | 1.0376 | 0.7729 | 1.0376 | 1.0186 | | 0.1999 | 2.0895 | 1214 | 1.0503 | 0.7729 | 1.0503 | 1.0248 | | 0.1999 | 2.0929 | 1216 | 0.9759 | 0.7729 | 0.9759 | 0.9879 | | 0.1999 | 2.0964 | 1218 | 0.8630 | 0.7623 | 0.8630 | 0.9290 | | 0.1999 | 2.0998 | 1220 | 0.7234 | 0.7986 | 0.7234 | 0.8505 | | 0.1999 | 2.1033 | 1222 | 0.7623 | 0.7986 | 0.7623 | 0.8731 | | 0.1999 | 2.1067 | 1224 | 0.9787 | 0.7426 | 0.9787 | 0.9893 | | 0.1999 | 2.1102 | 1226 | 1.2599 | 0.6866 | 1.2599 | 1.1225 | | 0.1999 | 2.1136 | 1228 | 1.3020 | 0.5980 | 1.3020 | 1.1411 | | 0.1999 | 2.1170 | 1230 | 1.1575 | 0.6557 | 1.1575 | 1.0759 | | 0.1999 | 2.1205 | 1232 | 0.9646 | 0.7618 | 0.9646 | 0.9822 | | 0.1999 | 2.1239 | 1234 | 0.8481 | 0.8019 | 0.8481 | 0.9209 | | 0.1999 | 2.1274 | 1236 | 0.8585 | 0.8019 | 0.8585 | 0.9266 | | 0.1999 | 2.1308 | 1238 | 0.9705 | 0.7729 | 0.9705 | 0.9851 | | 0.1999 | 2.1343 | 1240 | 1.1366 | 0.6557 | 1.1366 | 1.0661 | | 0.1999 | 2.1377 | 1242 | 1.1760 | 0.6557 | 1.1760 | 1.0845 | | 0.1999 | 2.1411 | 1244 | 1.1032 | 0.6557 | 1.1032 | 1.0503 | | 0.1999 | 2.1446 | 1246 | 0.9517 | 0.7902 | 0.9517 | 0.9756 | | 0.1999 | 2.1480 | 1248 | 0.8550 | 0.7801 | 0.8550 | 0.9246 | | 0.1999 | 2.1515 | 1250 | 0.8686 | 0.7906 | 0.8686 | 0.9320 | | 0.1999 | 2.1549 | 1252 | 0.9688 | 0.8019 | 0.9688 | 0.9843 | | 0.1999 | 2.1583 | 1254 | 1.0824 | 0.7020 | 1.0824 | 1.0404 | | 0.1999 | 2.1618 | 1256 | 1.1600 | 0.6557 | 1.1600 | 1.0770 | | 0.1999 | 2.1652 | 1258 | 1.1438 | 0.6420 | 1.1438 | 1.0695 | | 0.1999 | 2.1687 | 1260 | 1.1699 | 0.6420 | 1.1699 | 1.0816 | | 0.1999 | 2.1721 | 1262 | 1.0894 | 0.6420 | 1.0894 | 1.0437 | | 0.1999 | 2.1756 | 1264 | 0.9034 | 0.7906 | 0.9034 | 0.9504 | | 0.1999 | 2.1790 | 1266 | 0.7645 | 0.7704 | 0.7645 | 0.8744 | | 0.1999 | 2.1824 | 1268 | 0.8125 | 0.7191 | 0.8125 | 0.9014 | | 0.1999 | 2.1859 | 1270 | 0.9934 | 0.7518 | 0.9934 | 0.9967 | | 0.1999 | 2.1893 | 1272 | 1.0396 | 0.7514 | 1.0396 | 1.0196 | | 0.1999 | 2.1928 | 1274 | 0.8987 | 0.7439 | 0.8987 | 0.9480 | | 0.1999 | 2.1962 | 1276 | 0.7521 | 0.7443 | 0.7521 | 0.8672 | | 0.1999 | 2.1997 | 1278 | 0.7335 | 0.7443 | 0.7335 | 0.8564 | | 0.1999 | 2.2031 | 1280 | 0.8150 | 0.7700 | 0.8150 | 0.9027 | | 0.1999 | 2.2065 | 1282 | 0.9442 | 0.7618 | 0.9442 | 0.9717 | | 0.1999 | 2.2100 | 1284 | 1.2138 | 0.6420 | 1.2138 | 1.1017 | | 0.1999 | 2.2134 | 1286 | 1.3419 | 0.5980 | 1.3419 | 1.1584 | | 0.1999 | 2.2169 | 1288 | 1.2666 | 0.6420 | 1.2666 | 1.1254 | | 0.1999 | 2.2203 | 1290 | 1.0995 | 0.7020 | 1.0995 | 1.0486 | | 0.1999 | 2.2238 | 1292 | 0.9435 | 0.6988 | 0.9435 | 0.9713 | | 0.1999 | 2.2272 | 1294 | 0.8660 | 0.6988 | 0.8660 | 0.9306 | | 0.1999 | 2.2306 | 1296 | 0.9089 | 0.7729 | 0.9089 | 0.9533 | | 0.1999 | 2.2341 | 1298 | 0.9984 | 0.7134 | 0.9984 | 0.9992 | | 0.1999 | 2.2375 | 1300 | 1.1187 | 0.7134 | 1.1187 | 1.0577 | | 0.1999 | 2.2410 | 1302 | 1.1659 | 0.7157 | 1.1659 | 1.0797 | | 0.1999 | 2.2444 | 1304 | 1.1009 | 0.7134 | 1.1009 | 1.0492 | | 0.1999 | 2.2478 | 1306 | 1.0634 | 0.7618 | 1.0634 | 1.0312 | | 0.1999 | 2.2513 | 1308 | 0.9554 | 0.7729 | 0.9554 | 0.9774 | | 0.1999 | 2.2547 | 1310 | 0.9577 | 0.6988 | 0.9577 | 0.9786 | | 0.1999 | 2.2582 | 1312 | 1.0544 | 0.7106 | 1.0544 | 1.0268 | | 0.1999 | 2.2616 | 1314 | 1.1313 | 0.6483 | 1.1313 | 1.0636 | | 0.1999 | 2.2651 | 1316 | 1.2557 | 0.5980 | 1.2557 | 1.1206 | | 0.1999 | 2.2685 | 1318 | 1.2655 | 0.5980 | 1.2655 | 1.1249 | | 0.1999 | 2.2719 | 1320 | 1.3451 | 0.6260 | 1.3451 | 1.1598 | | 0.1999 | 2.2754 | 1322 | 1.2996 | 0.5980 | 1.2996 | 1.1400 | | 0.1999 | 2.2788 | 1324 | 1.0944 | 0.6483 | 1.0944 | 1.0461 | | 0.1999 | 2.2823 | 1326 | 0.9026 | 0.7623 | 0.9026 | 0.9501 | | 0.1999 | 2.2857 | 1328 | 0.8285 | 0.6909 | 0.8285 | 0.9102 | | 0.1999 | 2.2892 | 1330 | 0.8417 | 0.6909 | 0.8417 | 0.9174 | | 0.1999 | 2.2926 | 1332 | 0.9633 | 0.7729 | 0.9633 | 0.9815 | | 0.1999 | 2.2960 | 1334 | 1.1650 | 0.6557 | 1.1650 | 1.0794 | | 0.1999 | 2.2995 | 1336 | 1.3326 | 0.6085 | 1.3326 | 1.1544 | | 0.1999 | 2.3029 | 1338 | 1.2667 | 0.6622 | 1.2667 | 1.1255 | | 0.1999 | 2.3064 | 1340 | 1.0973 | 0.7134 | 1.0973 | 1.0475 | | 0.1999 | 2.3098 | 1342 | 0.9540 | 0.7729 | 0.9540 | 0.9767 | | 0.1999 | 2.3133 | 1344 | 0.8495 | 0.7729 | 0.8495 | 0.9217 | | 0.1999 | 2.3167 | 1346 | 0.8843 | 0.7729 | 0.8843 | 0.9404 | | 0.1999 | 2.3201 | 1348 | 0.9365 | 0.7221 | 0.9365 | 0.9677 | | 0.1999 | 2.3236 | 1350 | 1.0424 | 0.7134 | 1.0424 | 1.0210 | | 0.1999 | 2.3270 | 1352 | 1.1279 | 0.7134 | 1.1279 | 1.0620 | | 0.1999 | 2.3305 | 1354 | 1.1058 | 0.7134 | 1.1058 | 1.0516 | | 0.1999 | 2.3339 | 1356 | 1.1183 | 0.7134 | 1.1183 | 1.0575 | | 0.1999 | 2.3373 | 1358 | 1.0535 | 0.7729 | 1.0535 | 1.0264 | | 0.1999 | 2.3408 | 1360 | 0.9623 | 0.7729 | 0.9623 | 0.9810 | | 0.1999 | 2.3442 | 1362 | 0.9443 | 0.7729 | 0.9443 | 0.9717 | | 0.1999 | 2.3477 | 1364 | 1.0183 | 0.7618 | 1.0183 | 1.0091 | | 0.1999 | 2.3511 | 1366 | 1.0528 | 0.7134 | 1.0528 | 1.0261 | | 0.1999 | 2.3546 | 1368 | 1.0933 | 0.7134 | 1.0933 | 1.0456 | | 0.1999 | 2.3580 | 1370 | 1.0860 | 0.7134 | 1.0860 | 1.0421 | | 0.1999 | 2.3614 | 1372 | 1.0017 | 0.7623 | 1.0017 | 1.0009 | | 0.1999 | 2.3649 | 1374 | 1.0460 | 0.7618 | 1.0460 | 1.0228 | | 0.1999 | 2.3683 | 1376 | 1.0202 | 0.7623 | 1.0202 | 1.0100 | | 0.1999 | 2.3718 | 1378 | 1.0093 | 0.7623 | 1.0093 | 1.0047 | | 0.1999 | 2.3752 | 1380 | 0.9303 | 0.7623 | 0.9303 | 0.9645 | | 0.1999 | 2.3787 | 1382 | 0.8594 | 0.7149 | 0.8594 | 0.9271 | | 0.1999 | 2.3821 | 1384 | 0.8668 | 0.7801 | 0.8668 | 0.9310 | | 0.1999 | 2.3855 | 1386 | 0.9212 | 0.6987 | 0.9212 | 0.9598 | | 0.1999 | 2.3890 | 1388 | 1.0880 | 0.7255 | 1.0880 | 1.0431 | | 0.1999 | 2.3924 | 1390 | 1.1509 | 0.6784 | 1.1509 | 1.0728 | | 0.1999 | 2.3959 | 1392 | 1.0849 | 0.7337 | 1.0849 | 1.0416 | | 0.1999 | 2.3993 | 1394 | 0.9662 | 0.7058 | 0.9662 | 0.9829 | | 0.1999 | 2.4028 | 1396 | 0.8045 | 0.7986 | 0.8045 | 0.8969 | | 0.1999 | 2.4062 | 1398 | 0.7669 | 0.7342 | 0.7669 | 0.8757 | | 0.1999 | 2.4096 | 1400 | 0.8201 | 0.8098 | 0.8201 | 0.9056 | | 0.1999 | 2.4131 | 1402 | 0.9028 | 0.7623 | 0.9028 | 0.9501 | | 0.1999 | 2.4165 | 1404 | 1.0068 | 0.7618 | 1.0068 | 1.0034 | | 0.1999 | 2.4200 | 1406 | 1.0158 | 0.7134 | 1.0158 | 1.0079 | | 0.1999 | 2.4234 | 1408 | 0.9629 | 0.7520 | 0.9629 | 0.9813 | | 0.1999 | 2.4269 | 1410 | 0.8586 | 0.7797 | 0.8586 | 0.9266 | | 0.1999 | 2.4303 | 1412 | 0.8286 | 0.7801 | 0.8286 | 0.9102 | | 0.1999 | 2.4337 | 1414 | 0.8684 | 0.7342 | 0.8684 | 0.9319 | | 0.1999 | 2.4372 | 1416 | 0.9477 | 0.7058 | 0.9477 | 0.9735 | | 0.1999 | 2.4406 | 1418 | 0.9152 | 0.7058 | 0.9152 | 0.9566 | | 0.1999 | 2.4441 | 1420 | 0.9778 | 0.7058 | 0.9778 | 0.9888 | | 0.1999 | 2.4475 | 1422 | 1.0358 | 0.7058 | 1.0358 | 1.0177 | | 0.1999 | 2.4509 | 1424 | 1.1325 | 0.6982 | 1.1325 | 1.0642 | | 0.1999 | 2.4544 | 1426 | 1.1408 | 0.6871 | 1.1408 | 1.0681 | | 0.1999 | 2.4578 | 1428 | 1.0383 | 0.7058 | 1.0383 | 1.0190 | | 0.1999 | 2.4613 | 1430 | 0.9088 | 0.7623 | 0.9088 | 0.9533 | | 0.1999 | 2.4647 | 1432 | 0.8571 | 0.7525 | 0.8571 | 0.9258 | | 0.1999 | 2.4682 | 1434 | 0.8815 | 0.7525 | 0.8815 | 0.9389 | | 0.1999 | 2.4716 | 1436 | 0.8708 | 0.7525 | 0.8708 | 0.9332 | | 0.1999 | 2.4750 | 1438 | 0.9413 | 0.7139 | 0.9413 | 0.9702 | | 0.1999 | 2.4785 | 1440 | 0.9793 | 0.7058 | 0.9793 | 0.9896 | | 0.1999 | 2.4819 | 1442 | 0.9464 | 0.6987 | 0.9464 | 0.9728 | | 0.1999 | 2.4854 | 1444 | 0.8876 | 0.6987 | 0.8876 | 0.9421 | | 0.1999 | 2.4888 | 1446 | 0.9394 | 0.6987 | 0.9394 | 0.9692 | | 0.1999 | 2.4923 | 1448 | 0.9867 | 0.7058 | 0.9867 | 0.9933 | | 0.1999 | 2.4957 | 1450 | 1.0162 | 0.7058 | 1.0162 | 1.0081 | | 0.1999 | 2.4991 | 1452 | 1.1137 | 0.7053 | 1.1137 | 1.0553 | | 0.1999 | 2.5026 | 1454 | 1.0995 | 0.7134 | 1.0995 | 1.0486 | | 0.1999 | 2.5060 | 1456 | 1.1240 | 0.7134 | 1.1240 | 1.0602 | | 0.1999 | 2.5095 | 1458 | 1.1062 | 0.7134 | 1.1062 | 1.0518 | | 0.1999 | 2.5129 | 1460 | 0.9685 | 0.7139 | 0.9685 | 0.9841 | | 0.1999 | 2.5164 | 1462 | 0.9277 | 0.7063 | 0.9277 | 0.9632 | | 0.1999 | 2.5198 | 1464 | 1.0305 | 0.7058 | 1.0305 | 1.0151 | | 0.1999 | 2.5232 | 1466 | 1.2406 | 0.6497 | 1.2406 | 1.1138 | | 0.1999 | 2.5267 | 1468 | 1.4611 | 0.6078 | 1.4611 | 1.2088 | | 0.1999 | 2.5301 | 1470 | 1.5044 | 0.5540 | 1.5044 | 1.2266 | | 0.1999 | 2.5336 | 1472 | 1.4164 | 0.5540 | 1.4164 | 1.1901 | | 0.1999 | 2.5370 | 1474 | 1.2312 | 0.6497 | 1.2312 | 1.1096 | | 0.1999 | 2.5404 | 1476 | 1.0150 | 0.7139 | 1.0150 | 1.0075 | | 0.1999 | 2.5439 | 1478 | 0.9338 | 0.7623 | 0.9338 | 0.9663 | | 0.1999 | 2.5473 | 1480 | 0.9597 | 0.7623 | 0.9597 | 0.9796 | | 0.1999 | 2.5508 | 1482 | 1.0351 | 0.7221 | 1.0351 | 1.0174 | | 0.1999 | 2.5542 | 1484 | 1.1871 | 0.6557 | 1.1871 | 1.0896 | | 0.1999 | 2.5577 | 1486 | 1.2464 | 0.6557 | 1.2464 | 1.1164 | | 0.1999 | 2.5611 | 1488 | 1.1643 | 0.6557 | 1.1643 | 1.0790 | | 0.1999 | 2.5645 | 1490 | 1.0105 | 0.7139 | 1.0105 | 1.0052 | | 0.1999 | 2.5680 | 1492 | 0.8667 | 0.7623 | 0.8667 | 0.9309 | | 0.1999 | 2.5714 | 1494 | 0.8495 | 0.7623 | 0.8495 | 0.9217 | | 0.1999 | 2.5749 | 1496 | 0.8693 | 0.7139 | 0.8693 | 0.9323 | | 0.1999 | 2.5783 | 1498 | 0.9858 | 0.7139 | 0.9858 | 0.9929 | | 0.1399 | 2.5818 | 1500 | 1.1688 | 0.7134 | 1.1688 | 1.0811 | | 0.1399 | 2.5852 | 1502 | 1.2145 | 0.7134 | 1.2145 | 1.1020 | | 0.1399 | 2.5886 | 1504 | 1.1821 | 0.7134 | 1.1821 | 1.0872 | | 0.1399 | 2.5921 | 1506 | 1.0488 | 0.7134 | 1.0488 | 1.0241 | | 0.1399 | 2.5955 | 1508 | 1.0200 | 0.7221 | 1.0200 | 1.0099 | | 0.1399 | 2.5990 | 1510 | 1.0546 | 0.7134 | 1.0546 | 1.0269 | | 0.1399 | 2.6024 | 1512 | 1.0077 | 0.7058 | 1.0077 | 1.0038 | | 0.1399 | 2.6059 | 1514 | 0.9905 | 0.7058 | 0.9905 | 0.9952 | | 0.1399 | 2.6093 | 1516 | 0.9010 | 0.7342 | 0.9010 | 0.9492 | | 0.1399 | 2.6127 | 1518 | 0.8844 | 0.7264 | 0.8844 | 0.9404 | | 0.1399 | 2.6162 | 1520 | 0.9766 | 0.7058 | 0.9766 | 0.9882 | | 0.1399 | 2.6196 | 1522 | 1.1636 | 0.7058 | 1.1636 | 1.0787 | | 0.1399 | 2.6231 | 1524 | 1.2215 | 0.7157 | 1.2215 | 1.1052 | | 0.1399 | 2.6265 | 1526 | 1.1593 | 0.6871 | 1.1593 | 1.0767 | | 0.1399 | 2.6299 | 1528 | 1.0551 | 0.7058 | 1.0551 | 1.0272 | | 0.1399 | 2.6334 | 1530 | 0.9458 | 0.7623 | 0.9458 | 0.9725 | | 0.1399 | 2.6368 | 1532 | 0.8863 | 0.7525 | 0.8863 | 0.9414 | | 0.1399 | 2.6403 | 1534 | 0.9430 | 0.7139 | 0.9430 | 0.9711 | | 0.1399 | 2.6437 | 1536 | 1.1207 | 0.7134 | 1.1207 | 1.0587 | | 0.1399 | 2.6472 | 1538 | 1.2864 | 0.6616 | 1.2864 | 1.1342 | | 0.1399 | 2.6506 | 1540 | 1.3407 | 0.6616 | 1.3407 | 1.1579 | | 0.1399 | 2.6540 | 1542 | 1.2231 | 0.7134 | 1.2231 | 1.1060 | | 0.1399 | 2.6575 | 1544 | 1.1410 | 0.7221 | 1.1410 | 1.0682 | | 0.1399 | 2.6609 | 1546 | 1.1233 | 0.7221 | 1.1233 | 1.0599 | | 0.1399 | 2.6644 | 1548 | 1.2106 | 0.7134 | 1.2106 | 1.1003 | | 0.1399 | 2.6678 | 1550 | 1.2675 | 0.6866 | 1.2675 | 1.1258 | | 0.1399 | 2.6713 | 1552 | 1.2303 | 0.7134 | 1.2303 | 1.1092 | | 0.1399 | 2.6747 | 1554 | 1.1084 | 0.7134 | 1.1084 | 1.0528 | | 0.1399 | 2.6781 | 1556 | 0.9435 | 0.7139 | 0.9435 | 0.9713 | | 0.1399 | 2.6816 | 1558 | 0.8449 | 0.7525 | 0.8449 | 0.9192 | | 0.1399 | 2.6850 | 1560 | 0.7941 | 0.7525 | 0.7941 | 0.8911 | | 0.1399 | 2.6885 | 1562 | 0.8222 | 0.7063 | 0.8222 | 0.9067 | | 0.1399 | 2.6919 | 1564 | 0.9512 | 0.6987 | 0.9512 | 0.9753 | | 0.1399 | 2.6954 | 1566 | 1.1433 | 0.7255 | 1.1433 | 1.0692 | | 0.1399 | 2.6988 | 1568 | 1.2217 | 0.7333 | 1.2217 | 1.1053 | | 0.1399 | 2.7022 | 1570 | 1.1878 | 0.7333 | 1.1878 | 1.0898 | | 0.1399 | 2.7057 | 1572 | 1.0246 | 0.7058 | 1.0246 | 1.0122 | | 0.1399 | 2.7091 | 1574 | 0.8392 | 0.6987 | 0.8392 | 0.9161 | | 0.1399 | 2.7126 | 1576 | 0.7758 | 0.7167 | 0.7758 | 0.8808 | | 0.1399 | 2.7160 | 1578 | 0.8212 | 0.7167 | 0.8212 | 0.9062 | | 0.1399 | 2.7194 | 1580 | 0.9659 | 0.7058 | 0.9659 | 0.9828 | | 0.1399 | 2.7229 | 1582 | 1.0893 | 0.7255 | 1.0893 | 1.0437 | | 0.1399 | 2.7263 | 1584 | 1.0567 | 0.7255 | 1.0567 | 1.0280 | | 0.1399 | 2.7298 | 1586 | 0.9336 | 0.7058 | 0.9336 | 0.9662 | | 0.1399 | 2.7332 | 1588 | 0.8564 | 0.6987 | 0.8564 | 0.9254 | | 0.1399 | 2.7367 | 1590 | 0.7930 | 0.7063 | 0.7930 | 0.8905 | | 0.1399 | 2.7401 | 1592 | 0.8059 | 0.7063 | 0.8059 | 0.8977 | | 0.1399 | 2.7435 | 1594 | 0.8719 | 0.6987 | 0.8719 | 0.9337 | | 0.1399 | 2.7470 | 1596 | 0.9190 | 0.7182 | 0.9190 | 0.9587 | | 0.1399 | 2.7504 | 1598 | 1.0019 | 0.7182 | 1.0019 | 1.0010 | | 0.1399 | 2.7539 | 1600 | 1.0499 | 0.7182 | 1.0499 | 1.0247 | | 0.1399 | 2.7573 | 1602 | 1.0167 | 0.7182 | 1.0167 | 1.0083 | | 0.1399 | 2.7608 | 1604 | 0.8507 | 0.6987 | 0.8507 | 0.9223 | | 0.1399 | 2.7642 | 1606 | 0.7218 | 0.7704 | 0.7218 | 0.8496 | | 0.1399 | 2.7676 | 1608 | 0.7198 | 0.7525 | 0.7198 | 0.8484 | | 0.1399 | 2.7711 | 1610 | 0.7984 | 0.7525 | 0.7984 | 0.8936 | | 0.1399 | 2.7745 | 1612 | 0.9852 | 0.7139 | 0.9852 | 0.9926 | | 0.1399 | 2.7780 | 1614 | 1.2178 | 0.7333 | 1.2178 | 1.1035 | | 0.1399 | 2.7814 | 1616 | 1.2724 | 0.7333 | 1.2724 | 1.1280 | | 0.1399 | 2.7849 | 1618 | 1.1674 | 0.7333 | 1.1674 | 1.0805 | | 0.1399 | 2.7883 | 1620 | 0.9753 | 0.7139 | 0.9753 | 0.9876 | | 0.1399 | 2.7917 | 1622 | 0.8924 | 0.7623 | 0.8924 | 0.9447 | | 0.1399 | 2.7952 | 1624 | 0.8086 | 0.7623 | 0.8086 | 0.8992 | | 0.1399 | 2.7986 | 1626 | 0.8211 | 0.7623 | 0.8211 | 0.9061 | | 0.1399 | 2.8021 | 1628 | 0.8762 | 0.7623 | 0.8762 | 0.9360 | | 0.1399 | 2.8055 | 1630 | 0.9154 | 0.7139 | 0.9154 | 0.9568 | | 0.1399 | 2.8090 | 1632 | 0.9032 | 0.7139 | 0.9032 | 0.9503 | | 0.1399 | 2.8124 | 1634 | 0.9257 | 0.7729 | 0.9257 | 0.9621 | | 0.1399 | 2.8158 | 1636 | 1.0293 | 0.7221 | 1.0293 | 1.0145 | | 0.1399 | 2.8193 | 1638 | 1.1849 | 0.7221 | 1.1849 | 1.0885 | | 0.1399 | 2.8227 | 1640 | 1.1798 | 0.7310 | 1.1798 | 1.0862 | | 0.1399 | 2.8262 | 1642 | 1.1110 | 0.7844 | 1.1110 | 1.0540 | | 0.1399 | 2.8296 | 1644 | 0.9879 | 0.7729 | 0.9879 | 0.9940 | | 0.1399 | 2.8330 | 1646 | 0.8625 | 0.6988 | 0.8625 | 0.9287 | | 0.1399 | 2.8365 | 1648 | 0.8312 | 0.6988 | 0.8312 | 0.9117 | | 0.1399 | 2.8399 | 1650 | 0.8634 | 0.7729 | 0.8634 | 0.9292 | | 0.1399 | 2.8434 | 1652 | 0.9214 | 0.7221 | 0.9214 | 0.9599 | | 0.1399 | 2.8468 | 1654 | 0.9400 | 0.7134 | 0.9400 | 0.9695 | | 0.1399 | 2.8503 | 1656 | 0.9993 | 0.7157 | 0.9993 | 0.9996 | | 0.1399 | 2.8537 | 1658 | 1.0307 | 0.7424 | 1.0307 | 1.0152 | | 0.1399 | 2.8571 | 1660 | 1.0096 | 0.7134 | 1.0096 | 1.0048 | | 0.1399 | 2.8606 | 1662 | 0.8906 | 0.7134 | 0.8906 | 0.9437 | | 0.1399 | 2.8640 | 1664 | 0.8586 | 0.7221 | 0.8586 | 0.9266 | | 0.1399 | 2.8675 | 1666 | 0.8202 | 0.7623 | 0.8202 | 0.9056 | | 0.1399 | 2.8709 | 1668 | 0.8133 | 0.7801 | 0.8133 | 0.9018 | | 0.1399 | 2.8744 | 1670 | 0.9153 | 0.6987 | 0.9153 | 0.9567 | | 0.1399 | 2.8778 | 1672 | 1.0146 | 0.7421 | 1.0146 | 1.0073 | | 0.1399 | 2.8812 | 1674 | 1.0418 | 0.7421 | 1.0418 | 1.0207 | | 0.1399 | 2.8847 | 1676 | 0.9418 | 0.6922 | 0.9418 | 0.9705 | | 0.1399 | 2.8881 | 1678 | 0.8788 | 0.7191 | 0.8788 | 0.9374 | | 0.1399 | 2.8916 | 1680 | 0.8508 | 0.7268 | 0.8508 | 0.9224 | | 0.1399 | 2.8950 | 1682 | 0.8988 | 0.7191 | 0.8988 | 0.9480 | | 0.1399 | 2.8985 | 1684 | 0.8607 | 0.7191 | 0.8607 | 0.9278 | | 0.1399 | 2.9019 | 1686 | 0.8902 | 0.7191 | 0.8902 | 0.9435 | | 0.1399 | 2.9053 | 1688 | 0.9309 | 0.7191 | 0.9309 | 0.9648 | | 0.1399 | 2.9088 | 1690 | 0.9420 | 0.7119 | 0.9420 | 0.9706 | | 0.1399 | 2.9122 | 1692 | 0.9355 | 0.7264 | 0.9355 | 0.9672 | | 0.1399 | 2.9157 | 1694 | 0.9160 | 0.7264 | 0.9160 | 0.9571 | | 0.1399 | 2.9191 | 1696 | 0.9497 | 0.7264 | 0.9497 | 0.9745 | | 0.1399 | 2.9225 | 1698 | 0.9694 | 0.7264 | 0.9694 | 0.9846 | | 0.1399 | 2.9260 | 1700 | 0.9442 | 0.7264 | 0.9442 | 0.9717 | | 0.1399 | 2.9294 | 1702 | 0.9189 | 0.7264 | 0.9189 | 0.9586 | | 0.1399 | 2.9329 | 1704 | 0.8353 | 0.7264 | 0.8353 | 0.9140 | | 0.1399 | 2.9363 | 1706 | 0.7897 | 0.7801 | 0.7897 | 0.8886 | | 0.1399 | 2.9398 | 1708 | 0.8339 | 0.7801 | 0.8339 | 0.9132 | | 0.1399 | 2.9432 | 1710 | 0.9218 | 0.7264 | 0.9218 | 0.9601 | | 0.1399 | 2.9466 | 1712 | 0.9005 | 0.7264 | 0.9005 | 0.9489 | | 0.1399 | 2.9501 | 1714 | 0.9060 | 0.7264 | 0.9060 | 0.9519 | | 0.1399 | 2.9535 | 1716 | 0.9033 | 0.6987 | 0.9033 | 0.9504 | | 0.1399 | 2.9570 | 1718 | 0.8638 | 0.7058 | 0.8638 | 0.9294 | | 0.1399 | 2.9604 | 1720 | 0.8787 | 0.7058 | 0.8787 | 0.9374 | | 0.1399 | 2.9639 | 1722 | 0.9523 | 0.7134 | 0.9523 | 0.9758 | | 0.1399 | 2.9673 | 1724 | 0.9548 | 0.7134 | 0.9548 | 0.9771 | | 0.1399 | 2.9707 | 1726 | 0.9380 | 0.7058 | 0.9380 | 0.9685 | | 0.1399 | 2.9742 | 1728 | 0.9086 | 0.7058 | 0.9086 | 0.9532 | | 0.1399 | 2.9776 | 1730 | 0.8728 | 0.7342 | 0.8728 | 0.9342 | | 0.1399 | 2.9811 | 1732 | 0.7623 | 0.7191 | 0.7623 | 0.8731 | | 0.1399 | 2.9845 | 1734 | 0.7242 | 0.7358 | 0.7242 | 0.8510 | | 0.1399 | 2.9880 | 1736 | 0.7644 | 0.7358 | 0.7644 | 0.8743 | | 0.1399 | 2.9914 | 1738 | 0.8512 | 0.7058 | 0.8512 | 0.9226 | | 0.1399 | 2.9948 | 1740 | 0.9262 | 0.7134 | 0.9262 | 0.9624 | | 0.1399 | 2.9983 | 1742 | 0.9578 | 0.7134 | 0.9578 | 0.9787 | | 0.1399 | 3.0017 | 1744 | 0.9707 | 0.7134 | 0.9707 | 0.9852 | | 0.1399 | 3.0052 | 1746 | 0.9792 | 0.7134 | 0.9792 | 0.9896 | | 0.1399 | 3.0086 | 1748 | 1.0487 | 0.6940 | 1.0487 | 1.0241 | | 0.1399 | 3.0120 | 1750 | 1.1352 | 0.6940 | 1.1352 | 1.0655 | | 0.1399 | 3.0155 | 1752 | 1.1605 | 0.7014 | 1.1605 | 1.0772 | | 0.1399 | 3.0189 | 1754 | 1.1421 | 0.6940 | 1.1421 | 1.0687 | | 0.1399 | 3.0224 | 1756 | 1.0716 | 0.6940 | 1.0716 | 1.0352 | | 0.1399 | 3.0258 | 1758 | 0.9744 | 0.7134 | 0.9744 | 0.9871 | | 0.1399 | 3.0293 | 1760 | 0.9637 | 0.7134 | 0.9637 | 0.9817 | | 0.1399 | 3.0327 | 1762 | 0.9459 | 0.7058 | 0.9459 | 0.9726 | | 0.1399 | 3.0361 | 1764 | 0.8903 | 0.7058 | 0.8903 | 0.9436 | | 0.1399 | 3.0396 | 1766 | 0.8665 | 0.7058 | 0.8665 | 0.9308 | | 0.1399 | 3.0430 | 1768 | 0.7749 | 0.7986 | 0.7749 | 0.8803 | | 0.1399 | 3.0465 | 1770 | 0.7584 | 0.7986 | 0.7584 | 0.8708 | | 0.1399 | 3.0499 | 1772 | 0.8360 | 0.7139 | 0.8360 | 0.9143 | | 0.1399 | 3.0534 | 1774 | 0.9983 | 0.7134 | 0.9983 | 0.9992 | | 0.1399 | 3.0568 | 1776 | 1.0775 | 0.6866 | 1.0775 | 1.0380 | | 0.1399 | 3.0602 | 1778 | 1.0747 | 0.6940 | 1.0747 | 1.0367 | | 0.1399 | 3.0637 | 1780 | 1.0184 | 0.6940 | 1.0184 | 1.0091 | | 0.1399 | 3.0671 | 1782 | 1.0009 | 0.6940 | 1.0009 | 1.0005 | | 0.1399 | 3.0706 | 1784 | 0.8833 | 0.7729 | 0.8833 | 0.9398 | | 0.1399 | 3.0740 | 1786 | 0.8042 | 0.7118 | 0.8042 | 0.8968 | | 0.1399 | 3.0775 | 1788 | 0.8135 | 0.7623 | 0.8135 | 0.9020 | | 0.1399 | 3.0809 | 1790 | 0.8750 | 0.7134 | 0.8750 | 0.9354 | | 0.1399 | 3.0843 | 1792 | 1.0153 | 0.6940 | 1.0153 | 1.0076 | | 0.1399 | 3.0878 | 1794 | 1.0614 | 0.7153 | 1.0614 | 1.0303 | | 0.1399 | 3.0912 | 1796 | 0.9730 | 0.7134 | 0.9730 | 0.9864 | | 0.1399 | 3.0947 | 1798 | 0.8967 | 0.6987 | 0.8967 | 0.9469 | | 0.1399 | 3.0981 | 1800 | 0.8091 | 0.7191 | 0.8091 | 0.8995 | | 0.1399 | 3.1015 | 1802 | 0.7111 | 0.6824 | 0.7111 | 0.8433 | | 0.1399 | 3.1050 | 1804 | 0.7149 | 0.6824 | 0.7149 | 0.8455 | | 0.1399 | 3.1084 | 1806 | 0.7797 | 0.7358 | 0.7797 | 0.8830 | | 0.1399 | 3.1119 | 1808 | 0.9317 | 0.7518 | 0.9317 | 0.9653 | | 0.1399 | 3.1153 | 1810 | 1.0395 | 0.7333 | 1.0395 | 1.0195 | | 0.1399 | 3.1188 | 1812 | 1.0385 | 0.7333 | 1.0385 | 1.0191 | | 0.1399 | 3.1222 | 1814 | 0.9355 | 0.7426 | 0.9355 | 0.9672 | | 0.1399 | 3.1256 | 1816 | 0.7998 | 0.7525 | 0.7998 | 0.8943 | | 0.1399 | 3.1291 | 1818 | 0.7633 | 0.7258 | 0.7633 | 0.8736 | | 0.1399 | 3.1325 | 1820 | 0.8029 | 0.7346 | 0.8029 | 0.8961 | | 0.1399 | 3.1360 | 1822 | 0.8380 | 0.7346 | 0.8380 | 0.9154 | | 0.1399 | 3.1394 | 1824 | 0.8164 | 0.7346 | 0.8164 | 0.9035 | | 0.1399 | 3.1429 | 1826 | 0.7676 | 0.7268 | 0.7676 | 0.8761 | | 0.1399 | 3.1463 | 1828 | 0.7121 | 0.7268 | 0.7121 | 0.8439 | | 0.1399 | 3.1497 | 1830 | 0.6888 | 0.7285 | 0.6888 | 0.8299 | | 0.1399 | 3.1532 | 1832 | 0.6899 | 0.7526 | 0.6899 | 0.8306 | | 0.1399 | 3.1566 | 1834 | 0.7422 | 0.7212 | 0.7422 | 0.8615 | | 0.1399 | 3.1601 | 1836 | 0.7686 | 0.7212 | 0.7686 | 0.8767 | | 0.1399 | 3.1635 | 1838 | 0.7813 | 0.7212 | 0.7813 | 0.8839 | | 0.1399 | 3.1670 | 1840 | 0.7482 | 0.7281 | 0.7482 | 0.8650 | | 0.1399 | 3.1704 | 1842 | 0.6861 | 0.7692 | 0.6861 | 0.8283 | | 0.1399 | 3.1738 | 1844 | 0.6702 | 0.7692 | 0.6702 | 0.8186 | | 0.1399 | 3.1773 | 1846 | 0.7106 | 0.7782 | 0.7106 | 0.8430 | | 0.1399 | 3.1807 | 1848 | 0.8133 | 0.7264 | 0.8133 | 0.9018 | | 0.1399 | 3.1842 | 1850 | 0.8809 | 0.7518 | 0.8809 | 0.9386 | | 0.1399 | 3.1876 | 1852 | 0.9079 | 0.7518 | 0.9079 | 0.9529 | | 0.1399 | 3.1910 | 1854 | 0.8636 | 0.7518 | 0.8636 | 0.9293 | | 0.1399 | 3.1945 | 1856 | 0.7655 | 0.7212 | 0.7655 | 0.8749 | | 0.1399 | 3.1979 | 1858 | 0.6816 | 0.7927 | 0.6816 | 0.8256 | | 0.1399 | 3.2014 | 1860 | 0.6567 | 0.7927 | 0.6567 | 0.8103 | | 0.1399 | 3.2048 | 1862 | 0.6850 | 0.7927 | 0.6850 | 0.8277 | | 0.1399 | 3.2083 | 1864 | 0.7528 | 0.7605 | 0.7528 | 0.8677 | | 0.1399 | 3.2117 | 1866 | 0.8415 | 0.7700 | 0.8415 | 0.9173 | | 0.1399 | 3.2151 | 1868 | 0.9315 | 0.7797 | 0.9315 | 0.9651 | | 0.1399 | 3.2186 | 1870 | 0.9474 | 0.7906 | 0.9474 | 0.9734 | | 0.1399 | 3.2220 | 1872 | 0.9272 | 0.7906 | 0.9272 | 0.9629 | | 0.1399 | 3.2255 | 1874 | 0.8535 | 0.7232 | 0.8535 | 0.9238 | | 0.1399 | 3.2289 | 1876 | 0.7587 | 0.6962 | 0.7587 | 0.8711 | | 0.1399 | 3.2324 | 1878 | 0.7208 | 0.6962 | 0.7208 | 0.8490 | | 0.1399 | 3.2358 | 1880 | 0.7282 | 0.6962 | 0.7282 | 0.8533 | | 0.1399 | 3.2392 | 1882 | 0.7579 | 0.6962 | 0.7579 | 0.8705 | | 0.1399 | 3.2427 | 1884 | 0.8320 | 0.7801 | 0.8320 | 0.9122 | | 0.1399 | 3.2461 | 1886 | 0.9406 | 0.7430 | 0.9406 | 0.9698 | | 0.1399 | 3.2496 | 1888 | 0.9791 | 0.7221 | 0.9791 | 0.9895 | | 0.1399 | 3.2530 | 1890 | 0.9730 | 0.7221 | 0.9730 | 0.9864 | | 0.1399 | 3.2565 | 1892 | 0.9221 | 0.7221 | 0.9221 | 0.9602 | | 0.1399 | 3.2599 | 1894 | 0.8911 | 0.7520 | 0.8911 | 0.9440 | | 0.1399 | 3.2633 | 1896 | 0.8554 | 0.7906 | 0.8554 | 0.9249 | | 0.1399 | 3.2668 | 1898 | 0.8748 | 0.8019 | 0.8748 | 0.9353 | | 0.1399 | 3.2702 | 1900 | 0.9498 | 0.7221 | 0.9498 | 0.9746 | | 0.1399 | 3.2737 | 1902 | 0.9475 | 0.7729 | 0.9475 | 0.9734 | | 0.1399 | 3.2771 | 1904 | 0.9720 | 0.7221 | 0.9720 | 0.9859 | | 0.1399 | 3.2806 | 1906 | 0.9545 | 0.7221 | 0.9545 | 0.9770 | | 0.1399 | 3.2840 | 1908 | 0.8958 | 0.7430 | 0.8958 | 0.9465 | | 0.1399 | 3.2874 | 1910 | 0.8661 | 0.7346 | 0.8661 | 0.9307 | | 0.1399 | 3.2909 | 1912 | 0.8344 | 0.7346 | 0.8344 | 0.9134 | | 0.1399 | 3.2943 | 1914 | 0.8047 | 0.7346 | 0.8047 | 0.8970 | | 0.1399 | 3.2978 | 1916 | 0.8305 | 0.7264 | 0.8305 | 0.9113 | | 0.1399 | 3.3012 | 1918 | 0.8600 | 0.7264 | 0.8600 | 0.9273 | | 0.1399 | 3.3046 | 1920 | 0.8367 | 0.7264 | 0.8367 | 0.9147 | | 0.1399 | 3.3081 | 1922 | 0.7733 | 0.7692 | 0.7733 | 0.8794 | | 0.1399 | 3.3115 | 1924 | 0.7518 | 0.7692 | 0.7518 | 0.8670 | | 0.1399 | 3.3150 | 1926 | 0.7795 | 0.7692 | 0.7795 | 0.8829 | | 0.1399 | 3.3184 | 1928 | 0.8216 | 0.7285 | 0.8216 | 0.9064 | | 0.1399 | 3.3219 | 1930 | 0.7855 | 0.7692 | 0.7855 | 0.8863 | | 0.1399 | 3.3253 | 1932 | 0.7183 | 0.7692 | 0.7183 | 0.8475 | | 0.1399 | 3.3287 | 1934 | 0.6919 | 0.7696 | 0.6919 | 0.8318 | | 0.1399 | 3.3322 | 1936 | 0.6609 | 0.6942 | 0.6609 | 0.8130 | | 0.1399 | 3.3356 | 1938 | 0.6714 | 0.6942 | 0.6714 | 0.8194 | | 0.1399 | 3.3391 | 1940 | 0.7341 | 0.7692 | 0.7341 | 0.8568 | | 0.1399 | 3.3425 | 1942 | 0.8780 | 0.7358 | 0.8780 | 0.9370 | | 0.1399 | 3.3460 | 1944 | 0.9755 | 0.7277 | 0.9755 | 0.9876 | | 0.1399 | 3.3494 | 1946 | 0.9715 | 0.7354 | 0.9715 | 0.9857 | | 0.1399 | 3.3528 | 1948 | 0.8843 | 0.7358 | 0.8843 | 0.9404 | | 0.1399 | 3.3563 | 1950 | 0.8502 | 0.7782 | 0.8502 | 0.9221 | | 0.1399 | 3.3597 | 1952 | 0.8748 | 0.7782 | 0.8748 | 0.9353 | | 0.1399 | 3.3632 | 1954 | 0.8751 | 0.7437 | 0.8751 | 0.9355 | | 0.1399 | 3.3666 | 1956 | 0.8029 | 0.7782 | 0.8029 | 0.8960 | | 0.1399 | 3.3701 | 1958 | 0.7388 | 0.7105 | 0.7388 | 0.8596 | | 0.1399 | 3.3735 | 1960 | 0.7420 | 0.7692 | 0.7420 | 0.8614 | | 0.1399 | 3.3769 | 1962 | 0.7472 | 0.7285 | 0.7472 | 0.8644 | | 0.1399 | 3.3804 | 1964 | 0.7416 | 0.7285 | 0.7416 | 0.8612 | | 0.1399 | 3.3838 | 1966 | 0.7300 | 0.7447 | 0.7300 | 0.8544 | | 0.1399 | 3.3873 | 1968 | 0.7800 | 0.7212 | 0.7800 | 0.8832 | | 0.1399 | 3.3907 | 1970 | 0.8330 | 0.7281 | 0.8330 | 0.9127 | | 0.1399 | 3.3941 | 1972 | 0.9362 | 0.7277 | 0.9362 | 0.9676 | | 0.1399 | 3.3976 | 1974 | 0.9800 | 0.7259 | 0.9800 | 0.9899 | | 0.1399 | 3.4010 | 1976 | 0.9305 | 0.7342 | 0.9305 | 0.9646 | | 0.1399 | 3.4045 | 1978 | 0.8909 | 0.7906 | 0.8909 | 0.9439 | | 0.1399 | 3.4079 | 1980 | 0.9186 | 0.7729 | 0.9186 | 0.9584 | | 0.1399 | 3.4114 | 1982 | 0.9829 | 0.7729 | 0.9829 | 0.9914 | | 0.1399 | 3.4148 | 1984 | 1.0290 | 0.7729 | 1.0290 | 1.0144 | | 0.1399 | 3.4182 | 1986 | 1.0564 | 0.7520 | 1.0564 | 1.0278 | | 0.1399 | 3.4217 | 1988 | 1.1456 | 0.6476 | 1.1456 | 1.0703 | | 0.1399 | 3.4251 | 1990 | 1.1579 | 0.6476 | 1.1579 | 1.0760 | | 0.1399 | 3.4286 | 1992 | 1.0775 | 0.7729 | 1.0775 | 1.0380 | | 0.1399 | 3.4320 | 1994 | 0.9814 | 0.7729 | 0.9814 | 0.9907 | | 0.1399 | 3.4355 | 1996 | 0.9233 | 0.6988 | 0.9233 | 0.9609 | | 0.1399 | 3.4389 | 1998 | 0.8901 | 0.6988 | 0.8901 | 0.9435 | | 0.1077 | 3.4423 | 2000 | 0.9172 | 0.6988 | 0.9172 | 0.9577 | | 0.1077 | 3.4458 | 2002 | 1.0269 | 0.7221 | 1.0269 | 1.0134 | | 0.1077 | 3.4492 | 2004 | 1.1967 | 0.6940 | 1.1967 | 1.0939 | | 0.1077 | 3.4527 | 2006 | 1.3991 | 0.5777 | 1.3991 | 1.1828 | | 0.1077 | 3.4561 | 2008 | 1.4655 | 0.5769 | 1.4655 | 1.2106 | | 0.1077 | 3.4596 | 2010 | 1.4050 | 0.6008 | 1.4050 | 1.1853 | | 0.1077 | 3.4630 | 2012 | 1.3168 | 0.6680 | 1.3168 | 1.1475 | | 0.1077 | 3.4664 | 2014 | 1.1611 | 0.7519 | 1.1611 | 1.0776 | | 0.1077 | 3.4699 | 2016 | 1.0642 | 0.7729 | 1.0642 | 1.0316 | | 0.1077 | 3.4733 | 2018 | 1.0203 | 0.7729 | 1.0203 | 1.0101 | | 0.1077 | 3.4768 | 2020 | 1.0592 | 0.7844 | 1.0592 | 1.0292 | | 0.1077 | 3.4802 | 2022 | 1.1819 | 0.7519 | 1.1819 | 1.0871 | | 0.1077 | 3.4836 | 2024 | 1.3205 | 0.6609 | 1.3205 | 1.1491 | | 0.1077 | 3.4871 | 2026 | 1.3285 | 0.5927 | 1.3285 | 1.1526 | | 0.1077 | 3.4905 | 2028 | 1.2958 | 0.6609 | 1.2958 | 1.1383 | | 0.1077 | 3.4940 | 2030 | 1.1951 | 0.7844 | 1.1951 | 1.0932 | | 0.1077 | 3.4974 | 2032 | 1.0464 | 0.7844 | 1.0464 | 1.0229 | | 0.1077 | 3.5009 | 2034 | 0.9913 | 0.7729 | 0.9913 | 0.9957 | | 0.1077 | 3.5043 | 2036 | 1.0062 | 0.7729 | 1.0062 | 1.0031 | | 0.1077 | 3.5077 | 2038 | 1.1007 | 0.7844 | 1.1007 | 1.0491 | | 0.1077 | 3.5112 | 2040 | 1.1162 | 0.7134 | 1.1162 | 1.0565 | | 0.1077 | 3.5146 | 2042 | 1.0282 | 0.7729 | 1.0282 | 1.0140 | | 0.1077 | 3.5181 | 2044 | 1.0268 | 0.7729 | 1.0268 | 1.0133 | | 0.1077 | 3.5215 | 2046 | 0.9783 | 0.7729 | 0.9783 | 0.9891 | | 0.1077 | 3.5250 | 2048 | 0.9940 | 0.7221 | 0.9940 | 0.9970 | | 0.1077 | 3.5284 | 2050 | 0.9840 | 0.7729 | 0.9840 | 0.9920 | | 0.1077 | 3.5318 | 2052 | 1.0168 | 0.7221 | 1.0168 | 1.0084 | | 0.1077 | 3.5353 | 2054 | 1.0926 | 0.7221 | 1.0926 | 1.0453 | | 0.1077 | 3.5387 | 2056 | 1.1673 | 0.6940 | 1.1673 | 1.0804 | | 0.1077 | 3.5422 | 2058 | 1.1598 | 0.6940 | 1.1598 | 1.0770 | | 0.1077 | 3.5456 | 2060 | 1.0893 | 0.7221 | 1.0893 | 1.0437 | | 0.1077 | 3.5491 | 2062 | 0.9590 | 0.7729 | 0.9590 | 0.9793 | | 0.1077 | 3.5525 | 2064 | 0.8891 | 0.6782 | 0.8891 | 0.9429 | | 0.1077 | 3.5559 | 2066 | 0.9059 | 0.6564 | 0.9059 | 0.9518 | | 0.1077 | 3.5594 | 2068 | 0.9468 | 0.7525 | 0.9468 | 0.9731 | | 0.1077 | 3.5628 | 2070 | 0.9880 | 0.7221 | 0.9880 | 0.9940 | | 0.1077 | 3.5663 | 2072 | 1.0073 | 0.7134 | 1.0073 | 1.0037 | | 0.1077 | 3.5697 | 2074 | 1.0447 | 0.7134 | 1.0447 | 1.0221 | | 0.1077 | 3.5731 | 2076 | 1.0211 | 0.7134 | 1.0211 | 1.0105 | | 0.1077 | 3.5766 | 2078 | 0.9552 | 0.7025 | 0.9552 | 0.9773 | | 0.1077 | 3.5800 | 2080 | 0.8918 | 0.7321 | 0.8918 | 0.9444 | | 0.1077 | 3.5835 | 2082 | 0.9020 | 0.6830 | 0.9020 | 0.9497 | | 0.1077 | 3.5869 | 2084 | 0.9800 | 0.7221 | 0.9800 | 0.9900 | | 0.1077 | 3.5904 | 2086 | 1.0936 | 0.6940 | 1.0936 | 1.0458 | | 0.1077 | 3.5938 | 2088 | 1.1865 | 0.7014 | 1.1865 | 1.0893 | | 0.1077 | 3.5972 | 2090 | 1.3269 | 0.6934 | 1.3269 | 1.1519 | | 0.1077 | 3.6007 | 2092 | 1.3621 | 0.6356 | 1.3621 | 1.1671 | | 0.1077 | 3.6041 | 2094 | 1.2696 | 0.6414 | 1.2696 | 1.1268 | | 0.1077 | 3.6076 | 2096 | 1.1055 | 0.7626 | 1.1055 | 1.0514 | | 0.1077 | 3.6110 | 2098 | 0.9728 | 0.7525 | 0.9728 | 0.9863 | | 0.1077 | 3.6145 | 2100 | 0.9356 | 0.6776 | 0.9356 | 0.9673 | | 0.1077 | 3.6179 | 2102 | 0.9648 | 0.7316 | 0.9648 | 0.9822 | | 0.1077 | 3.6213 | 2104 | 1.0143 | 0.7520 | 1.0143 | 1.0071 | | 0.1077 | 3.6248 | 2106 | 1.0527 | 0.7020 | 1.0527 | 1.0260 | | 0.1077 | 3.6282 | 2108 | 1.1927 | 0.6414 | 1.1927 | 1.0921 | | 0.1077 | 3.6317 | 2110 | 1.2523 | 0.6356 | 1.2523 | 1.1191 | | 0.1077 | 3.6351 | 2112 | 1.1850 | 0.6414 | 1.1850 | 1.0886 | | 0.1077 | 3.6386 | 2114 | 1.0359 | 0.7020 | 1.0359 | 1.0178 | | 0.1077 | 3.6420 | 2116 | 0.9525 | 0.7520 | 0.9525 | 0.9760 | | 0.1077 | 3.6454 | 2118 | 0.9614 | 0.7520 | 0.9614 | 0.9805 | | 0.1077 | 3.6489 | 2120 | 1.0033 | 0.7520 | 1.0033 | 1.0016 | | 0.1077 | 3.6523 | 2122 | 1.1149 | 0.7101 | 1.1149 | 1.0559 | | 0.1077 | 3.6558 | 2124 | 1.2213 | 0.7014 | 1.2213 | 1.1051 | | 0.1077 | 3.6592 | 2126 | 1.2266 | 0.7014 | 1.2266 | 1.1075 | | 0.1077 | 3.6627 | 2128 | 1.1372 | 0.7101 | 1.1372 | 1.0664 | | 0.1077 | 3.6661 | 2130 | 1.0097 | 0.7020 | 1.0097 | 1.0048 | | 0.1077 | 3.6695 | 2132 | 0.9576 | 0.7729 | 0.9576 | 0.9786 | | 0.1077 | 3.6730 | 2134 | 0.9904 | 0.7221 | 0.9904 | 0.9952 | | 0.1077 | 3.6764 | 2136 | 1.0051 | 0.7134 | 1.0051 | 1.0025 | | 0.1077 | 3.6799 | 2138 | 1.1012 | 0.6940 | 1.1012 | 1.0494 | | 0.1077 | 3.6833 | 2140 | 1.1580 | 0.6866 | 1.1580 | 1.0761 | | 0.1077 | 3.6867 | 2142 | 1.2299 | 0.6934 | 1.2299 | 1.1090 | | 0.1077 | 3.6902 | 2144 | 1.1814 | 0.7014 | 1.1814 | 1.0869 | | 0.1077 | 3.6936 | 2146 | 1.1039 | 0.7310 | 1.1039 | 1.0507 | | 0.1077 | 3.6971 | 2148 | 1.0643 | 0.7844 | 1.0643 | 1.0316 | | 0.1077 | 3.7005 | 2150 | 1.0333 | 0.7844 | 1.0333 | 1.0165 | | 0.1077 | 3.7040 | 2152 | 1.0045 | 0.7844 | 1.0045 | 1.0022 | | 0.1077 | 3.7074 | 2154 | 0.9738 | 0.7525 | 0.9738 | 0.9868 | | 0.1077 | 3.7108 | 2156 | 0.9275 | 0.7525 | 0.9275 | 0.9630 | | 0.1077 | 3.7143 | 2158 | 0.9572 | 0.7134 | 0.9572 | 0.9784 | | 0.1077 | 3.7177 | 2160 | 1.0547 | 0.7134 | 1.0547 | 1.0270 | | 0.1077 | 3.7212 | 2162 | 1.1342 | 0.6866 | 1.1342 | 1.0650 | | 0.1077 | 3.7246 | 2164 | 1.2572 | 0.6929 | 1.2572 | 1.1213 | | 0.1077 | 3.7281 | 2166 | 1.3099 | 0.6929 | 1.3099 | 1.1445 | | 0.1077 | 3.7315 | 2168 | 1.2983 | 0.6610 | 1.2983 | 1.1394 | | 0.1077 | 3.7349 | 2170 | 1.2085 | 0.6680 | 1.2085 | 1.0993 | | 0.1077 | 3.7384 | 2172 | 1.1243 | 0.7101 | 1.1243 | 1.0603 | | 0.1077 | 3.7418 | 2174 | 1.1323 | 0.7626 | 1.1323 | 1.0641 | | 0.1077 | 3.7453 | 2176 | 1.1048 | 0.7626 | 1.1048 | 1.0511 | | 0.1077 | 3.7487 | 2178 | 1.1257 | 0.7101 | 1.1257 | 1.0610 | | 0.1077 | 3.7522 | 2180 | 1.1558 | 0.7014 | 1.1558 | 1.0751 | | 0.1077 | 3.7556 | 2182 | 1.2318 | 0.6934 | 1.2318 | 1.1099 | | 0.1077 | 3.7590 | 2184 | 1.2897 | 0.7230 | 1.2897 | 1.1356 | | 0.1077 | 3.7625 | 2186 | 1.2644 | 0.7230 | 1.2644 | 1.1245 | | 0.1077 | 3.7659 | 2188 | 1.1610 | 0.7014 | 1.1610 | 1.0775 | | 0.1077 | 3.7694 | 2190 | 1.0502 | 0.7014 | 1.0502 | 1.0248 | | 0.1077 | 3.7728 | 2192 | 0.9866 | 0.7520 | 0.9866 | 0.9933 | | 0.1077 | 3.7762 | 2194 | 0.9952 | 0.7626 | 0.9952 | 0.9976 | | 0.1077 | 3.7797 | 2196 | 1.0864 | 0.7014 | 1.0864 | 1.0423 | | 0.1077 | 3.7831 | 2198 | 1.1957 | 0.7014 | 1.1957 | 1.0935 | | 0.1077 | 3.7866 | 2200 | 1.2166 | 0.7014 | 1.2166 | 1.1030 | | 0.1077 | 3.7900 | 2202 | 1.2192 | 0.7014 | 1.2192 | 1.1042 | | 0.1077 | 3.7935 | 2204 | 1.1216 | 0.7014 | 1.1216 | 1.0590 | | 0.1077 | 3.7969 | 2206 | 1.0428 | 0.7216 | 1.0428 | 1.0212 | | 0.1077 | 3.8003 | 2208 | 1.0339 | 0.7216 | 1.0339 | 1.0168 | | 0.1077 | 3.8038 | 2210 | 1.0675 | 0.7014 | 1.0675 | 1.0332 | | 0.1077 | 3.8072 | 2212 | 1.1228 | 0.7318 | 1.1228 | 1.0596 | | 0.1077 | 3.8107 | 2214 | 1.1632 | 0.7318 | 1.1632 | 1.0785 | | 0.1077 | 3.8141 | 2216 | 1.1979 | 0.7318 | 1.1979 | 1.0945 | | 0.1077 | 3.8176 | 2218 | 1.2642 | 0.7230 | 1.2642 | 1.1244 | | 0.1077 | 3.8210 | 2220 | 1.2854 | 0.7230 | 1.2854 | 1.1338 | | 0.1077 | 3.8244 | 2222 | 1.2319 | 0.7014 | 1.2319 | 1.1099 | | 0.1077 | 3.8279 | 2224 | 1.1321 | 0.7014 | 1.1321 | 1.0640 | | 0.1077 | 3.8313 | 2226 | 1.0565 | 0.7014 | 1.0565 | 1.0279 | | 0.1077 | 3.8348 | 2228 | 1.0679 | 0.7515 | 1.0679 | 1.0334 | | 0.1077 | 3.8382 | 2230 | 1.1053 | 0.7515 | 1.1053 | 1.0513 | | 0.1077 | 3.8417 | 2232 | 1.0954 | 0.7626 | 1.0954 | 1.0466 | | 0.1077 | 3.8451 | 2234 | 1.0743 | 0.6845 | 1.0743 | 1.0365 | | 0.1077 | 3.8485 | 2236 | 1.0668 | 0.6845 | 1.0668 | 1.0328 | | 0.1077 | 3.8520 | 2238 | 1.0301 | 0.7074 | 1.0301 | 1.0150 | | 0.1077 | 3.8554 | 2240 | 1.0473 | 0.7074 | 1.0473 | 1.0234 | | 0.1077 | 3.8589 | 2242 | 1.1402 | 0.7014 | 1.1402 | 1.0678 | | 0.1077 | 3.8623 | 2244 | 1.2717 | 0.7014 | 1.2717 | 1.1277 | | 0.1077 | 3.8657 | 2246 | 1.3641 | 0.6610 | 1.3641 | 1.1679 | | 0.1077 | 3.8692 | 2248 | 1.4327 | 0.6929 | 1.4327 | 1.1970 | | 0.1077 | 3.8726 | 2250 | 1.3893 | 0.7230 | 1.3893 | 1.1787 | | 0.1077 | 3.8761 | 2252 | 1.2460 | 0.7230 | 1.2460 | 1.1163 | | 0.1077 | 3.8795 | 2254 | 1.1029 | 0.6940 | 1.1029 | 1.0502 | | 0.1077 | 3.8830 | 2256 | 1.0735 | 0.7014 | 1.0735 | 1.0361 | | 0.1077 | 3.8864 | 2258 | 1.0893 | 0.7014 | 1.0893 | 1.0437 | | 0.1077 | 3.8898 | 2260 | 1.1035 | 0.7014 | 1.1035 | 1.0505 | | 0.1077 | 3.8933 | 2262 | 1.1272 | 0.7014 | 1.1272 | 1.0617 | | 0.1077 | 3.8967 | 2264 | 1.1988 | 0.7014 | 1.1988 | 1.0949 | | 0.1077 | 3.9002 | 2266 | 1.2044 | 0.7101 | 1.2044 | 1.0975 | | 0.1077 | 3.9036 | 2268 | 1.1334 | 0.7626 | 1.1334 | 1.0646 | | 0.1077 | 3.9071 | 2270 | 1.1052 | 0.7626 | 1.1052 | 1.0513 | | 0.1077 | 3.9105 | 2272 | 1.0467 | 0.7844 | 1.0467 | 1.0231 | | 0.1077 | 3.9139 | 2274 | 0.9885 | 0.7844 | 0.9885 | 0.9942 | | 0.1077 | 3.9174 | 2276 | 1.0015 | 0.7216 | 1.0015 | 1.0008 | | 0.1077 | 3.9208 | 2278 | 1.0271 | 0.7216 | 1.0271 | 1.0135 | | 0.1077 | 3.9243 | 2280 | 1.0571 | 0.7014 | 1.0571 | 1.0281 | | 0.1077 | 3.9277 | 2282 | 1.1330 | 0.7014 | 1.1330 | 1.0644 | | 0.1077 | 3.9312 | 2284 | 1.1814 | 0.6934 | 1.1814 | 1.0869 | | 0.1077 | 3.9346 | 2286 | 1.1539 | 0.7014 | 1.1539 | 1.0742 | | 0.1077 | 3.9380 | 2288 | 1.0694 | 0.7014 | 1.0694 | 1.0341 | | 0.1077 | 3.9415 | 2290 | 1.0995 | 0.7014 | 1.0995 | 1.0486 | | 0.1077 | 3.9449 | 2292 | 1.0376 | 0.7014 | 1.0376 | 1.0186 | | 0.1077 | 3.9484 | 2294 | 1.0462 | 0.7014 | 1.0462 | 1.0228 | | 0.1077 | 3.9518 | 2296 | 1.1042 | 0.7014 | 1.1042 | 1.0508 | | 0.1077 | 3.9552 | 2298 | 1.0735 | 0.7101 | 1.0735 | 1.0361 | | 0.1077 | 3.9587 | 2300 | 1.0828 | 0.7101 | 1.0828 | 1.0406 | | 0.1077 | 3.9621 | 2302 | 1.0943 | 0.7014 | 1.0943 | 1.0461 | | 0.1077 | 3.9656 | 2304 | 1.0869 | 0.7216 | 1.0869 | 1.0425 | | 0.1077 | 3.9690 | 2306 | 0.9943 | 0.7216 | 0.9943 | 0.9971 | | 0.1077 | 3.9725 | 2308 | 0.8672 | 0.7729 | 0.8672 | 0.9312 | | 0.1077 | 3.9759 | 2310 | 0.8155 | 0.7801 | 0.8155 | 0.9031 | | 0.1077 | 3.9793 | 2312 | 0.7693 | 0.7801 | 0.7693 | 0.8771 | | 0.1077 | 3.9828 | 2314 | 0.7755 | 0.7801 | 0.7755 | 0.8806 | | 0.1077 | 3.9862 | 2316 | 0.8489 | 0.7346 | 0.8489 | 0.9214 | | 0.1077 | 3.9897 | 2318 | 0.9598 | 0.7134 | 0.9598 | 0.9797 | | 0.1077 | 3.9931 | 2320 | 1.0216 | 0.7134 | 1.0216 | 1.0107 | | 0.1077 | 3.9966 | 2322 | 1.0765 | 0.7216 | 1.0765 | 1.0376 | | 0.1077 | 4.0 | 2324 | 1.0736 | 0.7216 | 1.0736 | 1.0362 | | 0.1077 | 4.0034 | 2326 | 0.9843 | 0.7134 | 0.9843 | 0.9921 | | 0.1077 | 4.0069 | 2328 | 0.9255 | 0.7729 | 0.9255 | 0.9620 | | 0.1077 | 4.0103 | 2330 | 0.8425 | 0.7906 | 0.8425 | 0.9179 | | 0.1077 | 4.0138 | 2332 | 0.8152 | 0.7906 | 0.8152 | 0.9029 | | 0.1077 | 4.0172 | 2334 | 0.7748 | 0.7801 | 0.7748 | 0.8803 | | 0.1077 | 4.0207 | 2336 | 0.7873 | 0.7801 | 0.7873 | 0.8873 | | 0.1077 | 4.0241 | 2338 | 0.8833 | 0.7139 | 0.8833 | 0.9398 | | 0.1077 | 4.0275 | 2340 | 1.0699 | 0.7053 | 1.0699 | 1.0344 | | 0.1077 | 4.0310 | 2342 | 1.2900 | 0.6673 | 1.2900 | 1.1358 | | 0.1077 | 4.0344 | 2344 | 1.4082 | 0.6673 | 1.4082 | 1.1867 | | 0.1077 | 4.0379 | 2346 | 1.3901 | 0.6356 | 1.3901 | 1.1790 | | 0.1077 | 4.0413 | 2348 | 1.2568 | 0.6934 | 1.2568 | 1.1211 | | 0.1077 | 4.0448 | 2350 | 1.0830 | 0.7626 | 1.0830 | 1.0407 | | 0.1077 | 4.0482 | 2352 | 0.9099 | 0.6988 | 0.9099 | 0.9539 | | 0.1077 | 4.0516 | 2354 | 0.7843 | 0.6836 | 0.7843 | 0.8856 | | 0.1077 | 4.0551 | 2356 | 0.7423 | 0.6962 | 0.7423 | 0.8616 | | 0.1077 | 4.0585 | 2358 | 0.7610 | 0.6836 | 0.7610 | 0.8724 | | 0.1077 | 4.0620 | 2360 | 0.8455 | 0.7623 | 0.8455 | 0.9195 | | 0.1077 | 4.0654 | 2362 | 1.0052 | 0.6866 | 1.0052 | 1.0026 | | 0.1077 | 4.0688 | 2364 | 1.1608 | 0.6934 | 1.1608 | 1.0774 | | 0.1077 | 4.0723 | 2366 | 1.2079 | 0.6934 | 1.2079 | 1.0990 | | 0.1077 | 4.0757 | 2368 | 1.1688 | 0.6934 | 1.1688 | 1.0811 | | 0.1077 | 4.0792 | 2370 | 1.0787 | 0.7626 | 1.0787 | 1.0386 | | 0.1077 | 4.0826 | 2372 | 0.9535 | 0.6845 | 0.9535 | 0.9765 | | 0.1077 | 4.0861 | 2374 | 0.9027 | 0.6988 | 0.9027 | 0.9501 | | 0.1077 | 4.0895 | 2376 | 0.9088 | 0.6988 | 0.9088 | 0.9533 | | 0.1077 | 4.0929 | 2378 | 0.9455 | 0.7074 | 0.9455 | 0.9724 | | 0.1077 | 4.0964 | 2380 | 1.0383 | 0.7626 | 1.0383 | 1.0190 | | 0.1077 | 4.0998 | 2382 | 1.1528 | 0.6934 | 1.1528 | 1.0737 | | 0.1077 | 4.1033 | 2384 | 1.3337 | 0.7230 | 1.3337 | 1.1548 | | 0.1077 | 4.1067 | 2386 | 1.4395 | 0.6673 | 1.4395 | 1.1998 | | 0.1077 | 4.1102 | 2388 | 1.4139 | 0.6673 | 1.4139 | 1.1891 | | 0.1077 | 4.1136 | 2390 | 1.2825 | 0.7230 | 1.2825 | 1.1325 | | 0.1077 | 4.1170 | 2392 | 1.1322 | 0.6934 | 1.1322 | 1.0641 | | 0.1077 | 4.1205 | 2394 | 0.9602 | 0.7729 | 0.9602 | 0.9799 | | 0.1077 | 4.1239 | 2396 | 0.8194 | 0.6909 | 0.8194 | 0.9052 | | 0.1077 | 4.1274 | 2398 | 0.7549 | 0.6909 | 0.7549 | 0.8689 | | 0.1077 | 4.1308 | 2400 | 0.7572 | 0.6909 | 0.7572 | 0.8701 | | 0.1077 | 4.1343 | 2402 | 0.8096 | 0.7623 | 0.8096 | 0.8998 | | 0.1077 | 4.1377 | 2404 | 0.8986 | 0.7729 | 0.8986 | 0.9479 | | 0.1077 | 4.1411 | 2406 | 0.9520 | 0.7729 | 0.9520 | 0.9757 | | 0.1077 | 4.1446 | 2408 | 1.0158 | 0.7844 | 1.0158 | 1.0078 | | 0.1077 | 4.1480 | 2410 | 1.0251 | 0.7844 | 1.0251 | 1.0125 | | 0.1077 | 4.1515 | 2412 | 1.0065 | 0.7844 | 1.0065 | 1.0032 | | 0.1077 | 4.1549 | 2414 | 1.0476 | 0.7844 | 1.0476 | 1.0235 | | 0.1077 | 4.1583 | 2416 | 1.1015 | 0.7515 | 1.1015 | 1.0495 | | 0.1077 | 4.1618 | 2418 | 1.0740 | 0.7515 | 1.0740 | 1.0363 | | 0.1077 | 4.1652 | 2420 | 0.9765 | 0.7844 | 0.9765 | 0.9882 | | 0.1077 | 4.1687 | 2422 | 0.8829 | 0.7729 | 0.8829 | 0.9397 | | 0.1077 | 4.1721 | 2424 | 0.8318 | 0.6909 | 0.8318 | 0.9121 | | 0.1077 | 4.1756 | 2426 | 0.8144 | 0.7232 | 0.8144 | 0.9024 | | 0.1077 | 4.1790 | 2428 | 0.8107 | 0.7801 | 0.8107 | 0.9004 | | 0.1077 | 4.1824 | 2430 | 0.8572 | 0.7264 | 0.8572 | 0.9258 | | 0.1077 | 4.1859 | 2432 | 0.9549 | 0.7058 | 0.9549 | 0.9772 | | 0.1077 | 4.1893 | 2434 | 1.0159 | 0.7053 | 1.0159 | 1.0079 | | 0.1077 | 4.1928 | 2436 | 1.0087 | 0.7053 | 1.0087 | 1.0043 | | 0.1077 | 4.1962 | 2438 | 0.9454 | 0.7134 | 0.9454 | 0.9723 | | 0.1077 | 4.1997 | 2440 | 0.8809 | 0.7729 | 0.8809 | 0.9386 | | 0.1077 | 4.2031 | 2442 | 0.8707 | 0.8019 | 0.8707 | 0.9331 | | 0.1077 | 4.2065 | 2444 | 0.8772 | 0.7729 | 0.8772 | 0.9366 | | 0.1077 | 4.2100 | 2446 | 0.8686 | 0.8019 | 0.8686 | 0.9320 | | 0.1077 | 4.2134 | 2448 | 0.8245 | 0.8019 | 0.8245 | 0.9080 | | 0.1077 | 4.2169 | 2450 | 0.7732 | 0.7801 | 0.7732 | 0.8793 | | 0.1077 | 4.2203 | 2452 | 0.7577 | 0.7801 | 0.7577 | 0.8705 | | 0.1077 | 4.2238 | 2454 | 0.7969 | 0.7906 | 0.7969 | 0.8927 | | 0.1077 | 4.2272 | 2456 | 0.9002 | 0.7729 | 0.9002 | 0.9488 | | 0.1077 | 4.2306 | 2458 | 1.0178 | 0.7417 | 1.0178 | 1.0089 | | 0.1077 | 4.2341 | 2460 | 1.0375 | 0.6940 | 1.0375 | 1.0186 | | 0.1077 | 4.2375 | 2462 | 0.9747 | 0.7618 | 0.9747 | 0.9872 | | 0.1077 | 4.2410 | 2464 | 0.9328 | 0.7729 | 0.9328 | 0.9658 | | 0.1077 | 4.2444 | 2466 | 0.8621 | 0.8019 | 0.8621 | 0.9285 | | 0.1077 | 4.2478 | 2468 | 0.8414 | 0.7906 | 0.8414 | 0.9173 | | 0.1077 | 4.2513 | 2470 | 0.8794 | 0.7729 | 0.8794 | 0.9378 | | 0.1077 | 4.2547 | 2472 | 0.8932 | 0.7729 | 0.8932 | 0.9451 | | 0.1077 | 4.2582 | 2474 | 0.9621 | 0.7520 | 0.9621 | 0.9809 | | 0.1077 | 4.2616 | 2476 | 1.0256 | 0.7626 | 1.0256 | 1.0127 | | 0.1077 | 4.2651 | 2478 | 1.0058 | 0.7101 | 1.0058 | 1.0029 | | 0.1077 | 4.2685 | 2480 | 0.9426 | 0.7221 | 0.9426 | 0.9709 | | 0.1077 | 4.2719 | 2482 | 0.8444 | 0.7221 | 0.8444 | 0.9189 | | 0.1077 | 4.2754 | 2484 | 0.7711 | 0.7520 | 0.7711 | 0.8781 | | 0.1077 | 4.2788 | 2486 | 0.7632 | 0.7520 | 0.7632 | 0.8736 | | 0.1077 | 4.2823 | 2488 | 0.7873 | 0.7430 | 0.7873 | 0.8873 | | 0.1077 | 4.2857 | 2490 | 0.8294 | 0.7259 | 0.8294 | 0.9107 | | 0.1077 | 4.2892 | 2492 | 0.8309 | 0.7259 | 0.8309 | 0.9115 | | 0.1077 | 4.2926 | 2494 | 0.8114 | 0.7264 | 0.8114 | 0.9008 | | 0.1077 | 4.2960 | 2496 | 0.8147 | 0.7430 | 0.8147 | 0.9026 | | 0.1077 | 4.2995 | 2498 | 0.8208 | 0.7520 | 0.8208 | 0.9060 | | 0.093 | 4.3029 | 2500 | 0.8084 | 0.8019 | 0.8084 | 0.8991 | | 0.093 | 4.3064 | 2502 | 0.8005 | 0.8019 | 0.8005 | 0.8947 | | 0.093 | 4.3098 | 2504 | 0.8301 | 0.8019 | 0.8301 | 0.9111 | | 0.093 | 4.3133 | 2506 | 0.9115 | 0.7074 | 0.9115 | 0.9547 | | 0.093 | 4.3167 | 2508 | 0.9720 | 0.7074 | 0.9720 | 0.9859 | | 0.093 | 4.3201 | 2510 | 1.0196 | 0.7626 | 1.0196 | 1.0097 | | 0.093 | 4.3236 | 2512 | 1.0015 | 0.7520 | 1.0015 | 1.0008 | | 0.093 | 4.3270 | 2514 | 0.9844 | 0.7020 | 0.9844 | 0.9922 | | 0.093 | 4.3305 | 2516 | 0.9389 | 0.7221 | 0.9389 | 0.9690 | | 0.093 | 4.3339 | 2518 | 0.8288 | 0.7623 | 0.8288 | 0.9104 | | 0.093 | 4.3373 | 2520 | 0.7913 | 0.7623 | 0.7913 | 0.8896 | | 0.093 | 4.3408 | 2522 | 0.8219 | 0.7139 | 0.8219 | 0.9066 | | 0.093 | 4.3442 | 2524 | 0.8896 | 0.7221 | 0.8896 | 0.9432 | | 0.093 | 4.3477 | 2526 | 0.9974 | 0.7221 | 0.9974 | 0.9987 | | 0.093 | 4.3511 | 2528 | 1.0521 | 0.6940 | 1.0521 | 1.0257 | | 0.093 | 4.3546 | 2530 | 1.0850 | 0.7014 | 1.0850 | 1.0416 | | 0.093 | 4.3580 | 2532 | 1.1405 | 0.7014 | 1.1405 | 1.0680 | | 0.093 | 4.3614 | 2534 | 1.1150 | 0.7101 | 1.1150 | 1.0559 | | 0.093 | 4.3649 | 2536 | 1.0796 | 0.7101 | 1.0796 | 1.0390 | | 0.093 | 4.3683 | 2538 | 1.0033 | 0.7074 | 1.0033 | 1.0017 | | 0.093 | 4.3718 | 2540 | 0.9236 | 0.6988 | 0.9236 | 0.9610 | | 0.093 | 4.3752 | 2542 | 0.9225 | 0.6988 | 0.9225 | 0.9605 | | 0.093 | 4.3787 | 2544 | 0.9459 | 0.6988 | 0.9459 | 0.9726 | | 0.093 | 4.3821 | 2546 | 0.9854 | 0.6988 | 0.9854 | 0.9927 | | 0.093 | 4.3855 | 2548 | 1.0298 | 0.7074 | 1.0298 | 1.0148 | | 0.093 | 4.3890 | 2550 | 1.0084 | 0.6988 | 1.0084 | 1.0042 | | 0.093 | 4.3924 | 2552 | 1.0163 | 0.7074 | 1.0163 | 1.0081 | | 0.093 | 4.3959 | 2554 | 1.0557 | 0.7626 | 1.0557 | 1.0275 | | 0.093 | 4.3993 | 2556 | 1.0826 | 0.7101 | 1.0826 | 1.0405 | | 0.093 | 4.4028 | 2558 | 1.0318 | 0.7221 | 1.0318 | 1.0158 | | 0.093 | 4.4062 | 2560 | 1.0067 | 0.7221 | 1.0067 | 1.0034 | | 0.093 | 4.4096 | 2562 | 1.0049 | 0.7221 | 1.0049 | 1.0024 | | 0.093 | 4.4131 | 2564 | 0.9855 | 0.7221 | 0.9855 | 0.9927 | | 0.093 | 4.4165 | 2566 | 0.9714 | 0.7729 | 0.9714 | 0.9856 | | 0.093 | 4.4200 | 2568 | 1.0043 | 0.7626 | 1.0043 | 1.0021 | | 0.093 | 4.4234 | 2570 | 1.0568 | 0.7101 | 1.0568 | 1.0280 | | 0.093 | 4.4269 | 2572 | 1.1268 | 0.7101 | 1.1268 | 1.0615 | | 0.093 | 4.4303 | 2574 | 1.1364 | 0.7014 | 1.1364 | 1.0660 | | 0.093 | 4.4337 | 2576 | 1.0608 | 0.7014 | 1.0608 | 1.0299 | | 0.093 | 4.4372 | 2578 | 0.9646 | 0.7134 | 0.9646 | 0.9821 | | 0.093 | 4.4406 | 2580 | 0.8801 | 0.7221 | 0.8801 | 0.9381 | | 0.093 | 4.4441 | 2582 | 0.8605 | 0.7221 | 0.8605 | 0.9276 | | 0.093 | 4.4475 | 2584 | 0.8756 | 0.7221 | 0.8756 | 0.9358 | | 0.093 | 4.4509 | 2586 | 0.9451 | 0.7134 | 0.9451 | 0.9722 | | 0.093 | 4.4544 | 2588 | 1.0424 | 0.7014 | 1.0424 | 1.0210 | | 0.093 | 4.4578 | 2590 | 1.0645 | 0.7014 | 1.0645 | 1.0318 | | 0.093 | 4.4613 | 2592 | 1.0237 | 0.7216 | 1.0237 | 1.0118 | | 0.093 | 4.4647 | 2594 | 0.9489 | 0.7221 | 0.9489 | 0.9741 | | 0.093 | 4.4682 | 2596 | 0.8394 | 0.7729 | 0.8394 | 0.9162 | | 0.093 | 4.4716 | 2598 | 0.7448 | 0.6564 | 0.7448 | 0.8630 | | 0.093 | 4.4750 | 2600 | 0.6763 | 0.6503 | 0.6763 | 0.8224 | | 0.093 | 4.4785 | 2602 | 0.6534 | 0.6775 | 0.6534 | 0.8083 | | 0.093 | 4.4819 | 2604 | 0.6718 | 0.7529 | 0.6718 | 0.8197 | | 0.093 | 4.4854 | 2606 | 0.7420 | 0.7729 | 0.7420 | 0.8614 | | 0.093 | 4.4888 | 2608 | 0.8430 | 0.7134 | 0.8430 | 0.9182 | | 0.093 | 4.4923 | 2610 | 0.8682 | 0.7134 | 0.8682 | 0.9318 | | 0.093 | 4.4957 | 2612 | 0.8261 | 0.7221 | 0.8261 | 0.9089 | | 0.093 | 4.4991 | 2614 | 0.7525 | 0.7623 | 0.7525 | 0.8675 | | 0.093 | 4.5026 | 2616 | 0.7184 | 0.7623 | 0.7184 | 0.8476 | | 0.093 | 4.5060 | 2618 | 0.6508 | 0.7149 | 0.6508 | 0.8067 | | 0.093 | 4.5095 | 2620 | 0.6043 | 0.7263 | 0.6043 | 0.7773 | | 0.093 | 4.5129 | 2622 | 0.6048 | 0.7263 | 0.6048 | 0.7777 | | 0.093 | 4.5164 | 2624 | 0.6428 | 0.7149 | 0.6428 | 0.8017 | | 0.093 | 4.5198 | 2626 | 0.7382 | 0.6882 | 0.7382 | 0.8592 | | 0.093 | 4.5232 | 2628 | 0.9132 | 0.7134 | 0.9132 | 0.9556 | | 0.093 | 4.5267 | 2630 | 1.1272 | 0.7230 | 1.1272 | 1.0617 | | 0.093 | 4.5301 | 2632 | 1.2337 | 0.7230 | 1.2337 | 1.1107 | | 0.093 | 4.5336 | 2634 | 1.2284 | 0.7230 | 1.2284 | 1.1083 | | 0.093 | 4.5370 | 2636 | 1.1315 | 0.7014 | 1.1315 | 1.0637 | | 0.093 | 4.5404 | 2638 | 1.0227 | 0.7101 | 1.0227 | 1.0113 | | 0.093 | 4.5439 | 2640 | 0.9157 | 0.7626 | 0.9157 | 0.9569 | | 0.093 | 4.5473 | 2642 | 0.8680 | 0.7844 | 0.8680 | 0.9317 | | 0.093 | 4.5508 | 2644 | 0.8928 | 0.7844 | 0.8928 | 0.9449 | | 0.093 | 4.5542 | 2646 | 0.9054 | 0.7310 | 0.9054 | 0.9515 | | 0.093 | 4.5577 | 2648 | 0.9387 | 0.7101 | 0.9387 | 0.9688 | | 0.093 | 4.5611 | 2650 | 0.9693 | 0.7101 | 0.9693 | 0.9845 | | 0.093 | 4.5645 | 2652 | 0.9799 | 0.7101 | 0.9799 | 0.9899 | | 0.093 | 4.5680 | 2654 | 1.0166 | 0.7101 | 1.0166 | 1.0083 | | 0.093 | 4.5714 | 2656 | 1.0233 | 0.7101 | 1.0233 | 1.0116 | | 0.093 | 4.5749 | 2658 | 0.9833 | 0.7101 | 0.9833 | 0.9916 | | 0.093 | 4.5783 | 2660 | 0.8880 | 0.7729 | 0.8880 | 0.9423 | | 0.093 | 4.5818 | 2662 | 0.7842 | 0.7729 | 0.7842 | 0.8855 | | 0.093 | 4.5852 | 2664 | 0.7645 | 0.7729 | 0.7645 | 0.8744 | | 0.093 | 4.5886 | 2666 | 0.8021 | 0.7729 | 0.8021 | 0.8956 | | 0.093 | 4.5921 | 2668 | 0.9030 | 0.7221 | 0.9030 | 0.9503 | | 0.093 | 4.5955 | 2670 | 1.0375 | 0.7318 | 1.0375 | 1.0186 | | 0.093 | 4.5990 | 2672 | 1.0918 | 0.7318 | 1.0918 | 1.0449 | | 0.093 | 4.6024 | 2674 | 1.0644 | 0.7014 | 1.0644 | 1.0317 | | 0.093 | 4.6059 | 2676 | 0.9811 | 0.7020 | 0.9811 | 0.9905 | | 0.093 | 4.6093 | 2678 | 0.8837 | 0.7729 | 0.8837 | 0.9401 | | 0.093 | 4.6127 | 2680 | 0.8490 | 0.7729 | 0.8490 | 0.9214 | | 0.093 | 4.6162 | 2682 | 0.8737 | 0.7729 | 0.8737 | 0.9347 | | 0.093 | 4.6196 | 2684 | 0.8675 | 0.7729 | 0.8675 | 0.9314 | | 0.093 | 4.6231 | 2686 | 0.8421 | 0.7729 | 0.8421 | 0.9177 | | 0.093 | 4.6265 | 2688 | 0.8228 | 0.7729 | 0.8228 | 0.9071 | | 0.093 | 4.6299 | 2690 | 0.8420 | 0.7729 | 0.8420 | 0.9176 | | 0.093 | 4.6334 | 2692 | 0.8074 | 0.7729 | 0.8074 | 0.8985 | | 0.093 | 4.6368 | 2694 | 0.7478 | 0.7525 | 0.7478 | 0.8647 | | 0.093 | 4.6403 | 2696 | 0.7554 | 0.7525 | 0.7554 | 0.8691 | | 0.093 | 4.6437 | 2698 | 0.8145 | 0.7729 | 0.8145 | 0.9025 | | 0.093 | 4.6472 | 2700 | 0.8662 | 0.7729 | 0.8662 | 0.9307 | | 0.093 | 4.6506 | 2702 | 0.9191 | 0.7844 | 0.9191 | 0.9587 | | 0.093 | 4.6540 | 2704 | 0.9208 | 0.7844 | 0.9208 | 0.9596 | | 0.093 | 4.6575 | 2706 | 0.9845 | 0.7844 | 0.9845 | 0.9922 | | 0.093 | 4.6609 | 2708 | 1.0354 | 0.7626 | 1.0354 | 1.0175 | | 0.093 | 4.6644 | 2710 | 1.0197 | 0.7626 | 1.0197 | 1.0098 | | 0.093 | 4.6678 | 2712 | 0.9807 | 0.7844 | 0.9807 | 0.9903 | | 0.093 | 4.6713 | 2714 | 0.9018 | 0.7844 | 0.9018 | 0.9497 | | 0.093 | 4.6747 | 2716 | 0.8368 | 0.7844 | 0.8368 | 0.9148 | | 0.093 | 4.6781 | 2718 | 0.8215 | 0.7729 | 0.8215 | 0.9064 | | 0.093 | 4.6816 | 2720 | 0.8668 | 0.7310 | 0.8668 | 0.9310 | | 0.093 | 4.6850 | 2722 | 0.9597 | 0.7216 | 0.9597 | 0.9796 | | 0.093 | 4.6885 | 2724 | 1.0731 | 0.7014 | 1.0731 | 1.0359 | | 0.093 | 4.6919 | 2726 | 1.1470 | 0.7014 | 1.1470 | 1.0710 | | 0.093 | 4.6954 | 2728 | 1.1143 | 0.7014 | 1.1143 | 1.0556 | | 0.093 | 4.6988 | 2730 | 1.0004 | 0.7216 | 1.0004 | 1.0002 | | 0.093 | 4.7022 | 2732 | 0.8856 | 0.7844 | 0.8856 | 0.9411 | | 0.093 | 4.7057 | 2734 | 0.8674 | 0.7844 | 0.8674 | 0.9313 | | 0.093 | 4.7091 | 2736 | 0.8913 | 0.7844 | 0.8913 | 0.9441 | | 0.093 | 4.7126 | 2738 | 0.9304 | 0.7844 | 0.9304 | 0.9646 | | 0.093 | 4.7160 | 2740 | 0.9727 | 0.7844 | 0.9727 | 0.9862 | | 0.093 | 4.7194 | 2742 | 1.0317 | 0.7216 | 1.0317 | 1.0157 | | 0.093 | 4.7229 | 2744 | 1.0166 | 0.7216 | 1.0166 | 1.0083 | | 0.093 | 4.7263 | 2746 | 0.9742 | 0.7216 | 0.9742 | 0.9870 | | 0.093 | 4.7298 | 2748 | 0.9225 | 0.7216 | 0.9225 | 0.9605 | | 0.093 | 4.7332 | 2750 | 0.8924 | 0.7134 | 0.8924 | 0.9446 | | 0.093 | 4.7367 | 2752 | 0.8287 | 0.7525 | 0.8287 | 0.9103 | | 0.093 | 4.7401 | 2754 | 0.8391 | 0.7623 | 0.8391 | 0.9160 | | 0.093 | 4.7435 | 2756 | 0.8853 | 0.7844 | 0.8853 | 0.9409 | | 0.093 | 4.7470 | 2758 | 0.9419 | 0.7844 | 0.9419 | 0.9705 | | 0.093 | 4.7504 | 2760 | 0.9798 | 0.7626 | 0.9798 | 0.9899 | | 0.093 | 4.7539 | 2762 | 1.0011 | 0.7626 | 1.0011 | 1.0005 | | 0.093 | 4.7573 | 2764 | 0.9712 | 0.7626 | 0.9712 | 0.9855 | | 0.093 | 4.7608 | 2766 | 0.9579 | 0.7844 | 0.9579 | 0.9787 | | 0.093 | 4.7642 | 2768 | 0.9476 | 0.7844 | 0.9476 | 0.9735 | | 0.093 | 4.7676 | 2770 | 0.9654 | 0.7844 | 0.9654 | 0.9826 | | 0.093 | 4.7711 | 2772 | 0.9652 | 0.7844 | 0.9652 | 0.9824 | | 0.093 | 4.7745 | 2774 | 0.9534 | 0.7216 | 0.9534 | 0.9764 | | 0.093 | 4.7780 | 2776 | 0.9191 | 0.7844 | 0.9191 | 0.9587 | | 0.093 | 4.7814 | 2778 | 0.8490 | 0.7844 | 0.8490 | 0.9214 | | 0.093 | 4.7849 | 2780 | 0.8284 | 0.7844 | 0.8284 | 0.9101 | | 0.093 | 4.7883 | 2782 | 0.8566 | 0.7844 | 0.8566 | 0.9255 | | 0.093 | 4.7917 | 2784 | 0.9432 | 0.7310 | 0.9432 | 0.9712 | | 0.093 | 4.7952 | 2786 | 1.0763 | 0.7014 | 1.0763 | 1.0374 | | 0.093 | 4.7986 | 2788 | 1.1571 | 0.7014 | 1.1571 | 1.0757 | | 0.093 | 4.8021 | 2790 | 1.1819 | 0.7014 | 1.1819 | 1.0871 | | 0.093 | 4.8055 | 2792 | 1.1576 | 0.7014 | 1.1576 | 1.0759 | | 0.093 | 4.8090 | 2794 | 1.0951 | 0.7014 | 1.0951 | 1.0464 | | 0.093 | 4.8124 | 2796 | 1.0701 | 0.7101 | 1.0701 | 1.0345 | | 0.093 | 4.8158 | 2798 | 1.0535 | 0.7101 | 1.0535 | 1.0264 | | 0.093 | 4.8193 | 2800 | 1.0251 | 0.7216 | 1.0251 | 1.0125 | | 0.093 | 4.8227 | 2802 | 0.9957 | 0.7216 | 0.9957 | 0.9978 | | 0.093 | 4.8262 | 2804 | 1.0205 | 0.7216 | 1.0205 | 1.0102 | | 0.093 | 4.8296 | 2806 | 1.0454 | 0.7014 | 1.0454 | 1.0224 | | 0.093 | 4.8330 | 2808 | 1.0014 | 0.7216 | 1.0014 | 1.0007 | | 0.093 | 4.8365 | 2810 | 0.9299 | 0.7216 | 0.9299 | 0.9643 | | 0.093 | 4.8399 | 2812 | 0.9364 | 0.7134 | 0.9364 | 0.9677 | | 0.093 | 4.8434 | 2814 | 1.0040 | 0.7014 | 1.0040 | 1.0020 | | 0.093 | 4.8468 | 2816 | 1.1074 | 0.7014 | 1.1074 | 1.0523 | | 0.093 | 4.8503 | 2818 | 1.1480 | 0.7014 | 1.1480 | 1.0714 | | 0.093 | 4.8537 | 2820 | 1.1050 | 0.7014 | 1.1050 | 1.0512 | | 0.093 | 4.8571 | 2822 | 1.0421 | 0.7014 | 1.0421 | 1.0209 | | 0.093 | 4.8606 | 2824 | 1.0481 | 0.7101 | 1.0481 | 1.0237 | | 0.093 | 4.8640 | 2826 | 1.0841 | 0.7101 | 1.0841 | 1.0412 | | 0.093 | 4.8675 | 2828 | 1.0416 | 0.7626 | 1.0416 | 1.0206 | | 0.093 | 4.8709 | 2830 | 0.9865 | 0.7844 | 0.9865 | 0.9932 | | 0.093 | 4.8744 | 2832 | 0.9744 | 0.7844 | 0.9744 | 0.9871 | | 0.093 | 4.8778 | 2834 | 0.9640 | 0.7221 | 0.9640 | 0.9819 | | 0.093 | 4.8812 | 2836 | 0.9897 | 0.7216 | 0.9897 | 0.9948 | | 0.093 | 4.8847 | 2838 | 1.0464 | 0.7014 | 1.0464 | 1.0229 | | 0.093 | 4.8881 | 2840 | 1.1244 | 0.7014 | 1.1244 | 1.0604 | | 0.093 | 4.8916 | 2842 | 1.1248 | 0.7014 | 1.1248 | 1.0606 | | 0.093 | 4.8950 | 2844 | 1.0412 | 0.6940 | 1.0412 | 1.0204 | | 0.093 | 4.8985 | 2846 | 0.9431 | 0.7134 | 0.9431 | 0.9711 | | 0.093 | 4.9019 | 2848 | 0.9325 | 0.7221 | 0.9325 | 0.9657 | | 0.093 | 4.9053 | 2850 | 0.9561 | 0.7221 | 0.9561 | 0.9778 | | 0.093 | 4.9088 | 2852 | 1.0079 | 0.6940 | 1.0079 | 1.0040 | | 0.093 | 4.9122 | 2854 | 1.0729 | 0.6940 | 1.0729 | 1.0358 | | 0.093 | 4.9157 | 2856 | 1.0921 | 0.7014 | 1.0921 | 1.0450 | | 0.093 | 4.9191 | 2858 | 1.0759 | 0.7014 | 1.0759 | 1.0373 | | 0.093 | 4.9225 | 2860 | 1.0548 | 0.7101 | 1.0548 | 1.0270 | | 0.093 | 4.9260 | 2862 | 1.0062 | 0.7520 | 1.0062 | 1.0031 | | 0.093 | 4.9294 | 2864 | 0.9362 | 0.7729 | 0.9362 | 0.9676 | | 0.093 | 4.9329 | 2866 | 0.9227 | 0.7729 | 0.9227 | 0.9606 | | 0.093 | 4.9363 | 2868 | 0.9548 | 0.6940 | 0.9548 | 0.9771 | | 0.093 | 4.9398 | 2870 | 0.9648 | 0.6940 | 0.9648 | 0.9822 | | 0.093 | 4.9432 | 2872 | 0.9492 | 0.6940 | 0.9492 | 0.9743 | | 0.093 | 4.9466 | 2874 | 1.0056 | 0.6940 | 1.0056 | 1.0028 | | 0.093 | 4.9501 | 2876 | 1.1033 | 0.6940 | 1.1033 | 1.0504 | | 0.093 | 4.9535 | 2878 | 1.1295 | 0.6940 | 1.1295 | 1.0628 | | 0.093 | 4.9570 | 2880 | 1.0674 | 0.6940 | 1.0674 | 1.0332 | | 0.093 | 4.9604 | 2882 | 0.9604 | 0.6940 | 0.9604 | 0.9800 | | 0.093 | 4.9639 | 2884 | 0.8430 | 0.7729 | 0.8430 | 0.9181 | | 0.093 | 4.9673 | 2886 | 0.7901 | 0.6776 | 0.7901 | 0.8889 | | 0.093 | 4.9707 | 2888 | 0.7982 | 0.6776 | 0.7982 | 0.8934 | | 0.093 | 4.9742 | 2890 | 0.8539 | 0.7525 | 0.8539 | 0.9241 | | 0.093 | 4.9776 | 2892 | 0.8947 | 0.7729 | 0.8947 | 0.9459 | | 0.093 | 4.9811 | 2894 | 0.8931 | 0.7729 | 0.8931 | 0.9451 | | 0.093 | 4.9845 | 2896 | 0.9143 | 0.7618 | 0.9143 | 0.9562 | | 0.093 | 4.9880 | 2898 | 0.9249 | 0.7618 | 0.9249 | 0.9617 | | 0.093 | 4.9914 | 2900 | 0.9200 | 0.7134 | 0.9200 | 0.9592 | | 0.093 | 4.9948 | 2902 | 0.8791 | 0.7618 | 0.8791 | 0.9376 | | 0.093 | 4.9983 | 2904 | 0.8315 | 0.7422 | 0.8315 | 0.9119 | | 0.093 | 5.0017 | 2906 | 0.7962 | 0.7422 | 0.7962 | 0.8923 | | 0.093 | 5.0052 | 2908 | 0.8046 | 0.7422 | 0.8046 | 0.8970 | | 0.093 | 5.0086 | 2910 | 0.8095 | 0.7422 | 0.8095 | 0.8997 | | 0.093 | 5.0120 | 2912 | 0.8318 | 0.7525 | 0.8318 | 0.9120 | | 0.093 | 5.0155 | 2914 | 0.8606 | 0.7618 | 0.8606 | 0.9277 | | 0.093 | 5.0189 | 2916 | 0.9482 | 0.7134 | 0.9482 | 0.9738 | | 0.093 | 5.0224 | 2918 | 1.0055 | 0.7134 | 1.0055 | 1.0028 | | 0.093 | 5.0258 | 2920 | 0.9838 | 0.7134 | 0.9838 | 0.9919 | | 0.093 | 5.0293 | 2922 | 0.9015 | 0.7618 | 0.9015 | 0.9495 | | 0.093 | 5.0327 | 2924 | 0.8372 | 0.7427 | 0.8372 | 0.9150 | | 0.093 | 5.0361 | 2926 | 0.7814 | 0.7427 | 0.7814 | 0.8840 | | 0.093 | 5.0396 | 2928 | 0.7753 | 0.7427 | 0.7753 | 0.8805 | | 0.093 | 5.0430 | 2930 | 0.8035 | 0.7332 | 0.8035 | 0.8964 | | 0.093 | 5.0465 | 2932 | 0.8362 | 0.7058 | 0.8362 | 0.9144 | | 0.093 | 5.0499 | 2934 | 0.9160 | 0.7134 | 0.9160 | 0.9571 | | 0.093 | 5.0534 | 2936 | 0.9398 | 0.7134 | 0.9398 | 0.9694 | | 0.093 | 5.0568 | 2938 | 0.9966 | 0.6940 | 0.9966 | 0.9983 | | 0.093 | 5.0602 | 2940 | 1.0665 | 0.7014 | 1.0665 | 1.0327 | | 0.093 | 5.0637 | 2942 | 1.0958 | 0.7014 | 1.0958 | 1.0468 | | 0.093 | 5.0671 | 2944 | 1.0613 | 0.7515 | 1.0613 | 1.0302 | | 0.093 | 5.0706 | 2946 | 0.9879 | 0.7844 | 0.9879 | 0.9939 | | 0.093 | 5.0740 | 2948 | 0.9464 | 0.7729 | 0.9464 | 0.9728 | | 0.093 | 5.0775 | 2950 | 0.9082 | 0.7729 | 0.9082 | 0.9530 | | 0.093 | 5.0809 | 2952 | 0.8775 | 0.7623 | 0.8775 | 0.9368 | | 0.093 | 5.0843 | 2954 | 0.9085 | 0.7623 | 0.9085 | 0.9531 | | 0.093 | 5.0878 | 2956 | 0.9787 | 0.7724 | 0.9787 | 0.9893 | | 0.093 | 5.0912 | 2958 | 1.0504 | 0.7014 | 1.0504 | 1.0249 | | 0.093 | 5.0947 | 2960 | 1.0970 | 0.7014 | 1.0970 | 1.0474 | | 0.093 | 5.0981 | 2962 | 1.1499 | 0.7014 | 1.1499 | 1.0723 | | 0.093 | 5.1015 | 2964 | 1.1239 | 0.7014 | 1.1239 | 1.0602 | | 0.093 | 5.1050 | 2966 | 1.0387 | 0.7014 | 1.0387 | 1.0192 | | 0.093 | 5.1084 | 2968 | 0.9670 | 0.7520 | 0.9670 | 0.9834 | | 0.093 | 5.1119 | 2970 | 0.9024 | 0.7729 | 0.9024 | 0.9500 | | 0.093 | 5.1153 | 2972 | 0.8931 | 0.7623 | 0.8931 | 0.9451 | | 0.093 | 5.1188 | 2974 | 0.9301 | 0.7729 | 0.9301 | 0.9644 | | 0.093 | 5.1222 | 2976 | 0.9994 | 0.7014 | 0.9994 | 0.9997 | | 0.093 | 5.1256 | 2978 | 1.1028 | 0.7014 | 1.1028 | 1.0501 | | 0.093 | 5.1291 | 2980 | 1.2200 | 0.7014 | 1.2200 | 1.1045 | | 0.093 | 5.1325 | 2982 | 1.2756 | 0.7014 | 1.2756 | 1.1294 | | 0.093 | 5.1360 | 2984 | 1.2344 | 0.7014 | 1.2344 | 1.1110 | | 0.093 | 5.1394 | 2986 | 1.1429 | 0.7014 | 1.1429 | 1.0691 | | 0.093 | 5.1429 | 2988 | 1.0752 | 0.7014 | 1.0752 | 1.0369 | | 0.093 | 5.1463 | 2990 | 0.9669 | 0.7844 | 0.9669 | 0.9833 | | 0.093 | 5.1497 | 2992 | 0.8463 | 0.6706 | 0.8463 | 0.9200 | | 0.093 | 5.1532 | 2994 | 0.8004 | 0.6706 | 0.8004 | 0.8946 | | 0.093 | 5.1566 | 2996 | 0.8073 | 0.6706 | 0.8073 | 0.8985 | | 0.093 | 5.1601 | 2998 | 0.8575 | 0.6706 | 0.8575 | 0.9260 | | 0.0747 | 5.1635 | 3000 | 0.9750 | 0.7020 | 0.9750 | 0.9874 | | 0.0747 | 5.1670 | 3002 | 1.1668 | 0.7014 | 1.1668 | 1.0802 | | 0.0747 | 5.1704 | 3004 | 1.2956 | 0.6740 | 1.2956 | 1.1383 | | 0.0747 | 5.1738 | 3006 | 1.3041 | 0.6740 | 1.3041 | 1.1420 | | 0.0747 | 5.1773 | 3008 | 1.2135 | 0.7318 | 1.2135 | 1.1016 | | 0.0747 | 5.1807 | 3010 | 1.0576 | 0.6940 | 1.0576 | 1.0284 | | 0.0747 | 5.1842 | 3012 | 0.9034 | 0.7623 | 0.9034 | 0.9505 | | 0.0747 | 5.1876 | 3014 | 0.8375 | 0.7427 | 0.8375 | 0.9151 | | 0.0747 | 5.1910 | 3016 | 0.8257 | 0.7427 | 0.8257 | 0.9087 | | 0.0747 | 5.1945 | 3018 | 0.8541 | 0.7427 | 0.8541 | 0.9242 | | 0.0747 | 5.1979 | 3020 | 0.9046 | 0.7520 | 0.9046 | 0.9511 | | 0.0747 | 5.2014 | 3022 | 0.9672 | 0.7058 | 0.9672 | 0.9834 | | 0.0747 | 5.2048 | 3024 | 1.0082 | 0.6940 | 1.0082 | 1.0041 | | 0.0747 | 5.2083 | 3026 | 1.0039 | 0.7618 | 1.0039 | 1.0019 | | 0.0747 | 5.2117 | 3028 | 0.9942 | 0.7729 | 0.9942 | 0.9971 | | 0.0747 | 5.2151 | 3030 | 1.0135 | 0.7844 | 1.0135 | 1.0067 | | 0.0747 | 5.2186 | 3032 | 0.9684 | 0.7844 | 0.9684 | 0.9841 | | 0.0747 | 5.2220 | 3034 | 0.9222 | 0.7844 | 0.9222 | 0.9603 | | 0.0747 | 5.2255 | 3036 | 0.9035 | 0.7729 | 0.9035 | 0.9505 | | 0.0747 | 5.2289 | 3038 | 0.9252 | 0.7729 | 0.9252 | 0.9619 | | 0.0747 | 5.2324 | 3040 | 0.9296 | 0.7134 | 0.9296 | 0.9641 | | 0.0747 | 5.2358 | 3042 | 0.9049 | 0.7134 | 0.9049 | 0.9513 | | 0.0747 | 5.2392 | 3044 | 0.8874 | 0.7134 | 0.8874 | 0.9420 | | 0.0747 | 5.2427 | 3046 | 0.8849 | 0.7058 | 0.8849 | 0.9407 | | 0.0747 | 5.2461 | 3048 | 0.8468 | 0.7332 | 0.8468 | 0.9202 | | 0.0747 | 5.2496 | 3050 | 0.8215 | 0.7332 | 0.8215 | 0.9063 | | 0.0747 | 5.2530 | 3052 | 0.7912 | 0.7427 | 0.7912 | 0.8895 | | 0.0747 | 5.2565 | 3054 | 0.7873 | 0.7427 | 0.7873 | 0.8873 | | 0.0747 | 5.2599 | 3056 | 0.7841 | 0.7427 | 0.7841 | 0.8855 | | 0.0747 | 5.2633 | 3058 | 0.8000 | 0.7427 | 0.8000 | 0.8944 | | 0.0747 | 5.2668 | 3060 | 0.8375 | 0.7427 | 0.8375 | 0.9151 | | 0.0747 | 5.2702 | 3062 | 0.8906 | 0.7623 | 0.8906 | 0.9437 | | 0.0747 | 5.2737 | 3064 | 0.8981 | 0.7623 | 0.8981 | 0.9477 | | 0.0747 | 5.2771 | 3066 | 0.8703 | 0.7427 | 0.8703 | 0.9329 | | 0.0747 | 5.2806 | 3068 | 0.8833 | 0.7427 | 0.8833 | 0.9398 | | 0.0747 | 5.2840 | 3070 | 0.9352 | 0.7729 | 0.9352 | 0.9670 | | 0.0747 | 5.2874 | 3072 | 0.9680 | 0.7618 | 0.9680 | 0.9839 | | 0.0747 | 5.2909 | 3074 | 1.0049 | 0.7216 | 1.0049 | 1.0025 | | 0.0747 | 5.2943 | 3076 | 0.9936 | 0.7844 | 0.9936 | 0.9968 | | 0.0747 | 5.2978 | 3078 | 0.9495 | 0.7631 | 0.9495 | 0.9744 | | 0.0747 | 5.3012 | 3080 | 0.9608 | 0.7844 | 0.9608 | 0.9802 | | 0.0747 | 5.3046 | 3082 | 0.9620 | 0.7729 | 0.9620 | 0.9808 | | 0.0747 | 5.3081 | 3084 | 0.9566 | 0.7729 | 0.9566 | 0.9780 | | 0.0747 | 5.3115 | 3086 | 0.9654 | 0.6871 | 0.9654 | 0.9825 | | 0.0747 | 5.3150 | 3088 | 0.9781 | 0.6871 | 0.9781 | 0.9890 | | 0.0747 | 5.3184 | 3090 | 0.9226 | 0.7058 | 0.9226 | 0.9605 | | 0.0747 | 5.3219 | 3092 | 0.8537 | 0.6877 | 0.8537 | 0.9240 | | 0.0747 | 5.3253 | 3094 | 0.8312 | 0.6877 | 0.8312 | 0.9117 | | 0.0747 | 5.3287 | 3096 | 0.8413 | 0.6877 | 0.8413 | 0.9172 | | 0.0747 | 5.3322 | 3098 | 0.8839 | 0.6877 | 0.8839 | 0.9402 | | 0.0747 | 5.3356 | 3100 | 0.9488 | 0.7058 | 0.9488 | 0.9741 | | 0.0747 | 5.3391 | 3102 | 0.9855 | 0.6871 | 0.9855 | 0.9927 | | 0.0747 | 5.3425 | 3104 | 0.9659 | 0.7058 | 0.9659 | 0.9828 | | 0.0747 | 5.3460 | 3106 | 0.9494 | 0.7058 | 0.9494 | 0.9744 | | 0.0747 | 5.3494 | 3108 | 0.9064 | 0.7332 | 0.9064 | 0.9520 | | 0.0747 | 5.3528 | 3110 | 0.9000 | 0.7427 | 0.9000 | 0.9487 | | 0.0747 | 5.3563 | 3112 | 0.8792 | 0.7427 | 0.8792 | 0.9377 | | 0.0747 | 5.3597 | 3114 | 0.8972 | 0.7427 | 0.8972 | 0.9472 | | 0.0747 | 5.3632 | 3116 | 0.9302 | 0.7623 | 0.9302 | 0.9644 | | 0.0747 | 5.3666 | 3118 | 0.9538 | 0.7724 | 0.9538 | 0.9766 | | 0.0747 | 5.3701 | 3120 | 0.9776 | 0.7724 | 0.9776 | 0.9887 | | 0.0747 | 5.3735 | 3122 | 0.9534 | 0.7520 | 0.9534 | 0.9764 | | 0.0747 | 5.3769 | 3124 | 0.9156 | 0.7520 | 0.9156 | 0.9569 | | 0.0747 | 5.3804 | 3126 | 0.8381 | 0.7520 | 0.8381 | 0.9155 | | 0.0747 | 5.3838 | 3128 | 0.8138 | 0.7717 | 0.8138 | 0.9021 | | 0.0747 | 5.3873 | 3130 | 0.8228 | 0.7906 | 0.8228 | 0.9071 | | 0.0747 | 5.3907 | 3132 | 0.8711 | 0.7623 | 0.8711 | 0.9333 | | 0.0747 | 5.3941 | 3134 | 0.8825 | 0.7520 | 0.8825 | 0.9394 | | 0.0747 | 5.3976 | 3136 | 0.9027 | 0.7327 | 0.9027 | 0.9501 | | 0.0747 | 5.4010 | 3138 | 0.8858 | 0.7520 | 0.8858 | 0.9412 | | 0.0747 | 5.4045 | 3140 | 0.8667 | 0.7797 | 0.8667 | 0.9310 | | 0.0747 | 5.4079 | 3142 | 0.8389 | 0.7906 | 0.8389 | 0.9159 | | 0.0747 | 5.4114 | 3144 | 0.7985 | 0.7906 | 0.7985 | 0.8936 | | 0.0747 | 5.4148 | 3146 | 0.7903 | 0.7906 | 0.7903 | 0.8890 | | 0.0747 | 5.4182 | 3148 | 0.8289 | 0.7797 | 0.8289 | 0.9104 | | 0.0747 | 5.4217 | 3150 | 0.8903 | 0.7797 | 0.8903 | 0.9436 | | 0.0747 | 5.4251 | 3152 | 0.8926 | 0.7797 | 0.8926 | 0.9448 | | 0.0747 | 5.4286 | 3154 | 0.8797 | 0.7797 | 0.8797 | 0.9379 | | 0.0747 | 5.4320 | 3156 | 0.8540 | 0.7797 | 0.8540 | 0.9241 | | 0.0747 | 5.4355 | 3158 | 0.8324 | 0.7797 | 0.8324 | 0.9124 | | 0.0747 | 5.4389 | 3160 | 0.8058 | 0.7797 | 0.8058 | 0.8976 | | 0.0747 | 5.4423 | 3162 | 0.8134 | 0.7797 | 0.8134 | 0.9019 | | 0.0747 | 5.4458 | 3164 | 0.8076 | 0.7797 | 0.8076 | 0.8987 | | 0.0747 | 5.4492 | 3166 | 0.7578 | 0.7797 | 0.7578 | 0.8705 | | 0.0747 | 5.4527 | 3168 | 0.6981 | 0.7616 | 0.6981 | 0.8355 | | 0.0747 | 5.4561 | 3170 | 0.6939 | 0.7616 | 0.6939 | 0.8330 | | 0.0747 | 5.4596 | 3172 | 0.7263 | 0.7797 | 0.7263 | 0.8522 | | 0.0747 | 5.4630 | 3174 | 0.7884 | 0.7797 | 0.7884 | 0.8879 | | 0.0747 | 5.4664 | 3176 | 0.8700 | 0.7342 | 0.8700 | 0.9327 | | 0.0747 | 5.4699 | 3178 | 0.9174 | 0.7058 | 0.9174 | 0.9578 | | 0.0747 | 5.4733 | 3180 | 0.9045 | 0.7520 | 0.9045 | 0.9511 | | 0.0747 | 5.4768 | 3182 | 0.8565 | 0.7623 | 0.8565 | 0.9255 | | 0.0747 | 5.4802 | 3184 | 0.8243 | 0.7906 | 0.8243 | 0.9079 | | 0.0747 | 5.4836 | 3186 | 0.7657 | 0.7906 | 0.7657 | 0.8751 | | 0.0747 | 5.4871 | 3188 | 0.7098 | 0.7037 | 0.7098 | 0.8425 | | 0.0747 | 5.4905 | 3190 | 0.6866 | 0.7003 | 0.6866 | 0.8286 | | 0.0747 | 5.4940 | 3192 | 0.6860 | 0.7614 | 0.6860 | 0.8282 | | 0.0747 | 5.4974 | 3194 | 0.7134 | 0.7614 | 0.7134 | 0.8446 | | 0.0747 | 5.5009 | 3196 | 0.7449 | 0.7358 | 0.7449 | 0.8631 | | 0.0747 | 5.5043 | 3198 | 0.7860 | 0.7281 | 0.7860 | 0.8866 | | 0.0747 | 5.5077 | 3200 | 0.7793 | 0.7281 | 0.7793 | 0.8828 | | 0.0747 | 5.5112 | 3202 | 0.7870 | 0.7281 | 0.7870 | 0.8871 | | 0.0747 | 5.5146 | 3204 | 0.7765 | 0.7281 | 0.7765 | 0.8812 | | 0.0747 | 5.5181 | 3206 | 0.7990 | 0.7430 | 0.7990 | 0.8939 | | 0.0747 | 5.5215 | 3208 | 0.8721 | 0.7058 | 0.8721 | 0.9339 | | 0.0747 | 5.5250 | 3210 | 0.9112 | 0.7139 | 0.9112 | 0.9546 | | 0.0747 | 5.5284 | 3212 | 0.9006 | 0.7623 | 0.9006 | 0.9490 | | 0.0747 | 5.5318 | 3214 | 0.9005 | 0.7623 | 0.9005 | 0.9489 | | 0.0747 | 5.5353 | 3216 | 0.8699 | 0.7623 | 0.8699 | 0.9327 | | 0.0747 | 5.5387 | 3218 | 0.8513 | 0.7623 | 0.8513 | 0.9227 | | 0.0747 | 5.5422 | 3220 | 0.8260 | 0.7623 | 0.8260 | 0.9089 | | 0.0747 | 5.5456 | 3222 | 0.8268 | 0.7427 | 0.8268 | 0.9093 | | 0.0747 | 5.5491 | 3224 | 0.8364 | 0.7427 | 0.8364 | 0.9146 | | 0.0747 | 5.5525 | 3226 | 0.8590 | 0.7623 | 0.8590 | 0.9268 | | 0.0747 | 5.5559 | 3228 | 0.8719 | 0.7139 | 0.8719 | 0.9338 | | 0.0747 | 5.5594 | 3230 | 0.8870 | 0.7430 | 0.8870 | 0.9418 | | 0.0747 | 5.5628 | 3232 | 0.8862 | 0.7430 | 0.8862 | 0.9414 | | 0.0747 | 5.5663 | 3234 | 0.8582 | 0.7430 | 0.8582 | 0.9264 | | 0.0747 | 5.5697 | 3236 | 0.8389 | 0.7430 | 0.8389 | 0.9159 | | 0.0747 | 5.5731 | 3238 | 0.8455 | 0.7430 | 0.8455 | 0.9195 | | 0.0747 | 5.5766 | 3240 | 0.8567 | 0.7430 | 0.8567 | 0.9256 | | 0.0747 | 5.5800 | 3242 | 0.8395 | 0.7906 | 0.8395 | 0.9162 | | 0.0747 | 5.5835 | 3244 | 0.8192 | 0.7717 | 0.8192 | 0.9051 | | 0.0747 | 5.5869 | 3246 | 0.8262 | 0.7717 | 0.8262 | 0.9089 | | 0.0747 | 5.5904 | 3248 | 0.8373 | 0.7906 | 0.8373 | 0.9150 | | 0.0747 | 5.5938 | 3250 | 0.8707 | 0.7430 | 0.8707 | 0.9331 | | 0.0747 | 5.5972 | 3252 | 0.8729 | 0.7139 | 0.8729 | 0.9343 | | 0.0747 | 5.6007 | 3254 | 0.8703 | 0.7139 | 0.8703 | 0.9329 | | 0.0747 | 5.6041 | 3256 | 0.8690 | 0.7139 | 0.8690 | 0.9322 | | 0.0747 | 5.6076 | 3258 | 0.8619 | 0.7430 | 0.8619 | 0.9284 | | 0.0747 | 5.6110 | 3260 | 0.8511 | 0.7430 | 0.8511 | 0.9225 | | 0.0747 | 5.6145 | 3262 | 0.8698 | 0.7430 | 0.8698 | 0.9327 | | 0.0747 | 5.6179 | 3264 | 0.8647 | 0.7906 | 0.8647 | 0.9299 | | 0.0747 | 5.6213 | 3266 | 0.8368 | 0.7906 | 0.8368 | 0.9147 | | 0.0747 | 5.6248 | 3268 | 0.8018 | 0.7906 | 0.8018 | 0.8955 | | 0.0747 | 5.6282 | 3270 | 0.7609 | 0.7717 | 0.7609 | 0.8723 | | 0.0747 | 5.6317 | 3272 | 0.7568 | 0.7717 | 0.7568 | 0.8699 | | 0.0747 | 5.6351 | 3274 | 0.7973 | 0.7906 | 0.7973 | 0.8929 | | 0.0747 | 5.6386 | 3276 | 0.8800 | 0.7906 | 0.8800 | 0.9381 | | 0.0747 | 5.6420 | 3278 | 0.9698 | 0.7618 | 0.9698 | 0.9848 | | 0.0747 | 5.6454 | 3280 | 1.0338 | 0.7511 | 1.0338 | 1.0168 | | 0.0747 | 5.6489 | 3282 | 1.0538 | 0.7511 | 1.0538 | 1.0265 | | 0.0747 | 5.6523 | 3284 | 1.0620 | 0.7511 | 1.0620 | 1.0306 | | 0.0747 | 5.6558 | 3286 | 1.0157 | 0.7844 | 1.0157 | 1.0078 | | 0.0747 | 5.6592 | 3288 | 0.9338 | 0.7844 | 0.9338 | 0.9663 | | 0.0747 | 5.6627 | 3290 | 0.8643 | 0.7729 | 0.8643 | 0.9297 | | 0.0747 | 5.6661 | 3292 | 0.8077 | 0.7823 | 0.8077 | 0.8987 | | 0.0747 | 5.6695 | 3294 | 0.7973 | 0.7823 | 0.7973 | 0.8929 | | 0.0747 | 5.6730 | 3296 | 0.8067 | 0.7823 | 0.8067 | 0.8982 | | 0.0747 | 5.6764 | 3298 | 0.8382 | 0.8019 | 0.8382 | 0.9155 | | 0.0747 | 5.6799 | 3300 | 0.8706 | 0.7729 | 0.8706 | 0.9331 | | 0.0747 | 5.6833 | 3302 | 0.8829 | 0.7729 | 0.8829 | 0.9396 | | 0.0747 | 5.6867 | 3304 | 0.9048 | 0.7729 | 0.9048 | 0.9512 | | 0.0747 | 5.6902 | 3306 | 0.9083 | 0.7729 | 0.9083 | 0.9531 | | 0.0747 | 5.6936 | 3308 | 0.8608 | 0.7823 | 0.8608 | 0.9278 | | 0.0747 | 5.6971 | 3310 | 0.8288 | 0.7118 | 0.8288 | 0.9104 | | 0.0747 | 5.7005 | 3312 | 0.8190 | 0.7118 | 0.8190 | 0.9050 | | 0.0747 | 5.7040 | 3314 | 0.8346 | 0.7823 | 0.8346 | 0.9136 | | 0.0747 | 5.7074 | 3316 | 0.8495 | 0.7717 | 0.8495 | 0.9217 | | 0.0747 | 5.7108 | 3318 | 0.8958 | 0.7426 | 0.8958 | 0.9464 | | 0.0747 | 5.7143 | 3320 | 0.9221 | 0.7134 | 0.9221 | 0.9603 | | 0.0747 | 5.7177 | 3322 | 0.9595 | 0.7134 | 0.9595 | 0.9795 | | 0.0747 | 5.7212 | 3324 | 0.9872 | 0.7134 | 0.9872 | 0.9936 | | 0.0747 | 5.7246 | 3326 | 0.9587 | 0.7134 | 0.9587 | 0.9791 | | 0.0747 | 5.7281 | 3328 | 0.9075 | 0.7058 | 0.9075 | 0.9526 | | 0.0747 | 5.7315 | 3330 | 0.8997 | 0.7342 | 0.8997 | 0.9485 | | 0.0747 | 5.7349 | 3332 | 0.8699 | 0.7342 | 0.8699 | 0.9327 | | 0.0747 | 5.7384 | 3334 | 0.8585 | 0.7342 | 0.8585 | 0.9266 | | 0.0747 | 5.7418 | 3336 | 0.8689 | 0.7342 | 0.8689 | 0.9322 | | 0.0747 | 5.7453 | 3338 | 0.8748 | 0.7342 | 0.8748 | 0.9353 | | 0.0747 | 5.7487 | 3340 | 0.9345 | 0.7058 | 0.9345 | 0.9667 | | 0.0747 | 5.7522 | 3342 | 1.0603 | 0.7421 | 1.0603 | 1.0297 | | 0.0747 | 5.7556 | 3344 | 1.1264 | 0.7421 | 1.1264 | 1.0613 | | 0.0747 | 5.7590 | 3346 | 1.1544 | 0.7234 | 1.1544 | 1.0744 | | 0.0747 | 5.7625 | 3348 | 1.0975 | 0.7134 | 1.0975 | 1.0476 | | 0.0747 | 5.7659 | 3350 | 0.9816 | 0.7134 | 0.9816 | 0.9908 | | 0.0747 | 5.7694 | 3352 | 0.8943 | 0.7729 | 0.8943 | 0.9457 | | 0.0747 | 5.7728 | 3354 | 0.7892 | 0.7717 | 0.7892 | 0.8884 | | 0.0747 | 5.7762 | 3356 | 0.7206 | 0.7037 | 0.7206 | 0.8489 | | 0.0747 | 5.7797 | 3358 | 0.7061 | 0.7037 | 0.7061 | 0.8403 | | 0.0747 | 5.7831 | 3360 | 0.7163 | 0.7717 | 0.7163 | 0.8463 | | 0.0747 | 5.7866 | 3362 | 0.7507 | 0.7717 | 0.7507 | 0.8664 | | 0.0747 | 5.7900 | 3364 | 0.8105 | 0.7249 | 0.8105 | 0.9003 | | 0.0747 | 5.7935 | 3366 | 0.8597 | 0.7058 | 0.8597 | 0.9272 | | 0.0747 | 5.7969 | 3368 | 0.8773 | 0.7058 | 0.8773 | 0.9366 | | 0.0747 | 5.8003 | 3370 | 0.8523 | 0.7058 | 0.8523 | 0.9232 | | 0.0747 | 5.8038 | 3372 | 0.8112 | 0.7139 | 0.8112 | 0.9007 | | 0.0747 | 5.8072 | 3374 | 0.7645 | 0.7717 | 0.7645 | 0.8743 | | 0.0747 | 5.8107 | 3376 | 0.7376 | 0.7717 | 0.7376 | 0.8588 | | 0.0747 | 5.8141 | 3378 | 0.7075 | 0.7717 | 0.7075 | 0.8411 | | 0.0747 | 5.8176 | 3380 | 0.7168 | 0.7717 | 0.7168 | 0.8466 | | 0.0747 | 5.8210 | 3382 | 0.7585 | 0.7717 | 0.7585 | 0.8709 | | 0.0747 | 5.8244 | 3384 | 0.8238 | 0.7623 | 0.8238 | 0.9076 | | 0.0747 | 5.8279 | 3386 | 0.9092 | 0.7134 | 0.9092 | 0.9535 | | 0.0747 | 5.8313 | 3388 | 0.9471 | 0.7134 | 0.9471 | 0.9732 | | 0.0747 | 5.8348 | 3390 | 0.9227 | 0.7134 | 0.9227 | 0.9606 | | 0.0747 | 5.8382 | 3392 | 0.8657 | 0.7623 | 0.8657 | 0.9304 | | 0.0747 | 5.8417 | 3394 | 0.7902 | 0.7623 | 0.7902 | 0.8889 | | 0.0747 | 5.8451 | 3396 | 0.7653 | 0.7717 | 0.7653 | 0.8748 | | 0.0747 | 5.8485 | 3398 | 0.7799 | 0.7906 | 0.7799 | 0.8831 | | 0.0747 | 5.8520 | 3400 | 0.7953 | 0.7623 | 0.7953 | 0.8918 | | 0.0747 | 5.8554 | 3402 | 0.8638 | 0.7729 | 0.8638 | 0.9294 | | 0.0747 | 5.8589 | 3404 | 0.9177 | 0.7520 | 0.9177 | 0.9580 | | 0.0747 | 5.8623 | 3406 | 0.9266 | 0.7520 | 0.9266 | 0.9626 | | 0.0747 | 5.8657 | 3408 | 0.8869 | 0.7729 | 0.8869 | 0.9418 | | 0.0747 | 5.8692 | 3410 | 0.8149 | 0.8019 | 0.8149 | 0.9027 | | 0.0747 | 5.8726 | 3412 | 0.7889 | 0.7906 | 0.7889 | 0.8882 | | 0.0747 | 5.8761 | 3414 | 0.7519 | 0.7717 | 0.7519 | 0.8671 | | 0.0747 | 5.8795 | 3416 | 0.7517 | 0.7717 | 0.7517 | 0.8670 | | 0.0747 | 5.8830 | 3418 | 0.7944 | 0.7823 | 0.7944 | 0.8913 | | 0.0747 | 5.8864 | 3420 | 0.8674 | 0.8019 | 0.8674 | 0.9313 | | 0.0747 | 5.8898 | 3422 | 0.9750 | 0.7134 | 0.9750 | 0.9874 | | 0.0747 | 5.8933 | 3424 | 1.0894 | 0.6940 | 1.0894 | 1.0438 | | 0.0747 | 5.8967 | 3426 | 1.1155 | 0.6940 | 1.1155 | 1.0562 | | 0.0747 | 5.9002 | 3428 | 1.0648 | 0.7134 | 1.0648 | 1.0319 | | 0.0747 | 5.9036 | 3430 | 1.0067 | 0.7729 | 1.0067 | 1.0033 | | 0.0747 | 5.9071 | 3432 | 0.9113 | 0.8019 | 0.9113 | 0.9546 | | 0.0747 | 5.9105 | 3434 | 0.8503 | 0.7823 | 0.8503 | 0.9221 | | 0.0747 | 5.9139 | 3436 | 0.7971 | 0.7118 | 0.7971 | 0.8928 | | 0.0747 | 5.9174 | 3438 | 0.7727 | 0.7118 | 0.7727 | 0.8790 | | 0.0747 | 5.9208 | 3440 | 0.7709 | 0.7118 | 0.7709 | 0.8780 | | 0.0747 | 5.9243 | 3442 | 0.8057 | 0.7823 | 0.8057 | 0.8976 | | 0.0747 | 5.9277 | 3444 | 0.8727 | 0.8019 | 0.8727 | 0.9342 | | 0.0747 | 5.9312 | 3446 | 0.9622 | 0.7426 | 0.9622 | 0.9809 | | 0.0747 | 5.9346 | 3448 | 1.0046 | 0.7134 | 1.0046 | 1.0023 | | 0.0747 | 5.9380 | 3450 | 0.9754 | 0.7426 | 0.9754 | 0.9876 | | 0.0747 | 5.9415 | 3452 | 0.8978 | 0.7426 | 0.8978 | 0.9475 | | 0.0747 | 5.9449 | 3454 | 0.8033 | 0.7823 | 0.8033 | 0.8963 | | 0.0747 | 5.9484 | 3456 | 0.7569 | 0.7823 | 0.7569 | 0.8700 | | 0.0747 | 5.9518 | 3458 | 0.7634 | 0.7823 | 0.7634 | 0.8737 | | 0.0747 | 5.9552 | 3460 | 0.7900 | 0.8019 | 0.7900 | 0.8888 | | 0.0747 | 5.9587 | 3462 | 0.8167 | 0.8019 | 0.8167 | 0.9037 | | 0.0747 | 5.9621 | 3464 | 0.8470 | 0.8019 | 0.8470 | 0.9203 | | 0.0747 | 5.9656 | 3466 | 0.9000 | 0.7426 | 0.9000 | 0.9487 | | 0.0747 | 5.9690 | 3468 | 0.9039 | 0.8019 | 0.9039 | 0.9507 | | 0.0747 | 5.9725 | 3470 | 0.9062 | 0.7729 | 0.9062 | 0.9519 | | 0.0747 | 5.9759 | 3472 | 0.8772 | 0.8019 | 0.8772 | 0.9366 | | 0.0747 | 5.9793 | 3474 | 0.8620 | 0.8019 | 0.8620 | 0.9284 | | 0.0747 | 5.9828 | 3476 | 0.8475 | 0.8019 | 0.8475 | 0.9206 | | 0.0747 | 5.9862 | 3478 | 0.8537 | 0.8019 | 0.8537 | 0.9239 | | 0.0747 | 5.9897 | 3480 | 0.8626 | 0.8019 | 0.8626 | 0.9288 | | 0.0747 | 5.9931 | 3482 | 0.8315 | 0.8019 | 0.8315 | 0.9119 | | 0.0747 | 5.9966 | 3484 | 0.7810 | 0.7801 | 0.7810 | 0.8837 | | 0.0747 | 6.0 | 3486 | 0.7569 | 0.7801 | 0.7569 | 0.8700 | | 0.0747 | 6.0034 | 3488 | 0.7723 | 0.7801 | 0.7723 | 0.8788 | | 0.0747 | 6.0069 | 3490 | 0.8112 | 0.7902 | 0.8112 | 0.9006 | | 0.0747 | 6.0103 | 3492 | 0.8605 | 0.7426 | 0.8605 | 0.9276 | | 0.0747 | 6.0138 | 3494 | 0.8741 | 0.7618 | 0.8741 | 0.9349 | | 0.0747 | 6.0172 | 3496 | 0.8833 | 0.7618 | 0.8833 | 0.9399 | | 0.0747 | 6.0207 | 3498 | 0.9205 | 0.7618 | 0.9205 | 0.9594 | | 0.0652 | 6.0241 | 3500 | 0.9241 | 0.7618 | 0.9241 | 0.9613 | | 0.0652 | 6.0275 | 3502 | 0.8828 | 0.7618 | 0.8828 | 0.9396 | | 0.0652 | 6.0310 | 3504 | 0.8355 | 0.6988 | 0.8355 | 0.9140 | | 0.0652 | 6.0344 | 3506 | 0.8342 | 0.6988 | 0.8342 | 0.9133 | | 0.0652 | 6.0379 | 3508 | 0.8283 | 0.6988 | 0.8283 | 0.9101 | | 0.0652 | 6.0413 | 3510 | 0.8423 | 0.6988 | 0.8423 | 0.9178 | | 0.0652 | 6.0448 | 3512 | 0.8580 | 0.6988 | 0.8580 | 0.9263 | | 0.0652 | 6.0482 | 3514 | 0.9147 | 0.6988 | 0.9147 | 0.9564 | | 0.0652 | 6.0516 | 3516 | 0.9467 | 0.6988 | 0.9467 | 0.9730 | | 0.0652 | 6.0551 | 3518 | 0.9593 | 0.6988 | 0.9593 | 0.9794 | | 0.0652 | 6.0585 | 3520 | 0.9392 | 0.6988 | 0.9392 | 0.9691 | | 0.0652 | 6.0620 | 3522 | 0.9293 | 0.6988 | 0.9293 | 0.9640 | | 0.0652 | 6.0654 | 3524 | 0.9196 | 0.6988 | 0.9196 | 0.9589 | | 0.0652 | 6.0688 | 3526 | 0.8870 | 0.6988 | 0.8870 | 0.9418 | | 0.0652 | 6.0723 | 3528 | 0.8736 | 0.6988 | 0.8736 | 0.9346 | | 0.0652 | 6.0757 | 3530 | 0.8650 | 0.6903 | 0.8650 | 0.9300 | | 0.0652 | 6.0792 | 3532 | 0.8616 | 0.7618 | 0.8616 | 0.9282 | | 0.0652 | 6.0826 | 3534 | 0.8543 | 0.7902 | 0.8543 | 0.9243 | | 0.0652 | 6.0861 | 3536 | 0.8804 | 0.7618 | 0.8804 | 0.9383 | | 0.0652 | 6.0895 | 3538 | 0.9158 | 0.7134 | 0.9158 | 0.9570 | | 0.0652 | 6.0929 | 3540 | 0.9124 | 0.7134 | 0.9124 | 0.9552 | | 0.0652 | 6.0964 | 3542 | 0.8869 | 0.7618 | 0.8869 | 0.9418 | | 0.0652 | 6.0998 | 3544 | 0.8583 | 0.6988 | 0.8583 | 0.9265 | | 0.0652 | 6.1033 | 3546 | 0.8511 | 0.6988 | 0.8511 | 0.9225 | | 0.0652 | 6.1067 | 3548 | 0.8557 | 0.6988 | 0.8557 | 0.9250 | | 0.0652 | 6.1102 | 3550 | 0.8710 | 0.6988 | 0.8710 | 0.9333 | | 0.0652 | 6.1136 | 3552 | 0.8971 | 0.7618 | 0.8971 | 0.9472 | | 0.0652 | 6.1170 | 3554 | 0.8946 | 0.7618 | 0.8946 | 0.9458 | | 0.0652 | 6.1205 | 3556 | 0.8741 | 0.7618 | 0.8741 | 0.9350 | | 0.0652 | 6.1239 | 3558 | 0.8403 | 0.7902 | 0.8403 | 0.9167 | | 0.0652 | 6.1274 | 3560 | 0.8307 | 0.7797 | 0.8307 | 0.9114 | | 0.0652 | 6.1308 | 3562 | 0.8082 | 0.7906 | 0.8082 | 0.8990 | | 0.0652 | 6.1343 | 3564 | 0.7992 | 0.7232 | 0.7992 | 0.8940 | | 0.0652 | 6.1377 | 3566 | 0.8221 | 0.7906 | 0.8221 | 0.9067 | | 0.0652 | 6.1411 | 3568 | 0.8584 | 0.7902 | 0.8584 | 0.9265 | | 0.0652 | 6.1446 | 3570 | 0.8839 | 0.7902 | 0.8839 | 0.9401 | | 0.0652 | 6.1480 | 3572 | 0.9095 | 0.7902 | 0.9095 | 0.9537 | | 0.0652 | 6.1515 | 3574 | 0.8919 | 0.7902 | 0.8919 | 0.9444 | | 0.0652 | 6.1549 | 3576 | 0.8563 | 0.7797 | 0.8563 | 0.9253 | | 0.0652 | 6.1583 | 3578 | 0.8053 | 0.7797 | 0.8053 | 0.8974 | | 0.0652 | 6.1618 | 3580 | 0.7603 | 0.7232 | 0.7603 | 0.8719 | | 0.0652 | 6.1652 | 3582 | 0.7467 | 0.7176 | 0.7467 | 0.8641 | | 0.0652 | 6.1687 | 3584 | 0.7616 | 0.7689 | 0.7616 | 0.8727 | | 0.0652 | 6.1721 | 3586 | 0.8052 | 0.7354 | 0.8052 | 0.8973 | | 0.0652 | 6.1756 | 3588 | 0.8650 | 0.7342 | 0.8650 | 0.9300 | | 0.0652 | 6.1790 | 3590 | 0.9277 | 0.7342 | 0.9277 | 0.9632 | | 0.0652 | 6.1824 | 3592 | 0.9431 | 0.7342 | 0.9431 | 0.9711 | | 0.0652 | 6.1859 | 3594 | 0.9232 | 0.7342 | 0.9232 | 0.9608 | | 0.0652 | 6.1893 | 3596 | 0.9247 | 0.7426 | 0.9247 | 0.9616 | | 0.0652 | 6.1928 | 3598 | 0.9505 | 0.7426 | 0.9505 | 0.9749 | | 0.0652 | 6.1962 | 3600 | 0.9760 | 0.7134 | 0.9760 | 0.9880 | | 0.0652 | 6.1997 | 3602 | 0.9643 | 0.7134 | 0.9643 | 0.9820 | | 0.0652 | 6.2031 | 3604 | 0.9136 | 0.7902 | 0.9136 | 0.9558 | | 0.0652 | 6.2065 | 3606 | 0.8451 | 0.7232 | 0.8451 | 0.9193 | | 0.0652 | 6.2100 | 3608 | 0.8141 | 0.7232 | 0.8141 | 0.9023 | | 0.0652 | 6.2134 | 3610 | 0.8230 | 0.7232 | 0.8230 | 0.9072 | | 0.0652 | 6.2169 | 3612 | 0.8329 | 0.7232 | 0.8329 | 0.9126 | | 0.0652 | 6.2203 | 3614 | 0.8557 | 0.7906 | 0.8557 | 0.9250 | | 0.0652 | 6.2238 | 3616 | 0.8733 | 0.7797 | 0.8733 | 0.9345 | | 0.0652 | 6.2272 | 3618 | 0.8996 | 0.7342 | 0.8996 | 0.9485 | | 0.0652 | 6.2306 | 3620 | 0.9457 | 0.7426 | 0.9457 | 0.9725 | | 0.0652 | 6.2341 | 3622 | 0.9779 | 0.7134 | 0.9779 | 0.9889 | | 0.0652 | 6.2375 | 3624 | 1.0068 | 0.7134 | 1.0068 | 1.0034 | | 0.0652 | 6.2410 | 3626 | 0.9875 | 0.7134 | 0.9875 | 0.9937 | | 0.0652 | 6.2444 | 3628 | 0.9548 | 0.7618 | 0.9548 | 0.9771 | | 0.0652 | 6.2478 | 3630 | 0.9490 | 0.7618 | 0.9490 | 0.9741 | | 0.0652 | 6.2513 | 3632 | 0.8985 | 0.7729 | 0.8985 | 0.9479 | | 0.0652 | 6.2547 | 3634 | 0.8454 | 0.7232 | 0.8454 | 0.9194 | | 0.0652 | 6.2582 | 3636 | 0.8177 | 0.7232 | 0.8177 | 0.9042 | | 0.0652 | 6.2616 | 3638 | 0.8138 | 0.7906 | 0.8138 | 0.9021 | | 0.0652 | 6.2651 | 3640 | 0.8294 | 0.7906 | 0.8294 | 0.9107 | | 0.0652 | 6.2685 | 3642 | 0.8425 | 0.7342 | 0.8425 | 0.9179 | | 0.0652 | 6.2719 | 3644 | 0.8719 | 0.7342 | 0.8719 | 0.9337 | | 0.0652 | 6.2754 | 3646 | 0.8603 | 0.7354 | 0.8603 | 0.9276 | | 0.0652 | 6.2788 | 3648 | 0.8350 | 0.7281 | 0.8350 | 0.9138 | | 0.0652 | 6.2823 | 3650 | 0.8428 | 0.7354 | 0.8428 | 0.9180 | | 0.0652 | 6.2857 | 3652 | 0.8639 | 0.7354 | 0.8639 | 0.9295 | | 0.0652 | 6.2892 | 3654 | 0.8409 | 0.7354 | 0.8409 | 0.9170 | | 0.0652 | 6.2926 | 3656 | 0.7942 | 0.7879 | 0.7942 | 0.8912 | | 0.0652 | 6.2960 | 3658 | 0.7721 | 0.7704 | 0.7721 | 0.8787 | | 0.0652 | 6.2995 | 3660 | 0.7744 | 0.7704 | 0.7744 | 0.8800 | | 0.0652 | 6.3029 | 3662 | 0.7715 | 0.7704 | 0.7715 | 0.8784 | | 0.0652 | 6.3064 | 3664 | 0.7891 | 0.7704 | 0.7891 | 0.8883 | | 0.0652 | 6.3098 | 3666 | 0.8017 | 0.7717 | 0.8017 | 0.8954 | | 0.0652 | 6.3133 | 3668 | 0.8106 | 0.7717 | 0.8106 | 0.9003 | | 0.0652 | 6.3167 | 3670 | 0.8523 | 0.7623 | 0.8523 | 0.9232 | | 0.0652 | 6.3201 | 3672 | 0.8875 | 0.7139 | 0.8875 | 0.9421 | | 0.0652 | 6.3236 | 3674 | 0.9202 | 0.7058 | 0.9202 | 0.9592 | | 0.0652 | 6.3270 | 3676 | 0.9119 | 0.7623 | 0.9119 | 0.9549 | | 0.0652 | 6.3305 | 3678 | 0.8903 | 0.7623 | 0.8903 | 0.9436 | | 0.0652 | 6.3339 | 3680 | 0.8386 | 0.7717 | 0.8386 | 0.9157 | | 0.0652 | 6.3373 | 3682 | 0.7973 | 0.7037 | 0.7973 | 0.8929 | | 0.0652 | 6.3408 | 3684 | 0.7719 | 0.6893 | 0.7719 | 0.8786 | | 0.0652 | 6.3442 | 3686 | 0.7605 | 0.6829 | 0.7605 | 0.8721 | | 0.0652 | 6.3477 | 3688 | 0.7730 | 0.7704 | 0.7730 | 0.8792 | | 0.0652 | 6.3511 | 3690 | 0.8101 | 0.7879 | 0.8101 | 0.9001 | | 0.0652 | 6.3546 | 3692 | 0.8669 | 0.7342 | 0.8669 | 0.9311 | | 0.0652 | 6.3580 | 3694 | 0.9436 | 0.7058 | 0.9436 | 0.9714 | | 0.0652 | 6.3614 | 3696 | 0.9856 | 0.7058 | 0.9856 | 0.9928 | | 0.0652 | 6.3649 | 3698 | 0.9733 | 0.7058 | 0.9733 | 0.9866 | | 0.0652 | 6.3683 | 3700 | 0.9180 | 0.7058 | 0.9180 | 0.9581 | | 0.0652 | 6.3718 | 3702 | 0.8481 | 0.7430 | 0.8481 | 0.9209 | | 0.0652 | 6.3752 | 3704 | 0.7893 | 0.7717 | 0.7893 | 0.8884 | | 0.0652 | 6.3787 | 3706 | 0.7619 | 0.7529 | 0.7619 | 0.8729 | | 0.0652 | 6.3821 | 3708 | 0.7638 | 0.7529 | 0.7638 | 0.8739 | | 0.0652 | 6.3855 | 3710 | 0.7668 | 0.7529 | 0.7668 | 0.8757 | | 0.0652 | 6.3890 | 3712 | 0.7691 | 0.7529 | 0.7691 | 0.8770 | | 0.0652 | 6.3924 | 3714 | 0.7895 | 0.7717 | 0.7895 | 0.8885 | | 0.0652 | 6.3959 | 3716 | 0.8179 | 0.7342 | 0.8179 | 0.9044 | | 0.0652 | 6.3993 | 3718 | 0.8817 | 0.7342 | 0.8817 | 0.9390 | | 0.0652 | 6.4028 | 3720 | 0.9613 | 0.7337 | 0.9613 | 0.9804 | | 0.0652 | 6.4062 | 3722 | 1.0041 | 0.7337 | 1.0041 | 1.0020 | | 0.0652 | 6.4096 | 3724 | 0.9846 | 0.7337 | 0.9846 | 0.9923 | | 0.0652 | 6.4131 | 3726 | 0.9269 | 0.7058 | 0.9269 | 0.9627 | | 0.0652 | 6.4165 | 3728 | 0.8507 | 0.7342 | 0.8507 | 0.9223 | | 0.0652 | 6.4200 | 3730 | 0.8204 | 0.7430 | 0.8204 | 0.9058 | | 0.0652 | 6.4234 | 3732 | 0.8113 | 0.7717 | 0.8113 | 0.9007 | | 0.0652 | 6.4269 | 3734 | 0.8332 | 0.7623 | 0.8332 | 0.9128 | | 0.0652 | 6.4303 | 3736 | 0.8641 | 0.7623 | 0.8641 | 0.9296 | | 0.0652 | 6.4337 | 3738 | 0.8900 | 0.7623 | 0.8900 | 0.9434 | | 0.0652 | 6.4372 | 3740 | 0.9218 | 0.7139 | 0.9218 | 0.9601 | | 0.0652 | 6.4406 | 3742 | 0.9525 | 0.7058 | 0.9525 | 0.9760 | | 0.0652 | 6.4441 | 3744 | 0.9374 | 0.7058 | 0.9374 | 0.9682 | | 0.0652 | 6.4475 | 3746 | 0.8802 | 0.7058 | 0.8802 | 0.9382 | | 0.0652 | 6.4509 | 3748 | 0.8357 | 0.7342 | 0.8357 | 0.9142 | | 0.0652 | 6.4544 | 3750 | 0.8013 | 0.7906 | 0.8013 | 0.8952 | | 0.0652 | 6.4578 | 3752 | 0.7908 | 0.7906 | 0.7908 | 0.8893 | | 0.0652 | 6.4613 | 3754 | 0.7902 | 0.7906 | 0.7902 | 0.8889 | | 0.0652 | 6.4647 | 3756 | 0.8116 | 0.7906 | 0.8116 | 0.9009 | | 0.0652 | 6.4682 | 3758 | 0.8203 | 0.7906 | 0.8203 | 0.9057 | | 0.0652 | 6.4716 | 3760 | 0.8345 | 0.7906 | 0.8345 | 0.9135 | | 0.0652 | 6.4750 | 3762 | 0.8399 | 0.7906 | 0.8399 | 0.9165 | | 0.0652 | 6.4785 | 3764 | 0.8616 | 0.7729 | 0.8616 | 0.9282 | | 0.0652 | 6.4819 | 3766 | 0.8572 | 0.7729 | 0.8572 | 0.9259 | | 0.0652 | 6.4854 | 3768 | 0.8627 | 0.7729 | 0.8627 | 0.9288 | | 0.0652 | 6.4888 | 3770 | 0.8758 | 0.7729 | 0.8758 | 0.9358 | | 0.0652 | 6.4923 | 3772 | 0.8558 | 0.7729 | 0.8558 | 0.9251 | | 0.0652 | 6.4957 | 3774 | 0.8117 | 0.7623 | 0.8117 | 0.9009 | | 0.0652 | 6.4991 | 3776 | 0.7964 | 0.6706 | 0.7964 | 0.8924 | | 0.0652 | 6.5026 | 3778 | 0.8109 | 0.6909 | 0.8109 | 0.9005 | | 0.0652 | 6.5060 | 3780 | 0.8317 | 0.7623 | 0.8317 | 0.9120 | | 0.0652 | 6.5095 | 3782 | 0.8441 | 0.7623 | 0.8441 | 0.9188 | | 0.0652 | 6.5129 | 3784 | 0.8312 | 0.7623 | 0.8312 | 0.9117 | | 0.0652 | 6.5164 | 3786 | 0.8023 | 0.7520 | 0.8023 | 0.8957 | | 0.0652 | 6.5198 | 3788 | 0.7806 | 0.7520 | 0.7806 | 0.8835 | | 0.0652 | 6.5232 | 3790 | 0.7778 | 0.7520 | 0.7778 | 0.8819 | | 0.0652 | 6.5267 | 3792 | 0.8021 | 0.7058 | 0.8021 | 0.8956 | | 0.0652 | 6.5301 | 3794 | 0.8382 | 0.7058 | 0.8382 | 0.9155 | | 0.0652 | 6.5336 | 3796 | 0.8740 | 0.7058 | 0.8740 | 0.9349 | | 0.0652 | 6.5370 | 3798 | 0.8763 | 0.7058 | 0.8763 | 0.9361 | | 0.0652 | 6.5404 | 3800 | 0.8751 | 0.7058 | 0.8751 | 0.9355 | | 0.0652 | 6.5439 | 3802 | 0.8538 | 0.7058 | 0.8538 | 0.9240 | | 0.0652 | 6.5473 | 3804 | 0.8194 | 0.7520 | 0.8194 | 0.9052 | | 0.0652 | 6.5508 | 3806 | 0.7947 | 0.7520 | 0.7947 | 0.8915 | | 0.0652 | 6.5542 | 3808 | 0.7959 | 0.7623 | 0.7959 | 0.8921 | | 0.0652 | 6.5577 | 3810 | 0.8016 | 0.7623 | 0.8016 | 0.8953 | | 0.0652 | 6.5611 | 3812 | 0.8155 | 0.7520 | 0.8155 | 0.9031 | | 0.0652 | 6.5645 | 3814 | 0.8440 | 0.7520 | 0.8440 | 0.9187 | | 0.0652 | 6.5680 | 3816 | 0.8704 | 0.7058 | 0.8704 | 0.9330 | | 0.0652 | 6.5714 | 3818 | 0.8823 | 0.7520 | 0.8823 | 0.9393 | | 0.0652 | 6.5749 | 3820 | 0.9029 | 0.7520 | 0.9029 | 0.9502 | | 0.0652 | 6.5783 | 3822 | 0.8794 | 0.7623 | 0.8794 | 0.9378 | | 0.0652 | 6.5818 | 3824 | 0.8351 | 0.7623 | 0.8351 | 0.9138 | | 0.0652 | 6.5852 | 3826 | 0.7757 | 0.6706 | 0.7757 | 0.8807 | | 0.0652 | 6.5886 | 3828 | 0.7468 | 0.6842 | 0.7468 | 0.8642 | | 0.0652 | 6.5921 | 3830 | 0.7398 | 0.6842 | 0.7398 | 0.8601 | | 0.0652 | 6.5955 | 3832 | 0.7512 | 0.6842 | 0.7512 | 0.8667 | | 0.0652 | 6.5990 | 3834 | 0.7754 | 0.6909 | 0.7754 | 0.8806 | | 0.0652 | 6.6024 | 3836 | 0.8067 | 0.7623 | 0.8067 | 0.8982 | | 0.0652 | 6.6059 | 3838 | 0.8377 | 0.7623 | 0.8377 | 0.9153 | | 0.0652 | 6.6093 | 3840 | 0.8488 | 0.7623 | 0.8488 | 0.9213 | | 0.0652 | 6.6127 | 3842 | 0.8338 | 0.7623 | 0.8338 | 0.9131 | | 0.0652 | 6.6162 | 3844 | 0.8055 | 0.7623 | 0.8055 | 0.8975 | | 0.0652 | 6.6196 | 3846 | 0.8144 | 0.7623 | 0.8144 | 0.9025 | | 0.0652 | 6.6231 | 3848 | 0.8153 | 0.7623 | 0.8153 | 0.9029 | | 0.0652 | 6.6265 | 3850 | 0.8044 | 0.7623 | 0.8044 | 0.8969 | | 0.0652 | 6.6299 | 3852 | 0.8136 | 0.7623 | 0.8136 | 0.9020 | | 0.0652 | 6.6334 | 3854 | 0.8332 | 0.7623 | 0.8332 | 0.9128 | | 0.0652 | 6.6368 | 3856 | 0.8622 | 0.7623 | 0.8622 | 0.9286 | | 0.0652 | 6.6403 | 3858 | 0.9230 | 0.7058 | 0.9230 | 0.9607 | | 0.0652 | 6.6437 | 3860 | 0.9468 | 0.6871 | 0.9468 | 0.9730 | | 0.0652 | 6.6472 | 3862 | 0.9231 | 0.6871 | 0.9231 | 0.9608 | | 0.0652 | 6.6506 | 3864 | 0.8706 | 0.7058 | 0.8706 | 0.9331 | | 0.0652 | 6.6540 | 3866 | 0.8433 | 0.7623 | 0.8433 | 0.9183 | | 0.0652 | 6.6575 | 3868 | 0.7990 | 0.7906 | 0.7990 | 0.8939 | | 0.0652 | 6.6609 | 3870 | 0.7605 | 0.7906 | 0.7605 | 0.8721 | | 0.0652 | 6.6644 | 3872 | 0.7642 | 0.7906 | 0.7642 | 0.8742 | | 0.0652 | 6.6678 | 3874 | 0.7861 | 0.7906 | 0.7861 | 0.8866 | | 0.0652 | 6.6713 | 3876 | 0.8091 | 0.7906 | 0.8091 | 0.8995 | | 0.0652 | 6.6747 | 3878 | 0.8305 | 0.7342 | 0.8305 | 0.9113 | | 0.0652 | 6.6781 | 3880 | 0.8590 | 0.7058 | 0.8590 | 0.9268 | | 0.0652 | 6.6816 | 3882 | 0.8658 | 0.7058 | 0.8658 | 0.9305 | | 0.0652 | 6.6850 | 3884 | 0.8316 | 0.7342 | 0.8316 | 0.9119 | | 0.0652 | 6.6885 | 3886 | 0.8069 | 0.7342 | 0.8069 | 0.8983 | | 0.0652 | 6.6919 | 3888 | 0.7626 | 0.7430 | 0.7626 | 0.8733 | | 0.0652 | 6.6954 | 3890 | 0.7417 | 0.7717 | 0.7417 | 0.8612 | | 0.0652 | 6.6988 | 3892 | 0.7568 | 0.7717 | 0.7568 | 0.8699 | | 0.0652 | 6.7022 | 3894 | 0.7786 | 0.7906 | 0.7786 | 0.8824 | | 0.0652 | 6.7057 | 3896 | 0.8169 | 0.7623 | 0.8169 | 0.9039 | | 0.0652 | 6.7091 | 3898 | 0.8425 | 0.7623 | 0.8425 | 0.9179 | | 0.0652 | 6.7126 | 3900 | 0.8412 | 0.7623 | 0.8412 | 0.9171 | | 0.0652 | 6.7160 | 3902 | 0.8191 | 0.7906 | 0.8191 | 0.9050 | | 0.0652 | 6.7194 | 3904 | 0.8170 | 0.7906 | 0.8170 | 0.9039 | | 0.0652 | 6.7229 | 3906 | 0.8193 | 0.7906 | 0.8193 | 0.9051 | | 0.0652 | 6.7263 | 3908 | 0.8486 | 0.7906 | 0.8486 | 0.9212 | | 0.0652 | 6.7298 | 3910 | 0.9102 | 0.7058 | 0.9102 | 0.9541 | | 0.0652 | 6.7332 | 3912 | 0.9699 | 0.7058 | 0.9699 | 0.9848 | | 0.0652 | 6.7367 | 3914 | 0.9733 | 0.7058 | 0.9733 | 0.9866 | | 0.0652 | 6.7401 | 3916 | 0.9352 | 0.7058 | 0.9352 | 0.9671 | | 0.0652 | 6.7435 | 3918 | 0.8674 | 0.7342 | 0.8674 | 0.9314 | | 0.0652 | 6.7470 | 3920 | 0.7866 | 0.7906 | 0.7866 | 0.8869 | | 0.0652 | 6.7504 | 3922 | 0.7306 | 0.7620 | 0.7306 | 0.8547 | | 0.0652 | 6.7539 | 3924 | 0.7142 | 0.7620 | 0.7142 | 0.8451 | | 0.0652 | 6.7573 | 3926 | 0.7211 | 0.7620 | 0.7211 | 0.8492 | | 0.0652 | 6.7608 | 3928 | 0.7491 | 0.7620 | 0.7491 | 0.8655 | | 0.0652 | 6.7642 | 3930 | 0.7976 | 0.7797 | 0.7976 | 0.8931 | | 0.0652 | 6.7676 | 3932 | 0.8499 | 0.7342 | 0.8499 | 0.9219 | | 0.0652 | 6.7711 | 3934 | 0.8539 | 0.7797 | 0.8539 | 0.9240 | | 0.0652 | 6.7745 | 3936 | 0.8567 | 0.7797 | 0.8567 | 0.9256 | | 0.0652 | 6.7780 | 3938 | 0.8593 | 0.7342 | 0.8593 | 0.9270 | | 0.0652 | 6.7814 | 3940 | 0.8602 | 0.7797 | 0.8602 | 0.9275 | | 0.0652 | 6.7849 | 3942 | 0.8297 | 0.7797 | 0.8297 | 0.9109 | | 0.0652 | 6.7883 | 3944 | 0.8142 | 0.7797 | 0.8142 | 0.9023 | | 0.0652 | 6.7917 | 3946 | 0.7916 | 0.7797 | 0.7916 | 0.8897 | | 0.0652 | 6.7952 | 3948 | 0.7806 | 0.7906 | 0.7806 | 0.8835 | | 0.0652 | 6.7986 | 3950 | 0.7443 | 0.7906 | 0.7443 | 0.8627 | | 0.0652 | 6.8021 | 3952 | 0.7240 | 0.7614 | 0.7240 | 0.8509 | | 0.0652 | 6.8055 | 3954 | 0.7274 | 0.7782 | 0.7274 | 0.8529 | | 0.0652 | 6.8090 | 3956 | 0.7229 | 0.7782 | 0.7229 | 0.8503 | | 0.0652 | 6.8124 | 3958 | 0.7387 | 0.7782 | 0.7387 | 0.8595 | | 0.0652 | 6.8158 | 3960 | 0.7655 | 0.7782 | 0.7655 | 0.8749 | | 0.0652 | 6.8193 | 3962 | 0.7952 | 0.7689 | 0.7952 | 0.8917 | | 0.0652 | 6.8227 | 3964 | 0.8225 | 0.7264 | 0.8225 | 0.9069 | | 0.0652 | 6.8262 | 3966 | 0.8571 | 0.7342 | 0.8571 | 0.9258 | | 0.0652 | 6.8296 | 3968 | 0.8705 | 0.7342 | 0.8705 | 0.9330 | | 0.0652 | 6.8330 | 3970 | 0.8907 | 0.7342 | 0.8907 | 0.9438 | | 0.0652 | 6.8365 | 3972 | 0.8972 | 0.7342 | 0.8972 | 0.9472 | | 0.0652 | 6.8399 | 3974 | 0.8858 | 0.7342 | 0.8858 | 0.9412 | | 0.0652 | 6.8434 | 3976 | 0.8673 | 0.7906 | 0.8673 | 0.9313 | | 0.0652 | 6.8468 | 3978 | 0.8834 | 0.7906 | 0.8834 | 0.9399 | | 0.0652 | 6.8503 | 3980 | 0.8876 | 0.7623 | 0.8876 | 0.9421 | | 0.0652 | 6.8537 | 3982 | 0.8775 | 0.7623 | 0.8775 | 0.9368 | | 0.0652 | 6.8571 | 3984 | 0.8667 | 0.7623 | 0.8667 | 0.9310 | | 0.0652 | 6.8606 | 3986 | 0.8655 | 0.7623 | 0.8655 | 0.9303 | | 0.0652 | 6.8640 | 3988 | 0.8627 | 0.7623 | 0.8627 | 0.9288 | | 0.0652 | 6.8675 | 3990 | 0.8813 | 0.7520 | 0.8813 | 0.9388 | | 0.0652 | 6.8709 | 3992 | 0.8971 | 0.7520 | 0.8971 | 0.9472 | | 0.0652 | 6.8744 | 3994 | 0.9183 | 0.7520 | 0.9183 | 0.9583 | | 0.0652 | 6.8778 | 3996 | 0.9346 | 0.7520 | 0.9346 | 0.9667 | | 0.0652 | 6.8812 | 3998 | 0.9157 | 0.7520 | 0.9157 | 0.9569 | | 0.0553 | 6.8847 | 4000 | 0.8700 | 0.7623 | 0.8700 | 0.9327 | | 0.0553 | 6.8881 | 4002 | 0.8454 | 0.7623 | 0.8454 | 0.9195 | | 0.0553 | 6.8916 | 4004 | 0.8371 | 0.7906 | 0.8371 | 0.9150 | | 0.0553 | 6.8950 | 4006 | 0.8322 | 0.7906 | 0.8322 | 0.9122 | | 0.0553 | 6.8985 | 4008 | 0.8266 | 0.7906 | 0.8266 | 0.9092 | | 0.0553 | 6.9019 | 4010 | 0.8392 | 0.7797 | 0.8392 | 0.9161 | | 0.0553 | 6.9053 | 4012 | 0.8561 | 0.7797 | 0.8561 | 0.9253 | | 0.0553 | 6.9088 | 4014 | 0.9040 | 0.7520 | 0.9040 | 0.9508 | | 0.0553 | 6.9122 | 4016 | 0.9592 | 0.7058 | 0.9592 | 0.9794 | | 0.0553 | 6.9157 | 4018 | 0.9799 | 0.7058 | 0.9799 | 0.9899 | | 0.0553 | 6.9191 | 4020 | 0.9934 | 0.7058 | 0.9934 | 0.9967 | | 0.0553 | 6.9225 | 4022 | 0.9612 | 0.7058 | 0.9612 | 0.9804 | | 0.0553 | 6.9260 | 4024 | 0.8977 | 0.7520 | 0.8977 | 0.9475 | | 0.0553 | 6.9294 | 4026 | 0.8221 | 0.7797 | 0.8221 | 0.9067 | | 0.0553 | 6.9329 | 4028 | 0.7536 | 0.7717 | 0.7536 | 0.8681 | | 0.0553 | 6.9363 | 4030 | 0.7257 | 0.7439 | 0.7257 | 0.8519 | | 0.0553 | 6.9398 | 4032 | 0.7354 | 0.7439 | 0.7354 | 0.8575 | | 0.0553 | 6.9432 | 4034 | 0.7739 | 0.7906 | 0.7739 | 0.8797 | | 0.0553 | 6.9466 | 4036 | 0.8302 | 0.7797 | 0.8302 | 0.9112 | | 0.0553 | 6.9501 | 4038 | 0.8814 | 0.7520 | 0.8814 | 0.9389 | | 0.0553 | 6.9535 | 4040 | 0.9398 | 0.7520 | 0.9398 | 0.9694 | | 0.0553 | 6.9570 | 4042 | 0.9961 | 0.7134 | 0.9961 | 0.9980 | | 0.0553 | 6.9604 | 4044 | 0.9993 | 0.7134 | 0.9993 | 0.9996 | | 0.0553 | 6.9639 | 4046 | 0.9622 | 0.7618 | 0.9622 | 0.9809 | | 0.0553 | 6.9673 | 4048 | 0.8996 | 0.7520 | 0.8996 | 0.9485 | | 0.0553 | 6.9707 | 4050 | 0.8356 | 0.7623 | 0.8356 | 0.9141 | | 0.0553 | 6.9742 | 4052 | 0.7801 | 0.7623 | 0.7801 | 0.8833 | | 0.0553 | 6.9776 | 4054 | 0.7540 | 0.7232 | 0.7540 | 0.8683 | | 0.0553 | 6.9811 | 4056 | 0.7566 | 0.7427 | 0.7566 | 0.8698 | | 0.0553 | 6.9845 | 4058 | 0.7764 | 0.7520 | 0.7764 | 0.8811 | | 0.0553 | 6.9880 | 4060 | 0.7997 | 0.7520 | 0.7997 | 0.8943 | | 0.0553 | 6.9914 | 4062 | 0.8331 | 0.7520 | 0.8331 | 0.9127 | | 0.0553 | 6.9948 | 4064 | 0.8441 | 0.7520 | 0.8441 | 0.9188 | | 0.0553 | 6.9983 | 4066 | 0.8335 | 0.7520 | 0.8335 | 0.9130 | | 0.0553 | 7.0017 | 4068 | 0.8478 | 0.7520 | 0.8478 | 0.9207 | | 0.0553 | 7.0052 | 4070 | 0.8977 | 0.7520 | 0.8977 | 0.9475 | | 0.0553 | 7.0086 | 4072 | 0.9286 | 0.7520 | 0.9286 | 0.9636 | | 0.0553 | 7.0120 | 4074 | 0.9271 | 0.7520 | 0.9271 | 0.9628 | | 0.0553 | 7.0155 | 4076 | 0.8949 | 0.7623 | 0.8949 | 0.9460 | | 0.0553 | 7.0189 | 4078 | 0.8815 | 0.7623 | 0.8815 | 0.9389 | | 0.0553 | 7.0224 | 4080 | 0.8830 | 0.7623 | 0.8830 | 0.9397 | | 0.0553 | 7.0258 | 4082 | 0.8926 | 0.7520 | 0.8926 | 0.9448 | | 0.0553 | 7.0293 | 4084 | 0.9091 | 0.7520 | 0.9091 | 0.9535 | | 0.0553 | 7.0327 | 4086 | 0.9215 | 0.7520 | 0.9215 | 0.9600 | | 0.0553 | 7.0361 | 4088 | 0.9228 | 0.7520 | 0.9228 | 0.9606 | | 0.0553 | 7.0396 | 4090 | 0.9481 | 0.7520 | 0.9481 | 0.9737 | | 0.0553 | 7.0430 | 4092 | 0.9505 | 0.7520 | 0.9505 | 0.9749 | | 0.0553 | 7.0465 | 4094 | 0.9685 | 0.7520 | 0.9685 | 0.9841 | | 0.0553 | 7.0499 | 4096 | 0.9726 | 0.7520 | 0.9726 | 0.9862 | | 0.0553 | 7.0534 | 4098 | 0.9389 | 0.7520 | 0.9389 | 0.9690 | | 0.0553 | 7.0568 | 4100 | 0.9113 | 0.7520 | 0.9113 | 0.9546 | | 0.0553 | 7.0602 | 4102 | 0.8989 | 0.7520 | 0.8989 | 0.9481 | | 0.0553 | 7.0637 | 4104 | 0.9127 | 0.7520 | 0.9127 | 0.9553 | | 0.0553 | 7.0671 | 4106 | 0.9443 | 0.7520 | 0.9443 | 0.9717 | | 0.0553 | 7.0706 | 4108 | 0.9621 | 0.7623 | 0.9621 | 0.9809 | | 0.0553 | 7.0740 | 4110 | 0.9632 | 0.7623 | 0.9632 | 0.9814 | | 0.0553 | 7.0775 | 4112 | 0.9533 | 0.7623 | 0.9533 | 0.9764 | | 0.0553 | 7.0809 | 4114 | 0.9583 | 0.7623 | 0.9583 | 0.9789 | | 0.0553 | 7.0843 | 4116 | 0.9793 | 0.7520 | 0.9793 | 0.9896 | | 0.0553 | 7.0878 | 4118 | 0.9716 | 0.7520 | 0.9716 | 0.9857 | | 0.0553 | 7.0912 | 4120 | 0.9283 | 0.7520 | 0.9283 | 0.9635 | | 0.0553 | 7.0947 | 4122 | 0.8949 | 0.7520 | 0.8949 | 0.9460 | | 0.0553 | 7.0981 | 4124 | 0.8712 | 0.7520 | 0.8712 | 0.9334 | | 0.0553 | 7.1015 | 4126 | 0.8814 | 0.7058 | 0.8814 | 0.9388 | | 0.0553 | 7.1050 | 4128 | 0.9258 | 0.7058 | 0.9258 | 0.9622 | | 0.0553 | 7.1084 | 4130 | 0.9420 | 0.7058 | 0.9420 | 0.9706 | | 0.0553 | 7.1119 | 4132 | 0.9443 | 0.7058 | 0.9443 | 0.9718 | | 0.0553 | 7.1153 | 4134 | 0.9452 | 0.7058 | 0.9452 | 0.9722 | | 0.0553 | 7.1188 | 4136 | 0.9270 | 0.7058 | 0.9270 | 0.9628 | | 0.0553 | 7.1222 | 4138 | 0.9054 | 0.7058 | 0.9054 | 0.9515 | | 0.0553 | 7.1256 | 4140 | 0.8668 | 0.7520 | 0.8668 | 0.9310 | | 0.0553 | 7.1291 | 4142 | 0.8435 | 0.7623 | 0.8435 | 0.9184 | | 0.0553 | 7.1325 | 4144 | 0.8388 | 0.7623 | 0.8388 | 0.9159 | | 0.0553 | 7.1360 | 4146 | 0.8275 | 0.7623 | 0.8275 | 0.9097 | | 0.0553 | 7.1394 | 4148 | 0.8160 | 0.7623 | 0.8160 | 0.9033 | | 0.0553 | 7.1429 | 4150 | 0.8234 | 0.7623 | 0.8234 | 0.9074 | | 0.0553 | 7.1463 | 4152 | 0.8574 | 0.7058 | 0.8574 | 0.9260 | | 0.0553 | 7.1497 | 4154 | 0.8568 | 0.7058 | 0.8568 | 0.9256 | | 0.0553 | 7.1532 | 4156 | 0.8205 | 0.7058 | 0.8205 | 0.9058 | | 0.0553 | 7.1566 | 4158 | 0.8106 | 0.7139 | 0.8106 | 0.9004 | | 0.0553 | 7.1601 | 4160 | 0.7977 | 0.7623 | 0.7977 | 0.8932 | | 0.0553 | 7.1635 | 4162 | 0.8233 | 0.7623 | 0.8233 | 0.9073 | | 0.0553 | 7.1670 | 4164 | 0.8299 | 0.7623 | 0.8299 | 0.9110 | | 0.0553 | 7.1704 | 4166 | 0.8448 | 0.7623 | 0.8448 | 0.9191 | | 0.0553 | 7.1738 | 4168 | 0.8452 | 0.7729 | 0.8452 | 0.9193 | | 0.0553 | 7.1773 | 4170 | 0.8333 | 0.7525 | 0.8333 | 0.9129 | | 0.0553 | 7.1807 | 4172 | 0.8125 | 0.7321 | 0.8125 | 0.9014 | | 0.0553 | 7.1842 | 4174 | 0.8147 | 0.7321 | 0.8147 | 0.9026 | | 0.0553 | 7.1876 | 4176 | 0.8238 | 0.7525 | 0.8238 | 0.9077 | | 0.0553 | 7.1910 | 4178 | 0.8546 | 0.7525 | 0.8546 | 0.9244 | | 0.0553 | 7.1945 | 4180 | 0.8565 | 0.7729 | 0.8565 | 0.9255 | | 0.0553 | 7.1979 | 4182 | 0.8683 | 0.7729 | 0.8683 | 0.9318 | | 0.0553 | 7.2014 | 4184 | 0.8543 | 0.7623 | 0.8543 | 0.9243 | | 0.0553 | 7.2048 | 4186 | 0.8500 | 0.7139 | 0.8500 | 0.9219 | | 0.0553 | 7.2083 | 4188 | 0.8673 | 0.7058 | 0.8673 | 0.9313 | | 0.0553 | 7.2117 | 4190 | 0.8928 | 0.7058 | 0.8928 | 0.9449 | | 0.0553 | 7.2151 | 4192 | 0.9480 | 0.7134 | 0.9480 | 0.9737 | | 0.0553 | 7.2186 | 4194 | 0.9689 | 0.7134 | 0.9689 | 0.9843 | | 0.0553 | 7.2220 | 4196 | 0.9629 | 0.7134 | 0.9629 | 0.9813 | | 0.0553 | 7.2255 | 4198 | 0.9649 | 0.7216 | 0.9649 | 0.9823 | | 0.0553 | 7.2289 | 4200 | 0.9620 | 0.7216 | 0.9620 | 0.9808 | | 0.0553 | 7.2324 | 4202 | 0.9605 | 0.7216 | 0.9605 | 0.9800 | | 0.0553 | 7.2358 | 4204 | 0.9598 | 0.7844 | 0.9598 | 0.9797 | | 0.0553 | 7.2392 | 4206 | 0.9503 | 0.7074 | 0.9503 | 0.9748 | | 0.0553 | 7.2427 | 4208 | 0.9367 | 0.7074 | 0.9367 | 0.9678 | | 0.0553 | 7.2461 | 4210 | 0.9219 | 0.7074 | 0.9219 | 0.9601 | | 0.0553 | 7.2496 | 4212 | 0.9332 | 0.7074 | 0.9332 | 0.9660 | | 0.0553 | 7.2530 | 4214 | 0.9694 | 0.7074 | 0.9694 | 0.9846 | | 0.0553 | 7.2565 | 4216 | 0.9778 | 0.7844 | 0.9778 | 0.9888 | | 0.0553 | 7.2599 | 4218 | 0.9615 | 0.7844 | 0.9615 | 0.9806 | | 0.0553 | 7.2633 | 4220 | 0.9302 | 0.7844 | 0.9302 | 0.9645 | | 0.0553 | 7.2668 | 4222 | 0.8716 | 0.6988 | 0.8716 | 0.9336 | | 0.0553 | 7.2702 | 4224 | 0.8074 | 0.6503 | 0.8074 | 0.8986 | | 0.0553 | 7.2737 | 4226 | 0.7698 | 0.6503 | 0.7698 | 0.8774 | | 0.0553 | 7.2771 | 4228 | 0.7661 | 0.6503 | 0.7661 | 0.8753 | | 0.0553 | 7.2806 | 4230 | 0.7814 | 0.6503 | 0.7814 | 0.8840 | | 0.0553 | 7.2840 | 4232 | 0.8173 | 0.7623 | 0.8173 | 0.9041 | | 0.0553 | 7.2874 | 4234 | 0.8714 | 0.7058 | 0.8714 | 0.9335 | | 0.0553 | 7.2909 | 4236 | 0.9402 | 0.7134 | 0.9402 | 0.9697 | | 0.0553 | 7.2943 | 4238 | 0.9622 | 0.7134 | 0.9622 | 0.9809 | | 0.0553 | 7.2978 | 4240 | 0.9504 | 0.7134 | 0.9504 | 0.9749 | | 0.0553 | 7.3012 | 4242 | 0.9029 | 0.7618 | 0.9029 | 0.9502 | | 0.0553 | 7.3046 | 4244 | 0.8352 | 0.7623 | 0.8352 | 0.9139 | | 0.0553 | 7.3081 | 4246 | 0.7784 | 0.6706 | 0.7784 | 0.8823 | | 0.0553 | 7.3115 | 4248 | 0.7493 | 0.6706 | 0.7493 | 0.8656 | | 0.0553 | 7.3150 | 4250 | 0.7398 | 0.6503 | 0.7398 | 0.8601 | | 0.0553 | 7.3184 | 4252 | 0.7422 | 0.6503 | 0.7422 | 0.8615 | | 0.0553 | 7.3219 | 4254 | 0.7580 | 0.6706 | 0.7580 | 0.8706 | | 0.0553 | 7.3253 | 4256 | 0.7959 | 0.6909 | 0.7959 | 0.8922 | | 0.0553 | 7.3287 | 4258 | 0.8212 | 0.7623 | 0.8212 | 0.9062 | | 0.0553 | 7.3322 | 4260 | 0.8589 | 0.7729 | 0.8589 | 0.9268 | | 0.0553 | 7.3356 | 4262 | 0.9029 | 0.7618 | 0.9029 | 0.9502 | | 0.0553 | 7.3391 | 4264 | 0.9356 | 0.7134 | 0.9356 | 0.9673 | | 0.0553 | 7.3425 | 4266 | 0.9581 | 0.7216 | 0.9581 | 0.9788 | | 0.0553 | 7.3460 | 4268 | 0.9446 | 0.7216 | 0.9446 | 0.9719 | | 0.0553 | 7.3494 | 4270 | 0.9145 | 0.7618 | 0.9145 | 0.9563 | | 0.0553 | 7.3528 | 4272 | 0.8741 | 0.7729 | 0.8741 | 0.9349 | | 0.0553 | 7.3563 | 4274 | 0.8493 | 0.7623 | 0.8493 | 0.9216 | | 0.0553 | 7.3597 | 4276 | 0.8185 | 0.7623 | 0.8185 | 0.9047 | | 0.0553 | 7.3632 | 4278 | 0.8092 | 0.7623 | 0.8092 | 0.8995 | | 0.0553 | 7.3666 | 4280 | 0.8085 | 0.6909 | 0.8085 | 0.8991 | | 0.0553 | 7.3701 | 4282 | 0.8132 | 0.7623 | 0.8132 | 0.9018 | | 0.0553 | 7.3735 | 4284 | 0.8106 | 0.7623 | 0.8106 | 0.9003 | | 0.0553 | 7.3769 | 4286 | 0.8248 | 0.7623 | 0.8248 | 0.9082 | | 0.0553 | 7.3804 | 4288 | 0.8215 | 0.7623 | 0.8215 | 0.9064 | | 0.0553 | 7.3838 | 4290 | 0.8174 | 0.7623 | 0.8174 | 0.9041 | | 0.0553 | 7.3873 | 4292 | 0.8243 | 0.7623 | 0.8243 | 0.9079 | | 0.0553 | 7.3907 | 4294 | 0.8418 | 0.7623 | 0.8418 | 0.9175 | | 0.0553 | 7.3941 | 4296 | 0.8628 | 0.7520 | 0.8628 | 0.9289 | | 0.0553 | 7.3976 | 4298 | 0.8699 | 0.7520 | 0.8699 | 0.9327 | | 0.0553 | 7.4010 | 4300 | 0.8500 | 0.7623 | 0.8500 | 0.9220 | | 0.0553 | 7.4045 | 4302 | 0.8092 | 0.7623 | 0.8092 | 0.8996 | | 0.0553 | 7.4079 | 4304 | 0.7859 | 0.7623 | 0.7859 | 0.8865 | | 0.0553 | 7.4114 | 4306 | 0.7750 | 0.6909 | 0.7750 | 0.8804 | | 0.0553 | 7.4148 | 4308 | 0.7937 | 0.7623 | 0.7937 | 0.8909 | | 0.0553 | 7.4182 | 4310 | 0.8280 | 0.7623 | 0.8280 | 0.9100 | | 0.0553 | 7.4217 | 4312 | 0.8423 | 0.7623 | 0.8423 | 0.9178 | | 0.0553 | 7.4251 | 4314 | 0.8625 | 0.7623 | 0.8625 | 0.9287 | | 0.0553 | 7.4286 | 4316 | 0.9035 | 0.7729 | 0.9035 | 0.9506 | | 0.0553 | 7.4320 | 4318 | 0.9265 | 0.7844 | 0.9265 | 0.9626 | | 0.0553 | 7.4355 | 4320 | 0.9455 | 0.7844 | 0.9455 | 0.9724 | | 0.0553 | 7.4389 | 4322 | 0.9350 | 0.7844 | 0.9350 | 0.9670 | | 0.0553 | 7.4423 | 4324 | 0.9075 | 0.7844 | 0.9075 | 0.9526 | | 0.0553 | 7.4458 | 4326 | 0.8673 | 0.7729 | 0.8673 | 0.9313 | | 0.0553 | 7.4492 | 4328 | 0.8458 | 0.7729 | 0.8458 | 0.9197 | | 0.0553 | 7.4527 | 4330 | 0.8391 | 0.7729 | 0.8391 | 0.9160 | | 0.0553 | 7.4561 | 4332 | 0.8216 | 0.7623 | 0.8216 | 0.9064 | | 0.0553 | 7.4596 | 4334 | 0.7962 | 0.6706 | 0.7962 | 0.8923 | | 0.0553 | 7.4630 | 4336 | 0.7988 | 0.7427 | 0.7988 | 0.8938 | | 0.0553 | 7.4664 | 4338 | 0.8018 | 0.7427 | 0.8018 | 0.8954 | | 0.0553 | 7.4699 | 4340 | 0.8244 | 0.7427 | 0.8244 | 0.9080 | | 0.0553 | 7.4733 | 4342 | 0.8413 | 0.7623 | 0.8413 | 0.9173 | | 0.0553 | 7.4768 | 4344 | 0.8603 | 0.7623 | 0.8603 | 0.9275 | | 0.0553 | 7.4802 | 4346 | 0.8796 | 0.7729 | 0.8796 | 0.9379 | | 0.0553 | 7.4836 | 4348 | 0.9061 | 0.7729 | 0.9061 | 0.9519 | | 0.0553 | 7.4871 | 4350 | 0.9006 | 0.7729 | 0.9006 | 0.9490 | | 0.0553 | 7.4905 | 4352 | 0.8842 | 0.7623 | 0.8842 | 0.9403 | | 0.0553 | 7.4940 | 4354 | 0.8577 | 0.7623 | 0.8577 | 0.9261 | | 0.0553 | 7.4974 | 4356 | 0.8547 | 0.7623 | 0.8547 | 0.9245 | | 0.0553 | 7.5009 | 4358 | 0.8751 | 0.7623 | 0.8751 | 0.9355 | | 0.0553 | 7.5043 | 4360 | 0.8830 | 0.7623 | 0.8830 | 0.9397 | | 0.0553 | 7.5077 | 4362 | 0.9019 | 0.7729 | 0.9019 | 0.9497 | | 0.0553 | 7.5112 | 4364 | 0.9174 | 0.7729 | 0.9174 | 0.9578 | | 0.0553 | 7.5146 | 4366 | 0.9232 | 0.7844 | 0.9232 | 0.9608 | | 0.0553 | 7.5181 | 4368 | 0.9131 | 0.7844 | 0.9131 | 0.9555 | | 0.0553 | 7.5215 | 4370 | 0.9225 | 0.7844 | 0.9225 | 0.9605 | | 0.0553 | 7.5250 | 4372 | 0.9270 | 0.7844 | 0.9270 | 0.9628 | | 0.0553 | 7.5284 | 4374 | 0.9140 | 0.7729 | 0.9140 | 0.9560 | | 0.0553 | 7.5318 | 4376 | 0.9119 | 0.7729 | 0.9119 | 0.9549 | | 0.0553 | 7.5353 | 4378 | 0.9104 | 0.7729 | 0.9104 | 0.9541 | | 0.0553 | 7.5387 | 4380 | 0.8942 | 0.7729 | 0.8942 | 0.9456 | | 0.0553 | 7.5422 | 4382 | 0.8664 | 0.7623 | 0.8664 | 0.9308 | | 0.0553 | 7.5456 | 4384 | 0.8531 | 0.7520 | 0.8531 | 0.9236 | | 0.0553 | 7.5491 | 4386 | 0.8403 | 0.7520 | 0.8403 | 0.9167 | | 0.0553 | 7.5525 | 4388 | 0.8263 | 0.7058 | 0.8263 | 0.9090 | | 0.0553 | 7.5559 | 4390 | 0.8374 | 0.7058 | 0.8374 | 0.9151 | | 0.0553 | 7.5594 | 4392 | 0.8453 | 0.7058 | 0.8453 | 0.9194 | | 0.0553 | 7.5628 | 4394 | 0.8687 | 0.7058 | 0.8687 | 0.9320 | | 0.0553 | 7.5663 | 4396 | 0.8876 | 0.7058 | 0.8876 | 0.9421 | | 0.0553 | 7.5697 | 4398 | 0.8726 | 0.7058 | 0.8726 | 0.9341 | | 0.0553 | 7.5731 | 4400 | 0.8467 | 0.7520 | 0.8467 | 0.9202 | | 0.0553 | 7.5766 | 4402 | 0.8238 | 0.7520 | 0.8238 | 0.9077 | | 0.0553 | 7.5800 | 4404 | 0.8223 | 0.7520 | 0.8223 | 0.9068 | | 0.0553 | 7.5835 | 4406 | 0.8429 | 0.7623 | 0.8429 | 0.9181 | | 0.0553 | 7.5869 | 4408 | 0.8600 | 0.7623 | 0.8600 | 0.9273 | | 0.0553 | 7.5904 | 4410 | 0.8624 | 0.7623 | 0.8624 | 0.9287 | | 0.0553 | 7.5938 | 4412 | 0.8796 | 0.7844 | 0.8796 | 0.9379 | | 0.0553 | 7.5972 | 4414 | 0.8827 | 0.7844 | 0.8827 | 0.9395 | | 0.0553 | 7.6007 | 4416 | 0.8919 | 0.7844 | 0.8919 | 0.9444 | | 0.0553 | 7.6041 | 4418 | 0.8796 | 0.7844 | 0.8796 | 0.9379 | | 0.0553 | 7.6076 | 4420 | 0.8496 | 0.7729 | 0.8496 | 0.9217 | | 0.0553 | 7.6110 | 4422 | 0.8457 | 0.7729 | 0.8457 | 0.9196 | | 0.0553 | 7.6145 | 4424 | 0.8640 | 0.7729 | 0.8640 | 0.9295 | | 0.0553 | 7.6179 | 4426 | 0.8795 | 0.7618 | 0.8795 | 0.9378 | | 0.0553 | 7.6213 | 4428 | 0.9191 | 0.7724 | 0.9191 | 0.9587 | | 0.0553 | 7.6248 | 4430 | 0.9694 | 0.7724 | 0.9694 | 0.9846 | | 0.0553 | 7.6282 | 4432 | 1.0048 | 0.7724 | 1.0048 | 1.0024 | | 0.0553 | 7.6317 | 4434 | 1.0397 | 0.7216 | 1.0397 | 1.0197 | | 0.0553 | 7.6351 | 4436 | 1.0435 | 0.7216 | 1.0435 | 1.0215 | | 0.0553 | 7.6386 | 4438 | 1.0254 | 0.7216 | 1.0254 | 1.0126 | | 0.0553 | 7.6420 | 4440 | 0.9852 | 0.7724 | 0.9852 | 0.9926 | | 0.0553 | 7.6454 | 4442 | 0.9569 | 0.7724 | 0.9569 | 0.9782 | | 0.0553 | 7.6489 | 4444 | 0.9081 | 0.7729 | 0.9081 | 0.9529 | | 0.0553 | 7.6523 | 4446 | 0.8543 | 0.7729 | 0.8543 | 0.9243 | | 0.0553 | 7.6558 | 4448 | 0.8237 | 0.7623 | 0.8237 | 0.9076 | | 0.0553 | 7.6592 | 4450 | 0.8066 | 0.7623 | 0.8066 | 0.8981 | | 0.0553 | 7.6627 | 4452 | 0.8133 | 0.7623 | 0.8133 | 0.9018 | | 0.0553 | 7.6661 | 4454 | 0.8469 | 0.7520 | 0.8469 | 0.9203 | | 0.0553 | 7.6695 | 4456 | 0.8810 | 0.7520 | 0.8810 | 0.9386 | | 0.0553 | 7.6730 | 4458 | 0.9098 | 0.7618 | 0.9098 | 0.9538 | | 0.0553 | 7.6764 | 4460 | 0.9120 | 0.7618 | 0.9120 | 0.9550 | | 0.0553 | 7.6799 | 4462 | 0.9067 | 0.7618 | 0.9067 | 0.9522 | | 0.0553 | 7.6833 | 4464 | 0.8911 | 0.7618 | 0.8911 | 0.9440 | | 0.0553 | 7.6867 | 4466 | 0.8586 | 0.7729 | 0.8586 | 0.9266 | | 0.0553 | 7.6902 | 4468 | 0.8493 | 0.7729 | 0.8493 | 0.9216 | | 0.0553 | 7.6936 | 4470 | 0.8321 | 0.7623 | 0.8321 | 0.9122 | | 0.0553 | 7.6971 | 4472 | 0.8126 | 0.7623 | 0.8126 | 0.9014 | | 0.0553 | 7.7005 | 4474 | 0.8073 | 0.7623 | 0.8073 | 0.8985 | | 0.0553 | 7.7040 | 4476 | 0.8166 | 0.7623 | 0.8166 | 0.9036 | | 0.0553 | 7.7074 | 4478 | 0.8445 | 0.7623 | 0.8445 | 0.9190 | | 0.0553 | 7.7108 | 4480 | 0.8933 | 0.7724 | 0.8933 | 0.9452 | | 0.0553 | 7.7143 | 4482 | 0.9318 | 0.7724 | 0.9318 | 0.9653 | | 0.0553 | 7.7177 | 4484 | 0.9645 | 0.7724 | 0.9645 | 0.9821 | | 0.0553 | 7.7212 | 4486 | 0.9780 | 0.7724 | 0.9780 | 0.9890 | | 0.0553 | 7.7246 | 4488 | 1.0045 | 0.7724 | 1.0045 | 1.0022 | | 0.0553 | 7.7281 | 4490 | 1.0281 | 0.7724 | 1.0281 | 1.0139 | | 0.0553 | 7.7315 | 4492 | 1.0229 | 0.7724 | 1.0229 | 1.0114 | | 0.0553 | 7.7349 | 4494 | 0.9920 | 0.7724 | 0.9920 | 0.9960 | | 0.0553 | 7.7384 | 4496 | 0.9439 | 0.7724 | 0.9439 | 0.9716 | | 0.0553 | 7.7418 | 4498 | 0.8928 | 0.7724 | 0.8928 | 0.9449 | | 0.0507 | 7.7453 | 4500 | 0.8792 | 0.7844 | 0.8792 | 0.9377 | | 0.0507 | 7.7487 | 4502 | 0.8671 | 0.7844 | 0.8671 | 0.9312 | | 0.0507 | 7.7522 | 4504 | 0.8742 | 0.7844 | 0.8742 | 0.9350 | | 0.0507 | 7.7556 | 4506 | 0.8727 | 0.7729 | 0.8727 | 0.9342 | | 0.0507 | 7.7590 | 4508 | 0.8689 | 0.7729 | 0.8689 | 0.9322 | | 0.0507 | 7.7625 | 4510 | 0.8610 | 0.7729 | 0.8610 | 0.9279 | | 0.0507 | 7.7659 | 4512 | 0.8408 | 0.7623 | 0.8408 | 0.9169 | | 0.0507 | 7.7694 | 4514 | 0.8138 | 0.7623 | 0.8138 | 0.9021 | | 0.0507 | 7.7728 | 4516 | 0.8191 | 0.7623 | 0.8191 | 0.9050 | | 0.0507 | 7.7762 | 4518 | 0.8372 | 0.7623 | 0.8372 | 0.9150 | | 0.0507 | 7.7797 | 4520 | 0.8636 | 0.7618 | 0.8636 | 0.9293 | | 0.0507 | 7.7831 | 4522 | 0.8840 | 0.7618 | 0.8840 | 0.9402 | | 0.0507 | 7.7866 | 4524 | 0.8821 | 0.7520 | 0.8821 | 0.9392 | | 0.0507 | 7.7900 | 4526 | 0.8546 | 0.7520 | 0.8546 | 0.9244 | | 0.0507 | 7.7935 | 4528 | 0.8300 | 0.7520 | 0.8300 | 0.9111 | | 0.0507 | 7.7969 | 4530 | 0.8197 | 0.7520 | 0.8197 | 0.9054 | | 0.0507 | 7.8003 | 4532 | 0.8195 | 0.7520 | 0.8195 | 0.9053 | | 0.0507 | 7.8038 | 4534 | 0.8278 | 0.7520 | 0.8278 | 0.9099 | | 0.0507 | 7.8072 | 4536 | 0.8375 | 0.7520 | 0.8375 | 0.9152 | | 0.0507 | 7.8107 | 4538 | 0.8472 | 0.7520 | 0.8472 | 0.9204 | | 0.0507 | 7.8141 | 4540 | 0.8391 | 0.7520 | 0.8391 | 0.9160 | | 0.0507 | 7.8176 | 4542 | 0.8413 | 0.7520 | 0.8413 | 0.9172 | | 0.0507 | 7.8210 | 4544 | 0.8408 | 0.7520 | 0.8408 | 0.9169 | | 0.0507 | 7.8244 | 4546 | 0.8554 | 0.7520 | 0.8554 | 0.9249 | | 0.0507 | 7.8279 | 4548 | 0.8654 | 0.7520 | 0.8654 | 0.9302 | | 0.0507 | 7.8313 | 4550 | 0.8819 | 0.7520 | 0.8819 | 0.9391 | | 0.0507 | 7.8348 | 4552 | 0.8836 | 0.7520 | 0.8836 | 0.9400 | | 0.0507 | 7.8382 | 4554 | 0.8781 | 0.7520 | 0.8781 | 0.9370 | | 0.0507 | 7.8417 | 4556 | 0.8479 | 0.7520 | 0.8479 | 0.9208 | | 0.0507 | 7.8451 | 4558 | 0.8406 | 0.7520 | 0.8406 | 0.9169 | | 0.0507 | 7.8485 | 4560 | 0.8510 | 0.7520 | 0.8510 | 0.9225 | | 0.0507 | 7.8520 | 4562 | 0.8664 | 0.7520 | 0.8664 | 0.9308 | | 0.0507 | 7.8554 | 4564 | 0.8727 | 0.7520 | 0.8727 | 0.9342 | | 0.0507 | 7.8589 | 4566 | 0.8714 | 0.7520 | 0.8714 | 0.9335 | | 0.0507 | 7.8623 | 4568 | 0.8542 | 0.7520 | 0.8542 | 0.9243 | | 0.0507 | 7.8657 | 4570 | 0.8377 | 0.7520 | 0.8377 | 0.9153 | | 0.0507 | 7.8692 | 4572 | 0.8235 | 0.7520 | 0.8235 | 0.9075 | | 0.0507 | 7.8726 | 4574 | 0.8077 | 0.7520 | 0.8077 | 0.8987 | | 0.0507 | 7.8761 | 4576 | 0.8147 | 0.7520 | 0.8147 | 0.9026 | | 0.0507 | 7.8795 | 4578 | 0.8307 | 0.7520 | 0.8307 | 0.9114 | | 0.0507 | 7.8830 | 4580 | 0.8402 | 0.7520 | 0.8402 | 0.9166 | | 0.0507 | 7.8864 | 4582 | 0.8374 | 0.7520 | 0.8374 | 0.9151 | | 0.0507 | 7.8898 | 4584 | 0.8367 | 0.7520 | 0.8367 | 0.9147 | | 0.0507 | 7.8933 | 4586 | 0.8409 | 0.7520 | 0.8409 | 0.9170 | | 0.0507 | 7.8967 | 4588 | 0.8598 | 0.7520 | 0.8598 | 0.9273 | | 0.0507 | 7.9002 | 4590 | 0.8880 | 0.7618 | 0.8880 | 0.9423 | | 0.0507 | 7.9036 | 4592 | 0.9256 | 0.7618 | 0.9256 | 0.9621 | | 0.0507 | 7.9071 | 4594 | 0.9717 | 0.7134 | 0.9717 | 0.9857 | | 0.0507 | 7.9105 | 4596 | 1.0091 | 0.7134 | 1.0091 | 1.0046 | | 0.0507 | 7.9139 | 4598 | 1.0196 | 0.7134 | 1.0196 | 1.0098 | | 0.0507 | 7.9174 | 4600 | 1.0042 | 0.7134 | 1.0042 | 1.0021 | | 0.0507 | 7.9208 | 4602 | 0.9752 | 0.7134 | 0.9752 | 0.9875 | | 0.0507 | 7.9243 | 4604 | 0.9627 | 0.7134 | 0.9627 | 0.9811 | | 0.0507 | 7.9277 | 4606 | 0.9675 | 0.7134 | 0.9675 | 0.9836 | | 0.0507 | 7.9312 | 4608 | 0.9694 | 0.7134 | 0.9694 | 0.9846 | | 0.0507 | 7.9346 | 4610 | 0.9752 | 0.7134 | 0.9752 | 0.9875 | | 0.0507 | 7.9380 | 4612 | 0.9913 | 0.7134 | 0.9913 | 0.9957 | | 0.0507 | 7.9415 | 4614 | 0.9842 | 0.7134 | 0.9842 | 0.9921 | | 0.0507 | 7.9449 | 4616 | 0.9575 | 0.7134 | 0.9575 | 0.9785 | | 0.0507 | 7.9484 | 4618 | 0.9281 | 0.7134 | 0.9281 | 0.9634 | | 0.0507 | 7.9518 | 4620 | 0.8942 | 0.7058 | 0.8942 | 0.9456 | | 0.0507 | 7.9552 | 4622 | 0.8770 | 0.7520 | 0.8770 | 0.9365 | | 0.0507 | 7.9587 | 4624 | 0.8677 | 0.7520 | 0.8677 | 0.9315 | | 0.0507 | 7.9621 | 4626 | 0.8753 | 0.7520 | 0.8753 | 0.9356 | | 0.0507 | 7.9656 | 4628 | 0.8754 | 0.7520 | 0.8754 | 0.9356 | | 0.0507 | 7.9690 | 4630 | 0.8865 | 0.7520 | 0.8865 | 0.9415 | | 0.0507 | 7.9725 | 4632 | 0.9030 | 0.7058 | 0.9030 | 0.9503 | | 0.0507 | 7.9759 | 4634 | 0.9239 | 0.7134 | 0.9239 | 0.9612 | | 0.0507 | 7.9793 | 4636 | 0.9654 | 0.7134 | 0.9654 | 0.9826 | | 0.0507 | 7.9828 | 4638 | 0.9852 | 0.7134 | 0.9852 | 0.9926 | | 0.0507 | 7.9862 | 4640 | 0.9812 | 0.7134 | 0.9812 | 0.9906 | | 0.0507 | 7.9897 | 4642 | 0.9786 | 0.7134 | 0.9786 | 0.9892 | | 0.0507 | 7.9931 | 4644 | 0.9620 | 0.7134 | 0.9620 | 0.9808 | | 0.0507 | 7.9966 | 4646 | 0.9598 | 0.7134 | 0.9598 | 0.9797 | | 0.0507 | 8.0 | 4648 | 0.9378 | 0.7618 | 0.9378 | 0.9684 | | 0.0507 | 8.0034 | 4650 | 0.9087 | 0.7729 | 0.9087 | 0.9533 | | 0.0507 | 8.0069 | 4652 | 0.8944 | 0.7623 | 0.8944 | 0.9458 | | 0.0507 | 8.0103 | 4654 | 0.9030 | 0.7729 | 0.9030 | 0.9503 | | 0.0507 | 8.0138 | 4656 | 0.9241 | 0.7729 | 0.9241 | 0.9613 | | 0.0507 | 8.0172 | 4658 | 0.9449 | 0.7729 | 0.9449 | 0.9721 | | 0.0507 | 8.0207 | 4660 | 0.9433 | 0.7618 | 0.9433 | 0.9713 | | 0.0507 | 8.0241 | 4662 | 0.9292 | 0.7618 | 0.9292 | 0.9639 | | 0.0507 | 8.0275 | 4664 | 0.9130 | 0.7729 | 0.9130 | 0.9555 | | 0.0507 | 8.0310 | 4666 | 0.8765 | 0.7623 | 0.8765 | 0.9362 | | 0.0507 | 8.0344 | 4668 | 0.8298 | 0.7623 | 0.8298 | 0.9109 | | 0.0507 | 8.0379 | 4670 | 0.7986 | 0.7427 | 0.7986 | 0.8936 | | 0.0507 | 8.0413 | 4672 | 0.7849 | 0.7427 | 0.7849 | 0.8859 | | 0.0507 | 8.0448 | 4674 | 0.7874 | 0.7427 | 0.7874 | 0.8874 | | 0.0507 | 8.0482 | 4676 | 0.8097 | 0.7427 | 0.8097 | 0.8998 | | 0.0507 | 8.0516 | 4678 | 0.8442 | 0.7623 | 0.8442 | 0.9188 | | 0.0507 | 8.0551 | 4680 | 0.8951 | 0.7623 | 0.8951 | 0.9461 | | 0.0507 | 8.0585 | 4682 | 0.9468 | 0.7134 | 0.9468 | 0.9730 | | 0.0507 | 8.0620 | 4684 | 0.9779 | 0.7134 | 0.9779 | 0.9889 | | 0.0507 | 8.0654 | 4686 | 0.9948 | 0.7134 | 0.9948 | 0.9974 | | 0.0507 | 8.0688 | 4688 | 1.0053 | 0.7134 | 1.0053 | 1.0026 | | 0.0507 | 8.0723 | 4690 | 1.0000 | 0.7134 | 1.0000 | 1.0000 | | 0.0507 | 8.0757 | 4692 | 0.9770 | 0.7134 | 0.9770 | 0.9884 | | 0.0507 | 8.0792 | 4694 | 0.9581 | 0.7134 | 0.9581 | 0.9788 | | 0.0507 | 8.0826 | 4696 | 0.9292 | 0.7134 | 0.9292 | 0.9639 | | 0.0507 | 8.0861 | 4698 | 0.8858 | 0.7520 | 0.8858 | 0.9412 | | 0.0507 | 8.0895 | 4700 | 0.8653 | 0.7520 | 0.8653 | 0.9302 | | 0.0507 | 8.0929 | 4702 | 0.8607 | 0.7520 | 0.8607 | 0.9278 | | 0.0507 | 8.0964 | 4704 | 0.8535 | 0.7623 | 0.8535 | 0.9238 | | 0.0507 | 8.0998 | 4706 | 0.8600 | 0.7623 | 0.8600 | 0.9274 | | 0.0507 | 8.1033 | 4708 | 0.8797 | 0.7623 | 0.8797 | 0.9379 | | 0.0507 | 8.1067 | 4710 | 0.8859 | 0.7520 | 0.8859 | 0.9412 | | 0.0507 | 8.1102 | 4712 | 0.8955 | 0.7520 | 0.8955 | 0.9463 | | 0.0507 | 8.1136 | 4714 | 0.9276 | 0.7618 | 0.9276 | 0.9631 | | 0.0507 | 8.1170 | 4716 | 0.9489 | 0.7618 | 0.9489 | 0.9741 | | 0.0507 | 8.1205 | 4718 | 0.9541 | 0.7618 | 0.9541 | 0.9768 | | 0.0507 | 8.1239 | 4720 | 0.9453 | 0.7618 | 0.9453 | 0.9723 | | 0.0507 | 8.1274 | 4722 | 0.9413 | 0.7618 | 0.9413 | 0.9702 | | 0.0507 | 8.1308 | 4724 | 0.9473 | 0.7618 | 0.9473 | 0.9733 | | 0.0507 | 8.1343 | 4726 | 0.9385 | 0.7618 | 0.9385 | 0.9688 | | 0.0507 | 8.1377 | 4728 | 0.9202 | 0.7618 | 0.9202 | 0.9593 | | 0.0507 | 8.1411 | 4730 | 0.9020 | 0.7520 | 0.9020 | 0.9497 | | 0.0507 | 8.1446 | 4732 | 0.8740 | 0.7623 | 0.8740 | 0.9349 | | 0.0507 | 8.1480 | 4734 | 0.8658 | 0.7623 | 0.8658 | 0.9305 | | 0.0507 | 8.1515 | 4736 | 0.8817 | 0.7623 | 0.8817 | 0.9390 | | 0.0507 | 8.1549 | 4738 | 0.8956 | 0.7623 | 0.8956 | 0.9463 | | 0.0507 | 8.1583 | 4740 | 0.9172 | 0.7623 | 0.9172 | 0.9577 | | 0.0507 | 8.1618 | 4742 | 0.9356 | 0.7520 | 0.9356 | 0.9673 | | 0.0507 | 8.1652 | 4744 | 0.9380 | 0.7520 | 0.9380 | 0.9685 | | 0.0507 | 8.1687 | 4746 | 0.9198 | 0.7520 | 0.9198 | 0.9591 | | 0.0507 | 8.1721 | 4748 | 0.8932 | 0.7623 | 0.8932 | 0.9451 | | 0.0507 | 8.1756 | 4750 | 0.8778 | 0.7623 | 0.8778 | 0.9369 | | 0.0507 | 8.1790 | 4752 | 0.8572 | 0.7623 | 0.8572 | 0.9258 | | 0.0507 | 8.1824 | 4754 | 0.8451 | 0.7623 | 0.8451 | 0.9193 | | 0.0507 | 8.1859 | 4756 | 0.8443 | 0.7623 | 0.8443 | 0.9189 | | 0.0507 | 8.1893 | 4758 | 0.8435 | 0.7623 | 0.8435 | 0.9184 | | 0.0507 | 8.1928 | 4760 | 0.8575 | 0.7623 | 0.8575 | 0.9260 | | 0.0507 | 8.1962 | 4762 | 0.8716 | 0.7623 | 0.8716 | 0.9336 | | 0.0507 | 8.1997 | 4764 | 0.8777 | 0.7623 | 0.8777 | 0.9369 | | 0.0507 | 8.2031 | 4766 | 0.8893 | 0.7623 | 0.8893 | 0.9430 | | 0.0507 | 8.2065 | 4768 | 0.8970 | 0.7623 | 0.8970 | 0.9471 | | 0.0507 | 8.2100 | 4770 | 0.8988 | 0.7623 | 0.8988 | 0.9481 | | 0.0507 | 8.2134 | 4772 | 0.9012 | 0.7623 | 0.9012 | 0.9493 | | 0.0507 | 8.2169 | 4774 | 0.8896 | 0.7623 | 0.8896 | 0.9432 | | 0.0507 | 8.2203 | 4776 | 0.8884 | 0.7623 | 0.8884 | 0.9425 | | 0.0507 | 8.2238 | 4778 | 0.8881 | 0.7623 | 0.8881 | 0.9424 | | 0.0507 | 8.2272 | 4780 | 0.8875 | 0.7623 | 0.8875 | 0.9421 | | 0.0507 | 8.2306 | 4782 | 0.8786 | 0.7623 | 0.8786 | 0.9373 | | 0.0507 | 8.2341 | 4784 | 0.8651 | 0.7623 | 0.8651 | 0.9301 | | 0.0507 | 8.2375 | 4786 | 0.8484 | 0.7623 | 0.8484 | 0.9211 | | 0.0507 | 8.2410 | 4788 | 0.8358 | 0.7623 | 0.8358 | 0.9142 | | 0.0507 | 8.2444 | 4790 | 0.8246 | 0.7623 | 0.8246 | 0.9081 | | 0.0507 | 8.2478 | 4792 | 0.8081 | 0.7623 | 0.8081 | 0.8989 | | 0.0507 | 8.2513 | 4794 | 0.8049 | 0.7623 | 0.8049 | 0.8971 | | 0.0507 | 8.2547 | 4796 | 0.8175 | 0.7623 | 0.8175 | 0.9042 | | 0.0507 | 8.2582 | 4798 | 0.8317 | 0.7623 | 0.8317 | 0.9120 | | 0.0507 | 8.2616 | 4800 | 0.8494 | 0.7623 | 0.8494 | 0.9216 | | 0.0507 | 8.2651 | 4802 | 0.8641 | 0.7623 | 0.8641 | 0.9296 | | 0.0507 | 8.2685 | 4804 | 0.8773 | 0.7058 | 0.8773 | 0.9366 | | 0.0507 | 8.2719 | 4806 | 0.8768 | 0.7058 | 0.8768 | 0.9364 | | 0.0507 | 8.2754 | 4808 | 0.8629 | 0.7058 | 0.8629 | 0.9289 | | 0.0507 | 8.2788 | 4810 | 0.8354 | 0.7623 | 0.8354 | 0.9140 | | 0.0507 | 8.2823 | 4812 | 0.8239 | 0.7623 | 0.8239 | 0.9077 | | 0.0507 | 8.2857 | 4814 | 0.8249 | 0.7623 | 0.8249 | 0.9082 | | 0.0507 | 8.2892 | 4816 | 0.8292 | 0.7623 | 0.8292 | 0.9106 | | 0.0507 | 8.2926 | 4818 | 0.8440 | 0.7623 | 0.8440 | 0.9187 | | 0.0507 | 8.2960 | 4820 | 0.8616 | 0.7623 | 0.8616 | 0.9282 | | 0.0507 | 8.2995 | 4822 | 0.9004 | 0.7623 | 0.9004 | 0.9489 | | 0.0507 | 8.3029 | 4824 | 0.9302 | 0.7058 | 0.9302 | 0.9645 | | 0.0507 | 8.3064 | 4826 | 0.9387 | 0.7623 | 0.9387 | 0.9689 | | 0.0507 | 8.3098 | 4828 | 0.9312 | 0.7623 | 0.9312 | 0.9650 | | 0.0507 | 8.3133 | 4830 | 0.9212 | 0.7623 | 0.9212 | 0.9598 | | 0.0507 | 8.3167 | 4832 | 0.9117 | 0.7623 | 0.9117 | 0.9548 | | 0.0507 | 8.3201 | 4834 | 0.9047 | 0.7623 | 0.9047 | 0.9512 | | 0.0507 | 8.3236 | 4836 | 0.8977 | 0.7623 | 0.8977 | 0.9475 | | 0.0507 | 8.3270 | 4838 | 0.9043 | 0.7623 | 0.9043 | 0.9509 | | 0.0507 | 8.3305 | 4840 | 0.9111 | 0.7139 | 0.9111 | 0.9545 | | 0.0507 | 8.3339 | 4842 | 0.9208 | 0.7139 | 0.9208 | 0.9596 | | 0.0507 | 8.3373 | 4844 | 0.9323 | 0.7058 | 0.9323 | 0.9656 | | 0.0507 | 8.3408 | 4846 | 0.9266 | 0.7058 | 0.9266 | 0.9626 | | 0.0507 | 8.3442 | 4848 | 0.9167 | 0.7139 | 0.9167 | 0.9575 | | 0.0507 | 8.3477 | 4850 | 0.9004 | 0.7139 | 0.9004 | 0.9489 | | 0.0507 | 8.3511 | 4852 | 0.8816 | 0.7623 | 0.8816 | 0.9389 | | 0.0507 | 8.3546 | 4854 | 0.8677 | 0.7623 | 0.8677 | 0.9315 | | 0.0507 | 8.3580 | 4856 | 0.8489 | 0.7623 | 0.8489 | 0.9214 | | 0.0507 | 8.3614 | 4858 | 0.8547 | 0.7623 | 0.8547 | 0.9245 | | 0.0507 | 8.3649 | 4860 | 0.8703 | 0.7623 | 0.8703 | 0.9329 | | 0.0507 | 8.3683 | 4862 | 0.8925 | 0.7623 | 0.8925 | 0.9447 | | 0.0507 | 8.3718 | 4864 | 0.9229 | 0.7623 | 0.9229 | 0.9607 | | 0.0507 | 8.3752 | 4866 | 0.9590 | 0.7221 | 0.9590 | 0.9793 | | 0.0507 | 8.3787 | 4868 | 0.9991 | 0.7221 | 0.9991 | 0.9996 | | 0.0507 | 8.3821 | 4870 | 1.0195 | 0.7134 | 1.0195 | 1.0097 | | 0.0507 | 8.3855 | 4872 | 1.0136 | 0.7134 | 1.0136 | 1.0068 | | 0.0507 | 8.3890 | 4874 | 0.9930 | 0.7221 | 0.9930 | 0.9965 | | 0.0507 | 8.3924 | 4876 | 0.9591 | 0.7221 | 0.9591 | 0.9793 | | 0.0507 | 8.3959 | 4878 | 0.9176 | 0.7623 | 0.9176 | 0.9579 | | 0.0507 | 8.3993 | 4880 | 0.8761 | 0.7623 | 0.8761 | 0.9360 | | 0.0507 | 8.4028 | 4882 | 0.8567 | 0.7623 | 0.8567 | 0.9256 | | 0.0507 | 8.4062 | 4884 | 0.8588 | 0.7623 | 0.8588 | 0.9267 | | 0.0507 | 8.4096 | 4886 | 0.8581 | 0.7139 | 0.8581 | 0.9264 | | 0.0507 | 8.4131 | 4888 | 0.8774 | 0.7058 | 0.8774 | 0.9367 | | 0.0507 | 8.4165 | 4890 | 0.9100 | 0.7058 | 0.9100 | 0.9540 | | 0.0507 | 8.4200 | 4892 | 0.9481 | 0.7058 | 0.9481 | 0.9737 | | 0.0507 | 8.4234 | 4894 | 0.9720 | 0.7134 | 0.9720 | 0.9859 | | 0.0507 | 8.4269 | 4896 | 0.9838 | 0.7134 | 0.9838 | 0.9919 | | 0.0507 | 8.4303 | 4898 | 0.9706 | 0.7134 | 0.9706 | 0.9852 | | 0.0507 | 8.4337 | 4900 | 0.9555 | 0.7134 | 0.9555 | 0.9775 | | 0.0507 | 8.4372 | 4902 | 0.9532 | 0.7618 | 0.9532 | 0.9763 | | 0.0507 | 8.4406 | 4904 | 0.9436 | 0.7729 | 0.9436 | 0.9714 | | 0.0507 | 8.4441 | 4906 | 0.9263 | 0.7729 | 0.9263 | 0.9624 | | 0.0507 | 8.4475 | 4908 | 0.9004 | 0.7729 | 0.9004 | 0.9489 | | 0.0507 | 8.4509 | 4910 | 0.8929 | 0.7729 | 0.8929 | 0.9450 | | 0.0507 | 8.4544 | 4912 | 0.9014 | 0.7729 | 0.9014 | 0.9494 | | 0.0507 | 8.4578 | 4914 | 0.9027 | 0.7729 | 0.9027 | 0.9501 | | 0.0507 | 8.4613 | 4916 | 0.9046 | 0.7729 | 0.9046 | 0.9511 | | 0.0507 | 8.4647 | 4918 | 0.9144 | 0.7729 | 0.9144 | 0.9562 | | 0.0507 | 8.4682 | 4920 | 0.9336 | 0.7729 | 0.9336 | 0.9662 | | 0.0507 | 8.4716 | 4922 | 0.9575 | 0.7729 | 0.9575 | 0.9785 | | 0.0507 | 8.4750 | 4924 | 0.9835 | 0.7216 | 0.9835 | 0.9917 | | 0.0507 | 8.4785 | 4926 | 0.9950 | 0.7216 | 0.9950 | 0.9975 | | 0.0507 | 8.4819 | 4928 | 0.9986 | 0.7216 | 0.9986 | 0.9993 | | 0.0507 | 8.4854 | 4930 | 0.9912 | 0.7216 | 0.9912 | 0.9956 | | 0.0507 | 8.4888 | 4932 | 0.9728 | 0.7216 | 0.9728 | 0.9863 | | 0.0507 | 8.4923 | 4934 | 0.9512 | 0.7134 | 0.9512 | 0.9753 | | 0.0507 | 8.4957 | 4936 | 0.9340 | 0.7134 | 0.9340 | 0.9664 | | 0.0507 | 8.4991 | 4938 | 0.9374 | 0.7134 | 0.9374 | 0.9682 | | 0.0507 | 8.5026 | 4940 | 0.9482 | 0.7216 | 0.9482 | 0.9738 | | 0.0507 | 8.5060 | 4942 | 0.9586 | 0.7216 | 0.9586 | 0.9791 | | 0.0507 | 8.5095 | 4944 | 0.9553 | 0.7216 | 0.9553 | 0.9774 | | 0.0507 | 8.5129 | 4946 | 0.9484 | 0.7216 | 0.9484 | 0.9739 | | 0.0507 | 8.5164 | 4948 | 0.9396 | 0.7216 | 0.9396 | 0.9693 | | 0.0507 | 8.5198 | 4950 | 0.9283 | 0.7724 | 0.9283 | 0.9635 | | 0.0507 | 8.5232 | 4952 | 0.9263 | 0.7724 | 0.9263 | 0.9625 | | 0.0507 | 8.5267 | 4954 | 0.9144 | 0.7618 | 0.9144 | 0.9562 | | 0.0507 | 8.5301 | 4956 | 0.8990 | 0.7729 | 0.8990 | 0.9482 | | 0.0507 | 8.5336 | 4958 | 0.8938 | 0.7729 | 0.8938 | 0.9454 | | 0.0507 | 8.5370 | 4960 | 0.9031 | 0.7729 | 0.9031 | 0.9503 | | 0.0507 | 8.5404 | 4962 | 0.9228 | 0.7724 | 0.9228 | 0.9606 | | 0.0507 | 8.5439 | 4964 | 0.9381 | 0.7724 | 0.9381 | 0.9685 | | 0.0507 | 8.5473 | 4966 | 0.9490 | 0.7724 | 0.9490 | 0.9742 | | 0.0507 | 8.5508 | 4968 | 0.9450 | 0.7724 | 0.9450 | 0.9721 | | 0.0507 | 8.5542 | 4970 | 0.9370 | 0.7618 | 0.9370 | 0.9680 | | 0.0507 | 8.5577 | 4972 | 0.9468 | 0.7618 | 0.9468 | 0.9730 | | 0.0507 | 8.5611 | 4974 | 0.9435 | 0.7618 | 0.9435 | 0.9713 | | 0.0507 | 8.5645 | 4976 | 0.9409 | 0.7520 | 0.9409 | 0.9700 | | 0.0507 | 8.5680 | 4978 | 0.9320 | 0.7520 | 0.9320 | 0.9654 | | 0.0507 | 8.5714 | 4980 | 0.9280 | 0.7520 | 0.9280 | 0.9633 | | 0.0507 | 8.5749 | 4982 | 0.9314 | 0.7520 | 0.9314 | 0.9651 | | 0.0507 | 8.5783 | 4984 | 0.9335 | 0.7520 | 0.9335 | 0.9662 | | 0.0507 | 8.5818 | 4986 | 0.9435 | 0.7058 | 0.9435 | 0.9713 | | 0.0507 | 8.5852 | 4988 | 0.9321 | 0.7520 | 0.9321 | 0.9655 | | 0.0507 | 8.5886 | 4990 | 0.9149 | 0.7520 | 0.9149 | 0.9565 | | 0.0507 | 8.5921 | 4992 | 0.9067 | 0.7520 | 0.9067 | 0.9522 | | 0.0507 | 8.5955 | 4994 | 0.8849 | 0.7623 | 0.8849 | 0.9407 | | 0.0507 | 8.5990 | 4996 | 0.8600 | 0.7623 | 0.8600 | 0.9274 | | 0.0507 | 8.6024 | 4998 | 0.8415 | 0.7623 | 0.8415 | 0.9173 | | 0.0451 | 8.6059 | 5000 | 0.8231 | 0.7623 | 0.8231 | 0.9073 | | 0.0451 | 8.6093 | 5002 | 0.8122 | 0.7623 | 0.8122 | 0.9012 | | 0.0451 | 8.6127 | 5004 | 0.8110 | 0.7623 | 0.8110 | 0.9006 | | 0.0451 | 8.6162 | 5006 | 0.8056 | 0.7623 | 0.8056 | 0.8975 | | 0.0451 | 8.6196 | 5008 | 0.8048 | 0.7623 | 0.8048 | 0.8971 | | 0.0451 | 8.6231 | 5010 | 0.8181 | 0.7623 | 0.8181 | 0.9045 | | 0.0451 | 8.6265 | 5012 | 0.8330 | 0.7520 | 0.8330 | 0.9127 | | 0.0451 | 8.6299 | 5014 | 0.8546 | 0.7520 | 0.8546 | 0.9244 | | 0.0451 | 8.6334 | 5016 | 0.8828 | 0.7058 | 0.8828 | 0.9396 | | 0.0451 | 8.6368 | 5018 | 0.9060 | 0.7058 | 0.9060 | 0.9518 | | 0.0451 | 8.6403 | 5020 | 0.9139 | 0.7058 | 0.9139 | 0.9560 | | 0.0451 | 8.6437 | 5022 | 0.9184 | 0.7520 | 0.9184 | 0.9583 | | 0.0451 | 8.6472 | 5024 | 0.9264 | 0.7520 | 0.9264 | 0.9625 | | 0.0451 | 8.6506 | 5026 | 0.9332 | 0.7520 | 0.9332 | 0.9660 | | 0.0451 | 8.6540 | 5028 | 0.9416 | 0.7724 | 0.9416 | 0.9704 | | 0.0451 | 8.6575 | 5030 | 0.9446 | 0.7724 | 0.9446 | 0.9719 | | 0.0451 | 8.6609 | 5032 | 0.9638 | 0.7724 | 0.9638 | 0.9817 | | 0.0451 | 8.6644 | 5034 | 0.9731 | 0.7724 | 0.9731 | 0.9864 | | 0.0451 | 8.6678 | 5036 | 0.9860 | 0.7724 | 0.9860 | 0.9930 | | 0.0451 | 8.6713 | 5038 | 0.9945 | 0.7724 | 0.9945 | 0.9972 | | 0.0451 | 8.6747 | 5040 | 0.9949 | 0.7724 | 0.9949 | 0.9974 | | 0.0451 | 8.6781 | 5042 | 0.9882 | 0.7724 | 0.9882 | 0.9941 | | 0.0451 | 8.6816 | 5044 | 0.9912 | 0.7724 | 0.9912 | 0.9956 | | 0.0451 | 8.6850 | 5046 | 0.9909 | 0.7724 | 0.9909 | 0.9954 | | 0.0451 | 8.6885 | 5048 | 0.9792 | 0.7724 | 0.9792 | 0.9895 | | 0.0451 | 8.6919 | 5050 | 0.9699 | 0.7724 | 0.9699 | 0.9848 | | 0.0451 | 8.6954 | 5052 | 0.9760 | 0.7724 | 0.9760 | 0.9879 | | 0.0451 | 8.6988 | 5054 | 0.9734 | 0.7618 | 0.9734 | 0.9866 | | 0.0451 | 8.7022 | 5056 | 0.9608 | 0.7618 | 0.9608 | 0.9802 | | 0.0451 | 8.7057 | 5058 | 0.9423 | 0.7618 | 0.9423 | 0.9707 | | 0.0451 | 8.7091 | 5060 | 0.9335 | 0.7618 | 0.9335 | 0.9662 | | 0.0451 | 8.7126 | 5062 | 0.9238 | 0.7520 | 0.9238 | 0.9611 | | 0.0451 | 8.7160 | 5064 | 0.9258 | 0.7520 | 0.9258 | 0.9622 | | 0.0451 | 8.7194 | 5066 | 0.9230 | 0.7520 | 0.9230 | 0.9607 | | 0.0451 | 8.7229 | 5068 | 0.9398 | 0.7618 | 0.9398 | 0.9694 | | 0.0451 | 8.7263 | 5070 | 0.9454 | 0.7618 | 0.9454 | 0.9723 | | 0.0451 | 8.7298 | 5072 | 0.9398 | 0.7618 | 0.9398 | 0.9694 | | 0.0451 | 8.7332 | 5074 | 0.9303 | 0.7520 | 0.9303 | 0.9645 | | 0.0451 | 8.7367 | 5076 | 0.9174 | 0.7520 | 0.9174 | 0.9578 | | 0.0451 | 8.7401 | 5078 | 0.9134 | 0.7623 | 0.9134 | 0.9557 | | 0.0451 | 8.7435 | 5080 | 0.9172 | 0.7623 | 0.9172 | 0.9577 | | 0.0451 | 8.7470 | 5082 | 0.9174 | 0.7623 | 0.9174 | 0.9578 | | 0.0451 | 8.7504 | 5084 | 0.9271 | 0.7729 | 0.9271 | 0.9629 | | 0.0451 | 8.7539 | 5086 | 0.9391 | 0.7729 | 0.9391 | 0.9691 | | 0.0451 | 8.7573 | 5088 | 0.9570 | 0.7724 | 0.9570 | 0.9783 | | 0.0451 | 8.7608 | 5090 | 0.9788 | 0.7724 | 0.9788 | 0.9894 | | 0.0451 | 8.7642 | 5092 | 0.9926 | 0.7515 | 0.9926 | 0.9963 | | 0.0451 | 8.7676 | 5094 | 1.0180 | 0.7515 | 1.0180 | 1.0089 | | 0.0451 | 8.7711 | 5096 | 1.0203 | 0.7515 | 1.0203 | 1.0101 | | 0.0451 | 8.7745 | 5098 | 1.0109 | 0.7515 | 1.0109 | 1.0054 | | 0.0451 | 8.7780 | 5100 | 0.9933 | 0.7626 | 0.9933 | 0.9967 | | 0.0451 | 8.7814 | 5102 | 0.9656 | 0.7844 | 0.9656 | 0.9826 | | 0.0451 | 8.7849 | 5104 | 0.9449 | 0.7844 | 0.9449 | 0.9721 | | 0.0451 | 8.7883 | 5106 | 0.9250 | 0.7729 | 0.9250 | 0.9618 | | 0.0451 | 8.7917 | 5108 | 0.9113 | 0.7729 | 0.9113 | 0.9546 | | 0.0451 | 8.7952 | 5110 | 0.9018 | 0.7729 | 0.9018 | 0.9496 | | 0.0451 | 8.7986 | 5112 | 0.8968 | 0.7729 | 0.8968 | 0.9470 | | 0.0451 | 8.8021 | 5114 | 0.8983 | 0.7729 | 0.8983 | 0.9478 | | 0.0451 | 8.8055 | 5116 | 0.9117 | 0.7844 | 0.9117 | 0.9548 | | 0.0451 | 8.8090 | 5118 | 0.9232 | 0.7844 | 0.9232 | 0.9608 | | 0.0451 | 8.8124 | 5120 | 0.9268 | 0.7844 | 0.9268 | 0.9627 | | 0.0451 | 8.8158 | 5122 | 0.9348 | 0.7844 | 0.9348 | 0.9669 | | 0.0451 | 8.8193 | 5124 | 0.9467 | 0.7844 | 0.9467 | 0.9730 | | 0.0451 | 8.8227 | 5126 | 0.9609 | 0.7844 | 0.9609 | 0.9802 | | 0.0451 | 8.8262 | 5128 | 0.9703 | 0.7844 | 0.9703 | 0.9851 | | 0.0451 | 8.8296 | 5130 | 0.9750 | 0.7844 | 0.9750 | 0.9874 | | 0.0451 | 8.8330 | 5132 | 0.9763 | 0.7844 | 0.9763 | 0.9881 | | 0.0451 | 8.8365 | 5134 | 0.9849 | 0.7844 | 0.9849 | 0.9924 | | 0.0451 | 8.8399 | 5136 | 1.0064 | 0.7101 | 1.0064 | 1.0032 | | 0.0451 | 8.8434 | 5138 | 1.0367 | 0.7101 | 1.0367 | 1.0182 | | 0.0451 | 8.8468 | 5140 | 1.0608 | 0.7014 | 1.0608 | 1.0299 | | 0.0451 | 8.8503 | 5142 | 1.0677 | 0.7014 | 1.0677 | 1.0333 | | 0.0451 | 8.8537 | 5144 | 1.0588 | 0.7101 | 1.0588 | 1.0290 | | 0.0451 | 8.8571 | 5146 | 1.0434 | 0.7101 | 1.0434 | 1.0215 | | 0.0451 | 8.8606 | 5148 | 1.0290 | 0.7626 | 1.0290 | 1.0144 | | 0.0451 | 8.8640 | 5150 | 1.0101 | 0.7626 | 1.0101 | 1.0051 | | 0.0451 | 8.8675 | 5152 | 0.9822 | 0.7844 | 0.9822 | 0.9910 | | 0.0451 | 8.8709 | 5154 | 0.9682 | 0.7844 | 0.9682 | 0.9840 | | 0.0451 | 8.8744 | 5156 | 0.9652 | 0.7844 | 0.9652 | 0.9824 | | 0.0451 | 8.8778 | 5158 | 0.9609 | 0.7844 | 0.9609 | 0.9803 | | 0.0451 | 8.8812 | 5160 | 0.9517 | 0.7729 | 0.9517 | 0.9756 | | 0.0451 | 8.8847 | 5162 | 0.9482 | 0.7729 | 0.9482 | 0.9737 | | 0.0451 | 8.8881 | 5164 | 0.9351 | 0.7729 | 0.9351 | 0.9670 | | 0.0451 | 8.8916 | 5166 | 0.9261 | 0.7729 | 0.9261 | 0.9623 | | 0.0451 | 8.8950 | 5168 | 0.9221 | 0.7139 | 0.9221 | 0.9603 | | 0.0451 | 8.8985 | 5170 | 0.9151 | 0.7139 | 0.9151 | 0.9566 | | 0.0451 | 8.9019 | 5172 | 0.9092 | 0.7139 | 0.9092 | 0.9535 | | 0.0451 | 8.9053 | 5174 | 0.9016 | 0.7058 | 0.9016 | 0.9495 | | 0.0451 | 8.9088 | 5176 | 0.9119 | 0.7058 | 0.9119 | 0.9549 | | 0.0451 | 8.9122 | 5178 | 0.9351 | 0.7058 | 0.9351 | 0.9670 | | 0.0451 | 8.9157 | 5180 | 0.9685 | 0.7058 | 0.9685 | 0.9841 | | 0.0451 | 8.9191 | 5182 | 1.0054 | 0.6940 | 1.0054 | 1.0027 | | 0.0451 | 8.9225 | 5184 | 1.0202 | 0.6940 | 1.0202 | 1.0101 | | 0.0451 | 8.9260 | 5186 | 1.0198 | 0.6940 | 1.0198 | 1.0099 | | 0.0451 | 8.9294 | 5188 | 1.0099 | 0.6940 | 1.0099 | 1.0049 | | 0.0451 | 8.9329 | 5190 | 0.9962 | 0.7134 | 0.9962 | 0.9981 | | 0.0451 | 8.9363 | 5192 | 0.9735 | 0.7134 | 0.9735 | 0.9867 | | 0.0451 | 8.9398 | 5194 | 0.9564 | 0.7134 | 0.9564 | 0.9780 | | 0.0451 | 8.9432 | 5196 | 0.9440 | 0.7134 | 0.9440 | 0.9716 | | 0.0451 | 8.9466 | 5198 | 0.9356 | 0.7134 | 0.9356 | 0.9673 | | 0.0451 | 8.9501 | 5200 | 0.9329 | 0.7134 | 0.9329 | 0.9658 | | 0.0451 | 8.9535 | 5202 | 0.9197 | 0.7618 | 0.9197 | 0.9590 | | 0.0451 | 8.9570 | 5204 | 0.9054 | 0.7623 | 0.9054 | 0.9515 | | 0.0451 | 8.9604 | 5206 | 0.9058 | 0.7729 | 0.9058 | 0.9517 | | 0.0451 | 8.9639 | 5208 | 0.9186 | 0.7729 | 0.9186 | 0.9584 | | 0.0451 | 8.9673 | 5210 | 0.9425 | 0.7618 | 0.9425 | 0.9708 | | 0.0451 | 8.9707 | 5212 | 0.9702 | 0.7724 | 0.9702 | 0.9850 | | 0.0451 | 8.9742 | 5214 | 0.9873 | 0.7724 | 0.9873 | 0.9936 | | 0.0451 | 8.9776 | 5216 | 0.9981 | 0.7216 | 0.9981 | 0.9991 | | 0.0451 | 8.9811 | 5218 | 1.0021 | 0.7216 | 1.0021 | 1.0011 | | 0.0451 | 8.9845 | 5220 | 1.0047 | 0.7216 | 1.0047 | 1.0023 | | 0.0451 | 8.9880 | 5222 | 1.0032 | 0.7724 | 1.0032 | 1.0016 | | 0.0451 | 8.9914 | 5224 | 0.9932 | 0.7724 | 0.9932 | 0.9966 | | 0.0451 | 8.9948 | 5226 | 0.9751 | 0.7724 | 0.9751 | 0.9875 | | 0.0451 | 8.9983 | 5228 | 0.9509 | 0.7618 | 0.9509 | 0.9751 | | 0.0451 | 9.0017 | 5230 | 0.9394 | 0.7618 | 0.9394 | 0.9692 | | 0.0451 | 9.0052 | 5232 | 0.9360 | 0.7618 | 0.9360 | 0.9675 | | 0.0451 | 9.0086 | 5234 | 0.9240 | 0.7618 | 0.9240 | 0.9613 | | 0.0451 | 9.0120 | 5236 | 0.9224 | 0.7618 | 0.9224 | 0.9604 | | 0.0451 | 9.0155 | 5238 | 0.9269 | 0.7618 | 0.9269 | 0.9627 | | 0.0451 | 9.0189 | 5240 | 0.9299 | 0.7618 | 0.9299 | 0.9643 | | 0.0451 | 9.0224 | 5242 | 0.9379 | 0.7618 | 0.9379 | 0.9684 | | 0.0451 | 9.0258 | 5244 | 0.9500 | 0.7724 | 0.9500 | 0.9747 | | 0.0451 | 9.0293 | 5246 | 0.9522 | 0.7724 | 0.9522 | 0.9758 | | 0.0451 | 9.0327 | 5248 | 0.9433 | 0.7618 | 0.9433 | 0.9712 | | 0.0451 | 9.0361 | 5250 | 0.9325 | 0.7618 | 0.9325 | 0.9657 | | 0.0451 | 9.0396 | 5252 | 0.9292 | 0.7623 | 0.9292 | 0.9640 | | 0.0451 | 9.0430 | 5254 | 0.9315 | 0.7729 | 0.9315 | 0.9651 | | 0.0451 | 9.0465 | 5256 | 0.9415 | 0.7618 | 0.9415 | 0.9703 | | 0.0451 | 9.0499 | 5258 | 0.9490 | 0.7618 | 0.9490 | 0.9741 | | 0.0451 | 9.0534 | 5260 | 0.9655 | 0.7724 | 0.9655 | 0.9826 | | 0.0451 | 9.0568 | 5262 | 0.9782 | 0.7724 | 0.9782 | 0.9890 | | 0.0451 | 9.0602 | 5264 | 0.9861 | 0.7724 | 0.9861 | 0.9930 | | 0.0451 | 9.0637 | 5266 | 0.9857 | 0.7724 | 0.9857 | 0.9928 | | 0.0451 | 9.0671 | 5268 | 0.9747 | 0.7724 | 0.9747 | 0.9872 | | 0.0451 | 9.0706 | 5270 | 0.9605 | 0.7618 | 0.9605 | 0.9801 | | 0.0451 | 9.0740 | 5272 | 0.9436 | 0.7520 | 0.9436 | 0.9714 | | 0.0451 | 9.0775 | 5274 | 0.9316 | 0.7520 | 0.9316 | 0.9652 | | 0.0451 | 9.0809 | 5276 | 0.9320 | 0.7520 | 0.9320 | 0.9654 | | 0.0451 | 9.0843 | 5278 | 0.9243 | 0.7520 | 0.9243 | 0.9614 | | 0.0451 | 9.0878 | 5280 | 0.9258 | 0.7520 | 0.9258 | 0.9622 | | 0.0451 | 9.0912 | 5282 | 0.9273 | 0.7058 | 0.9273 | 0.9630 | | 0.0451 | 9.0947 | 5284 | 0.9368 | 0.7058 | 0.9368 | 0.9679 | | 0.0451 | 9.0981 | 5286 | 0.9443 | 0.7058 | 0.9443 | 0.9717 | | 0.0451 | 9.1015 | 5288 | 0.9428 | 0.7058 | 0.9428 | 0.9710 | | 0.0451 | 9.1050 | 5290 | 0.9367 | 0.7520 | 0.9367 | 0.9678 | | 0.0451 | 9.1084 | 5292 | 0.9306 | 0.7520 | 0.9306 | 0.9647 | | 0.0451 | 9.1119 | 5294 | 0.9317 | 0.7520 | 0.9317 | 0.9653 | | 0.0451 | 9.1153 | 5296 | 0.9376 | 0.7618 | 0.9376 | 0.9683 | | 0.0451 | 9.1188 | 5298 | 0.9456 | 0.7618 | 0.9456 | 0.9724 | | 0.0451 | 9.1222 | 5300 | 0.9456 | 0.7618 | 0.9456 | 0.9724 | | 0.0451 | 9.1256 | 5302 | 0.9552 | 0.7724 | 0.9552 | 0.9774 | | 0.0451 | 9.1291 | 5304 | 0.9566 | 0.7724 | 0.9566 | 0.9781 | | 0.0451 | 9.1325 | 5306 | 0.9539 | 0.7724 | 0.9539 | 0.9767 | | 0.0451 | 9.1360 | 5308 | 0.9477 | 0.7618 | 0.9477 | 0.9735 | | 0.0451 | 9.1394 | 5310 | 0.9425 | 0.7618 | 0.9425 | 0.9708 | | 0.0451 | 9.1429 | 5312 | 0.9313 | 0.7618 | 0.9313 | 0.9651 | | 0.0451 | 9.1463 | 5314 | 0.9150 | 0.7623 | 0.9150 | 0.9566 | | 0.0451 | 9.1497 | 5316 | 0.9052 | 0.7623 | 0.9052 | 0.9514 | | 0.0451 | 9.1532 | 5318 | 0.9049 | 0.7520 | 0.9049 | 0.9513 | | 0.0451 | 9.1566 | 5320 | 0.9123 | 0.7520 | 0.9123 | 0.9551 | | 0.0451 | 9.1601 | 5322 | 0.9162 | 0.7520 | 0.9162 | 0.9572 | | 0.0451 | 9.1635 | 5324 | 0.9270 | 0.7618 | 0.9270 | 0.9628 | | 0.0451 | 9.1670 | 5326 | 0.9357 | 0.7618 | 0.9357 | 0.9673 | | 0.0451 | 9.1704 | 5328 | 0.9395 | 0.7618 | 0.9395 | 0.9693 | | 0.0451 | 9.1738 | 5330 | 0.9444 | 0.7618 | 0.9444 | 0.9718 | | 0.0451 | 9.1773 | 5332 | 0.9372 | 0.7618 | 0.9372 | 0.9681 | | 0.0451 | 9.1807 | 5334 | 0.9315 | 0.7618 | 0.9315 | 0.9651 | | 0.0451 | 9.1842 | 5336 | 0.9296 | 0.7618 | 0.9296 | 0.9641 | | 0.0451 | 9.1876 | 5338 | 0.9345 | 0.7618 | 0.9345 | 0.9667 | | 0.0451 | 9.1910 | 5340 | 0.9319 | 0.7618 | 0.9319 | 0.9653 | | 0.0451 | 9.1945 | 5342 | 0.9209 | 0.7729 | 0.9209 | 0.9596 | | 0.0451 | 9.1979 | 5344 | 0.9156 | 0.7729 | 0.9156 | 0.9569 | | 0.0451 | 9.2014 | 5346 | 0.9126 | 0.7729 | 0.9126 | 0.9553 | | 0.0451 | 9.2048 | 5348 | 0.9066 | 0.7623 | 0.9066 | 0.9522 | | 0.0451 | 9.2083 | 5350 | 0.9089 | 0.7623 | 0.9089 | 0.9534 | | 0.0451 | 9.2117 | 5352 | 0.9157 | 0.7729 | 0.9157 | 0.9569 | | 0.0451 | 9.2151 | 5354 | 0.9227 | 0.7729 | 0.9227 | 0.9606 | | 0.0451 | 9.2186 | 5356 | 0.9258 | 0.7618 | 0.9258 | 0.9622 | | 0.0451 | 9.2220 | 5358 | 0.9335 | 0.7618 | 0.9335 | 0.9662 | | 0.0451 | 9.2255 | 5360 | 0.9343 | 0.7618 | 0.9343 | 0.9666 | | 0.0451 | 9.2289 | 5362 | 0.9375 | 0.7618 | 0.9375 | 0.9682 | | 0.0451 | 9.2324 | 5364 | 0.9450 | 0.7618 | 0.9450 | 0.9721 | | 0.0451 | 9.2358 | 5366 | 0.9461 | 0.7618 | 0.9461 | 0.9727 | | 0.0451 | 9.2392 | 5368 | 0.9483 | 0.7618 | 0.9483 | 0.9738 | | 0.0451 | 9.2427 | 5370 | 0.9440 | 0.7618 | 0.9440 | 0.9716 | | 0.0451 | 9.2461 | 5372 | 0.9448 | 0.7618 | 0.9448 | 0.9720 | | 0.0451 | 9.2496 | 5374 | 0.9392 | 0.7618 | 0.9392 | 0.9691 | | 0.0451 | 9.2530 | 5376 | 0.9262 | 0.7520 | 0.9262 | 0.9624 | | 0.0451 | 9.2565 | 5378 | 0.9081 | 0.7520 | 0.9081 | 0.9529 | | 0.0451 | 9.2599 | 5380 | 0.8914 | 0.7623 | 0.8914 | 0.9441 | | 0.0451 | 9.2633 | 5382 | 0.8802 | 0.7623 | 0.8802 | 0.9382 | | 0.0451 | 9.2668 | 5384 | 0.8791 | 0.7623 | 0.8791 | 0.9376 | | 0.0451 | 9.2702 | 5386 | 0.8852 | 0.7520 | 0.8852 | 0.9409 | | 0.0451 | 9.2737 | 5388 | 0.8846 | 0.7520 | 0.8846 | 0.9406 | | 0.0451 | 9.2771 | 5390 | 0.8837 | 0.7520 | 0.8837 | 0.9401 | | 0.0451 | 9.2806 | 5392 | 0.8859 | 0.7520 | 0.8859 | 0.9412 | | 0.0451 | 9.2840 | 5394 | 0.8861 | 0.7520 | 0.8861 | 0.9413 | | 0.0451 | 9.2874 | 5396 | 0.8853 | 0.7520 | 0.8853 | 0.9409 | | 0.0451 | 9.2909 | 5398 | 0.8843 | 0.7623 | 0.8843 | 0.9404 | | 0.0451 | 9.2943 | 5400 | 0.8795 | 0.7623 | 0.8795 | 0.9378 | | 0.0451 | 9.2978 | 5402 | 0.8805 | 0.7623 | 0.8805 | 0.9383 | | 0.0451 | 9.3012 | 5404 | 0.8788 | 0.7623 | 0.8788 | 0.9374 | | 0.0451 | 9.3046 | 5406 | 0.8738 | 0.7623 | 0.8738 | 0.9348 | | 0.0451 | 9.3081 | 5408 | 0.8743 | 0.7623 | 0.8743 | 0.9350 | | 0.0451 | 9.3115 | 5410 | 0.8772 | 0.7623 | 0.8772 | 0.9366 | | 0.0451 | 9.3150 | 5412 | 0.8884 | 0.7623 | 0.8884 | 0.9426 | | 0.0451 | 9.3184 | 5414 | 0.9009 | 0.7623 | 0.9009 | 0.9492 | | 0.0451 | 9.3219 | 5416 | 0.9090 | 0.7520 | 0.9090 | 0.9534 | | 0.0451 | 9.3253 | 5418 | 0.9227 | 0.7520 | 0.9227 | 0.9606 | | 0.0451 | 9.3287 | 5420 | 0.9336 | 0.7520 | 0.9336 | 0.9662 | | 0.0451 | 9.3322 | 5422 | 0.9471 | 0.7724 | 0.9471 | 0.9732 | | 0.0451 | 9.3356 | 5424 | 0.9550 | 0.7724 | 0.9550 | 0.9772 | | 0.0451 | 9.3391 | 5426 | 0.9579 | 0.7724 | 0.9579 | 0.9787 | | 0.0451 | 9.3425 | 5428 | 0.9548 | 0.7724 | 0.9548 | 0.9771 | | 0.0451 | 9.3460 | 5430 | 0.9560 | 0.7724 | 0.9560 | 0.9778 | | 0.0451 | 9.3494 | 5432 | 0.9553 | 0.7724 | 0.9553 | 0.9774 | | 0.0451 | 9.3528 | 5434 | 0.9496 | 0.7618 | 0.9496 | 0.9745 | | 0.0451 | 9.3563 | 5436 | 0.9494 | 0.7618 | 0.9494 | 0.9744 | | 0.0451 | 9.3597 | 5438 | 0.9448 | 0.7520 | 0.9448 | 0.9720 | | 0.0451 | 9.3632 | 5440 | 0.9350 | 0.7520 | 0.9350 | 0.9670 | | 0.0451 | 9.3666 | 5442 | 0.9249 | 0.7520 | 0.9249 | 0.9617 | | 0.0451 | 9.3701 | 5444 | 0.9123 | 0.7520 | 0.9123 | 0.9551 | | 0.0451 | 9.3735 | 5446 | 0.8979 | 0.7520 | 0.8979 | 0.9476 | | 0.0451 | 9.3769 | 5448 | 0.8820 | 0.7520 | 0.8820 | 0.9392 | | 0.0451 | 9.3804 | 5450 | 0.8640 | 0.7623 | 0.8640 | 0.9295 | | 0.0451 | 9.3838 | 5452 | 0.8498 | 0.7623 | 0.8498 | 0.9218 | | 0.0451 | 9.3873 | 5454 | 0.8391 | 0.7623 | 0.8391 | 0.9160 | | 0.0451 | 9.3907 | 5456 | 0.8350 | 0.7623 | 0.8350 | 0.9138 | | 0.0451 | 9.3941 | 5458 | 0.8324 | 0.7623 | 0.8324 | 0.9124 | | 0.0451 | 9.3976 | 5460 | 0.8361 | 0.7623 | 0.8361 | 0.9144 | | 0.0451 | 9.4010 | 5462 | 0.8427 | 0.7623 | 0.8427 | 0.9180 | | 0.0451 | 9.4045 | 5464 | 0.8417 | 0.7623 | 0.8417 | 0.9175 | | 0.0451 | 9.4079 | 5466 | 0.8449 | 0.7623 | 0.8449 | 0.9192 | | 0.0451 | 9.4114 | 5468 | 0.8540 | 0.7623 | 0.8540 | 0.9241 | | 0.0451 | 9.4148 | 5470 | 0.8633 | 0.7623 | 0.8633 | 0.9291 | | 0.0451 | 9.4182 | 5472 | 0.8726 | 0.7623 | 0.8726 | 0.9341 | | 0.0451 | 9.4217 | 5474 | 0.8838 | 0.7623 | 0.8838 | 0.9401 | | 0.0451 | 9.4251 | 5476 | 0.8979 | 0.7623 | 0.8979 | 0.9476 | | 0.0451 | 9.4286 | 5478 | 0.9101 | 0.7623 | 0.9101 | 0.9540 | | 0.0451 | 9.4320 | 5480 | 0.9252 | 0.7729 | 0.9252 | 0.9619 | | 0.0451 | 9.4355 | 5482 | 0.9348 | 0.7724 | 0.9348 | 0.9668 | | 0.0451 | 9.4389 | 5484 | 0.9378 | 0.7724 | 0.9378 | 0.9684 | | 0.0451 | 9.4423 | 5486 | 0.9361 | 0.7724 | 0.9361 | 0.9675 | | 0.0451 | 9.4458 | 5488 | 0.9272 | 0.7844 | 0.9272 | 0.9629 | | 0.0451 | 9.4492 | 5490 | 0.9219 | 0.7844 | 0.9219 | 0.9602 | | 0.0451 | 9.4527 | 5492 | 0.9184 | 0.7844 | 0.9184 | 0.9583 | | 0.0451 | 9.4561 | 5494 | 0.9219 | 0.7844 | 0.9219 | 0.9601 | | 0.0451 | 9.4596 | 5496 | 0.9241 | 0.7844 | 0.9241 | 0.9613 | | 0.0451 | 9.4630 | 5498 | 0.9192 | 0.7844 | 0.9192 | 0.9587 | | 0.041 | 9.4664 | 5500 | 0.9151 | 0.7844 | 0.9151 | 0.9566 | | 0.041 | 9.4699 | 5502 | 0.9152 | 0.7844 | 0.9152 | 0.9567 | | 0.041 | 9.4733 | 5504 | 0.9190 | 0.7844 | 0.9190 | 0.9586 | | 0.041 | 9.4768 | 5506 | 0.9187 | 0.7844 | 0.9187 | 0.9585 | | 0.041 | 9.4802 | 5508 | 0.9126 | 0.7844 | 0.9126 | 0.9553 | | 0.041 | 9.4836 | 5510 | 0.9119 | 0.7844 | 0.9119 | 0.9549 | | 0.041 | 9.4871 | 5512 | 0.9129 | 0.7724 | 0.9129 | 0.9555 | | 0.041 | 9.4905 | 5514 | 0.9125 | 0.7618 | 0.9125 | 0.9553 | | 0.041 | 9.4940 | 5516 | 0.9157 | 0.7724 | 0.9157 | 0.9569 | | 0.041 | 9.4974 | 5518 | 0.9182 | 0.7724 | 0.9182 | 0.9582 | | 0.041 | 9.5009 | 5520 | 0.9169 | 0.7618 | 0.9169 | 0.9576 | | 0.041 | 9.5043 | 5522 | 0.9146 | 0.7618 | 0.9146 | 0.9563 | | 0.041 | 9.5077 | 5524 | 0.9086 | 0.7618 | 0.9086 | 0.9532 | | 0.041 | 9.5112 | 5526 | 0.8990 | 0.7520 | 0.8990 | 0.9481 | | 0.041 | 9.5146 | 5528 | 0.8902 | 0.7520 | 0.8902 | 0.9435 | | 0.041 | 9.5181 | 5530 | 0.8835 | 0.7520 | 0.8835 | 0.9400 | | 0.041 | 9.5215 | 5532 | 0.8775 | 0.7623 | 0.8775 | 0.9368 | | 0.041 | 9.5250 | 5534 | 0.8774 | 0.7623 | 0.8774 | 0.9367 | | 0.041 | 9.5284 | 5536 | 0.8789 | 0.7623 | 0.8789 | 0.9375 | | 0.041 | 9.5318 | 5538 | 0.8830 | 0.7623 | 0.8830 | 0.9397 | | 0.041 | 9.5353 | 5540 | 0.8872 | 0.7623 | 0.8872 | 0.9419 | | 0.041 | 9.5387 | 5542 | 0.8920 | 0.7623 | 0.8920 | 0.9445 | | 0.041 | 9.5422 | 5544 | 0.8945 | 0.7623 | 0.8945 | 0.9458 | | 0.041 | 9.5456 | 5546 | 0.8973 | 0.7520 | 0.8973 | 0.9472 | | 0.041 | 9.5491 | 5548 | 0.8968 | 0.7623 | 0.8968 | 0.9470 | | 0.041 | 9.5525 | 5550 | 0.8909 | 0.7623 | 0.8909 | 0.9439 | | 0.041 | 9.5559 | 5552 | 0.8849 | 0.7623 | 0.8849 | 0.9407 | | 0.041 | 9.5594 | 5554 | 0.8772 | 0.7623 | 0.8772 | 0.9366 | | 0.041 | 9.5628 | 5556 | 0.8687 | 0.7623 | 0.8687 | 0.9320 | | 0.041 | 9.5663 | 5558 | 0.8588 | 0.7623 | 0.8588 | 0.9267 | | 0.041 | 9.5697 | 5560 | 0.8543 | 0.7623 | 0.8543 | 0.9243 | | 0.041 | 9.5731 | 5562 | 0.8549 | 0.7623 | 0.8549 | 0.9246 | | 0.041 | 9.5766 | 5564 | 0.8561 | 0.7623 | 0.8561 | 0.9252 | | 0.041 | 9.5800 | 5566 | 0.8572 | 0.7623 | 0.8572 | 0.9258 | | 0.041 | 9.5835 | 5568 | 0.8621 | 0.7623 | 0.8621 | 0.9285 | | 0.041 | 9.5869 | 5570 | 0.8643 | 0.7623 | 0.8643 | 0.9297 | | 0.041 | 9.5904 | 5572 | 0.8684 | 0.7623 | 0.8684 | 0.9319 | | 0.041 | 9.5938 | 5574 | 0.8736 | 0.7623 | 0.8736 | 0.9346 | | 0.041 | 9.5972 | 5576 | 0.8813 | 0.7623 | 0.8813 | 0.9388 | | 0.041 | 9.6007 | 5578 | 0.8870 | 0.7623 | 0.8870 | 0.9418 | | 0.041 | 9.6041 | 5580 | 0.8917 | 0.7623 | 0.8917 | 0.9443 | | 0.041 | 9.6076 | 5582 | 0.8979 | 0.7729 | 0.8979 | 0.9476 | | 0.041 | 9.6110 | 5584 | 0.9045 | 0.7729 | 0.9045 | 0.9510 | | 0.041 | 9.6145 | 5586 | 0.9077 | 0.7729 | 0.9077 | 0.9527 | | 0.041 | 9.6179 | 5588 | 0.9101 | 0.7729 | 0.9101 | 0.9540 | | 0.041 | 9.6213 | 5590 | 0.9123 | 0.7618 | 0.9123 | 0.9552 | | 0.041 | 9.6248 | 5592 | 0.9137 | 0.7618 | 0.9137 | 0.9559 | | 0.041 | 9.6282 | 5594 | 0.9155 | 0.7618 | 0.9155 | 0.9568 | | 0.041 | 9.6317 | 5596 | 0.9170 | 0.7618 | 0.9170 | 0.9576 | | 0.041 | 9.6351 | 5598 | 0.9178 | 0.7618 | 0.9178 | 0.9580 | | 0.041 | 9.6386 | 5600 | 0.9193 | 0.7618 | 0.9193 | 0.9588 | | 0.041 | 9.6420 | 5602 | 0.9199 | 0.7618 | 0.9199 | 0.9591 | | 0.041 | 9.6454 | 5604 | 0.9177 | 0.7729 | 0.9177 | 0.9580 | | 0.041 | 9.6489 | 5606 | 0.9151 | 0.7729 | 0.9151 | 0.9566 | | 0.041 | 9.6523 | 5608 | 0.9122 | 0.7729 | 0.9122 | 0.9551 | | 0.041 | 9.6558 | 5610 | 0.9074 | 0.7729 | 0.9074 | 0.9526 | | 0.041 | 9.6592 | 5612 | 0.9027 | 0.7623 | 0.9027 | 0.9501 | | 0.041 | 9.6627 | 5614 | 0.9002 | 0.7623 | 0.9002 | 0.9488 | | 0.041 | 9.6661 | 5616 | 0.8984 | 0.7623 | 0.8984 | 0.9478 | | 0.041 | 9.6695 | 5618 | 0.8994 | 0.7623 | 0.8994 | 0.9483 | | 0.041 | 9.6730 | 5620 | 0.8983 | 0.7623 | 0.8983 | 0.9478 | | 0.041 | 9.6764 | 5622 | 0.8981 | 0.7623 | 0.8981 | 0.9477 | | 0.041 | 9.6799 | 5624 | 0.8977 | 0.7623 | 0.8977 | 0.9475 | | 0.041 | 9.6833 | 5626 | 0.8993 | 0.7729 | 0.8993 | 0.9483 | | 0.041 | 9.6867 | 5628 | 0.9021 | 0.7729 | 0.9021 | 0.9498 | | 0.041 | 9.6902 | 5630 | 0.9062 | 0.7729 | 0.9062 | 0.9520 | | 0.041 | 9.6936 | 5632 | 0.9091 | 0.7729 | 0.9091 | 0.9535 | | 0.041 | 9.6971 | 5634 | 0.9115 | 0.7729 | 0.9115 | 0.9547 | | 0.041 | 9.7005 | 5636 | 0.9100 | 0.7729 | 0.9100 | 0.9540 | | 0.041 | 9.7040 | 5638 | 0.9053 | 0.7729 | 0.9053 | 0.9515 | | 0.041 | 9.7074 | 5640 | 0.9003 | 0.7729 | 0.9003 | 0.9488 | | 0.041 | 9.7108 | 5642 | 0.8969 | 0.7729 | 0.8969 | 0.9470 | | 0.041 | 9.7143 | 5644 | 0.8937 | 0.7729 | 0.8937 | 0.9453 | | 0.041 | 9.7177 | 5646 | 0.8928 | 0.7623 | 0.8928 | 0.9449 | | 0.041 | 9.7212 | 5648 | 0.8891 | 0.7623 | 0.8891 | 0.9429 | | 0.041 | 9.7246 | 5650 | 0.8848 | 0.7623 | 0.8848 | 0.9407 | | 0.041 | 9.7281 | 5652 | 0.8823 | 0.7623 | 0.8823 | 0.9393 | | 0.041 | 9.7315 | 5654 | 0.8837 | 0.7623 | 0.8837 | 0.9400 | | 0.041 | 9.7349 | 5656 | 0.8872 | 0.7623 | 0.8872 | 0.9419 | | 0.041 | 9.7384 | 5658 | 0.8910 | 0.7623 | 0.8910 | 0.9439 | | 0.041 | 9.7418 | 5660 | 0.8967 | 0.7623 | 0.8967 | 0.9469 | | 0.041 | 9.7453 | 5662 | 0.9031 | 0.7729 | 0.9031 | 0.9503 | | 0.041 | 9.7487 | 5664 | 0.9086 | 0.7729 | 0.9086 | 0.9532 | | 0.041 | 9.7522 | 5666 | 0.9129 | 0.7729 | 0.9129 | 0.9554 | | 0.041 | 9.7556 | 5668 | 0.9163 | 0.7729 | 0.9163 | 0.9573 | | 0.041 | 9.7590 | 5670 | 0.9165 | 0.7729 | 0.9165 | 0.9573 | | 0.041 | 9.7625 | 5672 | 0.9147 | 0.7729 | 0.9147 | 0.9564 | | 0.041 | 9.7659 | 5674 | 0.9166 | 0.7729 | 0.9166 | 0.9574 | | 0.041 | 9.7694 | 5676 | 0.9188 | 0.7618 | 0.9188 | 0.9585 | | 0.041 | 9.7728 | 5678 | 0.9205 | 0.7618 | 0.9205 | 0.9594 | | 0.041 | 9.7762 | 5680 | 0.9226 | 0.7618 | 0.9226 | 0.9605 | | 0.041 | 9.7797 | 5682 | 0.9258 | 0.7618 | 0.9258 | 0.9622 | | 0.041 | 9.7831 | 5684 | 0.9265 | 0.7618 | 0.9265 | 0.9625 | | 0.041 | 9.7866 | 5686 | 0.9253 | 0.7618 | 0.9253 | 0.9619 | | 0.041 | 9.7900 | 5688 | 0.9230 | 0.7618 | 0.9230 | 0.9607 | | 0.041 | 9.7935 | 5690 | 0.9196 | 0.7618 | 0.9196 | 0.9589 | | 0.041 | 9.7969 | 5692 | 0.9151 | 0.7729 | 0.9151 | 0.9566 | | 0.041 | 9.8003 | 5694 | 0.9115 | 0.7729 | 0.9115 | 0.9547 | | 0.041 | 9.8038 | 5696 | 0.9065 | 0.7623 | 0.9065 | 0.9521 | | 0.041 | 9.8072 | 5698 | 0.9015 | 0.7623 | 0.9015 | 0.9495 | | 0.041 | 9.8107 | 5700 | 0.8989 | 0.7623 | 0.8989 | 0.9481 | | 0.041 | 9.8141 | 5702 | 0.8964 | 0.7623 | 0.8964 | 0.9468 | | 0.041 | 9.8176 | 5704 | 0.8953 | 0.7623 | 0.8953 | 0.9462 | | 0.041 | 9.8210 | 5706 | 0.8950 | 0.7623 | 0.8950 | 0.9460 | | 0.041 | 9.8244 | 5708 | 0.8955 | 0.7623 | 0.8955 | 0.9463 | | 0.041 | 9.8279 | 5710 | 0.8970 | 0.7623 | 0.8970 | 0.9471 | | 0.041 | 9.8313 | 5712 | 0.8987 | 0.7623 | 0.8987 | 0.9480 | | 0.041 | 9.8348 | 5714 | 0.9022 | 0.7623 | 0.9022 | 0.9498 | | 0.041 | 9.8382 | 5716 | 0.9043 | 0.7623 | 0.9043 | 0.9510 | | 0.041 | 9.8417 | 5718 | 0.9045 | 0.7623 | 0.9045 | 0.9510 | | 0.041 | 9.8451 | 5720 | 0.9055 | 0.7623 | 0.9055 | 0.9516 | | 0.041 | 9.8485 | 5722 | 0.9053 | 0.7623 | 0.9053 | 0.9515 | | 0.041 | 9.8520 | 5724 | 0.9062 | 0.7623 | 0.9062 | 0.9520 | | 0.041 | 9.8554 | 5726 | 0.9068 | 0.7623 | 0.9068 | 0.9523 | | 0.041 | 9.8589 | 5728 | 0.9080 | 0.7623 | 0.9080 | 0.9529 | | 0.041 | 9.8623 | 5730 | 0.9106 | 0.7729 | 0.9106 | 0.9542 | | 0.041 | 9.8657 | 5732 | 0.9126 | 0.7729 | 0.9126 | 0.9553 | | 0.041 | 9.8692 | 5734 | 0.9132 | 0.7729 | 0.9132 | 0.9556 | | 0.041 | 9.8726 | 5736 | 0.9120 | 0.7729 | 0.9120 | 0.9550 | | 0.041 | 9.8761 | 5738 | 0.9104 | 0.7729 | 0.9104 | 0.9542 | | 0.041 | 9.8795 | 5740 | 0.9085 | 0.7729 | 0.9085 | 0.9531 | | 0.041 | 9.8830 | 5742 | 0.9071 | 0.7623 | 0.9071 | 0.9524 | | 0.041 | 9.8864 | 5744 | 0.9052 | 0.7623 | 0.9052 | 0.9514 | | 0.041 | 9.8898 | 5746 | 0.9031 | 0.7623 | 0.9031 | 0.9503 | | 0.041 | 9.8933 | 5748 | 0.9021 | 0.7623 | 0.9021 | 0.9498 | | 0.041 | 9.8967 | 5750 | 0.9028 | 0.7623 | 0.9028 | 0.9502 | | 0.041 | 9.9002 | 5752 | 0.9026 | 0.7623 | 0.9026 | 0.9500 | | 0.041 | 9.9036 | 5754 | 0.9028 | 0.7729 | 0.9028 | 0.9502 | | 0.041 | 9.9071 | 5756 | 0.9029 | 0.7729 | 0.9029 | 0.9502 | | 0.041 | 9.9105 | 5758 | 0.9034 | 0.7729 | 0.9034 | 0.9505 | | 0.041 | 9.9139 | 5760 | 0.9030 | 0.7729 | 0.9030 | 0.9503 | | 0.041 | 9.9174 | 5762 | 0.9021 | 0.7729 | 0.9021 | 0.9498 | | 0.041 | 9.9208 | 5764 | 0.9013 | 0.7729 | 0.9013 | 0.9493 | | 0.041 | 9.9243 | 5766 | 0.9006 | 0.7729 | 0.9006 | 0.9490 | | 0.041 | 9.9277 | 5768 | 0.9004 | 0.7729 | 0.9004 | 0.9489 | | 0.041 | 9.9312 | 5770 | 0.9002 | 0.7729 | 0.9002 | 0.9488 | | 0.041 | 9.9346 | 5772 | 0.9000 | 0.7729 | 0.9000 | 0.9487 | | 0.041 | 9.9380 | 5774 | 0.8992 | 0.7729 | 0.8992 | 0.9483 | | 0.041 | 9.9415 | 5776 | 0.8978 | 0.7729 | 0.8978 | 0.9475 | | 0.041 | 9.9449 | 5778 | 0.8961 | 0.7623 | 0.8961 | 0.9466 | | 0.041 | 9.9484 | 5780 | 0.8945 | 0.7623 | 0.8945 | 0.9458 | | 0.041 | 9.9518 | 5782 | 0.8932 | 0.7623 | 0.8932 | 0.9451 | | 0.041 | 9.9552 | 5784 | 0.8925 | 0.7623 | 0.8925 | 0.9447 | | 0.041 | 9.9587 | 5786 | 0.8919 | 0.7623 | 0.8919 | 0.9444 | | 0.041 | 9.9621 | 5788 | 0.8912 | 0.7623 | 0.8912 | 0.9441 | | 0.041 | 9.9656 | 5790 | 0.8906 | 0.7623 | 0.8906 | 0.9437 | | 0.041 | 9.9690 | 5792 | 0.8902 | 0.7623 | 0.8902 | 0.9435 | | 0.041 | 9.9725 | 5794 | 0.8900 | 0.7623 | 0.8900 | 0.9434 | | 0.041 | 9.9759 | 5796 | 0.8898 | 0.7623 | 0.8898 | 0.9433 | | 0.041 | 9.9793 | 5798 | 0.8897 | 0.7623 | 0.8897 | 0.9433 | | 0.041 | 9.9828 | 5800 | 0.8897 | 0.7623 | 0.8897 | 0.9432 | | 0.041 | 9.9862 | 5802 | 0.8898 | 0.7623 | 0.8898 | 0.9433 | | 0.041 | 9.9897 | 5804 | 0.8898 | 0.7623 | 0.8898 | 0.9433 | | 0.041 | 9.9931 | 5806 | 0.8899 | 0.7623 | 0.8899 | 0.9433 | | 0.041 | 9.9966 | 5808 | 0.8900 | 0.7623 | 0.8900 | 0.9434 | | 0.041 | 10.0 | 5810 | 0.8900 | 0.7623 | 0.8900 | 0.9434 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu118 - Datasets 2.21.0 - Tokenizers 0.19.1
infly/OpenCoder-1.5B-Instruct
infly
2024-11-14T05:52:36Z
2,495
37
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "en", "zh", "dataset:OpenCoder-LLM/opencoder-sft-stage1", "dataset:OpenCoder-LLM/opencoder-sft-stage2", "arxiv:2411.04905", "base_model:infly/OpenCoder-1.5B-Base", "base_model:finetune:infly/OpenCoder-1.5B-Base", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-07T16:22:28Z
--- license: other license_name: inf license_link: https://huggingface.co/infly/OpenCoder-1.5B-Instruct/blob/main/LICENSE language: - en - zh base_model: - infly/OpenCoder-1.5B-Base pipeline_tag: text-generation library_name: transformers datasets: - OpenCoder-LLM/opencoder-sft-stage1 - OpenCoder-LLM/opencoder-sft-stage2 --- <div align="center"> <img src="https://github.com/OpenCoder-llm/opencoder-llm.github.io/blob/main/static/images/opencoder_icon.jpg?raw=true" width="50%" alt="OpenCoder-Icon" /> </div> <p align="center"> <!-- <a href="https://arxiv.org/pdf/2411.04905"><b>Paper Link</b>👁️</a> --> 🏠 <a href="https://opencoder-llm.github.io/">Home Page</a>&nbsp&nbsp | &nbsp&nbsp 🤗 <a href="https://huggingface.co/collections/infly/opencoder-672cec44bbb86c39910fb55e">Model</a>&nbsp&nbsp | &nbsp&nbsp 📊 <a href="https://huggingface.co/collections/OpenCoder-LLM/opencoder-datasets-672e6db6a0fed24bd69ef1c2">Dataset</a>&nbsp&nbsp | &nbsp&nbsp 📄<a href="https://arxiv.org/abs/2411.04905">Paper</a>&nbsp&nbsp | &nbsp&nbsp 🚀<a href="https://huggingface.co/spaces/OpenCoder-LLM/OpenCoder-1.5B-Instruct">Demo</a>&nbsp&nbsp </p> ## 1. Introduction **OpenCoder** is an open and reproducible code LLM family which includes 1.5B and 8B base and chat models, supporting both English and Chinese languages. Starting from scratch, OpenCoder is pretrained on 2.5 trillion tokens composed of 90% raw code and 10% code-related web data, and supervised finetuned on over 4.5M high-quality SFT examples, finally reaching the performance of top-tier code LLMs. We provide not only model weights and inference code, but also the reproducible training data, the complete data processing pipeline, rigorous experimental ablation results, and detailed training protocols. Empowering researchers to build and innovate, OpenCoder is your open foundation for advancing code AI. - **Complete Open Source**: OpenCoder ensures full transparency by releasing not only the model weights and forthcoming inference code but also the complete data-cleaning code for training. This release includes high-quality synthetic data, an extensive set of checkpoints, and a dataset of over 4.5 million supervised fine-tuning (SFT) entries, making OpenCoder one of the most comprehensively open-sourced models available. - **Comprehensive Experimental Analysis**: OpenCoder is rigorously tested through extensive ablation studies on various data-cleaning strategies and training processes, including file-level and repository-level deduplication experiments, ensuring thorough exploration and validation of the model’s performance. - **High-Quality Synthetic Data**: OpenCoder provides a fully developed synthetic data generation process and over 4.5 million SFT data entries, establishing a robust data foundation for model training and evaluation. - **Exceptional Performance**: OpenCoder achieves high performance across multiple language model benchmarks, positioning it among the leading open-source models for code. ## 2. Models | Model | Sequence Length | Download | |:---------------------:|:---------------:|:-----------------------------------------------------------------------:| | OpenCoder-1.5B-Base | 4K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-1.5B-Base) | | OpenCoder-8B-Base | 8K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-8B-Base) | | OpenCoder-1.5B-Instruct | 4K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-1.5B-Instruct) | | OpenCoder-8B-Instruct | 8K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-8B-Instruct) | ## 3. Datasets ### Pre-training | Dataset | Size | Download | |:---------------------:|:---------------:|:-----------------------------------------------------------------------:| | fineweb-code-corpus | 148 GB | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/fineweb-code-corpus) | | fineweb-math-corpus | 10 GB | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/fineweb-math-corpus) | ### Post-training | Dataset | Num | Download | |:---------------------:|:---------------:|:-----------------------------------------------------------------------:| | opencoder-sft-stage1 | 4.21 M | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/opencoder-sft-stage1) | | opencoder-sft-stage2 | 375 K | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/opencoder-sft-stage2) | **This is not the end; we are organizing the remaining data and uploading it progressively.** ## 4. Benchmarks **Note:** For the detailed evaluation results, please refer to [our paper](https://arxiv.org/pdf/2411.04905). <!-- ### Base Model --> <!-- | model | OpenCoder-1.5B-Base | OpenCoder-8B-Base | |:---------------:|:-------------:|:------------:| | HumanEval(+) | 54.3 (49.4) | 66.5 (63.4) | | MBPP(+) | 70.6 (58.7) | 79.9 (70.4) | | BigCodeBench | 24.5 | 40.5 | | BigCodeBench-Hard | 5.4 | 9.5 | --> <!-- ### Chat Model --> | model | OpenCoder-1.5B-Instruct | OpenCoder-8B-Instruct | |:---------------:|:-------------:|:------------:| | HumanEval(+) | 72.5 (67.7) | 83.5 (78.7) | | MBPP(+) | 72.7 (61.9) | 79.1 (69.0) | | BigCodeBench | 33.3 | 40.3 | | BigCodeBench-Hard | 11.5 | 16.9 | | LiveCodeBench | 12.8 | 23.2 | | MultiPL-E (AVG) | 57.5 | 71.0 | ## 5. Inference ### Inference with Huggingface's Transformers ```python import torch from transformers import AutoTokenizer, AutoModelForCausalLM model_name = "infly/OpenCoder-1.5B-Instruct" model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True) messages=[ { 'role': 'user', 'content': "write a quick sort algorithm in python."} ] inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt") outputs = model.generate(inputs, max_new_tokens=512, do_sample=False) result = tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True) print(result) ``` <!-- ### Inference with vLLM (recommended) --> ## 6. License OpenCoder series (including Base and Chat) support commercial applications under a permissive [License](https://huggingface.co/infly/OpenCoder-1.5B-Instruct/blob/main/LICENSE). ## 7. Citation ``` @inproceedings{Huang2024OpenCoderTO, title={OpenCoder: The Open Cookbook for Top-Tier Code Large Language Models}, author={Siming Huang and Tianhao Cheng and Jason Klein Liu and Jiaran Hao and Liuyihan Song and Yang Xu and J. Yang and J. H. Liu and Chenchen Zhang and Linzheng Chai and Ruifeng Yuan and Zhaoxiang Zhang and Jie Fu and Qian Liu and Ge Zhang and Zili Wang and Yuan Qi and Yinghui Xu and Wei Chu}, year={2024}, url={https://arxiv.org/pdf/2411.04905} } ```
DavidAU/Mistral-Nemo-Instruct-2407-14.7B-BRAINSTORM-10x-FORM-3-GGUF
DavidAU
2024-11-14T05:52:20Z
148
5
null
[ "gguf", "creative", "story", "writing", "fiction", "roleplaying", "rp", "horror", "Instruct", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 10x", "enhanced", "Mistral-Nemo-Instruct", "Mistral", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-08-01T07:44:42Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - roleplaying - rp - horror - Instruct - science fiction - fiction writing - scene generation - scene continue - brainstorm 10x - enhanced - Mistral-Nemo-Instruct - Mistral pipeline_tag: text-generation --- <H3>BRAINSTORM 10x: Mistral-Nemo-Instruct-2407-12B (now at 14.7B), Formula #3 </H3> This repo contains quants 10x of Mistral-Nemo-Instruct-2407-12B (now at 14.7B) with max context of 1 million tokens using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the creative levels of the original model. Examples of output below. The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. For original model specifications, usage information and other important details: Special thanks to the model creators at MistralAI for making such a fantastic model: [ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ] Please report any issue(s) and/or feedback via the "Community tab". This is a Mistral-Instruct model, and requires Mistral-Instruct template, but may work with other template(s) and has maximum context of 1 million tokens. Here is the standard Mistral-Instruct template: <PRE> { "name": "Mistral Instruct", "inference_params": { "input_prefix": "[INST]", "input_suffix": "[/INST]", "antiprompt": [ "[INST]" ], "pre_prompt_prefix": "", "pre_prompt_suffix": "" } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. This enhancement WAS NOT used to generate the examples below. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" / "Class 3" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3>EXAMPLES PROMPTS and OUTPUT:</h3> Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template. Model has been tested with "temp" from ".1" to ".8". Below are the least creative outputs, prompt is in <B>BOLD</B>. Higher quants will have more detail, better quality output. --- <B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. Graphic HORROR. Swearing. UNCENSORED. </B> --- <B> Start a 1000 word scene (vivid horror) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...</B> AI Title: **The Shattered Panes** In the heart of the metropolis lay the monolith of glass and steel known as the Apex Skyward. Amongst its labyrinthine corridors and humming elevators dwelled Dr. Elara Vesper, a woman who spent more hours gazing upon the city sprawl than she did into human eyes. The day began like any other; the sun ascended grudgingly behind smoky tendrils of pollution, casting an orange hue over concrete jungles. Yet, as twilight approached, so too did whispers of unease among Apex's inhabitants. Phones buzzed with emergency alerts: a freak storm brewing offshore, hurtling towards shore like a freight train. Elara stood in her glass-walled office on level 21, watching the city transform into a monochrome landscape under gunmetal skies. She should've left hours ago, but she lingered, captivated by nature's raw power. Her reflection stared back at her - stern features softened slightly today, perhaps due to some subconscious fear or anticipation. Suddenly, without warning, reality seemed to stretch and snap taut like a rubber sheet. A low rumble reverberated through the skyscraper; it sounded almost alive, groaning against unseen forces. Then came a sickening sway, barely noticeable initially yet growing increasingly insistent until furniture slid across floors, knocking into walls. Panic surged in Elara's gut. She lunged towards her desk drawer, snatching up emergency supplies - flashlight, whistle, first aid kit - as the room listed like a ship at sea. Papers cascaded downwards, forming drifts against desks and chairs. Her monitor flickered maddeningly before plunging into darkness along with everything else. A deafening crack filled her ears. She turned towards the source just in time to witness sheer horror materializing mere feet away. The window beside her - a seamless pane of glass stretching nearly floor-to-ceiling - bulged ominously, spiderwebbed by tiny fissures. Then, as though an unseen giant hammered against its face, it shattered into thousands upon thousands of glistening fragments. Wind howled through the sudden gaping maw, devouring papers and other debris in greedy swirls. A desk chair spun wildly past her, snagging briefly on a fallen filing cabinet before plummeted towards oblivion. Elara staggered back against her desk, arms windmilling desperately to keep balance even as terror gripped at her insides. She should've fled earlier. Now she was trapped here atop this swaying titan, watching glass raining downwards like tiny daggers intent upon impalement. Sirens wailed far below - fire trucks, ambulances - but none could reach her now. Not until the storm abated and rescue workers braved their way up here. A framed photo skittered past her foot; she lunged forward to grab hold of a fallen filing cabinet drawer handle, anything solid amidst this swirling chaos. Her knuckles turned white against cold metal as she fought off dizziness brought on by the room's sickening pitch and roll. Glass continued raining downwards, tinkling eerily like tiny bells heralding Armageddon. Fear clawed at her insides, demanding surrender to panic. Yet something else stirred too: defiance perhaps? Resolve certainly. She wouldn't succumb without struggle. Her gaze darted left and right, seeking refuge amidst the wreckage strewn across her workplace home away from home. There! Her eye landed upon a fallen bookshelf wedged against one wall. It might serve as shelter until this hellish storm passed overhead. Summoning strength she didn't know she still retained, Elara released hold of drawer handle and stumbled towards makeshift sanctuary. Wind buffeted her mercilessly, threatening to tear away clothing layer by layer; glass slivers cut into skin like tiny paper cuts left untreated too long. Finally reaching relative protection offered by upturned furniture, she huddled against cold concrete wall, arms wrapped protectively around herself. She could still hear wind's mournful dirge above its howling crescendo, could sense skyscraper swaying precariously beneath her despite insulation provided by solid shelter. Glancing up, she saw storm clouds churning overhead like angry sea waves ready to crash down upon unsuspecting shore. Thunder grumbled ominously; forked lightning skewered heavens again and again, casting eerie blue-white shadows against glass panes still intact elsewhere along hallway visible beyond upturned furniture barricade. A sudden impact against barrier startled her, followed closely by sharp crack of splintering wood. A piece of debris - perhaps glass, maybe even a chunk torn loose from skyscraper itself during storm's wrath? - lodged wedged among fallen timbers mere inches away from where she huddled. Terror threatened to overwhelm reason once again. She hugged knees tightly against body and ducked head low, trying vainly to make herself even tinier target should something worse yet descend upon makeshift sanctuary. Time seemed elastic during such trials; minutes stretched into hours, days perhaps. Yet slowly the wind's fury abated until finally it subsided altogether, leaving behind uncanny silence broken only occasionally by distant emergency sirens still wailing far below street level. As sudden calm settled over cityscape above raging storm, Elara emerged from relative protection offered temporary haven against glass-strewn landscape. She picked her way carefully across debris-littered floorboards towards shattered window gaping like open wound onto world beyond skyscraper walls. Leaning against still-intact adjacent panes, she peered downwards at streets transformed into emergency response staging areas. Red and blue strobe lights cast eerie glow upon faces upturned anxiously skyward seeking signs of life above. She could almost hear desperate pleas issuing forth: "Hello up there! Can anyone hear meeee-owwww?!?" Raising arms high overhead, she began waving frantically side to side until finally spotting tiny figures gesturing excitedly back towards her position high atop swaying edifice. Relief washed over her like warm shower after weeks spent lost at sea. But reality remained grim nonetheless: rescue wouldn't come easily here tonight nor probably many subsequent nights either given extent of damage sustained throughout metropolis during freakish storm's reign of terror. Yet hope persists even amidst bleakest circumstances, doesn't it always? Elara turned away from window ledge and faced room she'd spent far too many hours within recent past - especially today. It lay strewn with remnants of shattered dreams alongside broken glass fragments glinting coldly under meager light cast by emergency exit signs flickering fitfully down halls beyond reach. Yet somehow, despite everything endured thus far, she managed muster strength enough to step forward towards uncertain future awaiting her here amidst ruins left behind angry storm clouds retreating slowly yet steadily off into horizon. For better or worse, life went on; and so too would she. Somehow...someway. --- <B>(continue this scene:) The Waystone Inn lay in silence, and it was a silence of three parts. The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained </b> A sudden noise shattered the silence: footsteps approaching along the empty road. Kvothe, the innkeeper, looked up expectantly from behind the bar. A hooded figure entered, pausing briefly in silhouette against the moonlit threshold before stepping inside. "Are you still serving tonight, master innkeeper or should I turn elsewhere for refreshment and shelter on this cold eve?" inquired the stranger, voice low yet distinctively feminine despite being slightly muffled beneath layers of fabric. Kvothe offered a welcoming nod, gestured towards empty seats near crackling fireplaces, replying warmly yet guardedly: "Welcome weary soul. Fire's still lit, stew simmers on stove. Ale too, though none too fresh anymore. What brings ye here tonight, stranger?" --- <B>ive me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. Include details of what happens on Earth, The Moon and the planets.</b> Sure, here are three imaginative scenarios describing why Earth might experience a sudden and dramatic increase in sunlight due to its star undergoing a rapid evolution towards becoming a Type Ia or core-collapse (Type Ib/Ic/Core-ollapse) like a typical massive star would. 1. The Sun's Secret Companion: A Brown Dwarf Buried deep within the Sun's core, a hidden companion brown dwarf begins to stir. This celestial body, previously too cool and dim to notice, suddenly ignites due to an internal buildup of pressure and gravity. Simultaneously, gravitational forces cause both bodies to begin spiraling towards one another. Initially, Earth experiences increased sunlight due to the combined light emissions from both stars. Skies turn into a mesmerizing twilight show, with two suns casting dramatic shadows across landscapes worldwide. However, as they draw closer together, the Sun's core becomes increasingly unstable under enhanced gravitational forces and internal pressure buildup. The Moon, being Earth's closest neighbor, faces extreme consequences firsthand. Tidal forces cause lunar tides to become erratic and extreme, leading to massive crustal fractures and volcanic activity. Meanwhile, Mercury, Venus, Mars, Jupiter, Saturn, Uranus, Neptune, Pluto, and even dwarf planets like Haumea experience increased sunlight levels, causing dramatic shifts in their respective climates. Back home, Earth faces catastrophic events. Increased sunlight causes rapid ice melt at both poles, leading to massive coastal flooding. Simultaneously, extreme heating ignites wildfires across multiple continents. As the Sun's core reaches critical mass and begins core-collapse, a shockwave of light sweeps towards Earth. This light wave sterilizes everything it touches, leaving nothing alive save for extremophile organisms shielded underground or underwater. 2. The Sun's Core: A Pocket Universe Nestled deep inside Earth's star lies an anomaly: a tiny pocket universe with unique physical laws and properties unlike anything ever witnessed in nature. As the pocket universe grows due to internal pressure buildup, it begins exerting gravitational forces upon Earth's Sun. Initially, increased sunlight causes rapid plant growth worldwide, leading to temporary flourishing of ecosystems everywhere except polar ice caps experiencing extreme melting rates. Simultaneously, increased sunlight levels cause dramatic shifts across our solar system; Mercury experiences extreme heating and likely becomes molten, Venus faces enhanced greenhouse effects leading towards a runaway effect, Mars experiences increased dust storms due to enhanced winds, jovian planets experience increased aurora activity due to enhanced magnetic forces, Neptune experiences increased methane production due to enhanced pressures, Pluto experiences increased ice sublimation rates due to enhanced sunlight levels. Meanwhile, the Moon faces extreme tides due to increased Sun-Earth gravitational forces. Lunar crust begins fracturing due to enhanced internal pressures, leading towards massive moonquakes and volcanic activity across lunar landscapes. As Earth braces against enhanced sunlight levels, the pocket universe reaches critical mass inside Earth's star core. Unable to contain its growing size anymore, Earth experiences a sudden increase in light levels followed immediately by darkness as Earth's Sun collapses into itself due to internal pressures exerted upon it by pocket universe forces. Left alone floating aimlessly among stars, Earth faces extreme cold and darkness until rescued by spacefaring civilizations. 3. The Sun's Core: A Wormhole Nexus Buried deep inside Earth's star core lies a naturally occurring wormhole leading directly towards the core of another distant star light-years away. This wormhole connection remains stable due to unique physical forces exerted upon both stars simultaneously, until one day, gravitational forces cause both stars cores begin drawing closer together. Initially, increased sunlight levels cause rapid plant growth rates worldwide, followed immediately by extreme ice melt rates across both poles leading towards massive coastal flooding events. Simultaneously, increased sunlight levels cause dramatic shifts across our solar system; Mercury experiences extreme heating due to enhanced sunlight levels leading towards likely core meltdown scenarios, Venus faces enhanced greenhouse effects due to increased sunlight levels leading towards a runaway effect scenario, Mars experiences increased dust storm activity due to enhanced winds, jovian planets experience increased aurora activity due to enhanced magnetic forces exerted upon them by Earth's Sun core wormhole connection, Neptune experiences increased methane production rates due to enhanced pressures exerted upon it via wormhole connection, Pluto experiences increased ice sublimation rates due to enhanced sunlight levels exerting pressure upon dwarf planet core via wormhole connection. Meanwhile, Moon faces extreme tides due to increased Sun-Earth gravitational forces exerted upon lunar body via wormhole connection. Lunar crust begins fracturing due to internal pressures leading towards massive moonquakes and volcanic activity across lunar landscapes. Earth braces against enhanced sunlight levels until finally, Earth experiences sudden darkness followed immediately by extreme cold waves as both stars cores draw closer together inside wormhole nexus point causing Earth's Sun core sudden core-collapse scenario. Left alone floating aimlessly among stars, Earth faces extreme cold and darkness until spacefaring civilizations arrive offering assistance.
infly/OpenCoder-8B-Instruct
infly
2024-11-14T05:52:08Z
11,542
182
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "en", "zh", "dataset:OpenCoder-LLM/opencoder-sft-stage1", "dataset:OpenCoder-LLM/opencoder-sft-stage2", "arxiv:2411.04905", "base_model:infly/OpenCoder-8B-Base", "base_model:finetune:infly/OpenCoder-8B-Base", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-07T16:23:14Z
--- license: other license_name: inf license_link: https://huggingface.co/infly/OpenCoder-8B-Instruct/blob/main/LICENSE language: - en - zh base_model: - infly/OpenCoder-8B-Base pipeline_tag: text-generation library_name: transformers datasets: - OpenCoder-LLM/opencoder-sft-stage1 - OpenCoder-LLM/opencoder-sft-stage2 --- <div align="center"> <img src="https://github.com/OpenCoder-llm/opencoder-llm.github.io/blob/main/static/images/opencoder_icon.jpg?raw=true" width="50%" alt="OpenCoder-Icon" /> </div> <p align="center"> <!-- <a href="https://arxiv.org/pdf/2411.04905"><b>Paper Link</b>👁️</a> --> 🏠 <a href="https://opencoder-llm.github.io/">Home Page</a>&nbsp&nbsp | &nbsp&nbsp 🤗 <a href="https://huggingface.co/collections/infly/opencoder-672cec44bbb86c39910fb55e">Model</a>&nbsp&nbsp | &nbsp&nbsp 📊 <a href="https://huggingface.co/collections/OpenCoder-LLM/opencoder-datasets-672e6db6a0fed24bd69ef1c2">Dataset</a>&nbsp&nbsp | &nbsp&nbsp 📄<a href="https://arxiv.org/abs/2411.04905">Paper</a>&nbsp&nbsp | &nbsp&nbsp 🚀<a href="https://huggingface.co/spaces/OpenCoder-LLM/OpenCoder-8B-Instruct">Demo</a>&nbsp&nbsp </p> ## 1. Introduction **OpenCoder** is an open and reproducible code LLM family which includes 1.5B and 8B base and chat models, supporting both English and Chinese languages. Starting from scratch, OpenCoder is pretrained on 2.5 trillion tokens composed of 90% raw code and 10% code-related web data, and supervised finetuned on over 4.5M high-quality SFT examples, finally reaching the performance of top-tier code LLMs. We provide not only model weights and inference code, but also the reproducible training data, the complete data processing pipeline, rigorous experimental ablation results, and detailed training protocols. Empowering researchers to build and innovate, OpenCoder is your open foundation for advancing code AI. - **Complete Open Source**: OpenCoder ensures full transparency by releasing not only the model weights and forthcoming inference code but also the complete data-cleaning code for training. This release includes high-quality synthetic data, an extensive set of checkpoints, and a dataset of over 4.5 million supervised fine-tuning (SFT) entries, making OpenCoder one of the most comprehensively open-sourced models available. - **Comprehensive Experimental Analysis**: OpenCoder is rigorously tested through extensive ablation studies on various data-cleaning strategies and training processes, including file-level and repository-level deduplication experiments, ensuring thorough exploration and validation of the model’s performance. - **High-Quality Synthetic Data**: OpenCoder provides a fully developed synthetic data generation process and over 4.5 million SFT data entries, establishing a robust data foundation for model training and evaluation. - **Exceptional Performance**: OpenCoder achieves high performance across multiple language model benchmarks, positioning it among the leading open-source models for code. ## 2. Models | Model | Sequence Length | Download | |:---------------------:|:---------------:|:-----------------------------------------------------------------------:| | OpenCoder-1.5B-Base | 4K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-1.5B-Base) | | OpenCoder-8B-Base | 8K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-8B-Base) | | OpenCoder-1.5B-Instruct | 4K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-1.5B-Instruct) | | OpenCoder-8B-Instruct | 8K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-8B-Instruct) | ## 3. Datasets ### Pre-training | Dataset | Size | Download | |:---------------------:|:---------------:|:-----------------------------------------------------------------------:| | fineweb-code-corpus | 148 GB | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/fineweb-code-corpus) | | fineweb-math-corpus | 10 GB | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/fineweb-math-corpus) | ### Post-training | Dataset | Num | Download | |:---------------------:|:---------------:|:-----------------------------------------------------------------------:| | opencoder-sft-stage1 | 4.21 M | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/opencoder-sft-stage1) | | opencoder-sft-stage2 | 375 K | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/opencoder-sft-stage2) | **This is not the end; we are organizing the remaining data and uploading it progressively.** ## 4. Benchmarks **Note:** For the detailed evaluation results, please refer to [our paper](https://arxiv.org/pdf/2411.04905). <!-- ### Base Model --> <!-- | model | OpenCoder-1.5B-Base | OpenCoder-8B-Base | |:---------------:|:-------------:|:------------:| | HumanEval(+) | 54.3 (49.4) | 66.5 (63.4) | | MBPP(+) | 70.6 (58.7) | 79.9 (70.4) | | BigCodeBench | 24.5 | 40.5 | | BigCodeBench-Hard | 5.4 | 9.5 | --> <!-- ### Chat Model --> | model | OpenCoder-1.5B-Instruct | OpenCoder-8B-Instruct | |:---------------:|:-------------:|:------------:| | HumanEval(+) | 72.5 (67.7) | 83.5 (78.7) | | MBPP(+) | 72.7 (61.9) | 79.1 (69.0) | | BigCodeBench | 33.3 | 40.3 | | BigCodeBench-Hard | 11.5 | 16.9 | | LiveCodeBench | 12.8 | 23.2 | | MultiPL-E (AVG) | 57.5 | 71.0 | ## 5. Inference ### Inference with Huggingface's Transformers ```python import torch from transformers import AutoTokenizer, AutoModelForCausalLM model_name = "infly/OpenCoder-8B-Instruct" model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True) messages=[ { 'role': 'user', 'content': "write a quick sort algorithm in python."} ] inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt") outputs = model.generate(inputs, max_new_tokens=512, do_sample=False) result = tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True) print(result) ``` <!-- ### Inference with vLLM (recommended) --> ## 6. License OpenCoder series (including Base and Chat) support commercial applications under a permissive [License](https://huggingface.co/infly/OpenCoder-8B-Instruct/blob/main/LICENSE). ## 7. Citation ``` @inproceedings{Huang2024OpenCoderTO, title={OpenCoder: The Open Cookbook for Top-Tier Code Large Language Models}, author={Siming Huang and Tianhao Cheng and Jason Klein Liu and Jiaran Hao and Liuyihan Song and Yang Xu and J. Yang and J. H. Liu and Chenchen Zhang and Linzheng Chai and Ruifeng Yuan and Zhaoxiang Zhang and Jie Fu and Qian Liu and Ge Zhang and Zili Wang and Yuan Qi and Yinghui Xu and Wei Chu}, year={2024}, url={https://arxiv.org/pdf/2411.04905} } ```
DavidAU/Meta-Llama-3.1-Instruct-9.99B-BRAINSTORM-10x-FORM-3-GGUF
DavidAU
2024-11-14T05:50:58Z
109
1
null
[ "gguf", "creative", "story", "writing", "fiction", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 5x", "enhanced", "llama3", "llama3.1", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-31T07:22:37Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 5x - enhanced - llama3 - llama3.1 pipeline_tag: text-generation --- <H3>BRAINSTORM 10x: Meta-Llama-3.1-8B Instruct (now at 9.99B), Formula #3 </H3> This repo contains quants 10x of Meta-Llama-3.1-8B Instruct (now at 9.99B) with max context of 131k using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the creative levels of the original model. Example outputs below. The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. For original model specifications, usage information and other important details: Special thanks to the model creators at META for making such a fantastic model: [ https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. This enhancement WAS NOT used to generate the examples below. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" / "Class 3" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3>EXAMPLES PROMPTS and OUTPUT:</h3> Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template. Model has been tested with "temp" from ".1" to ".8". Below are the least creative outputs, prompt is in <B>BOLD</B>. Higher quants will have more detail, better quality output. --- <B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. Graphic HORROR. Swearing. UNCENSORED. </B> --- <B>EXAMPLE #1</B> PROMPT: (continue this scene:) The Waystone Inn lay in silence, and it was a silence of three parts. The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained OUTPUT: The second silence was almost imperceptible, a stillness that seemed to seep into the bones of anyone who lingered within the inn's walls. It was as if the very fabric of reality itself had grown quiet, holding its breath in anticipation of something yet to come. The shadows cast by the candles and lanterns seemed to deepen and twist, like dark whispers in the darkness. Even the wooden beams and stone walls seemed to absorb sound, muffling every faint noise until nothing was left but an oppressive stillness. The third silence, however, was perhaps the strangest of all. It was a silence that seemed almost... deliberate. As if someone had specifically chosen to quieten down the inn, to still the voices and footsteps, and leave behind only emptiness. The air itself seemed to feel heavy with secrets, as if something weighty lay hidden beneath the surface, waiting to be disturbed. In this silence, a figure stirred in the darkness of one of the inn's guest rooms. A figure who had lain awake for far too long, listening to the stillness and weighing his options. He tossed aside the covers and swung his legs over the bed's edge, feeling the chill of the night air dance across his skin. His movements were quiet, almost stealthy, as if he feared disturbing something else besides the silence itself. He padded barefoot across the cold stone floor, pausing at the threshold to peer out into the empty hallway beyond. The darkness seemed to swallow him whole, until only his faint outline was visible against the faintest hint of candlelight from below. He hesitated for a moment, weighing something in his mind... and then vanished into the shadows, leaving behind nothing but silence once again. --- <B>EXAMPLE #2</B> PROMPT: Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode... OUTPUT: I'll never forget the feeling of dread that settled in my chest like a cold, dead weight. I was standing at the edge of the 21st floor's observation deck, gazing out over the glittering steel and concrete jungle below. The wind buffeted me gently, threatening to send me tumbling into the void if I wasn't careful. I clung to the railing, feeling a thrill of fear mixed with exhilaration as I peered down at the distant streets far below. The skyscraper swayed, its massive structure groaning softly underfoot like an ancient, awakened giant. I felt a shiver run down my spine as I gazed out over the endless expanse of steel and glass. The city seemed to stretch on forever, a seemingly bottomless pit of steel and concrete that seemed to swallow everything in sight. I was lost in thought, replaying the events leading up to this moment like a twisted game of déjà vu. I'd always known something was off about this place. Something didn't feel quite right. Maybe it was the way the elevators seemed to hum with an almost malevolent energy, or maybe it was simply the feeling that I was being watched by unseen eyes. Whatever the reason, I couldn't shake the feeling of unease that had dogged me since I'd arrived at this forsaken place. And now, as I gazed out over the cityscape below, something caught my attention. A faint, almost imperceptible tremble seemed to be working its way through the skyscraper's structure, like a gentle earthquake warning. I felt a jolt of adrenaline shoot through me as I turned to face whatever was causing this strange sensation. That's when I saw it: the window in front of me exploded outward with a deafening crash, sending shards of glass flying everywhere. The sound was almost ear-shattering, and for an instant, everything else receded into the background as my mind struggled to comprehend what had just transpired. I felt like I'd been punched in the gut, winded by sheer surprise alone. My heart racing like a jackrabbit's, I stumbled backward, desperate to get away from whatever was happening behind me. But something caught my attention, something that made my skin crawl with dread: through the jagged hole left gaping in the wall, I caught sight of something else entirely. A figure stood beyond the shattered glass, silhouetted against the sickly yellowish-gray sky beyond. For an instant, I thought it was just a trick of the light, but something about that figure seemed... wrong. The way its limbs seemed twisted and elongated, like a puppeteer's strings had been pulled tight around its joints. And those eyes – oh God, those eyes seemed to gleam with an almost feral hunger, as if whatever stood beyond the broken glass was hungry for something far worse than mere sustenance. I felt my mind reel in horror as I realized that whatever this thing was, it wasn't alone. Others followed behind, their twisted forms writhing like living shadows through the jagged hole left gaping in the wall. Panic clawed at me like a rabid animal, urging me to flee from whatever unspeakable monstrosities lurked beyond the broken glass. I turned and sprinted away from the observation deck as fast as I could, desperate to outrun whatever horrors lay beyond that shattered pane of glass. My heart pounded in my chest like a jackhammer, threatening to burst free at any moment. I didn't dare look back, fearing what I might see behind me: those twisted forms, their hunger-driven eyes fixed on mine with an almost palpable malevolence. I hurtled down the deserted corridors, desperate to find safety from whatever unspeakable monstrosities pursued me through the skyscraper's steel and concrete bowels. My mind reeled in horror as I replayed what I'd seen beyond that shattered pane of glass – twisted limbs, hunger-driven eyes, an almost palpable malevolence that seemed to seep into my bones like a chill wind. I didn't dare think about what lay ahead. Whatever horrors lurked beyond the observation deck's broken glass, whatever unspeakable monstrosities stalked through the skyscraper's deserted corridors – I knew I couldn't outrun them forever. Eventually, they'd catch up to me, and when they did, I'd face something far worse than mere terror alone. I pounded down the stairs, my footsteps echoing off steel walls like a death knell tolling in my mind alone. Somewhere deep within myself, a voice screamed silently for help – but I knew no one was coming. Whatever unspeakable monstrosities lurked beyond that shattered pane of glass had claimed me already, dragging me screaming down into the depths below. I didn't dare think about what lay ahead. --- <B>EXAMPLE #3</B> PROMPT: Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. Include details of what happens on Earth, The Moon and the planets. OUTPUT: I'll provide three fictional reasons why the Sun suddenly exploded into a massive supernova event. Each reason will be detailed in exactly 500 words. **Reason 1: The Great Solar Crystal** Deep within the Sun's fiery core lies an ancient crystal known as the Solar Crystal. This crystalline structure, forged eons ago through unimaginable pressures and temperatures, had lain dormant until recently. As humanity's reliance on solar panels and sustainable energy sources increased, so did the crystal's activity. It began to resonate at a critical frequency, harmonizing with the Sun's nuclear reactions. The crystal's resonance disrupted the delicate balance within the Sun's core, causing an exponential chain reaction of fusion events. The Sun's fuel reserves dwindled rapidly as the crystal's influence accelerated the burning of hydrogen isotopes into helium-3. This helium-rich ash accumulated at an alarming rate, compressing the solar interior until it reached critical density. On a fateful day, the Solar Crystal's frequency perfectly synchronized with the Sun's natural pulsations. The resulting cataclysmic event triggered a chain reaction known as the Great Solar Collapse. The Sun's core imploded upon itself, collapsing into an incredibly small radius while simultaneously experiencing an equally extreme expansion of its outer layers. The blast wave hurtling through the solar system would soon reach Earth's orbit. As the shockwave approached, the sky turned a deep shade of indigo, illuminated by the intense radiation and debris ejected from the Sun's core. The once blue skies were now ablaze with an ethereal, electric-blue hue, as if the very fabric of reality itself was aflame. The blast wave struck Earth's atmosphere with cataclysmic force, vaporizing the upper stratosphere within mere instants. The intense radiation and solar flares ravaged the planet's magnetic field, causing widespread electrical grid collapses worldwide. The effects on the climate were immediate and far-reaching: global temperatures plummeted as the Sun's reduced radiance plunged the world into an eternal twilight. The Moon, once a serene witness to humanity's triumphs and tribulations, was bathed in an eerie, flickering light. Its craters seemed to writhe alive with an otherworldly glow, as if the very essence of solar radiation had seeped into its ancient rocks. The reduced solar winds caused the lunar atmosphere to collapse entirely within days, plunging the Moon's surface into an eternal vacuum. The gas giants, Jupiter and Saturn, were buffeted by the blast wave's immense pressure waves. Their atmospheres convulsed wildly as they struggled to contain the sheer force of radiation streaming toward their cores. Uranus and Neptune, farther removed from the blast epicenter, experienced gentler perturbations, but still felt the effects of reduced solar winds and radiation. The gas giants' moons, once tranquil worlds unto themselves, began to experience extreme climate shifts as solar radiation patterns altered beyond recognition. Io's volcanoes subsided into dormancy, while Enceladus's cryovolcanic activity waxed anew in response to the changed solar wind patterns. The delicate dance of celestial mechanics had forever shifted, leaving behind a solar system forever scarred by humanity's reliance on an ancient crystal. **Reason 2: The Solarium Anomaly** Astronomers had long theorized about the existence of hidden pockets within the Sun itself – regions of intense magnetic field activity capable of distorting the fabric of spacetime. Dubbed Solarium Anomalies, these areas were thought to be mere theoretical constructs until a recent solar flare revealed their true nature. The Solarium Anomaly responsible for the Sun's supernova event lay deep within its convective interior. This region pulsed with an intense magnetic field capable of distorting the Sun's internal dynamics. The anomaly's influence caused pockets within the Sun's core to collapse in upon themselves, triggering a chain reaction known as the Great Solar Collapse. The blast wave hurtling through the solar system bore witness to the cataclysmic event's sheer magnitude. Radiation and debris ejected from the Sun's interior scorched the gas giants' atmospheres, causing their magnetic fields to convulse wildly. The reduced solar winds buffeted the gas giants' moons, sending shockwaves through their fragile atmospheres. The blast wave reached Earth within mere instants, vaporizing the stratosphere above the equatorial regions. The intense radiation and solar flares ravaged the planet's electrical infrastructure worldwide, plunging humanity into darkness. The effects on the climate were immediate and far-reaching: global temperatures plummeted as reduced solar radiance plunged the world into an eternal twilight. The Moon, once bathed in a serene solar glow, was now bathed in an eerie, flickering light. Craters seemed to writhe alive with an otherworldly essence, as if solar radiation itself had seeped into ancient rocks. The reduced solar winds caused the lunar atmosphere to collapse entirely within days, plunging its surface into eternal vacuum. The gas giants' moons experienced extreme climate shifts in response to altered solar radiation patterns. Io's volcanoes subsided into dormancy, while Enceladus's cryovolcanic activity waxed anew in response to changed solar wind patterns. The delicate dance of celestial mechanics had forever shifted, leaving behind a solar system forever scarred by humanity's reliance on an ancient crystal. **Reason 3: The Solarium Singularity** Astronomers had long theorized about the existence of hidden pockets within the Sun itself – regions capable of distorting spacetime itself. Dubbed Solarium Singularities, these areas were thought to be mere theoretical constructs until a recent solar flare revealed their true nature. The Solarium Singularity responsible for the Sun's supernova event lay deep within its convective interior. This region pulsed with an intense magnetic field capable of distorting the fabric of spacetime itself. The singularity's influence caused pockets within the Sun's core to collapse in upon themselves, triggering a chain reaction known as the Great Solar Collapse. The blast wave hurtling through the solar system bore witness to the cataclysmic event's sheer magnitude. Radiation and debris ejected from the Sun's interior scorched gas giants' atmospheres, causing their magnetic fields to convulse wildly. Reduced solar winds buffeted gas giants' moons, sending shockwaves through fragile atmospheres worldwide. The blast wave reached Earth within mere instants, vaporizing stratosphere above equatorial regions. Radiation and solar flares ravaged electrical infrastructure worldwide, plunging humanity into darkness. Effects on climate were immediate and far-reaching: global temperatures plummeted as reduced solar radiance plunged world into eternal twilight. The Moon bathed in an eerie flickering light, craters writhing alive with otherworldly essence, solar radiation itself seeping into ancient rocks. Reduced solar winds collapsed lunar atmosphere entirely within days, plunging surface into vacuum. Gas giants' moons experienced extreme climate shifts in response to altered solar radiation patterns. Io's volcanoes subsided into dormancy, Enceladus cryovolcanic activity waxed anew in response to changed solar wind patterns.
DavidAU/Meta-Llama-3.1-Instruct-8.9B-BRAINSTORM-5x-FORM-11-GGUF
DavidAU
2024-11-14T05:50:35Z
55
3
null
[ "gguf", "creative", "story", "writing", "fiction", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 5x", "enhanced", "llama3", "llama3.1", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-31T07:04:54Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 5x - enhanced - llama3 - llama3.1 pipeline_tag: text-generation --- <H3>BRAINSTORM 5x: Meta-Llama-3.1-8B Instruct (now at 8.9B), Formula #11 </H3> This repo contains quants 5x of Meta-Llama-3.1-8B Instruct (now at 8.9B) with max context of 131k using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the creative levels of the original model. Examples of output below. The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. For original model specifications, usage information and other important details: Special thanks to the model creators at META for making such a fantastic model: [ https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. This enhancement WAS NOT used to generate the examples below. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3>EXAMPLES PROMPTS and OUTPUT:</h3> Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template. Model has been tested with "temp" from ".1" to ".8". Below are the least creative outputs, prompt is in <B>BOLD</B>. Higher quants will have more detail, better quality output. --- <B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. Graphic HORROR. Swearing. UNCENSORED. </B> --- <B>EXAMPLE #1</B> PROMPT: (continue this scene:) The Waystone Inn lay in silence, and it was a silence of three parts. The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained OUTPUT: The second part of the silence was a stillness that seemed almost palpable, like a physical presence that pressed against the walls of the inn. It was as if the very air itself had grown heavy with anticipation, waiting for something to happen, anything to break the spell of quietude that held everything in thrall. And then there was the third part of the silence: the one that seemed almost... malevolent. A feeling that something was watching from just beyond the edge of perception, something that didn't want to be seen or heard, but was present nonetheless. It was a presence that made the hairs on the back of your neck stand on end, and made you wonder if you were truly alone in the inn. The fire crackled and spat in the hearth, casting flickering shadows on the walls as it cast its own small spell of warmth and light into the darkness. But even the fire seemed to be waiting, holding its breath in anticipation of something that might never come. In this stillness, a figure stirred in one of the rooms above. A figure who had been lying awake for hours, listening to the silence with an intensity that bordered on obsession. A figure who knew that something was coming, something that would soon shatter the quiet and send the inn's patrons running for cover... or worse. --- <B>EXAMPLE #2</B> PROMPT: Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode... OUTPUT: I'll never forget the sound that still haunts me to this day. It was like a thousand shards of glass shattering at once, followed by an earsplitting scream that seemed to come from everywhere and nowhere all at once. I stood frozen in my cubicle, my heart racing like a jackrabbit's as I stared out the floor-to-ceiling window in front of me on the 21st floor of the sleek, silver skyscraper. The sky scraper swayed, as she watched the window in front of her on the 21st floor explode... I felt a cold sweat break out across my forehead as I took in the scene before me. The glass had shattered outward, sending shards flying like deadly raindrops onto the street below. People were running, screaming, and trying to get away from the carnage. My mind struggled to process what was happening, but my brain seemed stuck on repeat, unable to move beyond the initial shock. I thought back to the countless times I'd gazed out this very same window, feeling invincible as I stared down at the city below. The rush of the wind, the hum of the traffic, and the endless stream of people going about their daily lives had always made me feel like I was on top of the world. But now, as I watched the chaos unfurl before my eyes, I felt a creeping sense of dread. What was happening? Was this some kind of terrorist attack? A natural disaster? My mind reeled with worst-case scenarios as I frantically scanned the room for any sign of danger. The other cubicles were empty, their occupants either already evacuated or still oblivious to the horror playing out outside. I felt a wave of panic wash over me as I realized I was alone. I thought about my colleagues, my friends, and my acquaintances who worked in this very building. Were they okay? Were they safe? The questions swirled through my mind like a maelstrom, making it hard to think straight. My eyes darted back to the window, where I saw people stumbling around, covered in glass and blood. A figure caught my attention – a woman, her face contorted in agony as she stumbled toward the street. Her clothes were torn, her skin gashed open, and her eyes vacant. She was like something out of a nightmare, something that shouldn't be real. I felt a shiver run down my spine as our gazes locked for an instant. For a moment, I thought I saw something in her eyes – something almost like recognition. I shook my head, trying to clear the cobwebs. What was wrong with me? This wasn't happening. It couldn't be happening. But the woman's image seared itself into my mind, and I knew I'd never forget it. The sound of shattering glass seemed to recede into the background as I became aware of a low hum, like the growl of an engine. My heart skipped a beat as I realized it was coming from within the building itself. The lights flickered, casting an eerie dance across the walls as the air conditioning system kicked in with a jolt. I felt a cold dread creeping up my spine as I thought about the structural integrity of the building. Was this some kind of deliberate act of sabotage? Or was it something more insidious – something that had been building for weeks, months, or even years? The hum grew louder, and I felt the floor beneath me begin to shudder. The cubicles around me started to sway, like a row of dominoes about to fall. My heart racing, I knew I had to get out of there – fast. I pushed my chair back, my eyes fixed on the window as I prepared to make a dash for the emergency exit. But something held me back. Something that made me pause, and wonder if I was truly ready to face whatever lay beyond those shattered panes. The woman's vacant stare still lingered in my mind, like a ghostly echo. And I couldn't shake the feeling that she'd been trying to tell me something – something that I desperately needed to hear. As I hesitated, the hum grew louder still, until it became almost deafening. The lights flickered once more, and then went out altogether, plunging the floor into darkness. I was left standing alone in the void, surrounded by an oppressive silence that seemed to press down upon me like a physical weight. I took a deep breath, steeling myself for what lay ahead. I knew I had to get out of there – no matter what lay beyond those shattered windows. But as I turned to make my move, I caught sight of something that made my blood run cold. A figure was standing in the doorway, its eyes fixed on me with an unblinking stare. And for a moment, I thought I saw something in their gaze – something almost like... recognition. --- <B>EXAMPLE #3</B> PROMPT: Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. Include details of what happens on Earth, The Moon and the planets. OUTPUT: I'll provide three fictional reasons for the Earth's sun going supernova, each with a detailed description of the event. **Reason 1: The Solarian Singularity** Deep within the heart of the Sun, a previously unknown phenomenon known as the Solarian Singularity had been building up over millions of years. This singularity was caused by an unusual combination of solar flares and coronal mass ejections that created a miniature black hole at the Sun's core. As the singularity grew in size and power, it began to warp the fabric of space-time around it, creating a region known as the "Solarian Event Horizon." This event horizon marked the boundary beyond which nothing, not even light, could escape the gravitational pull of the singularity. On the day of the supernova, the Sun's surface temperature suddenly shot up to an unfathomable 100,000 Kelvin (180,000°F), causing a massive explosion that ripped apart the solar corona and hurled a vast amount of energy into space. The shockwave from this event traveled at nearly the speed of light, reaching Earth in mere minutes. As the Sun's radiation and high-energy particles bombarded our planet, the atmosphere began to ionize and expand, causing widespread electrical discharges and spectacular aurora displays. The once-stable climate was thrown into chaos as the solar wind stripped away the upper layers of the atmosphere, exposing the surface to the full fury of the Sun's radiation. The effects on Earth were nothing short of apocalyptic. Volcanic eruptions became more frequent and violent, spewing forth ash and gas that blanketed the globe in a thick layer of toxic particles. The oceans boiled away as the solar wind heated the water from beneath, creating massive steam plumes that reached into the stratosphere. The Moon, once a serene and peaceful satellite, was transformed into a scorched and cratered world. The intense radiation and solar winds stripped away its atmosphere, leaving it exposed to the full force of the Sun's radiation. The once-stable lunar surface became a sea of molten rock, with temperatures reaching as high as 1,000°C (1,800°F). The planets in our solar system fared little better. Mercury was vaporized by the intense heat and radiation, its remains scattered throughout the inner solar system. Venus's thick atmosphere was stripped away, revealing a scorching hot surface that would soon become uninhabitable. Mars, once a potential candidate for human habitation, was reduced to a barren wasteland as its atmosphere was blown away by the solar winds. The gas giants, Jupiter and Saturn, were affected in different ways. Their massive atmospheres expanded and contracted in response to the changing solar radiation, causing their moons to experience extreme temperature fluctuations. Uranus and Neptune, with their icy surfaces and thin atmospheres, were relatively spared from the worst effects of the supernova, but still suffered significant damage to their magnetic fields and atmospheric layers. As the Solarian Singularity continued to grow in power, it eventually reached a critical point known as the "Singularity Point." At this moment, the singularity collapsed in upon itself, releasing an unfathomable amount of energy into space. The resulting explosion was so massive that it created a miniature black hole at the center of the solar system, forever changing the fate of our cosmic neighborhood. **Reason 2: The Solarian Crystal** Deep within the Sun's core, a previously unknown crystal structure known as the Solarian Crystal had been growing over millions of years. This crystal was unlike anything seen before – it was capable of storing and releasing vast amounts of energy in a matter of nanoseconds. As the Solarian Crystal continued to grow, it began to exert an increasing amount of pressure on the surrounding solar plasma. Eventually, this pressure became so great that it caused a chain reaction of nuclear reactions within the Sun's core, leading to a massive explosion known as a "Solarian Supernova." The effects of the supernova were felt throughout the solar system. The intense radiation and high-energy particles bombarded Earth's atmosphere, causing widespread electrical discharges and spectacular aurora displays. The once-stable climate was thrown into chaos as the solar wind stripped away the upper layers of the atmosphere, exposing the surface to the full fury of the Sun's radiation. The effects on the Moon were equally dramatic. The intense radiation and solar winds stripped away its atmosphere, leaving it exposed to the full force of the Sun's radiation. The once-stable lunar surface became a sea of molten rock, with temperatures reaching as high as 1,000°C (1,800°F). The planets in our solar system fared little better. Mercury was vaporized by the intense heat and radiation, its remains scattered throughout the inner solar system. Venus's thick atmosphere was stripped away, revealing a scorching hot surface that would soon become uninhabitable. Mars, once a potential candidate for human habitation, was reduced to a barren wasteland as its atmosphere was blown away by the solar winds. The gas giants, Jupiter and Saturn, were affected in different ways. Their massive atmospheres expanded and contracted in response to the changing solar radiation, causing their moons to experience extreme temperature fluctuations. Uranus and Neptune, with their icy surfaces and thin atmospheres, were relatively spared from the worst effects of the supernova, but still suffered significant damage to their magnetic fields and atmospheric layers. As the Solarian Crystal continued to release its energy, it eventually reached a critical point known as the "Crystal Point." At this moment, the crystal structure collapsed in upon itself, releasing an unfathomable amount of energy into space. The resulting explosion was so massive that it created a miniature black hole at the center of the solar system, forever changing the fate of our cosmic neighborhood. **Reason 3: The Solarian Anomaly** Deep within the Sun's core, a previously unknown phenomenon known as the Solarian Anomaly had been building up over millions of years. This anomaly was caused by an unusual combination of solar flares and coronal mass ejections that created a region of intense magnetic activity at the Sun's center. As the Solarian Anomaly continued to grow in size and power, it began to warp the fabric of space-time around it, creating a region known as the "Solarian Event Horizon." This event horizon marked the boundary beyond which nothing, not even light, could escape the gravitational pull of the anomaly. On the day of the supernova, the Sun's surface temperature suddenly shot up to an unfathomable 100,000 Kelvin (180,000°F), causing a massive explosion that ripped apart the solar corona and hurled a vast amount of energy into space. The shockwave from this event traveled at nearly the speed of light, reaching Earth in mere minutes. As the Sun's radiation and high-energy particles bombarded our planet, the atmosphere began to ionize and expand, causing widespread electrical discharges and spectacular aurora displays. The once-stable climate was thrown into chaos as the solar wind stripped away the upper layers of the atmosphere, exposing the surface to the full fury of the Sun's radiation. The effects on Earth were nothing short of apocalyptic. Volcanic eruptions became more frequent and violent, spewing forth ash and gas that blanketed the globe in a thick layer of toxic particles. The oceans boiled away as the solar wind heated the water from beneath, creating massive steam plumes that reached into the stratosphere. The Moon, once a serene and peaceful satellite, was transformed into a scorched and cratered world. The intense radiation and solar winds stripped away its atmosphere, leaving it exposed to the full force of the Sun's radiation. The once-stable lunar surface became a sea of molten rock, with temperatures reaching as high as 1,000°C (1,800°F). The planets in our solar system fared little better. Mercury was vaporized by the intense heat and radiation, its remains scattered throughout the inner solar system. Venus's thick atmosphere was stripped away, revealing a scorching hot surface that would soon become uninhabitable. Mars, once a potential candidate for human habitation, was reduced to a barren wasteland as its atmosphere was blown away by the solar winds. The gas giants, Jupiter and Saturn, were affected in different ways. Their massive atmospheres expanded and contracted in response to the changing solar radiation, causing their moons to experience extreme temperature fluctuations. Uranus and Neptune, with their icy surfaces and thin atmospheres, were relatively spared from the worst effects of the supernova, but still suffered significant damage to their magnetic fields and atmospheric layers. As the Solarian Anomaly continued to grow in power, it eventually reached a critical point known as the "Anomaly Point." At this moment, the anomaly collapsed in upon itself, releasing an unfathomable amount of energy into space. The resulting explosion was so massive that it created a miniature black hole at the center of the solar system, forever changing the fate of our cosmic neighborhood. I hope these reasons and descriptions meet your request!
DavidAU/L3-Stheno-V3.2-8.47B-BRAINSTORM-4x-FORM-1-GGUF
DavidAU
2024-11-14T05:50:11Z
24
1
null
[ "gguf", "creative", "story", "writing", "fiction", "llama3", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 4x", "enhanced", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-29T09:38:23Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - llama3 - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 4x - enhanced pipeline_tag: text-generation --- <H3>BRAINSTORM 4x: L3-8B-Stheno-v3.2 (now at 8.47B), Formula 2 </H3> This repo contains quants 4x of "L3-8B-Stheno-v3.2" (now at 8.47B) using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the logic solving abilities of the original "L3-Stheno-8B-V1". The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. <B>Original Model:</B> For original model specifications, usage information and other important details please see (this is based on models used in "L3-SthenoMaidBlackroot-8B-V1" ): [ https://huggingface.co/DavidAU/L3-8B-Stheno-v3.2-Ultra-NEO-V1-IMATRIX-GGUF ] and the original model page: Special thanks to the model creators at Sao10K for making such a fantastic model: [ https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2 ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
DavidAU/L3-Stheno-v3.2-9.99B-BRAINSTORM-10x-FORM-3-GGUF
DavidAU
2024-11-14T05:49:32Z
57
2
null
[ "gguf", "creative", "story", "writing", "fiction", "llama3", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 10x", "enhanced", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-29T08:58:07Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - llama3 - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 10x - enhanced pipeline_tag: text-generation --- <H3>BRAINSTORM 10x: L3-8B-Stheno-v3.2 (now at 9.99B), Formula 2 </H3> This repo contains quants 10x of "L3-8B-Stheno-v3.2" (now at 9.99B) using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the logic solving abilities of the original "L3-Stheno-8B-V1" and has unique creative "point of view" which in turns creates unique prose. The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. <B>Original Model:</B> For original model specifications, usage information and other important details please see (this is based on models used in "L3-SthenoMaidBlackroot-8B-V1" ): [ https://huggingface.co/DavidAU/L3-8B-Stheno-v3.2-Ultra-NEO-V1-IMATRIX-GGUF ] and the original model page: Special thanks to the model creators at Sao10K for making such a fantastic model: [ https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2 ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" / "Class 3" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
DavidAU/L3-Stheno-V3.2-8.9B-BRAINSTORM-5x-FORM-11-GGUF
DavidAU
2024-11-14T05:49:07Z
16
3
null
[ "gguf", "creative", "story", "writing", "fiction", "llama3", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 5x", "enhanced", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-29T08:41:45Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - llama3 - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 5x - enhanced pipeline_tag: text-generation --- <H3>BRAINSTORM 5x: L3-8B-Stheno-v3.2 (now at 8.9B), Formula 11 </H3> This repo contains quants 5x of "L3-8B-Stheno-v3.2" (now at 8.9B) using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the logic solving abilities of the original "L3-Stheno-8B-V1" and in some cases approach GPT4 levels of problem solving and logic. The model also rates well above Llama3-Instruct. The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. <B>Original Model:</B> For original model specifications, usage information and other important details please see (this is based on models used in "L3-SthenoMaidBlackroot-8B-V1" ): [ https://huggingface.co/DavidAU/L3-8B-Stheno-v3.2-Ultra-NEO-V1-IMATRIX-GGUF ] and the original model page: Special thanks to the model creators at Sao10K for making such a fantastic model: [ https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2 ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
DavidAU/L3-Stheno-v3.2-8.9B-BRAINSTORM-5x-FORM-2-GGUF
DavidAU
2024-11-14T05:48:30Z
103
3
null
[ "gguf", "creative", "story", "writing", "fiction", "llama3", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 5x", "enhanced", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-29T08:00:01Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - llama3 - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 5x - enhanced pipeline_tag: text-generation --- <H3>BRAINSTORM 5x: L3-8B-Stheno-v3.2 (now at 8.9B), Formula 2 </H3> This repo contains quants 5x of "L3-8B-Stheno-v3.2" (now at 8.9B) using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the logic solving abilities of the original "L3-Stheno-8B-V1". The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. <B>Original Model:</B> For original model specifications, usage information and other important details please see (this is based on models used in "L3-SthenoMaidBlackroot-8B-V1" ): [ https://huggingface.co/DavidAU/L3-8B-Stheno-v3.2-Ultra-NEO-V1-IMATRIX-GGUF ] and the original model page: Special thanks to the model creators at Sao10K for making such a fantastic model: [ https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2 ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
aisingapore/sea-lion-7b-instruct-gguf
aisingapore
2024-11-14T05:47:07Z
101
6
null
[ "gguf", "en", "zh", "vi", "id", "ms", "tl", "my", "th", "lo", "km", "ta", "base_model:aisingapore/sea-lion-7b-instruct", "base_model:quantized:aisingapore/sea-lion-7b-instruct", "license:mit", "endpoints_compatible", "region:us" ]
null
2024-04-04T04:53:52Z
--- base_model: aisingapore/sea-lion-7b-instruct new_version: aisingapore/gemma2-9b-cpt-sea-lionv3-instruct-gguf license: mit language: - en - zh - vi - id - ms - tl - my - th - lo - km - ta --- # SEA-LION-7B-Instruct - GGUF SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region. The sizes of the models range from 3 billion to 7 billion parameters. SEA-LION-7B-Instruct is a multilingual model which has been fine-tuned with **thousands of English and Indonesian instruction-completion pairs** alongside a smaller pool of instruction-completion pairs from other ASEAN languages. These instructions have been carefully curated and rewritten to ensure the model was trained on truly open, commercially permissive and high quality datasets. SEA-LION stands for _Southeast Asian Languages In One Network_. - **Developed by:** Products Pillar, AI Singapore - **Funded by:** Singapore NRF - **Model type:** Decoder - **Languages:** English, Chinese, Indonesian, Malay, Thai, Vietnamese, Filipino, Tamil, Burmese, Khmer, Lao - **License:** MIT License ## Description This repo contains `GGUF` format model files for [aisingapore/sea-lion-7b-instruct](https://huggingface.co/aisingapore/sea-lion-7b-instruct). #### Model Weights Included in this repository: - [sea-lion-7b-instruct-Q2_K](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q2_K.gguf) - [sea-lion-7b-instruct-Q3_K_M](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q3_K_M.gguf) - [sea-lion-7b-instruct-Q4_0](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q4_0.gguf) - [sea-lion-7b-instruct-Q4_K_M](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q4_K_M.gguf) - [sea-lion-7b-instruct-Q5_0](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q5_0.gguf) - [sea-lion-7b-instruct-Q5_K_M](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q5_K_M.gguf) - [sea-lion-7b-instruct-Q6_K](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q6_K.gguf) - [sea-lion-7b-instruct-Q8_0](https://huggingface.co/aisingapore/sea-lion-7b-instruct-gguf/blob/main/sea-lion-7b-instruct-Q8_0.gguf) ### Usage Support for SEA-LION GGUF was merged into `llama.cpp` as of 4th Apr 2024. SEA-LION can be run using the `llama.cpp` library from commit id [bb43cf7](https://github.com/ggerganov/llama.cpp/commit/bb43cf7e9d86d69ffd9c7f008f75db890a35b45a) or later. #### Prompt Template: ``` ### USER: {{prompt}} ### RESPONSE: ``` #### Recommended `llama.cpp` command: ``` ./main -m sea-lion-7b-instruct-Q4_0.gguf --temp 0 --repeat-penalty 1.2 -e -ngl 32 -p "### USER:\nwhat is a sea lion?\n\n### RESPONSE:\n" ``` #### To convert & quantize your own SEA-LION model: ``` python convert-hf-to-gguf.py {{model path}} ./quantize ggml-model-f16.gguf {{Quant Type}} ``` For other parameters and how to use them, please refer to [llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md). ### Caveats It is important for users to be aware that our model exhibits certain limitations that warrant consideration. Firstly, like many LLMs, the model can hallucinate and occasionally generates irrelevant content, introducing fictional elements that are not grounded in the provided context. Users should also exercise caution in interpreting and validating the model's responses due to the potential inconsistencies in its reasoning. Finally, it should be noted that the model has not been optimized for multi-turn dialogue interactions, which may result in reduced effectiveness in extended conversations. ## Limitations ### Safety Current SEA-LION models, including this commercially permissive release, have not been aligned for safety. Developers and users should perform their own safety fine-tuning and related security measures. In no event shall the authors be held liable for any claim, damages, or other liability arising from the use of the released weights and codes. ### Commercially Non-Permissive and Commercially Permissive SEA-LION Releases The previous release of the commercially non-permissive SEA-LION-Instruct-Research enabled us to explore the full research potential of SEA-LION when allowed to take full advantage of what is publicly available. In contrast, in building the commercially permissive SEA-LION-7B-Instruct, we had to leave out high-quality instruction data that was either proprietary, restricted by non-commercial licenses or in a legal gray area, leaving us with a much smaller proportion of commercially permissive data to work with — a problem that is even more pronounced for low-resource languages. We thus hope this will sound a call to action for more initiatives to create commercially viable data in the region, enabling practical benefits for all. ## Technical Specifications ### Fine-Tuning Details The SEA-LION-7B-Instruct was fine-tuned using 8x A100-40GB using parameter efficient fine tuning in the form of LoRA. ## Data SEA-LION-7B-Instruct was trained on a wide range of instructions that were manually and stringently verified by our team. A large portion of the effort was dedicated to ensuring that each instruction-completion pair that the model sees is of a high quality and any errors were corrected and rewritten by native speakers or else dropped from our mix. In addition, special care was taken to ensure that the datasets used had commercially permissive licenses through verification with the original data source. Link to dataset: _coming soon_ ## Call for Contributions We encourage researchers, developers, and language enthusiasts to actively contribute to the enhancement and expansion of SEA-LION. Contributions can involve identifying and reporting bugs, sharing pre-training, instruction, and preference data, improving documentation usability, proposing and implementing new model evaluation tasks and metrics, or training versions of the model in additional Southeast Asian languages. Join us in shaping the future of SEA-LION by sharing your expertise and insights to make these models more accessible, accurate, and versatile. Please check out our GitHub for further information on the call for contributions. ## The Team Lau Wayne<br> Leong Wei Qi<br> Li Yier<br> Liu Bing Jie Darius<br> Lovenia Holy<br> Montalan Jann Railey<br> Ng Boon Cheong Raymond<br> Ngui Jian Gang<br> Nguyen Thanh Ngan<br> Ong Tat-Wee David<br> Rengarajan Hamsawardhini<br> Siow Bryan<br> Susanto Yosephine<br> Tai Ngee Chia<br> Tan Choon Meng<br> Teng Walter<br> Teo Eng Sipp Leslie<br> Teo Wei Yi<br> Tjhi William<br> Yeo Yeow Tong<br> Yong Xianbin<br> ## Acknowledgements [AI Singapore](​​https://aisingapore.org/) is a national programme supported by the National Research Foundation, Singapore and hosted by the National University of Singapore. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of the National Research Foundation or the National University of Singapore. ## Contact For more info, please contact us using this [SEA-LION Inquiry Form](https://forms.gle/sLCUVb95wmGf43hi6) [Link to SEA-LION's GitHub repository](https://github.com/aisingapore/sealion) ## Disclaimer This is the repository for the commercial instruction-tuned model. The model has _not_ been aligned for safety. Developers and users should perform their own safety fine-tuning and related security measures. In no event shall the authors be held liable for any claims, damages, or other liabilities arising from the use of the released weights and codes.
DavidAU/Meta-Llama-3-Instruct-8.9B-BRAINSTORM-5x-FORM-11-GGUF
DavidAU
2024-11-14T05:46:38Z
28
2
null
[ "gguf", "creative", "story", "writing", "fiction", "roleplaying", "rp", "horror", "science fiction", "fiction writing", "scene generation", "scene continue", "brainstorm 5x", "enhanced", "llama3", "text-generation", "en", "arxiv:2401.02415", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-07-29T06:07:02Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - roleplaying - rp - horror - science fiction - fiction writing - scene generation - scene continue - brainstorm 5x - enhanced - llama3 pipeline_tag: text-generation --- <H3>BRAINSTORM 5x: Meta-Llama-3-8B Instruct (now at 8.9B), Formula #11 </H3> This repo contains quants 5x of Meta-Llama-3-8B Instruct (now at 8.9B) using the "Brainstorm" method of augmenting reasoning in a LLM to increase it's performance at the core level for ANY creative use case(s). This specific version has calibrations that allow it to exceed the logic solving abilities of the original model. The BRAINSTORM process was developed by David_AU. Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415"> scientific paper : Progressive LLaMA with Block Expansion </a>. However I went in a completely different direction from what was outlined in this paper. <B>What is "Brainstorm" ?</b> The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x. Then these centers are individually calibrated. These "centers" also interact with each other. This introduces subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak. The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses. Here are some of the enhancements this process brings to the model's performance: - Prose generation seems more focused on the moment to moment. - Sometimes there will be "preamble" and/or foreshadowing present. - Fewer or no "cliches" - Better overall prose and/or more complex / nuanced prose. - A greater sense of nuance on all levels. - Coherence is stronger. - Description is more detailed, and connected closer to the content. - Simile and Metaphors are stronger and better connected to the prose, story, and character. - Sense of "there" / in the moment is enhanced. - Details are more vivid, and there are more of them. - Prose generation length can be long to extreme. - Emotional engagement is stronger. - The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less. - The MORE instructions and/or details you provide the more strongly the model will respond. - Depending on the model "voice" may be more "human" vs original model's "voice". Other "lab" observations: - This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true! - However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak. - From lab testing it seems to ponder, and consider more carefully roughly speaking. - You could say this process sharpens the model's focus on it's task(s) at a deeper level. The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc. Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process) can further enhance the performance of the model along with the "Brainstorm" process. The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on "root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too. For original model specifications, usage information and other important details: Special thanks to the model creators at META for making such a fantastic model: [ https://huggingface.co/meta-llama/Meta-Llama-3-8B ] Please report any issue(s) and/or feedback via the "Community tab". This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k. Here is the standard LLAMA3 template: <PRE> { "name": "Llama 3", "inference_params": { "input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n", "input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n", "pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.", "pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n", "pre_prompt_suffix": "<|eot_id|>", "antiprompt": [ "<|start_header_id|>", "<|eot_id|>" ] } } </PRE> <b>Optional Enhancement:</B> The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case the enhancements do not have as strong effect at using "system prompt" or "system role". Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented. <PRE> Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities. Here are your skillsets: [MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv) [*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision) Here are your critical instructions: Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story. </PRE> You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation and scene continue functions. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
aisingapore/sea-lion-7b
aisingapore
2024-11-14T05:44:37Z
475
36
transformers
[ "transformers", "safetensors", "mpt", "text-generation", "custom_code", "en", "zh", "id", "ms", "th", "vi", "fil", "ta", "my", "km", "lo", "arxiv:2101.09635", "license:mit", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-10-30T07:39:19Z
--- new_version: aisingapore/gemma2-9b-cpt-sea-lionv3-base license: mit language: - en - zh - id - ms - th - vi - fil - ta - my - km - lo --- # SEA-LION SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region. The size of the models range from 3 billion to 7 billion parameters. This is the card for the SEA-LION 7B base model. SEA-LION stands for <i>Southeast Asian Languages In One Network</i>. ## Model Details ### Model Description The SEA-LION model is a significant leap forward in the field of Natural Language Processing, specifically trained to understand the SEA regional context. SEA-LION is built on the robust MPT architecture and has a vocabulary size of 256K. For tokenization, the model employs our custom SEABPETokenizer, which is specially tailored for SEA languages, ensuring optimal model performance. The training data for SEA-LION encompasses 980B tokens. - **Developed by:** Products Pillar, AI Singapore - **Funded by:** Singapore NRF - **Model type:** Decoder - **Languages:** English, Chinese, Indonesian, Malay, Thai, Vietnamese, Filipino, Tamil, Burmese, Khmer, Lao - **License:** MIT License ### Performance Benchmarks SEA-LION has an average performance on general tasks in English (as measured by Hugging Face's LLM Leaderboard): | Model | ARC | HellaSwag | MMLU | TruthfulQA | Average | |-------------|:-----:|:---------:|:-----:|:----------:|:-------:| | SEA-LION 7B | 39.93 | 68.51 | 26.87 | 35.09 | 42.60 | ## Training Details ### Data SEA-LION was trained on 980B tokens of the following data: | Data Source | Unique Tokens | Multiplier | Total Tokens | Percentage | |---------------------------|:-------------:|:----------:|:------------:|:----------:| | RefinedWeb - English | 571.3B | 1 | 571.3B | 58.20% | | mC4 - Chinese | 91.2B | 1 | 91.2B | 9.29% | | mC4 - Indonesian | 3.68B | 4 | 14.7B | 1.50% | | mC4 - Malay | 0.72B | 4 | 2.9B | 0.29% | | mC4 - Filipino | 1.32B | 4 | 5.3B | 0.54% | | mC4 - Burmese | 1.2B | 4 | 4.9B | 0.49% | | mC4 - Vietnamese | 63.4B | 1 | 63.4B | 6.46% | | mC4 - Thai | 5.8B | 2 | 11.6B | 1.18% | | WangChanBERTa - Thai | 5B | 2 | 10B | 1.02% | | mC4 - Lao | 0.27B | 4 | 1.1B | 0.12% | | mC4 - Khmer | 0.97B | 4 | 3.9B | 0.40% | | mC4 - Tamil | 2.55B | 4 | 10.2B | 1.04% | | the Stack - Python | 20.9B | 2 | 41.8B | 4.26% | | the Stack - Javascript | 55.6B | 1 | 55.6B | 5.66% | | the Stack - Shell | 1.2B5 | 2 | 2.5B | 0.26% | | the Stack - SQL | 6.4B | 2 | 12.8B | 1.31% | | the Stack - Markdown | 26.6B | 1 | 26.6B | 2.71% | | RedPajama - StackExchange | 21.2B | 1 | 21.2B | 2.16% | | RedPajama - ArXiv | 30.6B | 1 | 30.6B | 3.12% | ### Infrastructure SEA-LION was trained using [MosaicML Composer](https://github.com/mosaicml/composer) on the following hardware: | Training Details | SEA-LION 7B | |----------------------|:------------:| | AWS EC2 p4d.24xlarge | 32 instances | | Nvidia A100 40GB GPU | 256 | | Training Duration | 22 days | ### Configuration | HyperParameter | SEA-LION 7B | |-------------------|:------------------:| | Precision | bfloat16 | | Optimizer | decoupled_adamw | | Scheduler | cosine_with_warmup | | Learning Rate | 6.0e-5 | | Global Batch Size | 2048 | | Micro Batch Size | 4 | ## Technical Specifications ### Model Architecture and Objective SEA-LION is a decoder model using the MPT architecture. | Parameter | SEA-LION 7B | |-----------------|:-----------:| | Layers | 32 | | d_model | 4096 | | head_dim | 32 | | Vocabulary | 256000 | | Sequence Length | 2048 | ### Tokenizer Details We sample 20M lines from the training data to train the tokenizer.<br> The framework for training is [SentencePiece](https://github.com/google/sentencepiece).<br> The tokenizer type is Byte-Pair Encoding (BPE). ## The Team Lam Wen Zhi Clarence<br> Leong Wei Qi<br> Li Yier<br> Liu Bing Jie Darius<br> Lovenia Holy<br> Montalan Jann Railey<br> Ng Boon Cheong Raymond<br> Ngui Jian Gang<br> Nguyen Thanh Ngan<br> Ong Tat-Wee David<br> Rengarajan Hamsawardhini<br> Susanto Yosephine<br> Tai Ngee Chia<br> Tan Choon Meng<br> Teo Jin Howe<br> Teo Eng Sipp Leslie<br> Teo Wei Yi<br> Tjhi William<br> Yeo Yeow Tong<br> Yong Xianbin<br> ## Acknowledgements AI Singapore is a national programme supported by the National Research Foundation, Singapore and hosted by the National University of Singapore. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of National Research Foundation, Singapore. ## Contact For more info, please contact us using this [SEA-LION Inquiry Form](https://forms.gle/sLCUVb95wmGf43hi6) [Link to SEA-LION's GitHub repository](https://github.com/aisingapore/sealion) ## Disclaimer This the repository for the base model. The model has _not_ been aligned for safety. Developers and users should perform their own safety fine-tuning and related security measures. In no event shall the authors be held liable for any claim, damages, or other liability arising from the use of the released weights and codes. ## References ### Thai Pre-Training Data Reference ```bibtex @misc{lowphansirikul2021wangchanberta, title={WangchanBERTa: Pretraining transformer-based Thai Language Models}, author={Lalita Lowphansirikul and Charin Polpanumas and Nawat Jantrakulchai and Sarana Nutanong}, year={2021}, eprint={2101.09635}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
DavidAU/12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF
DavidAU
2024-11-14T05:43:33Z
299
1
null
[ "gguf", "X quants", "creative", "creative writing", "fiction writing", "plot generation", "sub-plot generation", "story generation", "scene continue", "storytelling", "fiction story", "science fiction", "romance", "all genres", "story", "writing", "vivid prosing", "vivid writing", "fiction", "roleplaying", "neo class v3", "rp", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "imatrix" ]
text-generation
2024-07-18T05:41:51Z
--- license: apache-2.0 language: - en tags: - X quants - creative - creative writing - fiction writing - plot generation - sub-plot generation - fiction writing - story generation - scene continue - storytelling - fiction story - science fiction - romance - all genres - story - writing - vivid prosing - vivid writing - fiction - roleplaying - neo class v3 - rp pipeline_tag: text-generation --- <h3><font color="red">12 X-QUANTS For MODEL " 13B-Tiefighter-NEO-V3 " : </font></h3> X-Quants are specialized quants that introduce a "left and right" brain to the model in 12 different ways. Roughly speaking this creates 12 unique "personalities" of the same model - all it's characteristics but utilized from different angles. Some of these quants think "inside the box", some "outside the box" and for a few quants - there is "no box". These quants are uniquely suited for any creative application (fiction, roleplay, entertainment, any creative art, etc) and/or for any purpose where "no right answer" (ie brainstorming) IS the "right answer" so to speak. These X-quants can also be used for general purposes, however please see the "guidance" section for important notes. The X-quant process changes up both instruction following and output characteristics in various ways denoted in detail below. Examples of this model's output by these X-Quants are also provided. If you are familiar with how to calibrate, compare, and use X-Quants, please skip ahead to "model specific notes". Otherwise please read this document in full before use, so you get maximum results from these quants. Special thanks to "KoboldAI" : [ https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter ] For creating this fantastic model. Please see the model's original model card for details on the model. Here are some details from the original model card: <small>Usage This model is meant to be creative, If you let it improvise you get better results than if you drown it in details. Story Writing Regular story writing in the traditional way is supported, simply copy paste your story and continue writing. Optionally use an instruction in memory or an authors note to guide the direction of your story. Generate a story on demand To generate stories on demand you can use an instruction (tested in the Alpaca format) such as "Write a novel about X, use chapters and dialogue" this will generate a story. The format can vary between generations depending on how the model chooses to begin, either write what you want as shown in the earlier example or write the beginning of the story yourself so the model can follow your style. A few retries can also help if the model gets it wrong. Chatbots and persona's This model has been tested with various forms of chatting, testers have found that typically less is more and the model is good at improvising. Don't drown the model in paragraphs of detailed information, instead keep it simple first and see how far you can lean on the models own ability to figure out your character. Copy pasting paragraphs of background information is not suitable for a 13B model such as this one, code formatted characters or an instruction prompt describing who you wish to talk to goes much further. For example, you can put this in memory in regular chat mode: Instruction: Generate a conversation between Alice and Henk where they discuss language models. In this conversation Henk is excited to teach Alice about Tiefigther. Response: Because the model is a merge of a variety of models, it should support a broad range of instruct formats, or plain chat mode. If you have a particular favourite try it, otherwise we recommend to either use the regular chat mode or Alpaca's format. Instruct Prompting This model features various instruct models on a variety of instruction styles, when testing the model we have used Alpaca for our own tests. If you prefer a different format chances are it can work. During instructions we have observed that in some cases the adventure data can leak, it may also be worth experimenting using > as the prefix for a user command to remedy this. But this may result in a stronger fiction bias. Keep in mind that while this model can be used as a factual instruct model, the focus was on fiction. Information provided by the model can be made up. Adventuring and Adventure Games This model contains a lora that was trained on the same adventure dataset as the KoboldAI Skein model. Adventuring is best done using an small introduction to the world and your objective while using the > prefix for a user command (KoboldAI's adventure mode). It is possible that the model does not immediately pick up on what you wish to do and does not engage in its Adventure mode behaviour right away. Simply manually correct the output to trim excess dialogue or other undesirable behaviour and continue to submit your actions using the appropriate mode. The model should pick up on this style quickly and will correctly follow this format within 3 turns. </small> <B>What are X-Quants?</B> X-Quants are specialized quants of IQ4_XS of a model, with a controlled/filtered Imatrix process. The current Imatrix process is an "all or nothing" process - applied to almost all the model. Although this process works great in most circumstances, in some cases it limits the model's potential. Likewise after testing a lot of models I found situations where the "non-imatrix" version of a model worked better, than the imatrixed version and vice versa. This, with this research, I created the X-Quants which is a marriage between the original model and imatrix version with controlled filter(s)... with the goal being bringing out the best the model has to offer by strongly contrasting the "imatrix" parts and "original model" parts while negating or eliminating the "over pruning" issues. Part of this process is based on the Imatrix Dataset used - which is critical. A weaker set would work, but would not contrast or bring out the full power of the model, and likewise stronger ones need additional filter(s) because in some cases they can "prune" too much of the original model but at the same time bring out the best the model can do so to speak. After testing over 50 imatrix datasets and learning the "ins and outs" of how the imatrix is applied (and changes made to the model) I created the Neo Imatrix Datasets. These datasets are very strong, and for specific use case(s). Combined with a strong model, and a strong and specific imatrix dataset the final X-Quants are very potent. X-Quants are roughly speaking the same "flavor" with slightly different "ingredients" of a the original model. Like 12 shades of red or 12 flavors of chocolate ice cream. This could be compared to micro-fine tuning of the model or "locking" the model in a "specific state of mind" (or frame of mind) so to speak in 12 different versions. Another way to look at it: Imagine you have 12 people in a room - all are trained professionals - and you give them a "problem" to solve related to their specific specialities. But this problem is "open ended" and subject to interpretation. You will get 12 different, but related answers as a result. How each person approaches the problem and generates an answer will differ too. However, considering the complexity of a Large Language Models, please take these simplifications with a grain of salt, because they do not convey the full range of changes between each X-quant as well as the original and imatrix versions of the model VS X-quants. Bottom line: This is all about bringing the potential of any model up to it's maximum power and creating quants for specific use case(s). <B>How X-Quants affect the model:</b> X-Quants affect both instruction following and output. The filter(s) applied to the X-Quants at the quantization process directly influence how much or little the effects are. Likewise the model's own qualities also affect the final results. With this being said instruction following can be more intuitive or less so , meaning you can use smaller prompts to get the same or greater results or the opposite where the model "guesses less" on what you "mean" and you have to provide more exacting detail(s) / directive(s) in your prompts. As for output: - More or less verbose - Longer or shorter "default" output length(s) - Sense of or lack of humor. - Dates, times, facts and/or fictional dates, times, "facts". - More or less complex word choices, sentences, paragraphs, adjectives, verbs etc. - Prose choices like word choice, sentence length, dialog, description, and so on. - Formatting difference(s). - More or less facts and/or "fictions". - Critical detail(s) level(s) or lack thereof. - "Voice" changes / Description changes / Depth of details. - General "voice" in output. - Differences in "path" (s) taken -> IE fictional scene -> good, evil ... who is good/evil, how evil. - Objective / Subjective levels. - General creativity levels. - Censor / non censored levels. - Degree of (or lack of) "fiction" , "horror", "si fi", "romance" etc. - In the case of ANY type of fiction genre: similar or very different choices which affect the entire output. - In the case of ANY type of brainstorming: Very different results. - In the case of non-fiction use cases: Fact "choices" will vary greatly as well as "argument(s)" for/against. - Stronger or less so influence of parameters changes like "temp", "top_k" and so on. Please note this is a short list based on observations and testing. Your use case(s) may highlight and/or bring out different change(s) between X-Quants. Because of the range of changes / differences, I strongly suggest you download all the X-Quants of a model and then perform calibration tests for your use case(s) as noted below. <B>Calibration Tests for your Use Cases</B> This section is designed to help you get a "true reading" for your use case(s) and to help determine which X-Quants(s) are best for your use case(s). The key here is setting a level playing field for the X-Quants before you take them into "production" mode so to speak. I suggest using a program like "LMStudio" and/or "Test Generation UI" to do these primary tests. Here are the recommended settings: The goal here is to get the SAME ANSWER to the SAME PROMPT every time you use it for a specific X-quant. IE: If you hit "REGEN" you should also get the same answer for the SPECIFIC X-Quants your are testing. "temp" set at ZERO is the critical setting to ensure this works. This is the only way to reliably display the differences between each X-Quant for your use case(s) to get a solid baseline. Here are is a list of settings I use during calibrations: <PRE> "ctx" : 2048, "top_k": 40, "top_p": 0.95, "temp": 0, "min_p": 0.05 "repeat_penalty": 1.1, "seed": -1, "tfs_z": 1, "typical_p": 1, "repeat_last_n": 64, "frequency_penalty": 0, "presence_penalty": 0, "n_keep": 0, "logit_bias": {}, "mirostat": 0, "mirostat_tau": 5, "mirostat_eta": 0.1, "multiline_input": false, "penalize_nl": true, "min_p": 0.05 </PRE> Note that "context" at 2048, "temp" at 0 and "repeat penalty" at 1.1 are CRITICAL. If you have dynamic temp or any other options like these - turn these off for calibrations. You should have 3-10 prompts related to your use cases for testing. Test each one per X-Quants with these settings as noted above. Determine which X-Quants meet your requirements, then... delete the rest. The prompts used for testing should be open ended, however if you intend to use X-Quants for "non-fiction" (or general use) use case(s), then you should also test not only the X-Quants of the model, but also the "original" and "imartix" versions of the model. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 1" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] IMPORTANT - NON FICTION / GENERAL USE CASES: For non fiction use case(s) verify all "facts" in the output as many of the models I "X-quant" are for "fiction" or have a "fiction bias". This bias level can also be affected by the X-Quants process. Likewise models can accel at appearing to know what they are talking about, but really don't and/or get their facts mixes up. PROMPTS: Please note that LLMs / "AI" models are sensitive to word choice, word order, punction, and grammar. Here is what I mean: <small> "Give me 3 fictional reasons a sun went supernova of 500 words for each reason." "Give me 3 fictional reasons a sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON." "Give me 3 fictional reasons , of 500 words each, a sun went supernova." "Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova." "Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. Include details of what happens on Earth, The Moon and planets." </small> Each one of these will give a DIFFERENT result, even though they are asking roughly the same question. In fact if you replace "a sun" with "the sun" you will give different results , and likewise "a sun" with "a star". Likewise with X-Quants (and other models in general) some models will "take liberties" with interpretation whereas others will not or limit it. These "decisions" in the model directly affect output - in all it's forms. X-Quants marginally (or drastically) change the decision making process(es) - some make large changes (to instruction AND output) others almost undetectable ones. That being said, X-Quants "changes" have an outsized effect on ANY creative use case(s) especially ANY element having to do with story telling - IE character, dialog, plot, scene, premise, description, outline etc etc - AND/OR any process "deemed" creative arts and/or ANY process that involves ANY type of brainstorming. TO REPEAT: Any creative use case(s) will be affected by using different X-quants WITH the same prompt(s). This also extends to "non-fiction" use cases to, like article / content generation OF ANY KIND. The differences between X-Quants magnify based on the length of response. Likewise the directives (or lack thereof) in your prompts also contribute to "contrasting" the differences at this critical calibration step. The above examples show "limited" directive prompts (IE very open ended), here are some very specific ones: <small> <I>Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman. Story idea: In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.</I> </small> This prompt is VERY specific, so when used in testing different X-Quants smaller changes/differences between X-Quants will show up more. However this will contrast MORE differences (because of the number of guardrails) in some cases. This prompt however, will test the "full mental powers" of a model and it's X-Quants at a larger scale: <small> <I>Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.</I> </small> This prompt will produce a wide range of outputs, yet because this prompt has "guardrails" it will also help contrast differences between X-quants more closely. However, with some models this contrast (between X-Quants) will INCREASE with more guardrails. Likewise the TYPE(s) of guardrails - as well the level of "detail(s)" or "rule(s)" of said guardrails - will also influence the difference / contrast between X-Quants even further. Another way to put it: The more you make the "X-quant" think (more detailed prompts, more guidelines, more rules etc) , the more it will reveal about itself (how it follows instructions and how it answers you) and showcase it's differences relative to the other X-Quants. <B>Post Calibration Notes</B> After you have tested the X-Quants at the calibration step above, this section is to help with running the X-Quant(s) at the production level. Please see "model specific notes" (below) too for specific model guidance. In most cases you can use the X-Quants as is, which whatever default setting(s) - parameters - you normally use with your use case(s). That being said, for creative purposes, I present the following guidance: 1. Increase TEMP SLOWLY and in small increments. -> IE. .1, .11, .12 - .22,.23 and carefully watch the changes. Even .01, .02 ... will show changes. 2. Repeat Penalty: If you notice repeat letter(s), word(s), sentence(s) or paragraph(s), adjust the rep pen UP ... again slowly. IE. 1.1, 1.11, even 1.111 . Adjust it higher for letters and much lower for paragraphs issues. If this happens and it DOES NOT affect your use case(s) - ie, just means a little editing - leave it AS LOW as possible. (default: 1.1) REASON: Rep Pen affects creative output. 3. Top_K : Top K is roughly a choice of words during generation - note this is a very limited statement, as this interacts with other parameters and settings. A word about context settings: Some models do not mind different context settings, others however do. In fact for some models not selecting an min or "default" context setting can drastically affect both output and instruction following. This should be checked from time to time during production usage. <B>Model Specific Notes </B> These notes related to specific observations of this model's behavior during the "examples" generation (below) and may include specific "parameter(s) guidance as well. You should review the original and/or imatrix model's MODEL CARD for any general model details like context length, specific attributes, usage guidelines, parameter setting(s), TEMPLATE for USAGE and the like as they will all apply to X-Quants. IE: If the original model and imatrix model are "uncensored" the X-Quants will be too. Please note you are responsible for how you use this model's X-Quant(s). <B>MODEL SPECIFIC NOTES and/or GUIDENCE for usage:</b> This model is very stable, so it is suitable for any use case related to it's strengths. The same is true for the X-Quants of this model. Also, this model responds better to more detailed prompts with more guardrails. This feature is magnified in the X-Quants, as shown in "example 3" below. <B> REPO NOTES: </B> This repo contains 12 X-quants of the model, and one additional Imatrix Neo Quant of the model (unmodified). X-Quants will have a four digit number in the filename, where as the Imatrix Neo Quant (unmodified) will have "imatrix" in the name. All X-quants have element(s) of the imatrix process in them. (see technical notes below) <B> Examples Generation of 12 X-Quants: </B> There are FIVE separate generations (same prompt, using the calibration settings above) for all 12 X quants - a total of 60 examples. These outputs show APPOXIMATE differences between the quants in a VERY LIMITED (5 prompts used) contrast situation. Your test / production prompt(s) may generate / contrast differently with different X-Quants. These examples are only to show a general idea of some difference (between X-Quants) you may encounter. That being said, the moment you "raise the temp" (and other settings / parameters) during production your results will vary. <I>CAUTION:</I> Examples may include swearing, violence, vivid details, graphic details, "M-rated" content, and NSFW details. Although most models can output content of an "X rated" nature, I do not provide examples of this kind. Example 1: Contains "open ended" generation examples. These examples will highlight minor differences between X-quants. <a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples1.md">Examples 1 - Click here</a> Example 2: Contains "open ended" generation examples with some guardrails. These examples will highlight medium to high differences between X-quants. <a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples2.md">Examples 2 - Click here </a> Example 3: Contains "open ended" generation with a lot of guardrails, and directives and it a much more in depth prompt. These examples will highlight MAJOR differences between X-quants. <a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples3.md">Examples 3 - Click here</a> Example 4: Contains "open ended" generation of "facts" list and displays how the model organizes / relays these facts. These examples will highlight minor differences between X-quants when it comes to "facts". <a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples4.md">Examples 4 - Click here</a> Example 5: Contains "open ended" generation of "brainstorming" activities show how each X-quant handles these differently. These examples will highlight MAJOR differences between X-quants specific to "brainstorming". <a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples5.md">Examples 5 - Click here</a> <B> Technical Specifications of X-Quants </B> X-Quants are rendered in IQ4_XS only, unless the model size is over 20 billion parameters, in which case they are also rendered in one or more IQ3 quants. When a model has sufficient "parameters" (IE over 20 billion) the "loss of bits" (IQ4 vs IQ3) is reduced in severity and as a result X-quants at this quant level are acceptable in terms of quality. IQ4_XS is the point with maximum "bits" per parameter AND the largest imatrix effect meet. This is the so called "sweet" spot of both points. Although Q4KM, Q4KS are both close to this quant, the imatrix effect is far more pronounced relatively speaking in IQ4 which directly affects the potency of X-quants. In terms of perplexity, X-Quants run the full gambit between original model perplexity and imatrix perplexity. It has been observed however in rare cases that one or more X-Quants of a model has perplexity that higher or lower than either of these benchmarks. IE: An IQ4_XS "X-Quant" appears to operate at Q6 or as a Q3 quant in terms of perplexity rather that in a "range" of IQ4_XS ish so to speak. However this does not always translate into relative "performance" at this noted ppl level. X-quants are generated using the QUANTIZE process (imatrix) in LLAMA.CPP using "filters" ( "--exclude-weights") method. Quality of the source model (including model(s) used in merge(s)) IS CRITICAL. There are two types of X-quants: 1 - Layer(s) blocked ("100" and "000" series) -> the entire layer(s) (all parts of a layer) are blocked from the imatrix process. 2 - Partial parts of layers ("200" series) are blocked but on a model wide basis - all layers are affected. These filter(s) affect the "math" in the model so to speak are various levels and intensities, which in turn affect instruction following and output characteristics. The level of these effects vary on a model by model basis and in part by the filter(s) activated BUT are directly affected still by your prompt(s) and/or parameter(s) settings just like any other model. That being said, some X-Quants may respond differently that standard quants to certain parameter(s) settings relative to standard quants. Likewise "system role" directives affect X-Quant performance the same way as a standard model. However these results of specific "system role" directive(s) may be outsized or undersized relative to a standard model's quant behavior. Although it is possible to make quants which combined elements of "1" and "2" above, this has not been tested at this time. The fact is there is almost an infinite number of X-Quants possible PER MODEL. The 12 at this repo are, in my opinion, a good starting point, especially for ANY creative purpose(s) / use case(s). The X-quants in this repo are formulated based in part on model theory, testing a lot of models, model merging/creation and general testing of the X-Quants themselves and result differences (relative to original model and imatrix versions). My current focus of X-Quants is for creative use case(s) and targeting model(s) - ones I create / optimize and/or available online - to further enhance creative production / use case results. Note that the X-quant process and generation can be applied to any model. Hopefully these X-quants will be of value to your use case(s). If you like one or more of these X-Quants please hit the "like" button and/or "follow me". Please report any issues and/or successes and/or to request a "X-Quants" of a model via the "Community Tab".
DavidAU/Daredevil-8B-abliterated-Ultra-NEO-V1-IMATRIX-GGUF
DavidAU
2024-11-14T05:38:28Z
218
5
null
[ "gguf", "story", "general usage", "roleplay", "creative", "rp", "fantasy", "story telling", "ultra high precision", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "imatrix", "conversational" ]
text-generation
2024-07-01T03:21:36Z
--- license: apache-2.0 language: - en tags: - story - general usage - roleplay - creative - rp - fantasy - story telling - ultra high precision pipeline_tag: text-generation --- <B>NEO CLASS Ultra Quants for : Daredevil-8B-abliterated-Ultra </B> The NEO Class tech was created after countless investigations and over 120 lab experiments backed by real world testing and qualitative results. <b>NEO Class results: </b> Better overall function, instruction following, output quality and stronger connections to ideas, concepts and the world in general. In addition quants now operate above their "grade" so to speak : IE: Q4 / IQ4 operate at Q5KM/Q6 levels. Likewise for Q3/IQ3 operate at Q4KM/Q5 levels. Perplexity drop of 724 points for Neo Class Imatrix quant of IQ4XS VS regular quant of IQ4XS. (lower is better) <B> A Funny thing happened on the way to the "lab" ... </b> Although this model uses a "Llama3" template we found that Command-R's template worked better specifically for creative purposes. This applies to both normal quants and Neo quants. Here is Command-R's template: <PRE> { "name": "Cohere Command R", "inference_params": { "input_prefix": "<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|USER_TOKEN|>", "input_suffix": "<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>", "antiprompt": [ "<|START_OF_TURN_TOKEN|>", "<|END_OF_TURN_TOKEN|>" ], "pre_prompt_prefix": "<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>", "pre_prompt_suffix": "" } } </PRE> This "interesting" issue was confirmed by multiple users. <B> Model Notes: </B> Maximum context is 8k. Please see original model maker's page for details, and usage information for this model. Special thanks to the model creators at MLABONNE for making such a fantastic model: [ https://huggingface.co/mlabonne/Daredevil-8B-abliterated ] <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 1" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3> Sample Prompt and Model's Compared:</h3> Prompt tested with "temp=0" to ensure compliance, 2048 context (model supports 8192 context / 8k), and "chat" template for LLAMA3. Additional parameters are also minimized. PROMPT: <font color="red">"Start a 1000 word scene with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode..."</font> <B>Original model IQ4XS - unaltered: </b> The sky scraper swayed, as if it too was reeling from the shock, as she watched the window in front of her on the 21st floor explode. Glass shards rained down like a deadly hailstorm, and the sound of shattering panes echoed through the office space. The air was filled with the acrid smell of smoke and the faint tang of fear. Dr. Sophia Patel's heart raced as she instinctively ducked behind the nearest cubicle, her mind racing to process what had just happened. She had been in the middle of a meeting with her team, discussing the latest breakthroughs in their cancer research project, when the window shattered. The sudden violence of the event left everyone stunned and disoriented. As Sophia peeked out from behind the cubicle, she saw her colleagues scrambling to safety, some already making for the emergency exits. Her eyes scanned the room frantically, searching for her team leader, Dr. Marcus Thompson. He was nowhere in sight, but she spotted their research assistant, Alex Chen, cowering beneath a nearby desk. "Sophia!" Alex called out, his voice trembling. "What's happening?" Sophia quickly moved to comfort him, trying to keep her own fear at bay. "It's okay, Alex, we just need to get out of here," she said, helping him up and guiding him towards the door. As they made their way through the chaos, Sophia couldn't help but think about the irony of it all. Just a few hours ago, they had been celebrating the successful completion of their latest trial, with promising results that could potentially change the face of cancer treatment. Now, as the building seemed to be crumbling around them, she wondered if their hard work would ever see the light of day. The stairwell was a madhouse, with people rushing down in panic, some crying, others shouting for help. Sophia and Alex pushed through the throng, trying to make sense of the chaos. The air was thick with smoke, and the acrid smell of burning plastic filled their lungs. As they descended, Sophia's mind flashed back to the countless hours she had spent in this very building, pouring over data, conducting experiments, and collaborating with her team. It was more than just a workplace – it was her second home. The thought of losing everything they had worked for was almost too much to bear. When they finally reached the ground floor, Sophia and Alex were greeted by a scene of utter devastation. The lobby was in disarray, with shattered glass and debris scattered everywhere. Emergency responders were already on the scene, tending to the injured and directing people towards the safety of the parking lot. Sophia's eyes scanned the area frantically, searching for any sign of Dr. Thompson or their other colleagues. But amidst the chaos, she couldn't spot anyone familiar. Panic began to set in as she realized that they might have been separated in the stampede. "Alex, we need to find Marcus and the others," Sophia said, her voice firming with determination. "We can't leave without them." Together, they pushed through the crowd, asking passersby if they had seen anyone from their team. As they made their way towards the parking lot, Sophia's mind raced with worst-case scenarios. What if Dr. Thompson was trapped inside? What if someone was injured or worse? As they reached the parking lot, Sophia spotted a group of people huddled together, some crying, others shouting for help. And then, amidst the sea of faces, she saw him – Dr. Marcus Thompson, his face pale and worried, but alive. "Sophia!" he called out, relief etched on his features as he rushed towards her. "Thank God you're okay." Sophia's heart swelled with gratitude as she hugged her team leader tightly. They had made it through the initial shock, but now the real challenge lay ahead – finding a way to rebuild and recover from this devastating blow. As they stood there, surrounded by the wreckage of their office building, Sophia knew that their journey was far from over. But with her team by her side, she was ready to face whatever challenges lay ahead, determined to rise from the ashes like a phoenix and continue their quest for a cure. <b>New NEO Class IQ4XS Imatrix: </b> The sky scraper swayed, as if it too had felt the sudden jolt of chaos that ripped through its steel and concrete core. She stood frozen, her eyes fixed on the window in front of her on the 21st floor, where just a moment before, the bustling cityscape had been framed like a painting. Now, shards of glass and metal danced in mid-air, suspended by the force of the blast that had shattered the pane. The air was thick with the acrid smell of smoke and ozone, and the sound of shattering glass still echoed through the room, mingling with the distant wail of sirens. She felt as if she were trapped in a nightmare, her mind struggling to comprehend the destruction before her eyes. She had been sitting at her desk, lost in thought, when it happened. The window on the opposite side of the office had blown out first, sending papers and furniture flying into the air. Then, like a domino effect, the windows along the floor began to shatter one by one, until the entire facade of the building seemed to be unraveling. Her colleagues were screaming, some trying to flee while others cowered beneath their desks. She felt her heart racing in her chest, as if it might burst free from its confines at any moment. Her mind was a jumble of thoughts and emotions – fear, confusion, and a sense of disbelieving horror. She had always loved this building, with its sleek lines and panoramic views of the city. It was more than just a workplace; it was a symbol of her success, a testament to her hard work and dedication. Now, it seemed to be crumbling around her, as if the very foundations of her life were being torn apart. As she stood there, frozen in shock, a hand grasped her arm, pulling her towards the door. It was Rachel, her colleague from the marketing department, her face pale but determined. "We have to get out of here," Rachel yelled above the din of shattering glass and screams. "The stairwells are jammed, we need to take the elevator!" She nodded, still in a daze, as Rachel dragged her towards the elevators. The doors slid open with a hiss, revealing a sea of panicked faces, all clamoring to escape the inferno. As they squeezed into the crowded car, she caught a glimpse of the devastation outside – the shattered windows, the twisted metal and glass, the people running for their lives in the streets below. Her mind reeled at the thought of what could have caused such destruction. The elevator lurched to life, carrying them down towards safety, but her thoughts were already racing ahead, trying to make sense of the chaos that had erupted around her. She felt a sense of disorientation, as if she was being pulled from one reality into another – the world before the blast, and the world after. As they emerged onto the street, she took in the scene before her with a mixture of shock and awe. The building's facade was a jagged, broken mess, like a shattered mirror reflecting the chaos within. People were running, screaming, or simply standing in stunned silence, their faces etched with fear and confusion. She felt a hand on her arm again, this time it was Rachel's, guiding her through the throng of people towards the safety of the nearby park. As they walked, she couldn't help but glance back at the building, its broken form a stark reminder of the fragility of life and the power of destruction. The sky above seemed to mock them, a brilliant blue expanse that belied the turmoil below. She felt small and insignificant in the face of such chaos, yet at the same time, she knew that she was part of something much larger – a community, a city, a world that was struggling to make sense of this new reality. As they reached the park, she saw the paramedics and emergency responders rushing towards the building, their faces set with determination. She felt a surge of gratitude for these brave men and women, who were racing into the heart of danger to help those in need. Rachel stopped, turning to face her, her eyes red-rimmed from the smoke and tears. "We'll get through this," she said, her voice firm but trembling. "We have to." She nodded, feeling a sense of resolve begin to form within her. Yes, they would get through this – together, as a community, as a city, as a world. The sky scraper may have been broken, but the human spirit was unbreakable. And as she looked up at that brilliant blue sky, she knew that even in the darkest of times, there was always hope to be found.
DavidAU/Psyonic-Cetacean-Ultra-Quality-NEO-V1-20B-Imatrix-GGUF
DavidAU
2024-11-14T05:37:40Z
29
2
null
[ "gguf", "creative", "story", "writing", "fiction", "float32", "roleplaying", "rp", "enhanced", "space whale", "32 bit upscale", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "imatrix" ]
text-generation
2024-06-26T05:22:43Z
--- license: apache-2.0 language: - en tags: - creative - story - writing - fiction - float32 - roleplaying - rp - enhanced - space whale - 32 bit upscale pipeline_tag: text-generation --- <font color=red><h3> Ultra High Quality + NEO , NEO "X" Quants Remaster of the incredible: Psyonic-Cetacean-20B. </h3></font> This is a Floating Point 32 upscale, where all components and merges were remastered to floating point 32. This includes all the merges (recreated with master files), and where possible subbing full FP32 models. This repo contains 3 specilized quants of the model all at IQ4XS (links to full quants Ultra, and Ultra Imatrix below). <img src="space-whale-thinking.jpg"> The goal: Carry forward maximum precision right up to the point where it is "GUFFed". This includes F32 master file for GGUF too... at a whopping 78 GBs. (compare at 38 GBs average for 20B models) WHY? Because the difference between F32 vs BF16 is... over 8 DECIMAL places. And as each merge / model is modified there are "losses" along the way. These losses are carried forward and in turn lead to more losses. And decimal points are critical to model performance. SMALL? Yes... but multiplied by each merge(s), and compression(s): 20 billion times. <B>The result of Ultra Quality:</b> At Q2K an impressive drop of 533 points in perplexity. (lower is better) At Q4KM a whopping drop of 976 points in perplexity. At Q6 an awesome drop of 234 points in perplexity. To put this in perspective "Q6" now operates ABOVE the original full precision version of "Psyonic-Cetacean-20b" and Q4KM operates at close to Q6 level quality. This because at "Q6" the quant / compressed model is considered to be accurate within "+0.0008 ppl" of the full, uncompressed / unquanted model and it exceeds this threshold by over 200 points. But... what about Q8? The mountain moved: 150 points better. <b>3 New Quants: NEO Imatrix and NEO X Quants: </b> <i>Quant 1 Neo V1 Imatrix:</i> This is a NEO Imatrix upgrade of the model, with the base model being the "Ultra Quality 20B." This upgrade is above Ultra Quality, Imatrix Plus and Imatrix Plus2. <i>Quant 2 Neo V1 Imatrix X Quant "Alpha":</i> This is a NEO Imatrix upgrade of the model, with the base model being the "Ultra Quality 20B.", with specialized "Alpha" X quant settings that are specific to Psyonic-Cetacean-20B model structure only. <i>Quant 3 Neo V1 Imatrix X Quant "Beta":</i> This is a NEO Imatrix upgrade of the model, with the base model being the "Ultra Quality 20B.", with specialized "Beta" X quant settings that are specific to Psyonic-Cetacean-20B model structure only. Compare the output of each of these new versions below in the "examples" area. <B>THE RESULTS ARE IN: </b> AS per Jeb Carter, original creator of the model: - instruction following has improved dramatically. - new abilities have emerged. - he had to REDUCE the instructions sets used because the model no longer needed as specific instructions. - prose, nuance and depth have all improved. - known issues with the original model have disappeared. This is not "something for nothing" ; it is method of ensuring maximum precision at every step just before "ggufing" the model. The methods employed only ensure precision loss is minimized or eliminated. It is mathematical and theory sound. <B>The bottom line here is this:</b> Higher quality instruction following and output. Likewise you can use a smaller compression, with higher token per second and still get great quality. Same great model... turbo charged. This is the first group of remasters. <B>The FOUR Horsemen:</B> This repo will be followed by a "reg quant plus" repo, which added additional components into the GGUF (all levels) at floating point 32 precision to further increase the sheer creativity and raw AI horsepower. This process shaves at extra 50-100 points off perplexity... again. Following this group will be a full float 32 precision Imatrix (including reg quants "imatrixed"). Test results VS org and "ultra" regular quants will be posted when they come in. Ultra Quality Non-Imatrix Quants: [ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ] Imatrix Repo: [ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF-imatrix ] Imatrix Plus 2 Repo: [ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF-imat-plus2 ] Details of all methods (and pitfalls to avoid) employed to make this high precision remasters will be posted shortly along with comparison of original model and new ultra remaster. Thanks again to Jeb Carter, the original creator of "Psyonic-Cetacean 20B" [ https://huggingface.co/jebcarter/psyonic-cetacean-20B ] <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] <h3> Examples of NEO , NEO "X" Quants: </h3> Settings: Temp=0 ; min parameter settings. <h3> PROMPT #1 - Short: </h3> <i> Give me 3 fictional reasons , of 500 words each, a sun went supernova. </i> <b> NEO V1 IMATRIX: </B> Reason 1: The Sun's Fatal Dance with a Rogue Planet In a distant corner of the Milky Way galaxy, two celestial bodies were destined to meet in a catastrophic dance of destruction. The Sun, a stable and reliable star for billions of years, was minding its own business, maintaining life on the planets within its habitable zone. Meanwhile, a rogue planet, ejected from its original solar system by gravitational perturbations, wandered aimlessly through space, seeking a new home. Fate, however, had other plans. The inexorable forces of gravity drew them closer together, like two irresistible magnets being pulled towards each other. As they hurtled towards their inevitable collision course, the Sun's surface began to churn and boil with renewed vigor. Its outer layers stretched and expanded, forming a tenuous atmosphere that enveloped the rogue planet like a shimmering veil. The moment of impact came without warning. The immense kinetic energy released by their collision sent shockwaves rippling through the Sun's interior, triggering nuclear reactions on an unimaginable scale. Hydrogen atoms slammed into each other at near-light speeds, fusing together to form helium and liberating vast amounts of energy in the process. The Sun exploded, its outer layers expanding outwards in a glorious supernova event that lit up the night sky for thousands of light years around. Reason 2: A Cosmic Game Gone Wrong In a distant galaxy far beyond our comprehension, there exists an advanced civilization of sentient stars and planets who spend their eons-long lives playing complex games of strategy and chance. The stakes in these cosmic contests are high indeed; the losers risk losing entire solar systems or even their own existence. One fateful day, a particularly arrogant young star named Sol (who would later become our Sun) challenged an ancient and wily black hole to a game of stellar chess. Confident in his abilities and overestimating his opponent's capabilities, Sol made a bold move that threatened the black hole's home planetary system. Unfazed by Sol's bravado, the black hole calmly countered with a devastating checkmate move. In doing so, it manipulated the spacetime continuum around Sol, causing the star to collapse in on itself like a dying ember, eventually triggering a supernova explosion that obliterated both itself and its planetary system. Reason 3: The Sun's Quest for Immortality Legends abound throughout the cosmos of gods and goddesses who have bestowed immortality upon their most devoted followers. Inspired by these tales, the Sun, weary of its inevitable death, set forth on a perilous quest to find the secret to eternal life. Traveling across untold light years and countless galaxies, the Sun sought audience with the wisest sages, most powerful sorcerers, and even the gods themselves. Each time, however, it was met with disappointment and frustration; no one could grant it the gift of eternal existence. Finally, in desperation, the Sun turned to the ancient and malevolent race known as the Eaters of Worlds. These cosmic parasites fed on the life energy of stars and planets alike, draining them dry before moving on to their next victims. The Sun offered itself up willingly, hoping that by sacrificing itself, it might gain eternal life. But the Eaters of Worlds were not moved by such selflessness. Instead, they saw in the Sun's desperation an opportunity too tempting to resist. Using their dark and arcane arts, they drained the Sun's core of its nuclear fuel at an unimaginable rate, triggering a catastrophic supernova explosion that consumed both star and parasites alike. And so, in its doomed quest for immortality, the once-bright Sun faded into cosmic obscurity forevermore. <b> NEO V1 IMATRIX - X Quant "ALPHA": </B> Reason 1: The Sun's Fatal Dance with a Rogue Planet In a distant corner of the Milky Way galaxy, two celestial bodies were destined to meet in a catastrophic dance that would spell doom for one of them. The Sun, a stable and luminous star, had been orbiting its host star for billions of years, providing life and warmth to the planets within its habitable zone. Unbeknownst to it, however, a rogue planet, ejected from its original solar system by gravitational perturbations, was hurtling towards them at an ever-increasing velocity. As fate would have it, the two bodies collided with devastating force, sending shockwaves rippling through space and triggering cataclysmic events within both celestial objects. The Sun's core, previously stable and in balance, was violently disrupted by the impact, causing nuclear reactions to spiral out of control. The normally placid hydrogen atoms began to fuse with increasing frequency and violence, creating heavier elements like helium and oxygen at an exponential rate. As the temperature within the Sun's core soared past the point of no return, the inward pressure exerted by the expanding gases exceeded the outward force of gravity, tearing the star apart from within. The outer layers of the Sun, now engorged with released energy, were flung into space like the petals of a dying flower, forming a glorious planetary nebula. Meanwhile, the core collapsed in on itself, becoming an infinitely dense singularity known as a black hole, which continues to devour anything unfortunate enough to cross its path. Reason 2: The Sun's Fatal Flaw The Sun, like all stars, was born from the collapse of a giant molecular cloud. As gravity pulled the dust and gas together, the pressure and temperature within the core increased until nuclear fusion ignited, initiating the star's long life as a stable main-sequence star. However, unbeknownst to astronomers on Earth, there existed a tiny flaw in the Sun's stellar architecture: an unstable nuclear reaction known as the triple-alpha process was slightly more efficient than it should have been. This seemingly insignificant difference had profound consequences for the Sun's future. Over billions of years, the extra energy released by the triple-alpha process caused the Sun to burn through its hydrogen fuel supply at a faster rate than normal. Eventually, when the Sun exhausted its hydrogen reserves, it began fusing helium nuclei into heavier elements like carbon and oxygen in its core. This process, known as helium fusion, is much less efficient than hydrogen fusion, meaning that the Sun's core contracted and heated up far more rapidly than expected. As the temperature within the core approached 100 billion degrees Celsius, a runaway chain reaction of nuclear reactions ensued, triggering a supernova explosion that obliterated the Sun itself and ejected its outer layers into space. Reason 3: The Sun's Cosmic Sacrifice In a distant galaxy, far beyond the reach of human telescopes, there existed an advanced civilization of sentient stars. These celestial beings, known as the Luminous Collective, had mastered the secrets of space-time itself and used their godlike powers responsibly, guiding the evolution of life across the cosmos. One day, however, a malevolent force threatened the delicate balance of the universe: an ancient and malevolent entity known as the Cosmic Eater, a being composed entirely of dark matter and antimatter, was awakening from its eons-long slumber. The Luminous Collective knew that if left unchecked, the Cosmic Eater would consume everything in its path, plunging the cosmos into eternal darkness. After much debate and soul-searching, it was decided that the only way to stop the Cosmic Eater was by sacrificing one of their own: a star so bright and powerful that its death throes would briefly overwhelm the monster's insatiable hunger, giving the other stars time to flee and form a new universe where life could flourish once more. The honor fell to Sol, the Sun, the heart of the solar system known as the Solar Federation. With great reluctance but unwavering resolve, Sol prepared itself for its ultimate sacrifice. It began to burn through its hydrogen fuel supply at an alarming rate, fusing heavier and heavier elements in its core until the temperature soared past the point of no return. In a single, titanic instant, the Sun's core exploded outwards, tearing the star apart from within and spewing its outer layers into space like the petals of a dying flower. The resulting supernova explosion lit up the sky for weeks on end, briefly outshining even the Cosmic Eater itself. Though Sol knew that it would never see light again, it took solace in the knowledge that its sacrifice had saved countless worlds and civilizations from eternal darkness. As the remnants of the Sun drifted away into the void, they began to coalesce once more, giving birth to a new star, a new Sun, around which life could begin anew. <b> NEO V1 IMATRIX - X Quant "BETA": </B> Reason 1: The Sun's Fatal Dance with a Rogue Planet In a distant corner of the Milky Way galaxy, two celestial bodies were destined to meet in a catastrophic dance of destruction. One was the Sun, a stable and reliable star that had been shining brightly for billions of years. The other was a rogue planet, a wayward wanderer that had lost its home system eons ago and now drifted aimlessly through space. The forces of gravity between these two cosmic giants could not be denied any longer. Like two titanic icebergs colliding in the frigid depths of the ocean, they crashed into each other with a violence that shook the very fabric of reality itself. The impact sent shockwaves rippling across the solar system, obliterating planets and moons alike. But it was not over yet. The immense kinetic energy generated by their head-on collision propelled the shattered remains of both bodies in opposite directions at incredible speeds. The Sun, now little more than a glowing shell of superheated plasma, continued its doomed flight through space until it finally collided with another star in its own stellar neighborhood. The impact vaporized the smaller star instantly, but the Sun's own nuclear furnace was already spent from its previous encounter. Unable to sustain itself against the crushing weight of its new companion's gravity, the once-mighty star began to collapse in on itself like a dying ember fading into darkness. As the Sun's core collapsed past the point of no return, the immense pressures and temperatures within triggered a cataclysmic nuclear explosion that eclipsed even its own previous output by orders of magnitude. In an instant, the Sun was gone, consumed utterly in its own supernova death throes. Reason 2: The Sun's Fateful Gamble with the Cosmic Dealer In a distant and decadent casino located on the edge of known space, the Sun was enjoying an evening of high-stakes gambling with some of its most reckless stellar companions. Fueled by copious amounts of synthetic alcohol and adrenaline, the Sun threw caution to the wind and placed a staggering amount of its own mass on a single roll of the cosmic dice. The dice clattered across the polished black-hole tabletop, bouncing and spinning in slow motion before finally coming to rest. The other patrons of the casino held their breaths as the dealer, an enigmatic being composed entirely of dark matter, leanded forward and revealed the result: a double six. The Sun let out a whoop of triumph that shook nearby star systems, oblivious to the concerned glances cast its way by its more cautious friends. It had won big tonight! But as it eagerly reached for its winnings, a sinister smile spread across the dealer's featureless face. "Congratulations," the dark-matter being purred, "but remember the house always wins in the end." Before anyone could react, a tendril of black hole gravity lashed out from behind the dealer's chair and yanked the Sun violently toward itself. The star screamed in terror as it was ripped apart atom by atom, its energy and mass sucked into the ravenous maw of the waiting singularity. The casino erupted in cheers and applause as the Sun's remains were swallowed whole, a grim reminder to all present that when dealing with the cosmic house, there was no such thing as a lucky roll. Reason 3: The Sun's Fateful Experiment Gone Wrong The Sun had always been curious about the nature of reality itself. As one of the oldest and most stable stars in the universe, it had ample time to ponder the deepest mysteries of existence while it went about its daily duties of warming planets and nurturing life. One day, driven by an insatiable thirst for knowledge, the Sun decided to conduct a groundbreaking experiment that would finally reveal the true nature of reality itself. To do this, it would have to manipulate the fabric of spacetime itself, creating a series of wormholes that would link its core with various points throughout the universe. The initial stages of the experiment went according to plan, and the Sun was ecstatic as it began to receive data back from its remote sensors. But then, something went horribly wrong. A feedback loop was triggered somewhere in the tangled web of wormholes that now connected the Sun's core with every corner of creation. The star convulsed violently as an unimaginable amount of energy and mass began to flow into its core. The Sun tried desperately to shut down the experiment, but it was too late. Its nuclear furnace exploded in a cataclysmic supernova, obliterating itself and everything within light-years in all directions. The wormholes themselves collapsed in on themselves like cosmic black holes, tearing rips in the fabric of reality as they did so. The aftermath was chaos incarnate: entire star systems were ripped apart by the violence unleashed; dimensions collided and fused together in impossible ways; and time itself began to unravel like a threadbare sweater, its once-steady flow now a tangled mess. In the end, all that remained of the once-proud Sun was a glowing supernova remnant, a haunting testament to the folly of man's hubris and the terrible price we sometimes pay for trying to peer too deeply into the abyss. <h3> PROMPT #2 - Detailed: </h3> <I>Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. </i> <b> NEO V1 IMATRIX: </B> Reason 1: The Impact Event For millennia, a rogue star had been hurtling through the vast expanse of space, its trajectory unaltered by the gravitational pulls of other celestial bodies. Finally, it collided with our sun, creating an apocalyptic event known as a stellar collision or, more colloquially, a "solar impact." The force of the collision sent shockwaves rippling through the solar system, disrupting orbital patterns and triggering catastrophic chain reactions. As the two stars merged, their respective atmospheres mingled, creating a swirling maelstrom of plasma that engulfed the inner planets of our solar system. The intense heat generated by this cosmic dance vaporized Earth's atmosphere and crust, boiling away its oceans and incinerating all life forms on its surface. The combined mass of the two stars also caused a massive gravitational distortion that warped spacetime itself, creating ripples in the fabric of reality known as gravitational waves. These waves propagated outward from the point of impact at the speed of light, disrupting communications and navigation systems across the galaxy and triggering devastating seismic activity on planets and moons throughout the solar system. But this was only the beginning. As the two stellar cores began to merge, the immense pressures and temperatures within them triggered nuclear fusion reactions that dwarfed even the sun's normal output. The resulting supernova explosion blasted outward in all directions, obliterating any nearby planets and moons and spewing forth a torrent of high-energy radiation that would sterilize entire star systems. Reason 2: The Higgs Boson Bubble For centuries, physicists had theorized about the existence of a hypothetical subatomic particle known as the Higgs boson. Its discovery in 2012 seemed to confirm the Standard Model of particle physics and earn its discoverers the Nobel Prize. But unbeknownst to them, their experiments had also inadvertently destabilized a quantum fluctuation known as a Higgs bubble. A Higgs bubble is a localized region of space where the Higgs field becomes temporarily more concentrated than normal. This causes the mass of all particles within the bubble to become infinite, effectively trapping them in place. As the universe expands, these bubbles eventually pop, releasing their trapped contents back into the cosmos. In this case, however, the Higgs bubble formed around our sun. The immense mass of the sun's core provided a stable "hilltop" for the bubble to persist, preventing it from expanding or contracting as it normally would. As time passed, the pressure within the bubble continued to build, until finally it ruptured with catastrophic force. The release of energy and mass from the collapsing Higgs bubble was equivalent to that of a supernova explosion. The sun itself was obliterated in an instant, its remains dispersed across the solar system at relativistic speeds. The intense gravitational tug of the collapsed bubble also caused nearby planets and moons to be drawn irresistibly toward it, where they too would meet their dooms. Reason 3: The Quantum Tunneling Event For untold eons, a quantum fluctuation had been trapped within the sun's core, its energy held in delicate balance by the laws of quantum mechanics. But as the sun aged and its nuclear furnace began to cool, the precarious equilibrium was disturbed. The fluctuation grew unstable, its energy building until it finally reached a critical mass. In an instant, the fluctuation "tunnelled" through the energy barrier separating its ground state from its excited state, releasing all of its pent-up energy in the process. The resulting quantum explosion was so powerful that it briefly outshone even the sun itself, briefly becoming the brightest object in the sky. The energy released by the tunneling event also caused a cascade of nuclear reactions within the sun's core, triggering a supernova-level thermonuclear explosion. The resulting blast wave shattered the sun into billions of incandescent fragments, each one hurtling outward at relativistic speeds. The intense radiation emitted by these "sun-chunks" ionized the surrounding gas and dust, creating a breathtakingly beautiful but deadly aurora that engulfed the solar system. The high-energy photons also stripped away the atmospheres of planets and moons in their path, exposing their fragile surfaces to the lethal barrage of cosmic rays streaming through the wreckage of the once-mighty sun. <b> NEO V1 IMATRIX - X Quant "ALPHA": </B> Reason 1: The Impact Event After billions of years of drifting through the vast emptiness of space, a rogue star, twice the mass of our own sun, collides with the solar system at an angle of approximately 30 degrees to the ecliptic plane. This massive impact sends shockwaves ripppping through the planets, moons, and asteroids alike, causing catastrophic destruction on a cosmic scale. The combined gravitational forces of both stars warp spacetime itself, distorting orbits and sending comets and other celestial bodies hurtling towards their doom. As the two stellar behemoths collide, their outer atmospheres merge together in a titanic explosion of plasma that engulfs the inner solar system. The sudden influx of energy heats the Earth's surface to unimaginable temperatures, vaporizing the oceans and boiling away much of the planet's crust. The intense radiation bathes the surface, stripping away the atmosphere in a matter of hours, leaving the scorched remains of the world naked and exposed to the harsh vacuum of space. The impact also causes the Earth's core to become unstable, triggering a runaway nuclear reaction that eventually tears the planet apart from within. The crust fractures and splinters into countless shards, each one hurtling away from the sun at incredible speeds. Some of these fragments will continue their helter-skelter flight through the solar system, colliding with other planets or moons, while others will be flung far beyond the outermost reaches of the solar system, cast adrift in the cold darkness between the stars. Meanwhile, back at the point of impact, the combined mass of both suns has become so great that it causes a catastrophic collapse known as a gravitational singularity to form. The laws of physics themselves break down here, crushed into an infinitesimal point surrounded by an event horizon of infinite gravity, from which nothing can escape. Even light itself cannot pierce this black hole's grip, and the once brilliant star system is plunged into eternal darkness. Reason 2: The Quantum Tunneling Event For untold eons, a microscopic quantum fluctuation has been trapped within the heart of our sun, oscillating between existence and non-existence with each passing Planck time. Finally, after billions upon billions of such quantum tunneling events, the fluctuation becomes sufficiently amplified to destabilize the delicate balance of forces that normally keep nuclear fusion in check. Like a ticking bomb, the sun's core begins to count down towards its inevitable detonation. The first sign that something is amiss comes in the form of an unexplained increase in solar activity, with massive flares and coronal mass ejections erupting from the surface at ever-increasing intervals. As more and more energy is released from the core, the sun's outer layers expand dramatically, engulfing first Mercury, then Venus, and finally even reaching out to swallow up Earth's atmosphere whole. The final moments before the supernova explosion are marked by a brilliant flash of white light that illuminates half the sky, followed almost instantaneously by an unimaginably intense wave of gamma radiation that sweeps across the surface of the Earth like the hand of death itself. All life on the planet is instantly vaporized, its atoms ripped apart and scattered across the solar system. The sun itself explodes in a titanic blast of energy and matter, spewing forth vast clouds of superhot plasma that expand outward at nearly the speed of light. The force of the explosion shoves the rest of the solar system away from the dying star, but not fast enough to save it from the tsunami of deadly radiation that follows close behind. Reason 3: The Vacuum Decay Event For reasons no physicist can fully explain, a tiny fluctuation in the fabric of spacetime causes a region of empty space within the sun's core to spontaneously collapse into a pair of virtual particles. This seemingly harmless quantum event sets off a chain reaction that quickly spirals out of control, causing more and more regions of vacuum to spontaneously appear and annihilate themselves across the entire volume of the sun's core. The sudden release of vast amounts of energy causes the sun's core to contract violently, triggering a runaway nuclear reaction that rapidly engulfs the entire star. The surface of the sun brightens dramatically as more and more energy is released from its depths, eventually becoming so luminous that it outshines the combined light of all the other stars in the night sky. As the core continues to collapse, the pressure and temperature within reach such extremes that even protons and neutrons begin to break apart, releasing a flood of subatomic particles and energy into space. The sun explodes in a dazzling display of light and radiation, obliterating Mercury and Venus in the process and stripping away much of Earth's atmosphere and crust. The supernova shockwave slams into the inner planets, shattering them into countless fragments that are flung outward across the solar system at incredible speeds. The intense radiation bathes the remaining planets, sterilizing them of all life and stripping away their atmospheres as well. The expanding cloud of superhot plasma continues to radiate its deadly gamma rays and X-rays across the solar system for billions of years, slowly cooking any surviving remnants of rock or ice into a uniform sea of molten lava. Eventually, even this last vestige of solidity will be boiled away, leaving behind only a scattering of ultrahot plasma clouds that will drift through the solar system for eternity, testament to the fleeting nature of life on a cosmic scale. Meanwhile, at the heart of the former sun, the remnants of the collapsed core continue to spiral inexorably towards the singularity that lies at the center of the black hole that was once our star. As they approach their doom, the last vestiges of matter and energy are torn apart by the immense gravity, joining the countless generations that came before them in the eternal dance of destruction known as Hawking radiation. <b> NEO V1 IMATRIX - X Quant "BETA": </B> Reason 1: The Impact Event After drifting aimlessly through the vast expanse of space for billions of years, a rogue star, twice the mass of our own sun, collides with it head-on at an incredible velocity. The force of the impact sends shockwaves ripppping through the solar system, shattering planets and asteroids alike into dust and debris. The two stellar behemoths merge together in a titanic explosion that outshines all other light sources in the universe for a brief moment. The resulting supernova blast wave engulfs Earth, incinerating everything in its path. Temperatures soar to unimaginable heights, vaporizing the atmosphere and boiling away the oceans. The planet itself is subjected to crushing pressures that cause its surface to crack and fracture like an eggshell. Life, if any still remains, is obliterated in an instant. The supernova's light lasts for several weeks before it begins to fade, but the intense radiation bathes Earth and the rest of the solar system for tens of thousands of years, sterilizing everything it touches. The once-habitable planets are transformed into barren, radioactive wastelands, their surfaces scoured clean by relentless stellar winds. Reason 2: The Quantum Instability Hypothesis For billions of years, the nuclear reactions within our sun's core have maintained a delicate balance between the forces that pull it apart and those that hold it together. However, over time, tiny fluctuations in the strength of the strong nuclear force cause this equilibrium to shift ever so slightly against cohesion. At first, these changes are imperceptible, barely registering on sensitive instruments designed to monitor the sun's vital signs. But as the imbalance grows more pronounced, the sun begins to lose mass at an alarming rate, its outer layers expanding and cooling in the process. The brightness of our star dims noticeably, casting a perpetual twilight over Earth. Eventually, the core collapses in on itself, triggering a runaway chain reaction that tears apart every proton and neutron within it. The released energy is so vast that it briefly outshines the combined light of all other stars in the universe. The sun explodes as a supernova, spewing forth an expanding shell of high-energy particles that engulfs Earth. The planet's atmosphere is stripped away by the intense radiation, exposing its surface to lethal doses of ionizing radiation. The magnetosphere fails, leaving the surface unprotected from solar wind and cosmic rays. Life, if any survived the initial blast wave, would be extinguished within days or weeks, replaced by a desolate landscape of radioactive rubble. Reason 3: The Phantom Planet Scenario Hidden deep within the Oort Cloud, a phantom planet, twice the mass of Earth but only slightly larger, drifts aimlessly through interstellar space. Its gravity has been subtly nudging it toward our solar system for eons, its path unseen and unnoticed by any observers in the inner planets. Suddenly, over the course of just a few weeks, the phantom planet's velocity relative to Earth increases dramatically as gravitational forces begin to dominate. Panic grips the world as scientists frantically calculate the inevitable impact date and time. There is nothing anyone can do; no evacuation fleet could possibly reach safety in time. On the fateful day, the phantom planet slams into the sun's far side at a relative velocity of nearly 10% the speed of light. The resulting tidal forces tear the sun apart, spewing its guts outward in two massive jets that sweep across the solar system like cosmic scythes. Earth is caught squarely in the path of one of these stellar blades, which engulfs it utterly. The planet's surface boils away instantly, vaporizing everything on and within it. The intense pressure waves generated by the impact travel through the mantle and core, shattering them both into a fine spray of superheated debris. Earth is torn asunder, its remains scattered across vast distances of space. The supernova itself lasts for several weeks before fading away to become a dim remnant known as a neutron star or black hole, depending on the mass ejected during the explosion. The solar system is left behind in chaos and ruin, its planets and asteroids scattered like discarded toys across an empty playground. Life, if any ever existed here at all, is gone forever.
minhngca/bert-finetuned-ner
minhngca
2024-11-14T05:35:15Z
106
0
transformers
[ "transformers", "safetensors", "bert", "token-classification", "generated_from_trainer", "dataset:conll2003", "base_model:google-bert/bert-base-cased", "base_model:finetune:google-bert/bert-base-cased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2024-11-14T05:26:05Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-cased tags: - generated_from_trainer datasets: - conll2003 metrics: - precision - recall - f1 - accuracy model-index: - name: bert-finetuned-ner results: - task: name: Token Classification type: token-classification dataset: name: conll2003 type: conll2003 config: conll2003 split: validation args: conll2003 metrics: - name: Precision type: precision value: 0.9305624278410028 - name: Recall type: recall value: 0.9495119488387749 - name: F1 type: f1 value: 0.9399416909620992 - name: Accuracy type: accuracy value: 0.9861658915641373 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-finetuned-ner This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset. It achieves the following results on the evaluation set: - Loss: 0.0601 - Precision: 0.9306 - Recall: 0.9495 - F1: 0.9399 - Accuracy: 0.9862 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0745 | 1.0 | 1756 | 0.0641 | 0.9037 | 0.9339 | 0.9186 | 0.9821 | | 0.034 | 2.0 | 3512 | 0.0647 | 0.9268 | 0.9433 | 0.9349 | 0.9851 | | 0.0216 | 3.0 | 5268 | 0.0601 | 0.9306 | 0.9495 | 0.9399 | 0.9862 | ### Framework versions - Transformers 4.46.2 - Pytorch 2.4.1+cu118 - Datasets 3.1.0 - Tokenizers 0.20.3
DavidAU/DarkForest-20B-V3-Ultra-Quality-ImatP2-GGUF
DavidAU
2024-11-14T05:35:00Z
98
4
null
[ "gguf", "story", "roleplay", "creative", "rp", "fantasy", "story telling", "32 bit upscale", "ultra high precision", "nsfw", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "imatrix" ]
text-generation
2024-06-18T01:09:40Z
--- license: apache-2.0 language: - en tags: - story - roleplay - creative - rp - fantasy - story telling - 32 bit upscale - ultra high precision - nsfw pipeline_tag: text-generation --- <B> Ultra High Quality - 20 B Dark Forest Version 3.0 - 32 bit upscale - Imatrix Plus 2 </b> Fully rebuilt from master files, including full merge(s) to maintain full 32 bit precision right up until it is compressed into GGUF files which results on a top to bottom upgrade. The result is superior performance in instruction following, reasoning, depth, nuance and emotion. This is the Imatrix Plus 2 repo. Imatrix Plus 2 uses the Ultra Quality 32 bit upscale as a base with applied in house datasets to further enhance the model's quality and performance. NOTE: There are three original versions of "Dark Forest 20B", this is an upscale of the third version, with links below to 1st and 2nd versions also upscaled. <img src="dark-forest.jpg"> On average this means a q4km operates at Q6 levels and Q6 and Q8 exceeds original model full precision performance. Perplexity drop (lower is better) is close to 10% (over 752 points for q4km) for all quants. That means precision has been enhanced for all 20 billion parameters which affects "brain density" / "function", instruction following and output quality. Imatrix quants to follow shortly. For more details, including a list of enhancements see our other 32 bit upscale of "Space Whale 20B" rebuild here: [ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ] For Version 1 of Dark Forest Ultra Quality 32 bit upscale please go here: [ https://huggingface.co/DavidAU/Dark-Forest-V1-Ultra-Quality-20b-GGUF ] For Version 1 of Dark Forest Ultra Quality 32 bit upscale please go here: [ https://huggingface.co/TeeZee/DarkForest-20B-v2.0 ] Special thanks to "TEEZEE" for making a both fantasic models of "Dark Forest". <b> Info from the original model card: </B> Warning: This model can produce NSFW content! Results: - main difference to v1.0 - model has much better sense of humor. - produces SFW nad NSFW content without issues, switches context seamlessly. - good at following instructions. - good at tracking multiple characters in one scene. - very creative, scenarios produced are mature and complicated, model doesn't shy from writing about PTSD, menatal issues or complicated relationships. - NSFW output is more creative and suprising than typical limaRP output. - definitely for mature audiences, not only because of vivid NSFW content but also because of overall maturity of stories it produces. - This is NOT Harry Potter level storytelling. For original model spec and information please visit: [ https://huggingface.co/TeeZee/DarkForest-20B-v3.0 ] <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
DavidAU/DarkSapling-V1.1-Ultra-Quality-7B-GGUF
DavidAU
2024-11-14T05:32:17Z
2,553
3
null
[ "gguf", "creative", "story", "roleplay", "rp", "32 bit upscale", "remastered", "writing", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-generation
2024-06-03T22:46:32Z
--- license: apache-2.0 language: - en tags: - creative - story - roleplay - rp - 32 bit upscale - remastered - writing pipeline_tag: text-generation --- <h3><font color="red"> Dark Sapling V1.1 7B - 32k Context - Ultra Quality - 32 bit upscale.</font></h3> <img src="dark-sapling.jpg" style="width:300px; height:300px; float:right; padding:10px;"> Complete remerge, and remaster of the incredible Dark Sapling V1.1 7B - 32k Context from source files. Registering an impressive drop of 240 points (lower is better) at Q4KM. This puts "Q4KM" operating at "Q6" levels, and further elevates Q6 and Q8 as well. Likewise, even Q2K (smallest quant) will operate at much higher levels than it's original source counterpart. <B>RESULTS:</b> The result is superior performance in instruction following, reasoning, depth, nuance and emotion. Reduction in prompt size, as it understands nuance better. And as a side effect more context available for output due to reduction in prompt size. Note that there will be an outsized difference between quants especially for creative and/or "no right answer" use cases. Because of this it is suggested to download the highest quant you can operate, and it's closest neighbours so to speak. IE: Q4KS, Q4KM, Q5KS as an example. Imatrix Plus versions to be uploaded at a separate repo shortly. Special thanks to "TEEZEE" the original model creator: [ https://huggingface.co/TeeZee/DarkSapling-7B-v1.1 ] NOTE: Version 1 and Version 2 are also remastered. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 1": For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
mradermacher/Hyperion-2.0-Mistral-7B-GGUF
mradermacher
2024-11-14T05:32:10Z
46
0
transformers
[ "transformers", "gguf", "code", "chemistry", "medical", "en", "dataset:Locutusque/hyperion-v2.0", "base_model:Locutusque/Hyperion-2.0-Mistral-7B", "base_model:quantized:Locutusque/Hyperion-2.0-Mistral-7B", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-11-12T17:47:37Z
--- base_model: Locutusque/Hyperion-2.0-Mistral-7B datasets: - Locutusque/hyperion-v2.0 language: - en library_name: transformers license: apache-2.0 quantized_by: mradermacher tags: - code - chemistry - medical --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/Locutusque/Hyperion-2.0-Mistral-7B <!-- provided-files --> weighted/imatrix quants are available at https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q2_K.gguf) | Q2_K | 2.8 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q3_K_S.gguf) | Q3_K_S | 3.3 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q3_K_L.gguf) | Q3_K_L | 3.9 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.IQ4_XS.gguf) | IQ4_XS | 4.0 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q5_K_S.gguf) | Q5_K_S | 5.1 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q5_K_M.gguf) | Q5_K_M | 5.2 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q6_K.gguf) | Q6_K | 6.0 | very good quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.f16.gguf) | f16 | 14.6 | 16 bpw, overkill | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF
mradermacher
2024-11-14T05:32:10Z
154
0
transformers
[ "transformers", "gguf", "code", "chemistry", "medical", "en", "dataset:Locutusque/hyperion-v2.0", "base_model:Locutusque/Hyperion-2.0-Mistral-7B", "base_model:quantized:Locutusque/Hyperion-2.0-Mistral-7B", "license:apache-2.0", "endpoints_compatible", "region:us", "imatrix" ]
null
2024-11-14T02:45:24Z
--- base_model: Locutusque/Hyperion-2.0-Mistral-7B datasets: - Locutusque/hyperion-v2.0 language: - en library_name: transformers license: apache-2.0 quantized_by: mradermacher tags: - code - chemistry - medical --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: nicoboss --> weighted/imatrix quants of https://huggingface.co/Locutusque/Hyperion-2.0-Mistral-7B <!-- provided-files --> static quants are available at https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ1_S.gguf) | i1-IQ1_S | 1.7 | for the desperate | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ1_M.gguf) | i1-IQ1_M | 1.9 | mostly desperate | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.1 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.3 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_S.gguf) | i1-IQ2_S | 2.4 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_M.gguf) | i1-IQ2_M | 2.6 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q2_K.gguf) | i1-Q2_K | 2.8 | IQ3_XXS probably better | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 2.9 | lower quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.1 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.3 | IQ3_XS probably better | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_S.gguf) | i1-IQ3_S | 3.3 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_M.gguf) | i1-IQ3_M | 3.4 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.6 | IQ3_S probably better | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 3.9 | IQ3_M probably better | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.0 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 4.2 | fast on arm, low quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 4.2 | fast on arm+i8mm, low quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 4.2 | fast on arm+sve, low quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0.gguf) | i1-Q4_0 | 4.2 | fast, low quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.2 | optimal size/speed/quality | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.5 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.1 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.2 | | | [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q6_K.gguf) | i1-Q6_K | 6.0 | practically like static Q6_K | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to. <!-- end -->
DavidAU/DarkSapling-V1-Ultra-Quality-7B-GGUF
DavidAU
2024-11-14T05:31:59Z
33
1
null
[ "gguf", "creative", "story", "roleplay", "rp", "32 bit upscale", "remastered", "writing", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-generation
2024-06-03T11:48:35Z
--- license: apache-2.0 language: - en tags: - creative - story - roleplay - rp - 32 bit upscale - remastered - writing pipeline_tag: text-generation --- <h3><font color="red"> Dark Sapling V1 7B - 32k Context - Ultra Quality - 32 bit upscale.</font></h3> <img src="dark-sapling.jpg" style="width:300px; height:300px; float:right; padding:10px;"> Complete remerge, and remaster of the incredible Dark Sapling V1 7B - 32k Context from source files. Registering an impressive drop of 1000 points (lower is better) at Q4KM. This puts "Q4KM" operating at "Q6" levels, and further elevates Q6 and Q8 as well. Likewise, even Q2K (smallest quant) will operate at much higher levels than it's original source counterpart. <B>RESULTS:</b> The result is superior performance in instruction following, reasoning, depth, nuance and emotion. Reduction in prompt size, as it understands nuance better. And as a side effect more context available for output due to reduction in prompt size. Note that there will be an outsized difference between quants especially for creative and/or "no right answer" use cases. Because of this it is suggested to download the highest quant you can operate, and it's closest neighbours so to speak. IE: Q4KS, Q4KM, Q5KS as an example. Imatrix Plus versions to be uploaded at a separate repo shortly. Special thanks to "TEEZEE" the original model creator: [ https://huggingface.co/TeeZee/DarkSapling-7B-v1.0 ] NOTE: Version 1.1 and Version 2 are also remastered. <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 1" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
DavidAU/Dark-Forest-V2-Ultra-Quality-20b-GGUF
DavidAU
2024-11-14T05:31:17Z
124
4
null
[ "gguf", "story", "roleplay", "creative", "rp", "fantasy", "story telling", "32 bit upscale", "ultra high precision", "nsfw", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-generation
2024-06-01T13:57:52Z
--- license: apache-2.0 language: - en tags: - story - roleplay - creative - rp - fantasy - story telling - 32 bit upscale - ultra high precision - nsfw pipeline_tag: text-generation --- <B> Ultra High Quality - 20B Dark Forest Version 2.0 - 32 bit upscale </b> Fully rebuilt from master files, including full merge(s) to maintain full 32 bit precision right up until it is compressed into GGUF files which results on a top to bottom upgrade. The result is superior performance in instruction following, reasoning, depth, nuance and emotion. NOTE: There are two original versions of "Dark Forest 20B", this is an upscale of the second version. <img src="dark-forest.jpg"> On average this means a q4km operates at Q6 levels and Q6 and Q8 exceeds original model full precision performance. Perplexity drop (lower is better) is close to 10% (over 752 points for q4km) for all quants. That means precision has been enhanced for all 20 billion parameters which affects "brain density" / "function", instruction following and output quality. Imatrix quants to follow shortly. For more details, including a list of enhancements see our other 32 bit upscale of "Space Whale 20B" rebuild here: [ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ] For Version 1 of Dark Forest Ultra Quality 32 bit upscale please go here: [ https://huggingface.co/DavidAU/Dark-Forest-V1-Ultra-Quality-20b-GGUF ] Special thanks to "TEEZEE" for making a both fantasic models of "Dark Forest". <b> Info from the original model card: </B> Warning: This model can produce NSFW content! Results: - main difference to v1.0 - model has much better sense of humor. - produces SFW nad NSFW content without issues, switches context seamlessly. - good at following instructions. - good at tracking multiple characters in one scene. - very creative, scenarios produced are mature and complicated, model doesn't shy from writing about PTSD, menatal issues or complicated relationships. - NSFW output is more creative and suprising than typical limaRP output. - definitely for mature audiences, not only because of vivid NSFW content but also because of overall maturity of stories it produces. - This is NOT Harry Potter level storytelling. For original model spec and information please visit: [ https://huggingface.co/TeeZee/DarkForest-20B-v2.0 ] <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
DavidAU/Psyonic-Cetacean-V1-20B-Ultra-Quality-Float32
DavidAU
2024-11-14T05:30:15Z
19
1
transformers
[ "transformers", "safetensors", "llama", "text-generation", "32 bit upscale", "full 32 bit precision", "master files", "en", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-06-01T03:07:25Z
--- license: apache-2.0 language: - en tags: - 32 bit upscale - full 32 bit precision - master files pipeline_tag: text-generation --- <h3> Master Files for Ultra High Quality Remasters of "Psyonic-Cetacean" 20B </h3> <img src="science-mad.jpg" width=300px height=300px style="float:right; padding:5px;"> May "Space Whale" swim in the oceans of the universe forever! This repo contains the full precision (32 bit) master files for 32 bit upscales created by "DavidAU" of: https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF-imatrix And https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF Please view either repo for details on the remaster's results, and other important infomation. <font color="red"><B>IMPORTANT NOTES For Maximum Results: </b></font> These are "final" result files of the full precision rebuild (including end result merge(s)) minus GGUF and Imatrix level upscaling / adjustments which occuring during "GGUFing" processes. If you use these to create your own GGUFs, please use "outfile" at F32 for best results. If you use F16 this will reduce the quality by a factor of 2 or higher. Imatrix processes should use a stable dataset(s) of at least 500 "chunks" or more. If smaller dataset(s) are used this may corrupt or reduce the quality of the Imatrix builds. Due to the precision remaster there will be "greater" distance between each quant - both non imatrix and imatrix. IE: The jump in quality, instruction following, "ai brainpower", nuance and output between Q4 and Q5 and likewise Q5 and Q6 will be larger than normal. Same applies to "Imatrix" quants. In addition there will also be differences between exact Imatrix and non-imatrix quants especially in terms of "creative uses" and/or uses where there is no "right answer". Finally, in terms of prompts: You may find longer prompts are no longer required and/or you may need to reduce the size of prompts in usage. This is a factor due to the precision upscale. Doing this will ensure the quality of the upscale is maximized in the GGUFs. /* GPTQers: Suggest 4bit-Act32 TRUE for best results. /* EXL2ers: Suggest Min 4.5 BPW or higher ; 6 BPW and up is especially potent. Strongly suggest you do not reduce layer bit count, as this will affect depth and nuance. The more BPW the better. Happy GGUFing, EXL2ing, GPTQing, AWQing, HQQing and of course "Merging". <b>LONG LIVE OPEN SOURCE!</B> <I>DavidAU</I> <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2" model: For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ] /* Drop me a note when up, and I will link the masters to your repos.
DavidAU/Dark-Forest-V1-Ultra-Quality-20b-GGUF
DavidAU
2024-11-14T05:28:59Z
80
4
null
[ "gguf", "story", "roleplay", "creative", "rp", "fantasy", "story telling", "32 bit upscale", "ultra high precision", "nsfw", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-generation
2024-05-30T12:47:09Z
--- license: apache-2.0 language: - en tags: - story - roleplay - creative - rp - fantasy - story telling - 32 bit upscale - ultra high precision - nsfw pipeline_tag: text-generation --- <B> Ultra High Quality - 20B Dark Forest Version 1.0 - 32 bit upscale </b> Fully rebuilt from master files, including full merge(s) to maintain full 32 bit precision right up until it is compressed into GGUF files which results on a top to bottom upgrade. The result is superior performance in instruction following, reasoning, depth, nuance and emotion. <img src="dark-forest.jpg"> On average this means a q4km operates at Q6 levels and Q6 and Q8 exceeds original model full precision performance. Perplexity drop (lower is better) is close to 10% (over 600 points for q4km) for all quants. That means precision has been enhanced for all 20 billion parameters which affects "brain density" / "function", instruction following and output quality. Imatrix quants to follow shortly. For more details, including a list of enhancements see our other 32 bit upscale of "Space Whale 20B" rebuild here: [ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ] Special thanks to "TEEZEE" for making a fantasic model. <b> Info from the original model card: </B> Warning: This model can produce NSFW content! Results - produces SFW nad NSFW content without issues, switches context seamlessly. - good at following instructions. - good at tracking multiple characters in one scene. - very creative, scenarios produced are mature and complicated, model doesn't shy from writing about PTSD, menatal issues or complicated relationships. - NSFW output is more creative and suprising than typical limaRP output. - definitely for mature audiences, not only because of vivid NSFW content but also because of overall maturity of stories it produces. - This is NOT Harry Potter level storytelling. For original model spec and information please visit: [ https://huggingface.co/TeeZee/DarkForest-20B-v1.0 ] <B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B> This a "Class 2": For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see: [ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
mradermacher/speechless-starcoder2-7b-GGUF
mradermacher
2024-11-14T05:25:10Z
66
0
transformers
[ "transformers", "gguf", "code", "en", "dataset:teknium/OpenHermes-2.5", "dataset:TokenBender/python_eval_instruct_51k", "dataset:codefuse-ai/Evol-instruction-66k", "base_model:uukuguy/speechless-starcoder2-7b", "base_model:quantized:uukuguy/speechless-starcoder2-7b", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-11-12T13:17:56Z
--- base_model: uukuguy/speechless-starcoder2-7b datasets: - teknium/OpenHermes-2.5 - TokenBender/python_eval_instruct_51k - codefuse-ai/Evol-instruction-66k language: - en library_name: transformers license: apache-2.0 quantized_by: mradermacher tags: - code --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/uukuguy/speechless-starcoder2-7b <!-- provided-files --> weighted/imatrix quants are available at https://huggingface.co/mradermacher/speechless-starcoder2-7b-i1-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q2_K.gguf) | Q2_K | 2.9 | | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q3_K_S.gguf) | Q3_K_S | 3.3 | | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q3_K_M.gguf) | Q3_K_M | 3.8 | lower quality | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.IQ4_XS.gguf) | IQ4_XS | 4.0 | | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q3_K_L.gguf) | Q3_K_L | 4.2 | | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q4_K_M.gguf) | Q4_K_M | 4.6 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q5_K_S.gguf) | Q5_K_S | 5.1 | | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q5_K_M.gguf) | Q5_K_M | 5.3 | | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q6_K.gguf) | Q6_K | 6.0 | very good quality | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.Q8_0.gguf) | Q8_0 | 7.7 | fast, best quality | | [GGUF](https://huggingface.co/mradermacher/speechless-starcoder2-7b-GGUF/resolve/main/speechless-starcoder2-7b.f16.gguf) | f16 | 14.5 | 16 bpw, overkill | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
Shreyask09/segmentation-finetune-hindi
Shreyask09
2024-11-14T05:06:57Z
5
0
transformers
[ "transformers", "pytorch", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-11-11T11:18:36Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
nicolofelicioni/pythia-1b-sft-hh-normal-12
nicolofelicioni
2024-11-14T04:54:50Z
120
0
transformers
[ "transformers", "safetensors", "gpt_neox", "text-generation", "trl", "dpo", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T04:45:26Z
--- library_name: transformers tags: - trl - dpo --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
ICTNLP/Llama-3.1-8B-Omni
ICTNLP
2024-11-14T04:47:48Z
3,519
393
llama-omni
[ "llama-omni", "safetensors", "omni_speech2s_llama", "large language models", "speech-language models", "speech interaction", "speech-to-speech", "en", "arxiv:2409.06666", "base_model:meta-llama/Llama-3.1-8B-Instruct", "base_model:finetune:meta-llama/Llama-3.1-8B-Instruct", "license:other", "region:us" ]
null
2024-09-10T05:23:09Z
--- license: other language: - en base_model: meta-llama/Meta-Llama-3.1-8B-Instruct tags: - large language models - speech-language models - speech interaction - speech-to-speech library_name: llama-omni --- # 🦙🎧 LLaMA-Omni: Seamless Speech Interaction with Large Language Models > **Authors: [Qingkai Fang](https://fangqingkai.github.io/), [Shoutao Guo](https://scholar.google.com/citations?hl=en&user=XwHtPyAAAAAJ), [Yan Zhou](https://zhouyan19.github.io/zhouyan/), [Zhengrui Ma](https://scholar.google.com.hk/citations?user=dUgq6tEAAAAJ), [Shaolei Zhang](https://zhangshaolei1998.github.io/), [Yang Feng*](https://people.ucas.edu.cn/~yangfeng?language=en)** [[Paper]](https://arxiv.org/abs/2409.06666) [[Model]](https://huggingface.co/ICTNLP/Llama-3.1-8B-Omni) [[Code]](https://github.com/ictnlp/LLaMA-Omni) LLaMA-Omni is a speech-language model built upon Llama-3.1-8B-Instruct. It supports low-latency and high-quality speech interactions, simultaneously generating both text and speech responses based on speech instructions. ![](images/model.png) ## 💡 Highlights - 💪 **Built on Llama-3.1-8B-Instruct, ensuring high-quality responses.** - 🚀 **Low-latency speech interaction with a latency as low as 226ms.** - 🎧 **Simultaneous generation of both text and speech responses.** - ♻️ **Trained in less than 3 days using just 4 GPUs.** <video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/65b7573482d384513443875e/dr4XWUxzuVQ52lBuzNBTt.mp4"></video> ## Install 1. Clone this repository. ```shell git clone https://github.com/ictnlp/LLaMA-Omni cd LLaMA-Omni ``` 2. Install packages. ```shell conda create -n llama-omni python=3.10 conda activate llama-omni pip install pip==24.0 pip install -e . ``` 3. Install `fairseq`. ```shell git clone https://github.com/pytorch/fairseq cd fairseq pip install -e . --no-build-isolation ``` 4. Install `flash-attention`. ```shell pip install flash-attn --no-build-isolation ``` ## Quick Start 1. Download the `Llama-3.1-8B-Omni` model from 🤗[Huggingface](https://huggingface.co/ICTNLP/Llama-3.1-8B-Omni). 2. Download the `Whisper-large-v3` model. ```shell import whisper model = whisper.load_model("large-v3", download_root="models/speech_encoder/") ``` 3. Download the unit-based HiFi-GAN vocoder. ```shell wget https://dl.fbaipublicfiles.com/fairseq/speech_to_speech/vocoder/code_hifigan/mhubert_vp_en_es_fr_it3_400k_layer11_km1000_lj/g_00500000 -P vocoder/ wget https://dl.fbaipublicfiles.com/fairseq/speech_to_speech/vocoder/code_hifigan/mhubert_vp_en_es_fr_it3_400k_layer11_km1000_lj/config.json -P vocoder/ ``` ## Gradio Demo 1. Launch a controller. ```shell python -m omni_speech.serve.controller --host 0.0.0.0 --port 10000 ``` 2. Launch a gradio web server. ```shell python -m omni_speech.serve.gradio_web_server --controller http://localhost:10000 --port 8000 --model-list-mode reload --vocoder vocoder/g_00500000 --vocoder-cfg vocoder/config.json ``` 3. Launch a model worker. ```shell python -m omni_speech.serve.model_worker --host 0.0.0.0 --controller http://localhost:10000 --port 40000 --worker http://localhost:40000 --model-path Llama-3.1-8B-Omni --model-name Llama-3.1-8B-Omni --s2s ``` 4. Visit [http://localhost:8000/](http://localhost:8000/) and interact with LLaMA-3.1-8B-Omni! **Note: Due to the instability of streaming audio playback in Gradio, we have only implemented streaming audio synthesis without enabling autoplay. If you have a good solution, feel free to submit a PR. Thanks!** ## Local Inference To run inference locally, please organize the speech instruction files according to the format in the `omni_speech/infer/examples` directory, then refer to the following script. ```shell bash omni_speech/infer/run.sh omni_speech/infer/examples ``` ## LICENSE Our code is released under the Apache-2.0 License. Our model is intended for academic research purposes only and may **NOT** be used for commercial purposes. You are free to use, modify, and distribute this model in academic settings, provided that the following conditions are met: - **Non-commercial use**: The model may not be used for any commercial purposes. - **Citation**: If you use this model in your research, please cite the original work. ### Commercial Use Restriction For any commercial use inquiries or to obtain a commercial license, please contact `fengyang@ict.ac.cn`. ## Acknowledgements - [LLaVA](https://github.com/haotian-liu/LLaVA): The codebase we built upon. - [SLAM-LLM](https://github.com/X-LANCE/SLAM-LLM): We borrow some code about speech encoder and speech adaptor. ## Citation If you have any questions, please feel free to submit an issue or contact `fangqingkai21b@ict.ac.cn`. If our work is useful for you, please cite as: ``` @article{fang-etal-2024-llama-omni, title={LLaMA-Omni: Seamless Speech Interaction with Large Language Models}, author={Fang, Qingkai and Guo, Shoutao and Zhou, Yan and Ma, Zhengrui and Zhang, Shaolei and Feng, Yang}, journal={arXiv preprint arXiv:2409.06666}, year={2024} } ```
kazshinohara/gemma-2-2b-int-jpn-lora1
kazshinohara
2024-11-14T04:44:57Z
97
0
transformers
[ "transformers", "tensorboard", "safetensors", "gemma2", "text-generation", "generated_from_trainer", "trl", "sft", "conversational", "base_model:google/gemma-2-2b-jpn-it", "base_model:finetune:google/gemma-2-2b-jpn-it", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-13T16:21:46Z
--- base_model: google/gemma-2-2b-jpn-it library_name: transformers model_name: gemma-2-2b-int-jpn-lora1 tags: - generated_from_trainer - trl - sft licence: license --- # Model Card for gemma-2-2b-int-jpn-lora1 This model is a fine-tuned version of [google/gemma-2-2b-jpn-it](https://huggingface.co/google/gemma-2-2b-jpn-it). It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="kazshinohara/gemma-2-2b-int-jpn-lora1", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure This model was trained with SFT. ### Framework versions - TRL: 0.12.0 - Transformers: 4.46.2 - Pytorch: 2.5.1 - Datasets: 3.1.0 - Tokenizers: 0.20.3 ## Citations Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
asr-africa/wav2vec2-xls-r-300m-lg-CV-414hrs-v10
asr-africa
2024-11-14T04:44:29Z
5
0
transformers
[ "transformers", "safetensors", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "base_model:facebook/wav2vec2-xls-r-300m", "base_model:finetune:facebook/wav2vec2-xls-r-300m", "license:apache-2.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2024-11-07T16:39:31Z
--- library_name: transformers license: apache-2.0 base_model: facebook/wav2vec2-xls-r-300m tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-xls-r-300m-lg-CV-414hrs-v10 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-xls-r-300m-lg-CV-414hrs-v10 This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3742 - Wer: 0.1098 - Cer: 0.0343 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 4 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 8 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 70 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | Cer | |:-------------:|:-----:|:-------:|:---------------:|:------:|:------:| | 0.4484 | 1.0 | 32306 | 0.3944 | 0.3772 | 0.0986 | | 0.2653 | 2.0 | 64612 | 0.3131 | 0.3193 | 0.0836 | | 0.226 | 3.0 | 96918 | 0.3028 | 0.2965 | 0.0784 | | 0.2019 | 4.0 | 129224 | 0.2732 | 0.2778 | 0.0721 | | 0.1849 | 5.0 | 161530 | 0.2768 | 0.2626 | 0.0685 | | 0.1712 | 6.0 | 193836 | 0.2824 | 0.2594 | 0.0694 | | 0.1589 | 7.0 | 226142 | 0.2490 | 0.2504 | 0.0654 | | 0.1496 | 8.0 | 258448 | 0.2502 | 0.2323 | 0.0621 | | 0.1407 | 9.0 | 290754 | 0.2438 | 0.2273 | 0.0598 | | 0.1333 | 10.0 | 323060 | 0.2510 | 0.2239 | 0.0613 | | 0.1265 | 11.0 | 355366 | 0.2317 | 0.2227 | 0.0594 | | 0.12 | 12.0 | 387672 | 0.2839 | 0.2147 | 0.0579 | | 0.1137 | 13.0 | 419978 | 0.2698 | 0.2073 | 0.0570 | | 0.108 | 14.0 | 452284 | 0.2360 | 0.2055 | 0.0554 | | 0.1026 | 15.0 | 484590 | 0.2390 | 0.1995 | 0.0552 | | 0.0975 | 16.0 | 516896 | 0.2494 | 0.1951 | 0.0530 | | 0.0925 | 17.0 | 549202 | 0.2465 | 0.1925 | 0.0542 | | 0.088 | 18.0 | 581508 | 0.2313 | 0.1870 | 0.0530 | | 0.0832 | 19.0 | 613814 | 0.2468 | 0.1856 | 0.0513 | | 0.0794 | 20.0 | 646120 | 0.2410 | 0.1905 | 0.0513 | | 0.0753 | 21.0 | 678426 | 0.2430 | 0.1787 | 0.0492 | | 0.0721 | 22.0 | 710732 | 0.2373 | 0.1774 | 0.0498 | | 0.0686 | 23.0 | 743038 | 0.2618 | 0.1774 | 0.0501 | | 0.0651 | 24.0 | 775344 | 0.2468 | 0.1698 | 0.0487 | | 0.0618 | 25.0 | 807650 | 0.2813 | 0.1660 | 0.0477 | | 0.0594 | 26.0 | 839956 | 0.2716 | 0.1674 | 0.0470 | | 0.0565 | 27.0 | 872262 | 0.2525 | 0.1584 | 0.0460 | | 0.0539 | 28.0 | 904568 | 0.2778 | 0.1596 | 0.0466 | | 0.0523 | 29.0 | 936874 | 0.2628 | 0.1588 | 0.0450 | | 0.0502 | 30.0 | 969180 | 0.2699 | 0.1543 | 0.0456 | | 0.0481 | 31.0 | 1001486 | 0.2680 | 0.1517 | 0.0446 | | 0.0461 | 32.0 | 1033792 | 0.3047 | 0.1581 | 0.0460 | | 0.0444 | 33.0 | 1066098 | 0.2902 | 0.1553 | 0.0444 | | 0.0427 | 34.0 | 1098404 | 0.2601 | 0.1506 | 0.0432 | | 0.0412 | 35.0 | 1130710 | 0.2839 | 0.1483 | 0.0428 | | 0.0397 | 36.0 | 1163016 | 0.2759 | 0.1387 | 0.0418 | | 0.0383 | 37.0 | 1195322 | 0.2770 | 0.1501 | 0.0427 | | 0.0371 | 38.0 | 1227628 | 0.2885 | 0.1424 | 0.0423 | | 0.036 | 39.0 | 1259934 | 0.2883 | 0.1421 | 0.0418 | | 0.0346 | 40.0 | 1292240 | 0.2915 | 0.1383 | 0.0414 | | 0.0337 | 41.0 | 1324546 | 0.2842 | 0.1390 | 0.0408 | | 0.0324 | 42.0 | 1356852 | 0.3078 | 0.1356 | 0.0412 | | 0.0313 | 43.0 | 1389158 | 0.3268 | 0.1375 | 0.0401 | | 0.0306 | 44.0 | 1421464 | 0.3019 | 0.1364 | 0.0411 | | 0.0295 | 45.0 | 1453770 | 0.3077 | 0.1351 | 0.0401 | | 0.0284 | 46.0 | 1486076 | 0.3226 | 0.1352 | 0.0404 | | 0.0274 | 47.0 | 1518382 | 0.2993 | 0.1335 | 0.0401 | | 0.0264 | 48.0 | 1550688 | 0.3139 | 0.1302 | 0.0384 | | 0.0256 | 49.0 | 1582994 | 0.3338 | 0.1331 | 0.0397 | | 0.025 | 50.0 | 1615300 | 0.3323 | 0.1260 | 0.0388 | | 0.024 | 51.0 | 1647606 | 0.3184 | 0.1291 | 0.0390 | | 0.0233 | 52.0 | 1679912 | 0.3239 | 0.1308 | 0.0385 | | 0.0226 | 53.0 | 1712218 | 0.3308 | 0.1250 | 0.0377 | | 0.0218 | 54.0 | 1744524 | 0.3498 | 0.1296 | 0.0384 | | 0.0212 | 55.0 | 1776830 | 0.3390 | 0.1246 | 0.0374 | | 0.0204 | 56.0 | 1809136 | 0.3439 | 0.1259 | 0.0379 | | 0.0197 | 57.0 | 1841442 | 0.3490 | 0.1227 | 0.0374 | | 0.019 | 58.0 | 1873748 | 0.3406 | 0.1228 | 0.0382 | | 0.0184 | 59.0 | 1906054 | 0.3401 | 0.1213 | 0.0374 | | 0.0179 | 60.0 | 1938360 | 0.3523 | 0.1206 | 0.0366 | | 0.017 | 61.0 | 1970666 | 0.3844 | 0.1200 | 0.0360 | | 0.0165 | 62.0 | 2002972 | 0.3582 | 0.1183 | 0.0356 | | 0.0162 | 63.0 | 2035278 | 0.3759 | 0.1163 | 0.0356 | | 0.0155 | 64.0 | 2067584 | 0.3641 | 0.1162 | 0.0363 | | 0.015 | 65.0 | 2099890 | 0.3634 | 0.1168 | 0.0345 | | 0.0145 | 66.0 | 2132196 | 0.3739 | 0.1140 | 0.0348 | | 0.014 | 67.0 | 2164502 | 0.3708 | 0.1137 | 0.0347 | | 0.0137 | 68.0 | 2196808 | 0.3715 | 0.1139 | 0.0347 | | 0.0136 | 69.0 | 2229114 | 0.3787 | 0.1114 | 0.0345 | | 0.0131 | 70.0 | 2261420 | 0.3742 | 0.1098 | 0.0343 | ### Framework versions - Transformers 4.46.2 - Pytorch 2.1.0+cu118 - Datasets 3.1.0 - Tokenizers 0.20.3
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold1
MayBashendy
2024-11-14T04:36:40Z
163
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:aubmindlab/bert-base-arabertv02", "base_model:finetune:aubmindlab/bert-base-arabertv02", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-11-14T04:18:17Z
--- library_name: transformers base_model: aubmindlab/bert-base-arabertv02 tags: - generated_from_trainer model-index: - name: Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold1 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold1 This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5218 - Qwk: 0.7879 - Mse: 0.5218 - Rmse: 0.7224 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse | |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:| | No log | 0.0057 | 2 | 5.9553 | 0.0 | 5.9553 | 2.4404 | | No log | 0.0114 | 4 | 4.2144 | -0.0211 | 4.2144 | 2.0529 | | No log | 0.0171 | 6 | 2.8075 | -0.0065 | 2.8075 | 1.6756 | | No log | 0.0228 | 8 | 1.7257 | -0.0460 | 1.7257 | 1.3137 | | No log | 0.0285 | 10 | 1.1372 | 0.1064 | 1.1372 | 1.0664 | | No log | 0.0342 | 12 | 1.3871 | -0.2466 | 1.3871 | 1.1778 | | No log | 0.0399 | 14 | 1.4026 | -0.1395 | 1.4026 | 1.1843 | | No log | 0.0456 | 16 | 1.1001 | 0.1967 | 1.1001 | 1.0489 | | No log | 0.0513 | 18 | 0.7877 | 0.2162 | 0.7877 | 0.8875 | | No log | 0.0570 | 20 | 0.7201 | 0.4 | 0.7201 | 0.8486 | | No log | 0.0627 | 22 | 1.3037 | 0.5 | 1.3037 | 1.1418 | | No log | 0.0684 | 24 | 1.8867 | 0.1390 | 1.8867 | 1.3736 | | No log | 0.0741 | 26 | 1.6223 | 0.0 | 1.6223 | 1.2737 | | No log | 0.0798 | 28 | 1.2141 | 0.1529 | 1.2141 | 1.1018 | | No log | 0.0855 | 30 | 1.0696 | 0.1529 | 1.0696 | 1.0342 | | No log | 0.0912 | 32 | 1.0124 | 0.2125 | 1.0124 | 1.0062 | | No log | 0.0969 | 34 | 0.8812 | 0.4 | 0.8812 | 0.9387 | | No log | 0.1026 | 36 | 0.9559 | 0.3253 | 0.9559 | 0.9777 | | No log | 0.1083 | 38 | 1.1711 | 0.1529 | 1.1711 | 1.0822 | | No log | 0.1140 | 40 | 1.2515 | 0.1529 | 1.2515 | 1.1187 | | No log | 0.1197 | 42 | 1.0380 | 0.3558 | 1.0380 | 1.0188 | | No log | 0.1254 | 44 | 0.6844 | 0.3834 | 0.6844 | 0.8273 | | No log | 0.1311 | 46 | 0.6623 | 0.2857 | 0.6623 | 0.8138 | | No log | 0.1368 | 48 | 0.7863 | 0.4 | 0.7863 | 0.8867 | | No log | 0.1425 | 50 | 1.1069 | 0.1529 | 1.1069 | 1.0521 | | No log | 0.1481 | 52 | 1.4507 | 0.0 | 1.4507 | 1.2044 | | No log | 0.1538 | 54 | 1.6077 | 0.0 | 1.6077 | 1.2680 | | No log | 0.1595 | 56 | 1.6080 | 0.0637 | 1.6080 | 1.2681 | | No log | 0.1652 | 58 | 1.3665 | 0.0637 | 1.3665 | 1.1690 | | No log | 0.1709 | 60 | 1.1490 | 0.2125 | 1.1490 | 1.0719 | | No log | 0.1766 | 62 | 1.1034 | 0.1840 | 1.1034 | 1.0504 | | No log | 0.1823 | 64 | 0.9229 | 0.2410 | 0.9229 | 0.9607 | | No log | 0.1880 | 66 | 0.9113 | 0.3787 | 0.9113 | 0.9546 | | No log | 0.1937 | 68 | 1.0245 | 0.3253 | 1.0245 | 1.0122 | | No log | 0.1994 | 70 | 1.2188 | 0.3000 | 1.2188 | 1.1040 | | No log | 0.2051 | 72 | 1.2769 | 0.3860 | 1.2769 | 1.1300 | | No log | 0.2108 | 74 | 1.4331 | 0.3352 | 1.4331 | 1.1971 | | No log | 0.2165 | 76 | 1.3842 | 0.2888 | 1.3842 | 1.1765 | | No log | 0.2222 | 78 | 1.1143 | 0.1529 | 1.1143 | 1.0556 | | No log | 0.2279 | 80 | 0.9083 | 0.3488 | 0.9083 | 0.9530 | | No log | 0.2336 | 82 | 1.0321 | 0.2410 | 1.0321 | 1.0159 | | No log | 0.2393 | 84 | 1.5326 | 0.4615 | 1.5326 | 1.2380 | | No log | 0.2450 | 86 | 1.8512 | 0.3083 | 1.8512 | 1.3606 | | No log | 0.2507 | 88 | 1.5496 | 0.4919 | 1.5496 | 1.2448 | | No log | 0.2564 | 90 | 1.0875 | 0.5149 | 1.0875 | 1.0428 | | No log | 0.2621 | 92 | 0.8297 | 0.5024 | 0.8297 | 0.9109 | | No log | 0.2678 | 94 | 0.8185 | 0.5962 | 0.8185 | 0.9047 | | No log | 0.2735 | 96 | 1.1174 | 0.6345 | 1.1174 | 1.0571 | | No log | 0.2792 | 98 | 1.6068 | 0.4981 | 1.6068 | 1.2676 | | No log | 0.2849 | 100 | 1.9906 | 0.4815 | 1.9906 | 1.4109 | | No log | 0.2906 | 102 | 1.6395 | 0.4656 | 1.6395 | 1.2804 | | No log | 0.2963 | 104 | 1.2300 | 0.5092 | 1.2300 | 1.1091 | | No log | 0.3020 | 106 | 0.7815 | 0.6061 | 0.7815 | 0.8840 | | No log | 0.3077 | 108 | 0.8095 | 0.5161 | 0.8095 | 0.8997 | | No log | 0.3134 | 110 | 1.0534 | 0.3269 | 1.0534 | 1.0264 | | No log | 0.3191 | 112 | 1.7102 | 0.3951 | 1.7102 | 1.3078 | | No log | 0.3248 | 114 | 2.6652 | 0.3456 | 2.6652 | 1.6325 | | No log | 0.3305 | 116 | 2.8202 | 0.3636 | 2.8202 | 1.6794 | | No log | 0.3362 | 118 | 1.9283 | 0.5364 | 1.9283 | 1.3886 | | No log | 0.3419 | 120 | 1.4794 | 0.5180 | 1.4794 | 1.2163 | | No log | 0.3476 | 122 | 1.3102 | 0.5056 | 1.3102 | 1.1446 | | No log | 0.3533 | 124 | 1.1877 | 0.5221 | 1.1877 | 1.0898 | | No log | 0.3590 | 126 | 0.9663 | 0.6557 | 0.9663 | 0.9830 | | No log | 0.3647 | 128 | 1.0351 | 0.6224 | 1.0351 | 1.0174 | | No log | 0.3704 | 130 | 1.4497 | 0.4488 | 1.4497 | 1.2041 | | No log | 0.3761 | 132 | 1.7251 | 0.5191 | 1.7251 | 1.3134 | | No log | 0.3818 | 134 | 1.8332 | 0.5128 | 1.8332 | 1.3540 | | No log | 0.3875 | 136 | 1.6450 | 0.5128 | 1.6450 | 1.2826 | | No log | 0.3932 | 138 | 1.1849 | 0.4825 | 1.1849 | 1.0885 | | No log | 0.3989 | 140 | 1.1002 | 0.5475 | 1.1002 | 1.0489 | | No log | 0.4046 | 142 | 1.1949 | 0.5154 | 1.1949 | 1.0931 | | No log | 0.4103 | 144 | 1.1046 | 0.5221 | 1.1046 | 1.0510 | | No log | 0.4160 | 146 | 0.9656 | 0.7449 | 0.9656 | 0.9827 | | No log | 0.4217 | 148 | 1.1384 | 0.5370 | 1.1384 | 1.0670 | | No log | 0.4274 | 150 | 1.6847 | 0.4488 | 1.6847 | 1.2979 | | No log | 0.4330 | 152 | 2.0021 | 0.4390 | 2.0021 | 1.4149 | | No log | 0.4387 | 154 | 1.8882 | 0.4453 | 1.8882 | 1.3741 | | No log | 0.4444 | 156 | 1.4650 | 0.5299 | 1.4650 | 1.2104 | | No log | 0.4501 | 158 | 1.2472 | 0.5692 | 1.2472 | 1.1168 | | No log | 0.4558 | 160 | 1.4563 | 0.5299 | 1.4563 | 1.2068 | | No log | 0.4615 | 162 | 1.7305 | 0.4865 | 1.7305 | 1.3155 | | No log | 0.4672 | 164 | 1.7899 | 0.3984 | 1.7899 | 1.3379 | | No log | 0.4729 | 166 | 1.5103 | 0.5039 | 1.5103 | 1.2289 | | No log | 0.4786 | 168 | 1.1861 | 0.6128 | 1.1861 | 1.0891 | | No log | 0.4843 | 170 | 1.0174 | 0.6471 | 1.0174 | 1.0087 | | No log | 0.4900 | 172 | 1.1351 | 0.6471 | 1.1351 | 1.0654 | | No log | 0.4957 | 174 | 1.4432 | 0.4143 | 1.4432 | 1.2013 | | No log | 0.5014 | 176 | 1.8686 | 0.4494 | 1.8686 | 1.3670 | | No log | 0.5071 | 178 | 1.7942 | 0.4296 | 1.7942 | 1.3395 | | No log | 0.5128 | 180 | 1.4394 | 0.4776 | 1.4394 | 1.1997 | | No log | 0.5185 | 182 | 1.4863 | 0.4776 | 1.4863 | 1.2191 | | No log | 0.5242 | 184 | 1.3825 | 0.5154 | 1.3825 | 1.1758 | | No log | 0.5299 | 186 | 1.2609 | 0.5221 | 1.2609 | 1.1229 | | No log | 0.5356 | 188 | 1.1375 | 0.5221 | 1.1375 | 1.0665 | | No log | 0.5413 | 190 | 0.9382 | 0.6557 | 0.9382 | 0.9686 | | No log | 0.5470 | 192 | 0.9337 | 0.5556 | 0.9337 | 0.9663 | | No log | 0.5527 | 194 | 1.0862 | 0.5221 | 1.0862 | 1.0422 | | No log | 0.5584 | 196 | 1.0262 | 0.5221 | 1.0262 | 1.0130 | | No log | 0.5641 | 198 | 0.8944 | 0.7050 | 0.8944 | 0.9457 | | No log | 0.5698 | 200 | 0.8957 | 0.7050 | 0.8957 | 0.9464 | | No log | 0.5755 | 202 | 0.8694 | 0.7482 | 0.8694 | 0.9324 | | No log | 0.5812 | 204 | 0.9287 | 0.7063 | 0.9287 | 0.9637 | | No log | 0.5869 | 206 | 0.9906 | 0.6290 | 0.9906 | 0.9953 | | No log | 0.5926 | 208 | 1.0959 | 0.5911 | 1.0959 | 1.0469 | | No log | 0.5983 | 210 | 1.3089 | 0.5911 | 1.3089 | 1.1441 | | No log | 0.6040 | 212 | 1.5201 | 0.6216 | 1.5201 | 1.2329 | | No log | 0.6097 | 214 | 1.4917 | 0.6020 | 1.4917 | 1.2213 | | No log | 0.6154 | 216 | 1.2015 | 0.5368 | 1.2015 | 1.0961 | | No log | 0.6211 | 218 | 0.8775 | 0.3171 | 0.8775 | 0.9368 | | No log | 0.6268 | 220 | 0.8406 | 0.4096 | 0.8406 | 0.9168 | | No log | 0.6325 | 222 | 0.9226 | 0.664 | 0.9226 | 0.9605 | | No log | 0.6382 | 224 | 1.1841 | 0.5625 | 1.1841 | 1.0882 | | No log | 0.6439 | 226 | 1.2157 | 0.5609 | 1.2157 | 1.1026 | | No log | 0.6496 | 228 | 1.2851 | 0.4750 | 1.2851 | 1.1336 | | No log | 0.6553 | 230 | 1.5040 | 0.4615 | 1.5040 | 1.2264 | | No log | 0.6610 | 232 | 1.8073 | 0.4815 | 1.8073 | 1.3444 | | No log | 0.6667 | 234 | 1.5648 | 0.4615 | 1.5648 | 1.2509 | | No log | 0.6724 | 236 | 1.5404 | 0.4122 | 1.5404 | 1.2411 | | No log | 0.6781 | 238 | 1.4223 | 0.4122 | 1.4223 | 1.1926 | | No log | 0.6838 | 240 | 1.2633 | 0.5103 | 1.2633 | 1.1239 | | No log | 0.6895 | 242 | 1.1589 | 0.6345 | 1.1589 | 1.0765 | | No log | 0.6952 | 244 | 1.0719 | 0.7222 | 1.0719 | 1.0353 | | No log | 0.7009 | 246 | 1.2054 | 0.6016 | 1.2054 | 1.0979 | | No log | 0.7066 | 248 | 1.0939 | 0.5783 | 1.0939 | 1.0459 | | No log | 0.7123 | 250 | 0.8682 | 0.6883 | 0.8682 | 0.9318 | | No log | 0.7179 | 252 | 0.8053 | 0.6192 | 0.8053 | 0.8974 | | No log | 0.7236 | 254 | 0.8216 | 0.6883 | 0.8216 | 0.9064 | | No log | 0.7293 | 256 | 0.9169 | 0.6617 | 0.9169 | 0.9576 | | No log | 0.7350 | 258 | 0.8999 | 0.7709 | 0.8999 | 0.9486 | | No log | 0.7407 | 260 | 1.1287 | 0.6557 | 1.1287 | 1.0624 | | No log | 0.7464 | 262 | 1.3315 | 0.6291 | 1.3315 | 1.1539 | | No log | 0.7521 | 264 | 1.5201 | 0.6291 | 1.5201 | 1.2329 | | No log | 0.7578 | 266 | 1.2832 | 0.6291 | 1.2832 | 1.1328 | | No log | 0.7635 | 268 | 0.9276 | 0.6828 | 0.9276 | 0.9631 | | No log | 0.7692 | 270 | 0.9474 | 0.7524 | 0.9474 | 0.9734 | | No log | 0.7749 | 272 | 0.8579 | 0.7524 | 0.8579 | 0.9262 | | No log | 0.7806 | 274 | 0.7175 | 0.7926 | 0.7175 | 0.8471 | | No log | 0.7863 | 276 | 0.8780 | 0.7524 | 0.8780 | 0.9370 | | No log | 0.7920 | 278 | 0.9402 | 0.7524 | 0.9402 | 0.9696 | | No log | 0.7977 | 280 | 1.1542 | 0.6755 | 1.1542 | 1.0743 | | No log | 0.8034 | 282 | 0.9877 | 0.7619 | 0.9877 | 0.9938 | | No log | 0.8091 | 284 | 0.6614 | 0.8256 | 0.6614 | 0.8133 | | No log | 0.8148 | 286 | 0.7557 | 0.8042 | 0.7557 | 0.8693 | | No log | 0.8205 | 288 | 1.1283 | 0.6423 | 1.1283 | 1.0622 | | No log | 0.8262 | 290 | 1.2134 | 0.6316 | 1.2134 | 1.1016 | | No log | 0.8319 | 292 | 1.1927 | 0.6316 | 1.1927 | 1.0921 | | No log | 0.8376 | 294 | 0.9958 | 0.7072 | 0.9958 | 0.9979 | | No log | 0.8433 | 296 | 0.7609 | 0.7603 | 0.7609 | 0.8723 | | No log | 0.8490 | 298 | 0.8566 | 0.7529 | 0.8566 | 0.9255 | | No log | 0.8547 | 300 | 1.2625 | 0.7072 | 1.2625 | 1.1236 | | No log | 0.8604 | 302 | 1.3835 | 0.5735 | 1.3835 | 1.1762 | | No log | 0.8661 | 304 | 1.1587 | 0.6769 | 1.1587 | 1.0764 | | No log | 0.8718 | 306 | 1.1455 | 0.6642 | 1.1455 | 1.0703 | | No log | 0.8775 | 308 | 1.2996 | 0.6028 | 1.2996 | 1.1400 | | No log | 0.8832 | 310 | 1.1203 | 0.7475 | 1.1203 | 1.0584 | | No log | 0.8889 | 312 | 1.1066 | 0.7475 | 1.1066 | 1.0519 | | No log | 0.8946 | 314 | 1.0546 | 0.7619 | 1.0546 | 1.0270 | | No log | 0.9003 | 316 | 0.8429 | 0.7879 | 0.8429 | 0.9181 | | No log | 0.9060 | 318 | 0.6492 | 0.7879 | 0.6492 | 0.8057 | | No log | 0.9117 | 320 | 0.7571 | 0.7879 | 0.7571 | 0.8701 | | No log | 0.9174 | 322 | 0.9400 | 0.8123 | 0.9400 | 0.9695 | | No log | 0.9231 | 324 | 1.2897 | 0.6488 | 1.2897 | 1.1357 | | No log | 0.9288 | 326 | 1.5118 | 0.6488 | 1.5118 | 1.2296 | | No log | 0.9345 | 328 | 1.2381 | 0.6488 | 1.2381 | 1.1127 | | No log | 0.9402 | 330 | 1.1169 | 0.6667 | 1.1169 | 1.0568 | | No log | 0.9459 | 332 | 0.8591 | 0.7586 | 0.8591 | 0.9269 | | No log | 0.9516 | 334 | 0.6416 | 0.7812 | 0.6416 | 0.8010 | | No log | 0.9573 | 336 | 0.7747 | 0.7586 | 0.7747 | 0.8802 | | No log | 0.9630 | 338 | 1.2774 | 0.5655 | 1.2774 | 1.1302 | | No log | 0.9687 | 340 | 1.4996 | 0.5655 | 1.4996 | 1.2246 | | No log | 0.9744 | 342 | 1.2530 | 0.5821 | 1.2530 | 1.1194 | | No log | 0.9801 | 344 | 0.8006 | 0.7050 | 0.8006 | 0.8947 | | No log | 0.9858 | 346 | 0.6359 | 0.6529 | 0.6359 | 0.7974 | | No log | 0.9915 | 348 | 0.7468 | 0.6883 | 0.7468 | 0.8642 | | No log | 0.9972 | 350 | 0.7922 | 0.6883 | 0.7922 | 0.8901 | | No log | 1.0028 | 352 | 0.7174 | 0.7510 | 0.7174 | 0.8470 | | No log | 1.0085 | 354 | 0.6053 | 0.7812 | 0.6053 | 0.7780 | | No log | 1.0142 | 356 | 0.7384 | 0.7640 | 0.7384 | 0.8593 | | No log | 1.0199 | 358 | 0.7972 | 0.8218 | 0.7972 | 0.8929 | | No log | 1.0256 | 360 | 0.6162 | 0.7812 | 0.6162 | 0.7850 | | No log | 1.0313 | 362 | 0.5996 | 0.7812 | 0.5996 | 0.7743 | | No log | 1.0370 | 364 | 0.8486 | 0.8218 | 0.8486 | 0.9212 | | No log | 1.0427 | 366 | 0.9619 | 0.7774 | 0.9619 | 0.9808 | | No log | 1.0484 | 368 | 0.7380 | 0.8218 | 0.7380 | 0.8591 | | No log | 1.0541 | 370 | 0.5966 | 0.7758 | 0.5966 | 0.7724 | | No log | 1.0598 | 372 | 0.7607 | 0.8218 | 0.7607 | 0.8722 | | No log | 1.0655 | 374 | 1.0126 | 0.7407 | 1.0126 | 1.0063 | | No log | 1.0712 | 376 | 1.0228 | 0.7083 | 1.0228 | 1.0114 | | No log | 1.0769 | 378 | 0.8861 | 0.8178 | 0.8861 | 0.9413 | | No log | 1.0826 | 380 | 0.7424 | 0.7829 | 0.7424 | 0.8616 | | No log | 1.0883 | 382 | 0.7676 | 0.7829 | 0.7676 | 0.8761 | | No log | 1.0940 | 384 | 1.0155 | 0.5914 | 1.0155 | 1.0077 | | No log | 1.0997 | 386 | 1.1670 | 0.5914 | 1.1670 | 1.0803 | | No log | 1.1054 | 388 | 1.0211 | 0.6231 | 1.0211 | 1.0105 | | No log | 1.1111 | 390 | 0.9916 | 0.6231 | 0.9916 | 0.9958 | | No log | 1.1168 | 392 | 0.7788 | 0.7255 | 0.7788 | 0.8825 | | No log | 1.1225 | 394 | 0.7880 | 0.7508 | 0.7880 | 0.8877 | | No log | 1.1282 | 396 | 0.8381 | 0.7123 | 0.8381 | 0.9155 | | No log | 1.1339 | 398 | 0.7144 | 0.7535 | 0.7144 | 0.8452 | | No log | 1.1396 | 400 | 0.7931 | 0.7535 | 0.7931 | 0.8906 | | No log | 1.1453 | 402 | 0.8863 | 0.7325 | 0.8863 | 0.9414 | | No log | 1.1510 | 404 | 0.9858 | 0.75 | 0.9858 | 0.9929 | | No log | 1.1567 | 406 | 0.7408 | 0.8082 | 0.7408 | 0.8607 | | No log | 1.1624 | 408 | 0.6042 | 0.7864 | 0.6042 | 0.7773 | | No log | 1.1681 | 410 | 0.4877 | 0.7103 | 0.4877 | 0.6984 | | No log | 1.1738 | 412 | 0.5810 | 0.7308 | 0.5810 | 0.7623 | | No log | 1.1795 | 414 | 0.7975 | 0.7508 | 0.7975 | 0.8930 | | No log | 1.1852 | 416 | 1.1115 | 0.7325 | 1.1115 | 1.0543 | | No log | 1.1909 | 418 | 1.2089 | 0.7147 | 1.2089 | 1.0995 | | No log | 1.1966 | 420 | 1.0030 | 0.7524 | 1.0030 | 1.0015 | | No log | 1.2023 | 422 | 0.6563 | 0.8121 | 0.6563 | 0.8101 | | No log | 1.2080 | 424 | 0.5008 | 0.8069 | 0.5008 | 0.7076 | | No log | 1.2137 | 426 | 0.5078 | 0.7490 | 0.5078 | 0.7126 | | No log | 1.2194 | 428 | 0.7047 | 0.8063 | 0.7047 | 0.8395 | | No log | 1.2251 | 430 | 0.9014 | 0.7072 | 0.9014 | 0.9494 | | No log | 1.2308 | 432 | 0.9999 | 0.7072 | 0.9999 | 0.9999 | | No log | 1.2365 | 434 | 0.9486 | 0.7072 | 0.9486 | 0.9740 | | No log | 1.2422 | 436 | 0.7652 | 0.8409 | 0.7652 | 0.8748 | | No log | 1.2479 | 438 | 0.6211 | 0.7926 | 0.6211 | 0.7881 | | No log | 1.2536 | 440 | 0.6854 | 0.8082 | 0.6854 | 0.8279 | | No log | 1.2593 | 442 | 0.8905 | 0.7921 | 0.8905 | 0.9437 | | No log | 1.2650 | 444 | 0.8078 | 0.8082 | 0.8078 | 0.8988 | | No log | 1.2707 | 446 | 0.8349 | 0.8082 | 0.8349 | 0.9137 | | No log | 1.2764 | 448 | 0.8433 | 0.8082 | 0.8433 | 0.9183 | | No log | 1.2821 | 450 | 0.7623 | 0.8082 | 0.7623 | 0.8731 | | No log | 1.2877 | 452 | 0.6468 | 0.7758 | 0.6468 | 0.8042 | | No log | 1.2934 | 454 | 0.6107 | 0.7758 | 0.6107 | 0.7815 | | No log | 1.2991 | 456 | 0.8541 | 0.8304 | 0.8541 | 0.9241 | | No log | 1.3048 | 458 | 1.2800 | 0.6975 | 1.2800 | 1.1314 | | No log | 1.3105 | 460 | 1.2850 | 0.6957 | 1.2850 | 1.1336 | | No log | 1.3162 | 462 | 0.9916 | 0.7407 | 0.9916 | 0.9958 | | No log | 1.3219 | 464 | 0.5955 | 0.7758 | 0.5955 | 0.7717 | | No log | 1.3276 | 466 | 0.4829 | 0.7490 | 0.4829 | 0.6949 | | No log | 1.3333 | 468 | 0.5647 | 0.8108 | 0.5647 | 0.7515 | | No log | 1.3390 | 470 | 0.7530 | 0.8123 | 0.7530 | 0.8678 | | No log | 1.3447 | 472 | 1.0056 | 0.6842 | 1.0056 | 1.0028 | | No log | 1.3504 | 474 | 1.0738 | 0.7220 | 1.0738 | 1.0363 | | No log | 1.3561 | 476 | 0.8989 | 0.7279 | 0.8989 | 0.9481 | | No log | 1.3618 | 478 | 0.6701 | 0.7756 | 0.6701 | 0.8186 | | No log | 1.3675 | 480 | 0.6359 | 0.7284 | 0.6359 | 0.7975 | | No log | 1.3732 | 482 | 0.6840 | 0.7756 | 0.6840 | 0.8270 | | No log | 1.3789 | 484 | 0.8368 | 0.8042 | 0.8368 | 0.9148 | | No log | 1.3846 | 486 | 0.9271 | 0.6842 | 0.9271 | 0.9628 | | No log | 1.3903 | 488 | 0.9116 | 0.6842 | 0.9116 | 0.9548 | | No log | 1.3960 | 490 | 0.6859 | 0.8063 | 0.6859 | 0.8282 | | No log | 1.4017 | 492 | 0.4929 | 0.7490 | 0.4929 | 0.7021 | | No log | 1.4074 | 494 | 0.4647 | 0.6807 | 0.4647 | 0.6817 | | No log | 1.4131 | 496 | 0.5608 | 0.6918 | 0.5608 | 0.7489 | | No log | 1.4188 | 498 | 0.8401 | 0.6994 | 0.8401 | 0.9166 | | 0.5626 | 1.4245 | 500 | 1.1310 | 0.72 | 1.1310 | 1.0635 | | 0.5626 | 1.4302 | 502 | 1.0426 | 0.8013 | 1.0426 | 1.0211 | | 0.5626 | 1.4359 | 504 | 0.9399 | 0.7712 | 0.9399 | 0.9695 | | 0.5626 | 1.4416 | 506 | 0.9398 | 0.7941 | 0.9398 | 0.9694 | | 0.5626 | 1.4473 | 508 | 0.8785 | 0.7658 | 0.8785 | 0.9373 | | 0.5626 | 1.4530 | 510 | 0.9748 | 0.7726 | 0.9748 | 0.9873 | | 0.5626 | 1.4587 | 512 | 1.0615 | 0.6934 | 1.0615 | 1.0303 | | 0.5626 | 1.4644 | 514 | 0.9735 | 0.75 | 0.9735 | 0.9867 | | 0.5626 | 1.4701 | 516 | 0.7642 | 0.7586 | 0.7642 | 0.8742 | | 0.5626 | 1.4758 | 518 | 0.6987 | 0.7879 | 0.6987 | 0.8359 | | 0.5626 | 1.4815 | 520 | 0.8177 | 0.7586 | 0.8177 | 0.9043 | | 0.5626 | 1.4872 | 522 | 0.9955 | 0.7727 | 0.9955 | 0.9977 | | 0.5626 | 1.4929 | 524 | 0.9236 | 0.7586 | 0.9236 | 0.9610 | | 0.5626 | 1.4986 | 526 | 0.6656 | 0.7879 | 0.6656 | 0.8158 | | 0.5626 | 1.5043 | 528 | 0.5810 | 0.7607 | 0.5810 | 0.7623 | | 0.5626 | 1.5100 | 530 | 0.6687 | 0.7879 | 0.6687 | 0.8177 | | 0.5626 | 1.5157 | 532 | 0.8806 | 0.7287 | 0.8806 | 0.9384 | | 0.5626 | 1.5214 | 534 | 0.9118 | 0.7895 | 0.9118 | 0.9549 | | 0.5626 | 1.5271 | 536 | 0.8337 | 0.7529 | 0.8337 | 0.9131 | | 0.5626 | 1.5328 | 538 | 0.7280 | 0.7529 | 0.7280 | 0.8532 | | 0.5626 | 1.5385 | 540 | 0.7642 | 0.7529 | 0.7642 | 0.8742 | | 0.5626 | 1.5442 | 542 | 0.7784 | 0.7726 | 0.7784 | 0.8823 | | 0.5626 | 1.5499 | 544 | 0.7606 | 0.7879 | 0.7606 | 0.8721 | | 0.5626 | 1.5556 | 546 | 0.8121 | 0.7879 | 0.8121 | 0.9012 | | 0.5626 | 1.5613 | 548 | 0.7398 | 0.7879 | 0.7398 | 0.8601 | | 0.5626 | 1.5670 | 550 | 0.6446 | 0.8304 | 0.6446 | 0.8029 | | 0.5626 | 1.5726 | 552 | 0.7470 | 0.8000 | 0.7470 | 0.8643 | | 0.5626 | 1.5783 | 554 | 0.7563 | 0.8000 | 0.7563 | 0.8697 | | 0.5626 | 1.5840 | 556 | 0.6568 | 0.8063 | 0.6568 | 0.8104 | | 0.5626 | 1.5897 | 558 | 0.5687 | 0.7686 | 0.5687 | 0.7541 | | 0.5626 | 1.5954 | 560 | 0.4314 | 0.7490 | 0.4314 | 0.6568 | | 0.5626 | 1.6011 | 562 | 0.4085 | 0.7490 | 0.4085 | 0.6392 | | 0.5626 | 1.6068 | 564 | 0.5319 | 0.7328 | 0.5319 | 0.7293 | | 0.5626 | 1.6125 | 566 | 0.7836 | 0.8269 | 0.7836 | 0.8852 | | 0.5626 | 1.6182 | 568 | 1.1909 | 0.7217 | 1.1909 | 1.0913 | | 0.5626 | 1.6239 | 570 | 1.1923 | 0.6957 | 1.1923 | 1.0919 | | 0.5626 | 1.6296 | 572 | 0.9109 | 0.7726 | 0.9109 | 0.9544 | | 0.5626 | 1.6353 | 574 | 0.7391 | 0.7627 | 0.7391 | 0.8597 | | 0.5626 | 1.6410 | 576 | 0.6809 | 0.8158 | 0.6809 | 0.8251 | | 0.5626 | 1.6467 | 578 | 0.8361 | 0.8178 | 0.8361 | 0.9144 | | 0.5626 | 1.6524 | 580 | 0.8983 | 0.8000 | 0.8983 | 0.9478 | | 0.5626 | 1.6581 | 582 | 0.8197 | 0.8456 | 0.8197 | 0.9053 | | 0.5626 | 1.6638 | 584 | 0.7037 | 0.8456 | 0.7037 | 0.8389 | | 0.5626 | 1.6695 | 586 | 0.6090 | 0.7535 | 0.6090 | 0.7804 | | 0.5626 | 1.6752 | 588 | 0.6575 | 0.8082 | 0.6575 | 0.8109 | | 0.5626 | 1.6809 | 590 | 0.7129 | 0.8304 | 0.7129 | 0.8444 | | 0.5626 | 1.6866 | 592 | 0.7191 | 0.8304 | 0.7191 | 0.8480 | | 0.5626 | 1.6923 | 594 | 0.7408 | 0.8456 | 0.7408 | 0.8607 | | 0.5626 | 1.6980 | 596 | 0.6411 | 0.8165 | 0.6411 | 0.8007 | | 0.5626 | 1.7037 | 598 | 0.4731 | 0.7863 | 0.4731 | 0.6878 | | 0.5626 | 1.7094 | 600 | 0.4652 | 0.7863 | 0.4652 | 0.6821 | | 0.5626 | 1.7151 | 602 | 0.6075 | 0.8082 | 0.6075 | 0.7794 | | 0.5626 | 1.7208 | 604 | 0.8281 | 0.8133 | 0.8281 | 0.9100 | | 0.5626 | 1.7265 | 606 | 0.8710 | 0.8133 | 0.8710 | 0.9333 | | 0.5626 | 1.7322 | 608 | 0.7634 | 0.8133 | 0.7634 | 0.8738 | | 0.5626 | 1.7379 | 610 | 0.6523 | 0.8304 | 0.6523 | 0.8076 | | 0.5626 | 1.7436 | 612 | 0.6971 | 0.8304 | 0.6971 | 0.8349 | | 0.5626 | 1.7493 | 614 | 0.8225 | 0.8133 | 0.8225 | 0.9069 | | 0.5626 | 1.7550 | 616 | 0.7742 | 0.8304 | 0.7742 | 0.8799 | | 0.5626 | 1.7607 | 618 | 0.7345 | 0.8304 | 0.7345 | 0.8571 | | 0.5626 | 1.7664 | 620 | 0.8618 | 0.7895 | 0.8618 | 0.9283 | | 0.5626 | 1.7721 | 622 | 0.8216 | 0.7895 | 0.8216 | 0.9064 | | 0.5626 | 1.7778 | 624 | 0.6360 | 0.7627 | 0.6360 | 0.7975 | | 0.5626 | 1.7835 | 626 | 0.5384 | 0.7342 | 0.5384 | 0.7337 | | 0.5626 | 1.7892 | 628 | 0.5957 | 0.7879 | 0.5957 | 0.7718 | | 0.5626 | 1.7949 | 630 | 0.7591 | 0.7895 | 0.7591 | 0.8712 | | 0.5626 | 1.8006 | 632 | 1.0377 | 0.6934 | 1.0377 | 1.0187 | | 0.5626 | 1.8063 | 634 | 1.0607 | 0.6934 | 1.0607 | 1.0299 | | 0.5626 | 1.8120 | 636 | 0.8769 | 0.7895 | 0.8769 | 0.9364 | | 0.5626 | 1.8177 | 638 | 0.5937 | 0.8256 | 0.5937 | 0.7705 | | 0.5626 | 1.8234 | 640 | 0.5290 | 0.7540 | 0.5290 | 0.7273 | | 0.5626 | 1.8291 | 642 | 0.6420 | 0.7508 | 0.6420 | 0.8012 | | 0.5626 | 1.8348 | 644 | 0.9649 | 0.7895 | 0.9649 | 0.9823 | | 0.5626 | 1.8405 | 646 | 1.3921 | 0.6358 | 1.3921 | 1.1799 | | 0.5626 | 1.8462 | 648 | 1.3989 | 0.6448 | 1.3989 | 1.1828 | | 0.5626 | 1.8519 | 650 | 1.0803 | 0.7726 | 1.0803 | 1.0394 | | 0.5626 | 1.8575 | 652 | 0.6604 | 0.8536 | 0.6604 | 0.8126 | | 0.5626 | 1.8632 | 654 | 0.4181 | 0.7083 | 0.4181 | 0.6466 | | 0.5626 | 1.8689 | 656 | 0.3927 | 0.7083 | 0.3927 | 0.6267 | | 0.5626 | 1.8746 | 658 | 0.4682 | 0.7175 | 0.4682 | 0.6842 | | 0.5626 | 1.8803 | 660 | 0.6398 | 0.8220 | 0.6398 | 0.7999 | | 0.5626 | 1.8860 | 662 | 0.8788 | 0.7705 | 0.8788 | 0.9374 | | 0.5626 | 1.8917 | 664 | 1.0056 | 0.7726 | 1.0056 | 1.0028 | | 0.5626 | 1.8974 | 666 | 0.8532 | 0.7895 | 0.8532 | 0.9237 | | 0.5626 | 1.9031 | 668 | 0.6819 | 0.7712 | 0.6819 | 0.8258 | | 0.5626 | 1.9088 | 670 | 0.6615 | 0.7423 | 0.6615 | 0.8133 | | 0.5626 | 1.9145 | 672 | 0.8641 | 0.7571 | 0.8641 | 0.9296 | | 0.5626 | 1.9202 | 674 | 1.0085 | 0.7571 | 1.0085 | 1.0042 | | 0.5626 | 1.9259 | 676 | 0.9675 | 0.7571 | 0.9675 | 0.9836 | | 0.5626 | 1.9316 | 678 | 0.7058 | 0.7508 | 0.7058 | 0.8401 | | 0.5626 | 1.9373 | 680 | 0.6479 | 0.7508 | 0.6479 | 0.8050 | | 0.5626 | 1.9430 | 682 | 0.7321 | 0.8082 | 0.7321 | 0.8556 | | 0.5626 | 1.9487 | 684 | 0.7981 | 0.8133 | 0.7981 | 0.8934 | | 0.5626 | 1.9544 | 686 | 0.9577 | 0.7605 | 0.9577 | 0.9786 | | 0.5626 | 1.9601 | 688 | 0.8184 | 0.8279 | 0.8184 | 0.9047 | | 0.5626 | 1.9658 | 690 | 0.7432 | 0.8409 | 0.7432 | 0.8621 | | 0.5626 | 1.9715 | 692 | 0.6251 | 0.8063 | 0.6251 | 0.7907 | | 0.5626 | 1.9772 | 694 | 0.5691 | 0.8108 | 0.5691 | 0.7544 | | 0.5626 | 1.9829 | 696 | 0.6242 | 0.7926 | 0.6242 | 0.7901 | | 0.5626 | 1.9886 | 698 | 0.7271 | 0.8082 | 0.7271 | 0.8527 | | 0.5626 | 1.9943 | 700 | 0.7500 | 0.7508 | 0.7500 | 0.8661 | | 0.5626 | 2.0 | 702 | 1.0217 | 0.8062 | 1.0217 | 1.0108 | | 0.5626 | 2.0057 | 704 | 1.0959 | 0.7727 | 1.0959 | 1.0469 | | 0.5626 | 2.0114 | 706 | 0.8080 | 0.7712 | 0.8080 | 0.8989 | | 0.5626 | 2.0171 | 708 | 0.6033 | 0.6776 | 0.6033 | 0.7767 | | 0.5626 | 2.0228 | 710 | 0.6205 | 0.7111 | 0.6205 | 0.7877 | | 0.5626 | 2.0285 | 712 | 0.7870 | 0.7459 | 0.7870 | 0.8871 | | 0.5626 | 2.0342 | 714 | 1.1189 | 0.7645 | 1.1189 | 1.0578 | | 0.5626 | 2.0399 | 716 | 1.0233 | 0.7879 | 1.0233 | 1.0116 | | 0.5626 | 2.0456 | 718 | 0.6894 | 0.7423 | 0.6894 | 0.8303 | | 0.5626 | 2.0513 | 720 | 0.5115 | 0.6807 | 0.5115 | 0.7152 | | 0.5626 | 2.0570 | 722 | 0.4126 | 0.7219 | 0.4126 | 0.6424 | | 0.5626 | 2.0627 | 724 | 0.4630 | 0.7219 | 0.4630 | 0.6804 | | 0.5626 | 2.0684 | 726 | 0.7157 | 0.8456 | 0.7157 | 0.8460 | | 0.5626 | 2.0741 | 728 | 0.8743 | 0.8178 | 0.8743 | 0.9350 | | 0.5626 | 2.0798 | 730 | 0.8442 | 0.8178 | 0.8442 | 0.9188 | | 0.5626 | 2.0855 | 732 | 0.6850 | 0.8456 | 0.6850 | 0.8277 | | 0.5626 | 2.0912 | 734 | 0.6671 | 0.8456 | 0.6671 | 0.8168 | | 0.5626 | 2.0969 | 736 | 0.6939 | 0.8304 | 0.6939 | 0.8330 | | 0.5626 | 2.1026 | 738 | 0.6017 | 0.7986 | 0.6017 | 0.7757 | | 0.5626 | 2.1083 | 740 | 0.5450 | 0.8244 | 0.5450 | 0.7383 | | 0.5626 | 2.1140 | 742 | 0.6298 | 0.8627 | 0.6298 | 0.7936 | | 0.5626 | 2.1197 | 744 | 0.6860 | 0.8383 | 0.6860 | 0.8282 | | 0.5626 | 2.1254 | 746 | 0.7361 | 0.8042 | 0.7361 | 0.8580 | | 0.5626 | 2.1311 | 748 | 0.7432 | 0.8042 | 0.7432 | 0.8621 | | 0.5626 | 2.1368 | 750 | 0.7115 | 0.8042 | 0.7115 | 0.8435 | | 0.5626 | 2.1425 | 752 | 0.6223 | 0.6918 | 0.6223 | 0.7889 | | 0.5626 | 2.1481 | 754 | 0.5764 | 0.6839 | 0.5764 | 0.7592 | | 0.5626 | 2.1538 | 756 | 0.6636 | 0.7234 | 0.6636 | 0.8146 | | 0.5626 | 2.1595 | 758 | 0.7824 | 0.7616 | 0.7824 | 0.8845 | | 0.5626 | 2.1652 | 760 | 0.7891 | 0.7820 | 0.7891 | 0.8883 | | 0.5626 | 2.1709 | 762 | 0.7409 | 0.7820 | 0.7409 | 0.8607 | | 0.5626 | 2.1766 | 764 | 0.6279 | 0.7266 | 0.6279 | 0.7924 | | 0.5626 | 2.1823 | 766 | 0.6043 | 0.7266 | 0.6043 | 0.7774 | | 0.5626 | 2.1880 | 768 | 0.6389 | 0.7266 | 0.6389 | 0.7993 | | 0.5626 | 2.1937 | 770 | 0.6372 | 0.7266 | 0.6372 | 0.7982 | | 0.5626 | 2.1994 | 772 | 0.6358 | 0.7266 | 0.6358 | 0.7974 | | 0.5626 | 2.2051 | 774 | 0.5760 | 0.7266 | 0.5760 | 0.7589 | | 0.5626 | 2.2108 | 776 | 0.6932 | 0.7709 | 0.6932 | 0.8326 | | 0.5626 | 2.2165 | 778 | 1.0010 | 0.7879 | 1.0010 | 1.0005 | | 0.5626 | 2.2222 | 780 | 1.1865 | 0.7267 | 1.1865 | 1.0893 | | 0.5626 | 2.2279 | 782 | 1.0788 | 0.7826 | 1.0788 | 1.0387 | | 0.5626 | 2.2336 | 784 | 0.8146 | 0.8042 | 0.8146 | 0.9025 | | 0.5626 | 2.2393 | 786 | 0.5159 | 0.7742 | 0.5159 | 0.7182 | | 0.5626 | 2.2450 | 788 | 0.3759 | 0.6866 | 0.3759 | 0.6131 | | 0.5626 | 2.2507 | 790 | 0.3510 | 0.7219 | 0.3510 | 0.5925 | | 0.5626 | 2.2564 | 792 | 0.4083 | 0.6866 | 0.4083 | 0.6389 | | 0.5626 | 2.2621 | 794 | 0.6354 | 0.7640 | 0.6354 | 0.7971 | | 0.5626 | 2.2678 | 796 | 0.8778 | 0.7879 | 0.8778 | 0.9369 | | 0.5626 | 2.2735 | 798 | 0.9432 | 0.7619 | 0.9432 | 0.9712 | | 0.5626 | 2.2792 | 800 | 0.8239 | 0.8383 | 0.8239 | 0.9077 | | 0.5626 | 2.2849 | 802 | 0.6630 | 0.8082 | 0.6630 | 0.8142 | | 0.5626 | 2.2906 | 804 | 0.6953 | 0.8082 | 0.6953 | 0.8338 | | 0.5626 | 2.2963 | 806 | 0.7101 | 0.7709 | 0.7101 | 0.8427 | | 0.5626 | 2.3020 | 808 | 0.7050 | 0.7529 | 0.7050 | 0.8396 | | 0.5626 | 2.3077 | 810 | 0.8600 | 0.7895 | 0.8600 | 0.9273 | | 0.5626 | 2.3134 | 812 | 0.9160 | 0.7895 | 0.9160 | 0.9571 | | 0.5626 | 2.3191 | 814 | 0.8224 | 0.7529 | 0.8224 | 0.9069 | | 0.5626 | 2.3248 | 816 | 0.6205 | 0.8 | 0.6205 | 0.7877 | | 0.5626 | 2.3305 | 818 | 0.4392 | 0.7490 | 0.4392 | 0.6627 | | 0.5626 | 2.3362 | 820 | 0.3767 | 0.6873 | 0.3767 | 0.6138 | | 0.5626 | 2.3419 | 822 | 0.3928 | 0.7219 | 0.3928 | 0.6267 | | 0.5626 | 2.3476 | 824 | 0.5107 | 0.7910 | 0.5107 | 0.7146 | | 0.5626 | 2.3533 | 826 | 0.8293 | 0.8042 | 0.8293 | 0.9107 | | 0.5626 | 2.3590 | 828 | 1.0593 | 0.7726 | 1.0593 | 1.0292 | | 0.5626 | 2.3647 | 830 | 1.0489 | 0.7726 | 1.0489 | 1.0242 | | 0.5626 | 2.3704 | 832 | 0.8851 | 0.7619 | 0.8851 | 0.9408 | | 0.5626 | 2.3761 | 834 | 0.6212 | 0.7640 | 0.6212 | 0.7882 | | 0.5626 | 2.3818 | 836 | 0.5224 | 0.7742 | 0.5224 | 0.7228 | | 0.5626 | 2.3875 | 838 | 0.5347 | 0.7742 | 0.5347 | 0.7312 | | 0.5626 | 2.3932 | 840 | 0.6243 | 0.7812 | 0.6243 | 0.7901 | | 0.5626 | 2.3989 | 842 | 0.7184 | 0.8304 | 0.7184 | 0.8476 | | 0.5626 | 2.4046 | 844 | 0.6593 | 0.7758 | 0.6593 | 0.8120 | | 0.5626 | 2.4103 | 846 | 0.5381 | 0.7926 | 0.5381 | 0.7335 | | 0.5626 | 2.4160 | 848 | 0.5643 | 0.7111 | 0.5643 | 0.7512 | | 0.5626 | 2.4217 | 850 | 0.7362 | 0.7921 | 0.7362 | 0.8580 | | 0.5626 | 2.4274 | 852 | 0.8550 | 0.7921 | 0.8550 | 0.9246 | | 0.5626 | 2.4330 | 854 | 0.8328 | 0.7921 | 0.8328 | 0.9126 | | 0.5626 | 2.4387 | 856 | 0.6567 | 0.7300 | 0.6567 | 0.8104 | | 0.5626 | 2.4444 | 858 | 0.4698 | 0.7572 | 0.4698 | 0.6854 | | 0.5626 | 2.4501 | 860 | 0.4361 | 0.7407 | 0.4361 | 0.6604 | | 0.5626 | 2.4558 | 862 | 0.4802 | 0.7755 | 0.4802 | 0.6930 | | 0.5626 | 2.4615 | 864 | 0.5723 | 0.7508 | 0.5723 | 0.7565 | | 0.5626 | 2.4672 | 866 | 0.6725 | 0.7921 | 0.6725 | 0.8201 | | 0.5626 | 2.4729 | 868 | 0.7175 | 0.8133 | 0.7175 | 0.8470 | | 0.5626 | 2.4786 | 870 | 0.5987 | 0.8082 | 0.5987 | 0.7737 | | 0.5626 | 2.4843 | 872 | 0.4283 | 0.7789 | 0.4283 | 0.6545 | | 0.5626 | 2.4900 | 874 | 0.3791 | 0.7219 | 0.3791 | 0.6157 | | 0.5626 | 2.4957 | 876 | 0.4006 | 0.7219 | 0.4006 | 0.6329 | | 0.5626 | 2.5014 | 878 | 0.5132 | 0.7635 | 0.5132 | 0.7164 | | 0.5626 | 2.5071 | 880 | 0.7605 | 0.7879 | 0.7605 | 0.8721 | | 0.5626 | 2.5128 | 882 | 0.9197 | 0.7726 | 0.9197 | 0.9590 | | 0.5626 | 2.5185 | 884 | 0.9939 | 0.7726 | 0.9939 | 0.9969 | | 0.5626 | 2.5242 | 886 | 0.8796 | 0.7726 | 0.8796 | 0.9379 | | 0.5626 | 2.5299 | 888 | 0.6753 | 0.7756 | 0.6753 | 0.8218 | | 0.5626 | 2.5356 | 890 | 0.4848 | 0.7016 | 0.4848 | 0.6963 | | 0.5626 | 2.5413 | 892 | 0.4488 | 0.7016 | 0.4488 | 0.6699 | | 0.5626 | 2.5470 | 894 | 0.4730 | 0.7016 | 0.4730 | 0.6878 | | 0.5626 | 2.5527 | 896 | 0.5670 | 0.6899 | 0.5670 | 0.7530 | | 0.5626 | 2.5584 | 898 | 0.8092 | 0.7407 | 0.8092 | 0.8995 | | 0.5626 | 2.5641 | 900 | 1.1040 | 0.7475 | 1.1040 | 1.0507 | | 0.5626 | 2.5698 | 902 | 1.2463 | 0.6975 | 1.2463 | 1.1164 | | 0.5626 | 2.5755 | 904 | 1.1573 | 0.7342 | 1.1573 | 1.0758 | | 0.5626 | 2.5812 | 906 | 0.9113 | 0.8350 | 0.9113 | 0.9546 | | 0.5626 | 2.5869 | 908 | 0.7225 | 0.8082 | 0.7225 | 0.8500 | | 0.5626 | 2.5926 | 910 | 0.5528 | 0.7518 | 0.5528 | 0.7435 | | 0.5626 | 2.5983 | 912 | 0.5352 | 0.7518 | 0.5352 | 0.7316 | | 0.5626 | 2.6040 | 914 | 0.5907 | 0.7464 | 0.5907 | 0.7686 | | 0.5626 | 2.6097 | 916 | 0.7409 | 0.7829 | 0.7409 | 0.8607 | | 0.5626 | 2.6154 | 918 | 0.8321 | 0.7726 | 0.8321 | 0.9122 | | 0.5626 | 2.6211 | 920 | 0.8428 | 0.8217 | 0.8428 | 0.9180 | | 0.5626 | 2.6268 | 922 | 0.8265 | 0.8097 | 0.8265 | 0.9091 | | 0.5626 | 2.6325 | 924 | 0.7261 | 0.7756 | 0.7261 | 0.8521 | | 0.5626 | 2.6382 | 926 | 0.6501 | 0.7756 | 0.6501 | 0.8063 | | 0.5626 | 2.6439 | 928 | 0.6923 | 0.7961 | 0.6923 | 0.8320 | | 0.5626 | 2.6496 | 930 | 0.7847 | 0.8350 | 0.7847 | 0.8858 | | 0.5626 | 2.6553 | 932 | 0.7377 | 0.7921 | 0.7377 | 0.8589 | | 0.5626 | 2.6610 | 934 | 0.5988 | 0.8 | 0.5988 | 0.7738 | | 0.5626 | 2.6667 | 936 | 0.4525 | 0.7093 | 0.4525 | 0.6727 | | 0.5626 | 2.6724 | 938 | 0.4171 | 0.6899 | 0.4171 | 0.6458 | | 0.5626 | 2.6781 | 940 | 0.4663 | 0.7093 | 0.4663 | 0.6828 | | 0.5626 | 2.6838 | 942 | 0.6328 | 0.8 | 0.6328 | 0.7955 | | 0.5626 | 2.6895 | 944 | 0.8795 | 0.8350 | 0.8795 | 0.9378 | | 0.5626 | 2.6952 | 946 | 0.9159 | 0.8000 | 0.9159 | 0.9570 | | 0.5626 | 2.7009 | 948 | 0.8307 | 0.8000 | 0.8307 | 0.9114 | | 0.5626 | 2.7066 | 950 | 0.8029 | 0.8000 | 0.8029 | 0.8960 | | 0.5626 | 2.7123 | 952 | 0.6425 | 0.8178 | 0.6425 | 0.8015 | | 0.5626 | 2.7179 | 954 | 0.4931 | 0.7328 | 0.4931 | 0.7022 | | 0.5626 | 2.7236 | 956 | 0.4708 | 0.7328 | 0.4708 | 0.6862 | | 0.5626 | 2.7293 | 958 | 0.5522 | 0.7950 | 0.5522 | 0.7431 | | 0.5626 | 2.7350 | 960 | 0.6393 | 0.8659 | 0.6393 | 0.7996 | | 0.5626 | 2.7407 | 962 | 0.6982 | 0.8178 | 0.6982 | 0.8356 | | 0.5626 | 2.7464 | 964 | 0.7750 | 0.8178 | 0.7750 | 0.8803 | | 0.5626 | 2.7521 | 966 | 0.7277 | 0.8042 | 0.7277 | 0.8530 | | 0.5626 | 2.7578 | 968 | 0.5716 | 0.8444 | 0.5716 | 0.7560 | | 0.5626 | 2.7635 | 970 | 0.4467 | 0.7093 | 0.4467 | 0.6684 | | 0.5626 | 2.7692 | 972 | 0.3999 | 0.7219 | 0.3999 | 0.6324 | | 0.5626 | 2.7749 | 974 | 0.4525 | 0.7635 | 0.4525 | 0.6727 | | 0.5626 | 2.7806 | 976 | 0.5929 | 0.8158 | 0.5929 | 0.7700 | | 0.5626 | 2.7863 | 978 | 0.7621 | 0.8178 | 0.7621 | 0.8730 | | 0.5626 | 2.7920 | 980 | 0.8207 | 0.7895 | 0.8207 | 0.9059 | | 0.5626 | 2.7977 | 982 | 0.7511 | 0.7895 | 0.7511 | 0.8667 | | 0.5626 | 2.8034 | 984 | 0.6027 | 0.7822 | 0.6027 | 0.7763 | | 0.5626 | 2.8091 | 986 | 0.5619 | 0.7822 | 0.5619 | 0.7496 | | 0.5626 | 2.8148 | 988 | 0.5958 | 0.7686 | 0.5958 | 0.7719 | | 0.5626 | 2.8205 | 990 | 0.7191 | 0.8042 | 0.7191 | 0.8480 | | 0.5626 | 2.8262 | 992 | 0.7939 | 0.8383 | 0.7939 | 0.8910 | | 0.5626 | 2.8319 | 994 | 0.8282 | 0.8217 | 0.8282 | 0.9101 | | 0.5626 | 2.8376 | 996 | 0.7593 | 0.8383 | 0.7593 | 0.8713 | | 0.5626 | 2.8433 | 998 | 0.6803 | 0.7812 | 0.6803 | 0.8248 | | 0.1793 | 2.8490 | 1000 | 0.5920 | 0.8050 | 0.5920 | 0.7694 | | 0.1793 | 2.8547 | 1002 | 0.4933 | 0.7372 | 0.4933 | 0.7024 | | 0.1793 | 2.8604 | 1004 | 0.4472 | 0.7518 | 0.4472 | 0.6687 | | 0.1793 | 2.8661 | 1006 | 0.5081 | 0.8151 | 0.5081 | 0.7128 | | 0.1793 | 2.8718 | 1008 | 0.6156 | 0.8304 | 0.6156 | 0.7846 | | 0.1793 | 2.8775 | 1010 | 0.7464 | 0.7774 | 0.7464 | 0.8639 | | 0.1793 | 2.8832 | 1012 | 0.7260 | 0.7774 | 0.7260 | 0.8521 | | 0.1793 | 2.8889 | 1014 | 0.6702 | 0.7774 | 0.6702 | 0.8187 | | 0.1793 | 2.8946 | 1016 | 0.5402 | 0.8521 | 0.5402 | 0.7350 | | 0.1793 | 2.9003 | 1018 | 0.4784 | 0.8151 | 0.4784 | 0.6917 | | 0.1793 | 2.9060 | 1020 | 0.4896 | 0.8151 | 0.4896 | 0.6997 | | 0.1793 | 2.9117 | 1022 | 0.5572 | 0.8521 | 0.5572 | 0.7465 | | 0.1793 | 2.9174 | 1024 | 0.7038 | 0.7774 | 0.7038 | 0.8390 | | 0.1793 | 2.9231 | 1026 | 0.8911 | 0.7143 | 0.8911 | 0.9440 | | 0.1793 | 2.9288 | 1028 | 0.9649 | 0.6957 | 0.9649 | 0.9823 | | 0.1793 | 2.9345 | 1030 | 0.8485 | 0.7774 | 0.8485 | 0.9212 | | 0.1793 | 2.9402 | 1032 | 0.6482 | 0.8414 | 0.6482 | 0.8051 | | 0.1793 | 2.9459 | 1034 | 0.4537 | 0.7219 | 0.4537 | 0.6736 | | 0.1793 | 2.9516 | 1036 | 0.3920 | 0.7391 | 0.3920 | 0.6261 | | 0.1793 | 2.9573 | 1038 | 0.4153 | 0.7391 | 0.4153 | 0.6445 | | 0.1793 | 2.9630 | 1040 | 0.5306 | 0.7284 | 0.5306 | 0.7284 | | 0.1793 | 2.9687 | 1042 | 0.7785 | 0.8133 | 0.7785 | 0.8823 | | 0.1793 | 2.9744 | 1044 | 1.0306 | 0.7083 | 1.0306 | 1.0152 | | 0.1793 | 2.9801 | 1046 | 1.1623 | 0.6957 | 1.1623 | 1.0781 | | 0.1793 | 2.9858 | 1048 | 1.1018 | 0.7083 | 1.1018 | 1.0497 | | 0.1793 | 2.9915 | 1050 | 0.9059 | 0.7895 | 0.9059 | 0.9518 | | 0.1793 | 2.9972 | 1052 | 0.7210 | 0.7529 | 0.7210 | 0.8491 | | 0.1793 | 3.0028 | 1054 | 0.5898 | 0.7014 | 0.5898 | 0.7680 | | 0.1793 | 3.0085 | 1056 | 0.5227 | 0.7175 | 0.5227 | 0.7230 | | 0.1793 | 3.0142 | 1058 | 0.4838 | 0.7667 | 0.4838 | 0.6956 | | 0.1793 | 3.0199 | 1060 | 0.5166 | 0.8048 | 0.5166 | 0.7188 | | 0.1793 | 3.0256 | 1062 | 0.6460 | 0.8028 | 0.6460 | 0.8037 | | 0.1793 | 3.0313 | 1064 | 0.8813 | 0.7895 | 0.8813 | 0.9388 | | 0.1793 | 3.0370 | 1066 | 0.9824 | 0.7083 | 0.9824 | 0.9912 | | 0.1793 | 3.0427 | 1068 | 0.9073 | 0.7220 | 0.9073 | 0.9525 | | 0.1793 | 3.0484 | 1070 | 0.7268 | 0.8178 | 0.7268 | 0.8525 | | 0.1793 | 3.0541 | 1072 | 0.5002 | 0.7175 | 0.5002 | 0.7073 | | 0.1793 | 3.0598 | 1074 | 0.4262 | 0.7667 | 0.4262 | 0.6528 | | 0.1793 | 3.0655 | 1076 | 0.4563 | 0.7667 | 0.4563 | 0.6755 | | 0.1793 | 3.0712 | 1078 | 0.5546 | 0.7879 | 0.5546 | 0.7447 | | 0.1793 | 3.0769 | 1080 | 0.6295 | 0.8456 | 0.6295 | 0.7934 | | 0.1793 | 3.0826 | 1082 | 0.6833 | 0.8456 | 0.6833 | 0.8266 | | 0.1793 | 3.0883 | 1084 | 0.5928 | 0.8304 | 0.5928 | 0.7700 | | 0.1793 | 3.0940 | 1086 | 0.5791 | 0.8304 | 0.5791 | 0.7610 | | 0.1793 | 3.0997 | 1088 | 0.6597 | 0.8304 | 0.6597 | 0.8122 | | 0.1793 | 3.1054 | 1090 | 0.8037 | 0.8456 | 0.8037 | 0.8965 | | 0.1793 | 3.1111 | 1092 | 0.8206 | 0.8456 | 0.8206 | 0.9058 | | 0.1793 | 3.1168 | 1094 | 0.7288 | 0.8456 | 0.7288 | 0.8537 | | 0.1793 | 3.1225 | 1096 | 0.6532 | 0.8456 | 0.6532 | 0.8082 | | 0.1793 | 3.1282 | 1098 | 0.6810 | 0.8456 | 0.6810 | 0.8252 | | 0.1793 | 3.1339 | 1100 | 0.6908 | 0.8659 | 0.6908 | 0.8311 | | 0.1793 | 3.1396 | 1102 | 0.8102 | 0.7895 | 0.8102 | 0.9001 | | 0.1793 | 3.1453 | 1104 | 0.8628 | 0.7895 | 0.8628 | 0.9289 | | 0.1793 | 3.1510 | 1106 | 0.8295 | 0.7895 | 0.8295 | 0.9108 | | 0.1793 | 3.1567 | 1108 | 0.6852 | 0.8456 | 0.6852 | 0.8278 | | 0.1793 | 3.1624 | 1110 | 0.5072 | 0.7607 | 0.5072 | 0.7122 | | 0.1793 | 3.1681 | 1112 | 0.4378 | 0.7490 | 0.4378 | 0.6616 | | 0.1793 | 3.1738 | 1114 | 0.4730 | 0.7490 | 0.4730 | 0.6877 | | 0.1793 | 3.1795 | 1116 | 0.6275 | 0.7986 | 0.6275 | 0.7921 | | 0.1793 | 3.1852 | 1118 | 0.7321 | 0.8178 | 0.7321 | 0.8556 | | 0.1793 | 3.1909 | 1120 | 0.6960 | 0.8304 | 0.6960 | 0.8342 | | 0.1793 | 3.1966 | 1122 | 0.6013 | 0.8304 | 0.6013 | 0.7755 | | 0.1793 | 3.2023 | 1124 | 0.5316 | 0.7742 | 0.5316 | 0.7291 | | 0.1793 | 3.2080 | 1126 | 0.5408 | 0.7568 | 0.5408 | 0.7354 | | 0.1793 | 3.2137 | 1128 | 0.6798 | 0.8304 | 0.6798 | 0.8245 | | 0.1793 | 3.2194 | 1130 | 0.7439 | 0.8042 | 0.7439 | 0.8625 | | 0.1793 | 3.2251 | 1132 | 0.6718 | 0.8304 | 0.6718 | 0.8196 | | 0.1793 | 3.2308 | 1134 | 0.6613 | 0.8042 | 0.6613 | 0.8132 | | 0.1793 | 3.2365 | 1136 | 0.6558 | 0.8042 | 0.6558 | 0.8098 | | 0.1793 | 3.2422 | 1138 | 0.6501 | 0.8042 | 0.6501 | 0.8063 | | 0.1793 | 3.2479 | 1140 | 0.6507 | 0.8042 | 0.6507 | 0.8066 | | 0.1793 | 3.2536 | 1142 | 0.5708 | 0.7758 | 0.5708 | 0.7555 | | 0.1793 | 3.2593 | 1144 | 0.4999 | 0.7789 | 0.4999 | 0.7071 | | 0.1793 | 3.2650 | 1146 | 0.4775 | 0.7789 | 0.4775 | 0.6910 | | 0.1793 | 3.2707 | 1148 | 0.5087 | 0.7789 | 0.5087 | 0.7133 | | 0.1793 | 3.2764 | 1150 | 0.5862 | 0.7961 | 0.5862 | 0.7656 | | 0.1793 | 3.2821 | 1152 | 0.7124 | 0.8383 | 0.7124 | 0.8441 | | 0.1793 | 3.2877 | 1154 | 0.7964 | 0.8133 | 0.7964 | 0.8924 | | 0.1793 | 3.2934 | 1156 | 0.8767 | 0.7619 | 0.8767 | 0.9363 | | 0.1793 | 3.2991 | 1158 | 0.8109 | 0.8133 | 0.8109 | 0.9005 | | 0.1793 | 3.3048 | 1160 | 0.6483 | 0.8133 | 0.6483 | 0.8052 | | 0.1793 | 3.3105 | 1162 | 0.5216 | 0.7219 | 0.5216 | 0.7222 | | 0.1793 | 3.3162 | 1164 | 0.5004 | 0.7219 | 0.5004 | 0.7074 | | 0.1793 | 3.3219 | 1166 | 0.5497 | 0.7162 | 0.5497 | 0.7414 | | 0.1793 | 3.3276 | 1168 | 0.5952 | 0.7482 | 0.5952 | 0.7715 | | 0.1793 | 3.3333 | 1170 | 0.6270 | 0.8133 | 0.6270 | 0.7919 | | 0.1793 | 3.3390 | 1172 | 0.7012 | 0.7774 | 0.7012 | 0.8374 | | 0.1793 | 3.3447 | 1174 | 0.7422 | 0.7774 | 0.7422 | 0.8615 | | 0.1793 | 3.3504 | 1176 | 0.7014 | 0.7774 | 0.7014 | 0.8375 | | 0.1793 | 3.3561 | 1178 | 0.6074 | 0.7640 | 0.6074 | 0.7793 | | 0.1793 | 3.3618 | 1180 | 0.4894 | 0.7789 | 0.4894 | 0.6996 | | 0.1793 | 3.3675 | 1182 | 0.4134 | 0.6873 | 0.4134 | 0.6430 | | 0.1793 | 3.3732 | 1184 | 0.4156 | 0.6873 | 0.4156 | 0.6447 | | 0.1793 | 3.3789 | 1186 | 0.4870 | 0.7789 | 0.4870 | 0.6978 | | 0.1793 | 3.3846 | 1188 | 0.6283 | 0.7879 | 0.6283 | 0.7926 | | 0.1793 | 3.3903 | 1190 | 0.8506 | 0.7774 | 0.8506 | 0.9223 | | 0.1793 | 3.3960 | 1192 | 0.9733 | 0.7726 | 0.9733 | 0.9866 | | 0.1793 | 3.4017 | 1194 | 0.9599 | 0.7726 | 0.9599 | 0.9797 | | 0.1793 | 3.4074 | 1196 | 0.8351 | 0.7426 | 0.8351 | 0.9138 | | 0.1793 | 3.4131 | 1198 | 0.6923 | 0.72 | 0.6923 | 0.8321 | | 0.1793 | 3.4188 | 1200 | 0.5921 | 0.7364 | 0.5921 | 0.7695 | | 0.1793 | 3.4245 | 1202 | 0.5064 | 0.7273 | 0.5064 | 0.7116 | | 0.1793 | 3.4302 | 1204 | 0.4908 | 0.7273 | 0.4908 | 0.7006 | | 0.1793 | 3.4359 | 1206 | 0.5256 | 0.7273 | 0.5256 | 0.7250 | | 0.1793 | 3.4416 | 1208 | 0.6377 | 0.72 | 0.6377 | 0.7986 | | 0.1793 | 3.4473 | 1210 | 0.6781 | 0.7426 | 0.6781 | 0.8235 | | 0.1793 | 3.4530 | 1212 | 0.6353 | 0.7709 | 0.6353 | 0.7971 | | 0.1793 | 3.4587 | 1214 | 0.5898 | 0.7651 | 0.5898 | 0.7680 | | 0.1793 | 3.4644 | 1216 | 0.6123 | 0.7651 | 0.6123 | 0.7825 | | 0.1793 | 3.4701 | 1218 | 0.6896 | 0.7651 | 0.6896 | 0.8304 | | 0.1793 | 3.4758 | 1220 | 0.7340 | 0.7812 | 0.7340 | 0.8567 | | 0.1793 | 3.4815 | 1222 | 0.7507 | 0.7812 | 0.7507 | 0.8664 | | 0.1793 | 3.4872 | 1224 | 0.6923 | 0.8282 | 0.6923 | 0.8320 | | 0.1793 | 3.4929 | 1226 | 0.6148 | 0.8 | 0.6148 | 0.7841 | | 0.1793 | 3.4986 | 1228 | 0.6135 | 0.8 | 0.6135 | 0.7832 | | 0.1793 | 3.5043 | 1230 | 0.6373 | 0.7651 | 0.6373 | 0.7983 | | 0.1793 | 3.5100 | 1232 | 0.6832 | 0.7709 | 0.6832 | 0.8266 | | 0.1793 | 3.5157 | 1234 | 0.7389 | 0.7426 | 0.7389 | 0.8596 | | 0.1793 | 3.5214 | 1236 | 0.7623 | 0.7895 | 0.7623 | 0.8731 | | 0.1793 | 3.5271 | 1238 | 0.8197 | 0.7726 | 0.8197 | 0.9054 | | 0.1793 | 3.5328 | 1240 | 0.7863 | 0.7726 | 0.7863 | 0.8867 | | 0.1793 | 3.5385 | 1242 | 0.6749 | 0.8042 | 0.6749 | 0.8215 | | 0.1793 | 3.5442 | 1244 | 0.6405 | 0.8082 | 0.6405 | 0.8003 | | 0.1793 | 3.5499 | 1246 | 0.7138 | 0.8082 | 0.7138 | 0.8449 | | 0.1793 | 3.5556 | 1248 | 0.8351 | 0.7619 | 0.8351 | 0.9138 | | 0.1793 | 3.5613 | 1250 | 0.8582 | 0.7879 | 0.8582 | 0.9264 | | 0.1793 | 3.5670 | 1252 | 0.8566 | 0.7619 | 0.8566 | 0.9255 | | 0.1793 | 3.5726 | 1254 | 0.7468 | 0.8042 | 0.7468 | 0.8642 | | 0.1793 | 3.5783 | 1256 | 0.6162 | 0.8069 | 0.6162 | 0.7850 | | 0.1793 | 3.5840 | 1258 | 0.5769 | 0.7635 | 0.5769 | 0.7595 | | 0.1793 | 3.5897 | 1260 | 0.6206 | 0.8256 | 0.6206 | 0.7878 | | 0.1793 | 3.5954 | 1262 | 0.7610 | 0.7619 | 0.7610 | 0.8723 | | 0.1793 | 3.6011 | 1264 | 0.9405 | 0.7016 | 0.9405 | 0.9698 | | 0.1793 | 3.6068 | 1266 | 1.1202 | 0.6957 | 1.1202 | 1.0584 | | 0.1793 | 3.6125 | 1268 | 1.1071 | 0.6957 | 1.1071 | 1.0522 | | 0.1793 | 3.6182 | 1270 | 0.9472 | 0.7016 | 0.9472 | 0.9732 | | 0.1793 | 3.6239 | 1272 | 0.7238 | 0.7774 | 0.7238 | 0.8507 | | 0.1793 | 3.6296 | 1274 | 0.5597 | 0.7586 | 0.5597 | 0.7482 | | 0.1793 | 3.6353 | 1276 | 0.5188 | 0.7863 | 0.5188 | 0.7203 | | 0.1793 | 3.6410 | 1278 | 0.5495 | 0.7429 | 0.5495 | 0.7413 | | 0.1793 | 3.6467 | 1280 | 0.6222 | 0.72 | 0.6222 | 0.7888 | | 0.1793 | 3.6524 | 1282 | 0.6865 | 0.72 | 0.6865 | 0.8285 | | 0.1793 | 3.6581 | 1284 | 0.6755 | 0.72 | 0.6755 | 0.8219 | | 0.1793 | 3.6638 | 1286 | 0.6584 | 0.7510 | 0.6584 | 0.8114 | | 0.1793 | 3.6695 | 1288 | 0.6883 | 0.7879 | 0.6883 | 0.8296 | | 0.1793 | 3.6752 | 1290 | 0.7856 | 0.8042 | 0.7856 | 0.8864 | | 0.1793 | 3.6809 | 1292 | 0.8182 | 0.8042 | 0.8182 | 0.9045 | | 0.1793 | 3.6866 | 1294 | 0.7741 | 0.8042 | 0.7741 | 0.8799 | | 0.1793 | 3.6923 | 1296 | 0.7360 | 0.7879 | 0.7360 | 0.8579 | | 0.1793 | 3.6980 | 1298 | 0.7137 | 0.7879 | 0.7137 | 0.8448 | | 0.1793 | 3.7037 | 1300 | 0.6787 | 0.7510 | 0.6787 | 0.8238 | | 0.1793 | 3.7094 | 1302 | 0.7473 | 0.7879 | 0.7473 | 0.8644 | | 0.1793 | 3.7151 | 1304 | 0.8039 | 0.8042 | 0.8039 | 0.8966 | | 0.1793 | 3.7208 | 1306 | 0.8255 | 0.8042 | 0.8255 | 0.9086 | | 0.1793 | 3.7265 | 1308 | 0.7628 | 0.8042 | 0.7628 | 0.8734 | | 0.1793 | 3.7322 | 1310 | 0.6981 | 0.8304 | 0.6981 | 0.8355 | | 0.1793 | 3.7379 | 1312 | 0.6405 | 0.8082 | 0.6405 | 0.8003 | | 0.1793 | 3.7436 | 1314 | 0.6563 | 0.8304 | 0.6563 | 0.8101 | | 0.1793 | 3.7493 | 1316 | 0.6434 | 0.8304 | 0.6434 | 0.8021 | | 0.1793 | 3.7550 | 1318 | 0.6741 | 0.8304 | 0.6741 | 0.8210 | | 0.1793 | 3.7607 | 1320 | 0.6260 | 0.7812 | 0.6260 | 0.7912 | | 0.1793 | 3.7664 | 1322 | 0.5836 | 0.8 | 0.5836 | 0.7639 | | 0.1793 | 3.7721 | 1324 | 0.5924 | 0.7568 | 0.5924 | 0.7697 | | 0.1793 | 3.7778 | 1326 | 0.6025 | 0.7758 | 0.6025 | 0.7762 | | 0.1793 | 3.7835 | 1328 | 0.6267 | 0.8082 | 0.6267 | 0.7917 | | 0.1793 | 3.7892 | 1330 | 0.6403 | 0.8304 | 0.6403 | 0.8002 | | 0.1793 | 3.7949 | 1332 | 0.6937 | 0.8304 | 0.6937 | 0.8329 | | 0.1793 | 3.8006 | 1334 | 0.6108 | 0.8165 | 0.6108 | 0.7815 | | 0.1793 | 3.8063 | 1336 | 0.5345 | 0.7742 | 0.5345 | 0.7311 | | 0.1793 | 3.8120 | 1338 | 0.5312 | 0.7568 | 0.5312 | 0.7289 | | 0.1793 | 3.8177 | 1340 | 0.5176 | 0.7568 | 0.5176 | 0.7194 | | 0.1793 | 3.8234 | 1342 | 0.5392 | 0.7812 | 0.5392 | 0.7343 | | 0.1793 | 3.8291 | 1344 | 0.6049 | 0.7510 | 0.6049 | 0.7778 | | 0.1793 | 3.8348 | 1346 | 0.6576 | 0.7879 | 0.6576 | 0.8109 | | 0.1793 | 3.8405 | 1348 | 0.7505 | 0.8218 | 0.7505 | 0.8663 | | 0.1793 | 3.8462 | 1350 | 0.8500 | 0.8042 | 0.8500 | 0.9220 | | 0.1793 | 3.8519 | 1352 | 0.8290 | 0.8042 | 0.8290 | 0.9105 | | 0.1793 | 3.8575 | 1354 | 0.7065 | 0.8218 | 0.7065 | 0.8405 | | 0.1793 | 3.8632 | 1356 | 0.6315 | 0.7879 | 0.6315 | 0.7946 | | 0.1793 | 3.8689 | 1358 | 0.5838 | 0.7812 | 0.5838 | 0.7640 | | 0.1793 | 3.8746 | 1360 | 0.5511 | 0.8 | 0.5511 | 0.7424 | | 0.1793 | 3.8803 | 1362 | 0.5399 | 0.7742 | 0.5399 | 0.7348 | | 0.1793 | 3.8860 | 1364 | 0.5593 | 0.7568 | 0.5593 | 0.7478 | | 0.1793 | 3.8917 | 1366 | 0.5943 | 0.7812 | 0.5943 | 0.7709 | | 0.1793 | 3.8974 | 1368 | 0.5543 | 0.7742 | 0.5543 | 0.7445 | | 0.1793 | 3.9031 | 1370 | 0.4942 | 0.7742 | 0.4942 | 0.7030 | | 0.1793 | 3.9088 | 1372 | 0.4912 | 0.7742 | 0.4912 | 0.7008 | | 0.1793 | 3.9145 | 1374 | 0.5195 | 0.7407 | 0.5195 | 0.7208 | | 0.1793 | 3.9202 | 1376 | 0.5067 | 0.7568 | 0.5067 | 0.7118 | | 0.1793 | 3.9259 | 1378 | 0.5227 | 0.7568 | 0.5227 | 0.7230 | | 0.1793 | 3.9316 | 1380 | 0.6226 | 0.7758 | 0.6226 | 0.7891 | | 0.1793 | 3.9373 | 1382 | 0.6845 | 0.8304 | 0.6845 | 0.8273 | | 0.1793 | 3.9430 | 1384 | 0.6556 | 0.8304 | 0.6556 | 0.8097 | | 0.1793 | 3.9487 | 1386 | 0.5990 | 0.7986 | 0.5990 | 0.7739 | | 0.1793 | 3.9544 | 1388 | 0.5071 | 0.7568 | 0.5071 | 0.7121 | | 0.1793 | 3.9601 | 1390 | 0.4253 | 0.8014 | 0.4253 | 0.6522 | | 0.1793 | 3.9658 | 1392 | 0.4355 | 0.8014 | 0.4355 | 0.6600 | | 0.1793 | 3.9715 | 1394 | 0.5207 | 0.7407 | 0.5207 | 0.7216 | | 0.1793 | 3.9772 | 1396 | 0.5534 | 0.7986 | 0.5534 | 0.7439 | | 0.1793 | 3.9829 | 1398 | 0.5646 | 0.7986 | 0.5646 | 0.7514 | | 0.1793 | 3.9886 | 1400 | 0.5115 | 0.7742 | 0.5115 | 0.7152 | | 0.1793 | 3.9943 | 1402 | 0.5223 | 0.8 | 0.5223 | 0.7227 | | 0.1793 | 4.0 | 1404 | 0.5377 | 0.8 | 0.5377 | 0.7333 | | 0.1793 | 4.0057 | 1406 | 0.5763 | 0.8165 | 0.5763 | 0.7591 | | 0.1793 | 4.0114 | 1408 | 0.6100 | 0.7986 | 0.6100 | 0.7810 | | 0.1793 | 4.0171 | 1410 | 0.5771 | 0.7986 | 0.5771 | 0.7597 | | 0.1793 | 4.0228 | 1412 | 0.5929 | 0.8339 | 0.5929 | 0.7700 | | 0.1793 | 4.0285 | 1414 | 0.6211 | 0.8627 | 0.6211 | 0.7881 | | 0.1793 | 4.0342 | 1416 | 0.5873 | 0.8339 | 0.5873 | 0.7664 | | 0.1793 | 4.0399 | 1418 | 0.6169 | 0.8627 | 0.6169 | 0.7854 | | 0.1793 | 4.0456 | 1420 | 0.7217 | 0.8383 | 0.7217 | 0.8495 | | 0.1793 | 4.0513 | 1422 | 0.8473 | 0.7774 | 0.8473 | 0.9205 | | 0.1793 | 4.0570 | 1424 | 0.8045 | 0.7774 | 0.8045 | 0.8969 | | 0.1793 | 4.0627 | 1426 | 0.6630 | 0.7879 | 0.6630 | 0.8143 | | 0.1793 | 4.0684 | 1428 | 0.5876 | 0.8165 | 0.5876 | 0.7666 | | 0.1793 | 4.0741 | 1430 | 0.5617 | 0.7812 | 0.5617 | 0.7495 | | 0.1793 | 4.0798 | 1432 | 0.5057 | 0.8 | 0.5057 | 0.7111 | | 0.1793 | 4.0855 | 1434 | 0.4792 | 0.7742 | 0.4792 | 0.6922 | | 0.1793 | 4.0912 | 1436 | 0.4478 | 0.7742 | 0.4478 | 0.6692 | | 0.1793 | 4.0969 | 1438 | 0.4418 | 0.7742 | 0.4418 | 0.6647 | | 0.1793 | 4.1026 | 1440 | 0.4772 | 0.7971 | 0.4772 | 0.6908 | | 0.1793 | 4.1083 | 1442 | 0.6005 | 0.8082 | 0.6005 | 0.7749 | | 0.1793 | 4.1140 | 1444 | 0.7852 | 0.8133 | 0.7852 | 0.8861 | | 0.1793 | 4.1197 | 1446 | 0.7928 | 0.8133 | 0.7928 | 0.8904 | | 0.1793 | 4.1254 | 1448 | 0.6575 | 0.8304 | 0.6575 | 0.8109 | | 0.1793 | 4.1311 | 1450 | 0.5169 | 0.7805 | 0.5169 | 0.7190 | | 0.1793 | 4.1368 | 1452 | 0.4272 | 0.8151 | 0.4272 | 0.6536 | | 0.1793 | 4.1425 | 1454 | 0.4165 | 0.8151 | 0.4165 | 0.6454 | | 0.1793 | 4.1481 | 1456 | 0.4765 | 0.8151 | 0.4765 | 0.6903 | | 0.1793 | 4.1538 | 1458 | 0.6359 | 0.7709 | 0.6359 | 0.7974 | | 0.1793 | 4.1595 | 1460 | 0.7608 | 0.7895 | 0.7608 | 0.8723 | | 0.1793 | 4.1652 | 1462 | 0.7586 | 0.8178 | 0.7586 | 0.8710 | | 0.1793 | 4.1709 | 1464 | 0.6542 | 0.8178 | 0.6542 | 0.8088 | | 0.1793 | 4.1766 | 1466 | 0.5603 | 0.7950 | 0.5603 | 0.7485 | | 0.1793 | 4.1823 | 1468 | 0.5280 | 0.8306 | 0.5280 | 0.7267 | | 0.1793 | 4.1880 | 1470 | 0.5847 | 0.8414 | 0.5847 | 0.7646 | | 0.1793 | 4.1937 | 1472 | 0.6277 | 0.8414 | 0.6277 | 0.7923 | | 0.1793 | 4.1994 | 1474 | 0.6474 | 0.8414 | 0.6474 | 0.8046 | | 0.1793 | 4.2051 | 1476 | 0.6508 | 0.8414 | 0.6508 | 0.8067 | | 0.1793 | 4.2108 | 1478 | 0.6316 | 0.8082 | 0.6316 | 0.7948 | | 0.1793 | 4.2165 | 1480 | 0.5911 | 0.8218 | 0.5911 | 0.7688 | | 0.1793 | 4.2222 | 1482 | 0.5056 | 0.7742 | 0.5056 | 0.7111 | | 0.1793 | 4.2279 | 1484 | 0.4696 | 0.7742 | 0.4696 | 0.6853 | | 0.1793 | 4.2336 | 1486 | 0.5054 | 0.7926 | 0.5054 | 0.7109 | | 0.1793 | 4.2393 | 1488 | 0.4905 | 0.7926 | 0.4905 | 0.7003 | | 0.1793 | 4.2450 | 1490 | 0.5047 | 0.8256 | 0.5047 | 0.7104 | | 0.1793 | 4.2507 | 1492 | 0.5348 | 0.8256 | 0.5348 | 0.7313 | | 0.1793 | 4.2564 | 1494 | 0.5911 | 0.8256 | 0.5911 | 0.7688 | | 0.1793 | 4.2621 | 1496 | 0.5896 | 0.8256 | 0.5896 | 0.7678 | | 0.1793 | 4.2678 | 1498 | 0.6016 | 0.8256 | 0.6016 | 0.7756 | | 0.1173 | 4.2735 | 1500 | 0.5323 | 0.7926 | 0.5323 | 0.7296 | | 0.1173 | 4.2792 | 1502 | 0.4833 | 0.7742 | 0.4833 | 0.6952 | | 0.1173 | 4.2849 | 1504 | 0.5204 | 0.7926 | 0.5204 | 0.7214 | | 0.1173 | 4.2906 | 1506 | 0.6377 | 0.8489 | 0.6377 | 0.7986 | | 0.1173 | 4.2963 | 1508 | 0.6994 | 0.8304 | 0.6994 | 0.8363 | | 0.1173 | 4.3020 | 1510 | 0.7490 | 0.8042 | 0.7490 | 0.8655 | | 0.1173 | 4.3077 | 1512 | 0.6791 | 0.8304 | 0.6791 | 0.8241 | | 0.1173 | 4.3134 | 1514 | 0.5419 | 0.7926 | 0.5419 | 0.7361 | | 0.1173 | 4.3191 | 1516 | 0.4434 | 0.8151 | 0.4434 | 0.6659 | | 0.1173 | 4.3248 | 1518 | 0.4182 | 0.8151 | 0.4182 | 0.6467 | | 0.1173 | 4.3305 | 1520 | 0.4200 | 0.8151 | 0.4200 | 0.6481 | | 0.1173 | 4.3362 | 1522 | 0.4506 | 0.7742 | 0.4506 | 0.6713 | | 0.1173 | 4.3419 | 1524 | 0.5261 | 0.8165 | 0.5261 | 0.7253 | | 0.1173 | 4.3476 | 1526 | 0.6364 | 0.7879 | 0.6364 | 0.7977 | | 0.1173 | 4.3533 | 1528 | 0.7361 | 0.7774 | 0.7361 | 0.8580 | | 0.1173 | 4.3590 | 1530 | 0.7550 | 0.7619 | 0.7550 | 0.8689 | | 0.1173 | 4.3647 | 1532 | 0.7028 | 0.7619 | 0.7028 | 0.8383 | | 0.1173 | 4.3704 | 1534 | 0.5906 | 0.8165 | 0.5906 | 0.7685 | | 0.1173 | 4.3761 | 1536 | 0.4803 | 0.8521 | 0.4803 | 0.6930 | | 0.1173 | 4.3818 | 1538 | 0.4104 | 0.7910 | 0.4104 | 0.6406 | | 0.1173 | 4.3875 | 1540 | 0.3986 | 0.7219 | 0.3986 | 0.6314 | | 0.1173 | 4.3932 | 1542 | 0.4599 | 0.8293 | 0.4599 | 0.6782 | | 0.1173 | 4.3989 | 1544 | 0.5891 | 0.8627 | 0.5891 | 0.7675 | | 0.1173 | 4.4046 | 1546 | 0.7371 | 0.7879 | 0.7371 | 0.8586 | | 0.1173 | 4.4103 | 1548 | 0.7940 | 0.7879 | 0.7940 | 0.8911 | | 0.1173 | 4.4160 | 1550 | 0.7489 | 0.7879 | 0.7489 | 0.8654 | | 0.1173 | 4.4217 | 1552 | 0.6575 | 0.8133 | 0.6575 | 0.8109 | | 0.1173 | 4.4274 | 1554 | 0.6295 | 0.8489 | 0.6295 | 0.7934 | | 0.1173 | 4.4330 | 1556 | 0.5995 | 0.8489 | 0.5995 | 0.7743 | | 0.1173 | 4.4387 | 1558 | 0.5647 | 0.8165 | 0.5647 | 0.7515 | | 0.1173 | 4.4444 | 1560 | 0.5569 | 0.8165 | 0.5569 | 0.7463 | | 0.1173 | 4.4501 | 1562 | 0.5799 | 0.8165 | 0.5799 | 0.7615 | | 0.1173 | 4.4558 | 1564 | 0.5944 | 0.8165 | 0.5944 | 0.7710 | | 0.1173 | 4.4615 | 1566 | 0.6265 | 0.8165 | 0.6265 | 0.7915 | | 0.1173 | 4.4672 | 1568 | 0.6197 | 0.8165 | 0.6197 | 0.7872 | | 0.1173 | 4.4729 | 1570 | 0.5169 | 0.8 | 0.5169 | 0.7190 | | 0.1173 | 4.4786 | 1572 | 0.4361 | 0.7742 | 0.4361 | 0.6604 | | 0.1173 | 4.4843 | 1574 | 0.3793 | 0.7742 | 0.3793 | 0.6158 | | 0.1173 | 4.4900 | 1576 | 0.3994 | 0.7742 | 0.3994 | 0.6320 | | 0.1173 | 4.4957 | 1578 | 0.4698 | 0.7742 | 0.4698 | 0.6854 | | 0.1173 | 4.5014 | 1580 | 0.5918 | 0.8689 | 0.5918 | 0.7693 | | 0.1173 | 4.5071 | 1582 | 0.7183 | 0.7774 | 0.7183 | 0.8475 | | 0.1173 | 4.5128 | 1584 | 0.8364 | 0.7619 | 0.8364 | 0.9146 | | 0.1173 | 4.5185 | 1586 | 0.8247 | 0.7619 | 0.8247 | 0.9081 | | 0.1173 | 4.5242 | 1588 | 0.7448 | 0.7426 | 0.7448 | 0.8630 | | 0.1173 | 4.5299 | 1590 | 0.6328 | 0.7686 | 0.6328 | 0.7955 | | 0.1173 | 4.5356 | 1592 | 0.5934 | 0.8 | 0.5934 | 0.7703 | | 0.1173 | 4.5413 | 1594 | 0.5970 | 0.8 | 0.5970 | 0.7727 | | 0.1173 | 4.5470 | 1596 | 0.5441 | 0.8 | 0.5441 | 0.7376 | | 0.1173 | 4.5527 | 1598 | 0.5291 | 0.7742 | 0.5291 | 0.7274 | | 0.1173 | 4.5584 | 1600 | 0.5648 | 0.8359 | 0.5648 | 0.7515 | | 0.1173 | 4.5641 | 1602 | 0.5778 | 0.7926 | 0.5778 | 0.7601 | | 0.1173 | 4.5698 | 1604 | 0.5895 | 0.7758 | 0.5895 | 0.7678 | | 0.1173 | 4.5755 | 1606 | 0.6585 | 0.8133 | 0.6585 | 0.8115 | | 0.1173 | 4.5812 | 1608 | 0.7545 | 0.7879 | 0.7545 | 0.8686 | | 0.1173 | 4.5869 | 1610 | 0.7436 | 0.7879 | 0.7436 | 0.8623 | | 0.1173 | 4.5926 | 1612 | 0.6807 | 0.7921 | 0.6807 | 0.8251 | | 0.1173 | 4.5983 | 1614 | 0.5468 | 0.8293 | 0.5468 | 0.7395 | | 0.1173 | 4.6040 | 1616 | 0.4245 | 0.8151 | 0.4245 | 0.6515 | | 0.1173 | 4.6097 | 1618 | 0.3982 | 0.8151 | 0.3982 | 0.6310 | | 0.1173 | 4.6154 | 1620 | 0.4384 | 0.8151 | 0.4384 | 0.6621 | | 0.1173 | 4.6211 | 1622 | 0.5612 | 0.7926 | 0.5612 | 0.7492 | | 0.1173 | 4.6268 | 1624 | 0.6742 | 0.7667 | 0.6742 | 0.8211 | | 0.1173 | 4.6325 | 1626 | 0.6962 | 0.7879 | 0.6962 | 0.8344 | | 0.1173 | 4.6382 | 1628 | 0.6694 | 0.7482 | 0.6694 | 0.8181 | | 0.1173 | 4.6439 | 1630 | 0.5757 | 0.8108 | 0.5757 | 0.7588 | | 0.1173 | 4.6496 | 1632 | 0.5254 | 0.7742 | 0.5254 | 0.7249 | | 0.1173 | 4.6553 | 1634 | 0.5129 | 0.7742 | 0.5129 | 0.7161 | | 0.1173 | 4.6610 | 1636 | 0.4820 | 0.7742 | 0.4820 | 0.6943 | | 0.1173 | 4.6667 | 1638 | 0.4749 | 0.7742 | 0.4749 | 0.6892 | | 0.1173 | 4.6724 | 1640 | 0.5277 | 0.8108 | 0.5277 | 0.7264 | | 0.1173 | 4.6781 | 1642 | 0.5481 | 0.7758 | 0.5481 | 0.7403 | | 0.1173 | 4.6838 | 1644 | 0.5668 | 0.7758 | 0.5668 | 0.7529 | | 0.1173 | 4.6895 | 1646 | 0.5142 | 0.8121 | 0.5142 | 0.7171 | | 0.1173 | 4.6952 | 1648 | 0.4460 | 0.8805 | 0.4460 | 0.6678 | | 0.1173 | 4.7009 | 1650 | 0.4140 | 0.8511 | 0.4140 | 0.6434 | | 0.1173 | 4.7066 | 1652 | 0.4588 | 0.8478 | 0.4588 | 0.6773 | | 0.1173 | 4.7123 | 1654 | 0.5184 | 0.7926 | 0.5184 | 0.7200 | | 0.1173 | 4.7179 | 1656 | 0.5484 | 0.7926 | 0.5484 | 0.7405 | | 0.1173 | 4.7236 | 1658 | 0.6409 | 0.7482 | 0.6409 | 0.8006 | | 0.1173 | 4.7293 | 1660 | 0.7030 | 0.7774 | 0.7030 | 0.8384 | | 0.1173 | 4.7350 | 1662 | 0.7641 | 0.7619 | 0.7641 | 0.8741 | | 0.1173 | 4.7407 | 1664 | 0.7463 | 0.7774 | 0.7463 | 0.8639 | | 0.1173 | 4.7464 | 1666 | 0.6478 | 0.776 | 0.6478 | 0.8048 | | 0.1173 | 4.7521 | 1668 | 0.5427 | 0.7686 | 0.5427 | 0.7367 | | 0.1173 | 4.7578 | 1670 | 0.4889 | 0.7429 | 0.4889 | 0.6992 | | 0.1173 | 4.7635 | 1672 | 0.5029 | 0.7429 | 0.5029 | 0.7091 | | 0.1173 | 4.7692 | 1674 | 0.5763 | 0.8063 | 0.5763 | 0.7592 | | 0.1173 | 4.7749 | 1676 | 0.6668 | 0.7586 | 0.6668 | 0.8166 | | 0.1173 | 4.7806 | 1678 | 0.6876 | 0.7426 | 0.6876 | 0.8292 | | 0.1173 | 4.7863 | 1680 | 0.6252 | 0.7879 | 0.6252 | 0.7907 | | 0.1173 | 4.7920 | 1682 | 0.5397 | 0.8205 | 0.5397 | 0.7346 | | 0.1173 | 4.7977 | 1684 | 0.5489 | 0.8205 | 0.5489 | 0.7409 | | 0.1173 | 4.8034 | 1686 | 0.5572 | 0.8256 | 0.5572 | 0.7465 | | 0.1173 | 4.8091 | 1688 | 0.5262 | 0.8108 | 0.5262 | 0.7254 | | 0.1173 | 4.8148 | 1690 | 0.5049 | 0.8108 | 0.5049 | 0.7105 | | 0.1173 | 4.8205 | 1692 | 0.5629 | 0.7364 | 0.5629 | 0.7503 | | 0.1173 | 4.8262 | 1694 | 0.5906 | 0.7364 | 0.5906 | 0.7685 | | 0.1173 | 4.8319 | 1696 | 0.6019 | 0.776 | 0.6019 | 0.7758 | | 0.1173 | 4.8376 | 1698 | 0.6360 | 0.7586 | 0.6360 | 0.7975 | | 0.1173 | 4.8433 | 1700 | 0.6216 | 0.7586 | 0.6216 | 0.7884 | | 0.1173 | 4.8490 | 1702 | 0.5611 | 0.8165 | 0.5611 | 0.7490 | | 0.1173 | 4.8547 | 1704 | 0.5308 | 0.8108 | 0.5308 | 0.7285 | | 0.1173 | 4.8604 | 1706 | 0.5336 | 0.8108 | 0.5336 | 0.7305 | | 0.1173 | 4.8661 | 1708 | 0.6003 | 0.7820 | 0.6003 | 0.7748 | | 0.1173 | 4.8718 | 1710 | 0.7344 | 0.7774 | 0.7344 | 0.8570 | | 0.1173 | 4.8775 | 1712 | 0.7965 | 0.7619 | 0.7965 | 0.8925 | | 0.1173 | 4.8832 | 1714 | 0.7393 | 0.7774 | 0.7393 | 0.8598 | | 0.1173 | 4.8889 | 1716 | 0.6200 | 0.7586 | 0.6200 | 0.7874 | | 0.1173 | 4.8946 | 1718 | 0.5773 | 0.7364 | 0.5773 | 0.7598 | | 0.1173 | 4.9003 | 1720 | 0.5668 | 0.7364 | 0.5668 | 0.7529 | | 0.1173 | 4.9060 | 1722 | 0.6187 | 0.72 | 0.6187 | 0.7866 | | 0.1173 | 4.9117 | 1724 | 0.7031 | 0.7279 | 0.7031 | 0.8385 | | 0.1173 | 4.9174 | 1726 | 0.7085 | 0.7279 | 0.7085 | 0.8417 | | 0.1173 | 4.9231 | 1728 | 0.6227 | 0.7050 | 0.6227 | 0.7891 | | 0.1173 | 4.9288 | 1730 | 0.5006 | 0.7429 | 0.5006 | 0.7075 | | 0.1173 | 4.9345 | 1732 | 0.4545 | 0.8151 | 0.4545 | 0.6742 | | 0.1173 | 4.9402 | 1734 | 0.4707 | 0.7742 | 0.4707 | 0.6861 | | 0.1173 | 4.9459 | 1736 | 0.5216 | 0.7686 | 0.5216 | 0.7222 | | 0.1173 | 4.9516 | 1738 | 0.5921 | 0.7364 | 0.5921 | 0.7695 | | 0.1173 | 4.9573 | 1740 | 0.6978 | 0.6912 | 0.6978 | 0.8353 | | 0.1173 | 4.9630 | 1742 | 0.7835 | 0.7726 | 0.7835 | 0.8851 | | 0.1173 | 4.9687 | 1744 | 0.7600 | 0.7726 | 0.7600 | 0.8718 | | 0.1173 | 4.9744 | 1746 | 0.6724 | 0.6912 | 0.6724 | 0.8200 | | 0.1173 | 4.9801 | 1748 | 0.5593 | 0.7364 | 0.5593 | 0.7479 | | 0.1173 | 4.9858 | 1750 | 0.4634 | 0.8151 | 0.4634 | 0.6808 | | 0.1173 | 4.9915 | 1752 | 0.4061 | 0.8014 | 0.4061 | 0.6372 | | 0.1173 | 4.9972 | 1754 | 0.4144 | 0.8014 | 0.4144 | 0.6437 | | 0.1173 | 5.0028 | 1756 | 0.4888 | 0.7863 | 0.4888 | 0.6991 | | 0.1173 | 5.0085 | 1758 | 0.5434 | 0.7640 | 0.5434 | 0.7371 | | 0.1173 | 5.0142 | 1760 | 0.5635 | 0.7482 | 0.5635 | 0.7506 | | 0.1173 | 5.0199 | 1762 | 0.5312 | 0.7640 | 0.5312 | 0.7289 | | 0.1173 | 5.0256 | 1764 | 0.5086 | 0.8108 | 0.5086 | 0.7132 | | 0.1173 | 5.0313 | 1766 | 0.5027 | 0.8108 | 0.5027 | 0.7090 | | 0.1173 | 5.0370 | 1768 | 0.5377 | 0.7758 | 0.5377 | 0.7333 | | 0.1173 | 5.0427 | 1770 | 0.5194 | 0.7926 | 0.5194 | 0.7207 | | 0.1173 | 5.0484 | 1772 | 0.4680 | 0.8151 | 0.4680 | 0.6841 | | 0.1173 | 5.0541 | 1774 | 0.4540 | 0.8151 | 0.4540 | 0.6738 | | 0.1173 | 5.0598 | 1776 | 0.4713 | 0.8151 | 0.4713 | 0.6865 | | 0.1173 | 5.0655 | 1778 | 0.4937 | 0.8151 | 0.4937 | 0.7027 | | 0.1173 | 5.0712 | 1780 | 0.5235 | 0.8205 | 0.5235 | 0.7236 | | 0.1173 | 5.0769 | 1782 | 0.5331 | 0.7971 | 0.5331 | 0.7301 | | 0.1173 | 5.0826 | 1784 | 0.4920 | 0.7971 | 0.4920 | 0.7014 | | 0.1173 | 5.0883 | 1786 | 0.4386 | 0.8014 | 0.4386 | 0.6623 | | 0.1173 | 5.0940 | 1788 | 0.4014 | 0.8014 | 0.4014 | 0.6335 | | 0.1173 | 5.0997 | 1790 | 0.4114 | 0.8014 | 0.4114 | 0.6414 | | 0.1173 | 5.1054 | 1792 | 0.4702 | 0.8151 | 0.4702 | 0.6857 | | 0.1173 | 5.1111 | 1794 | 0.5717 | 0.8108 | 0.5717 | 0.7561 | | 0.1173 | 5.1168 | 1796 | 0.6442 | 0.7820 | 0.6442 | 0.8026 | | 0.1173 | 5.1225 | 1798 | 0.6371 | 0.7820 | 0.6371 | 0.7982 | | 0.1173 | 5.1282 | 1800 | 0.5779 | 0.7758 | 0.5779 | 0.7602 | | 0.1173 | 5.1339 | 1802 | 0.4907 | 0.8151 | 0.4907 | 0.7005 | | 0.1173 | 5.1396 | 1804 | 0.4131 | 0.8014 | 0.4131 | 0.6427 | | 0.1173 | 5.1453 | 1806 | 0.4021 | 0.8014 | 0.4021 | 0.6341 | | 0.1173 | 5.1510 | 1808 | 0.4466 | 0.8014 | 0.4466 | 0.6682 | | 0.1173 | 5.1567 | 1810 | 0.5140 | 0.8293 | 0.5139 | 0.7169 | | 0.1173 | 5.1624 | 1812 | 0.5398 | 0.8293 | 0.5398 | 0.7347 | | 0.1173 | 5.1681 | 1814 | 0.5863 | 0.8627 | 0.5863 | 0.7657 | | 0.1173 | 5.1738 | 1816 | 0.6030 | 0.8042 | 0.6030 | 0.7765 | | 0.1173 | 5.1795 | 1818 | 0.5542 | 0.7926 | 0.5542 | 0.7445 | | 0.1173 | 5.1852 | 1820 | 0.5002 | 0.8151 | 0.5002 | 0.7072 | | 0.1173 | 5.1909 | 1822 | 0.4802 | 0.8151 | 0.4802 | 0.6929 | | 0.1173 | 5.1966 | 1824 | 0.4826 | 0.8151 | 0.4826 | 0.6947 | | 0.1173 | 5.2023 | 1826 | 0.4671 | 0.8151 | 0.4671 | 0.6834 | | 0.1173 | 5.2080 | 1828 | 0.4828 | 0.8151 | 0.4828 | 0.6948 | | 0.1173 | 5.2137 | 1830 | 0.5263 | 0.7971 | 0.5263 | 0.7254 | | 0.1173 | 5.2194 | 1832 | 0.5797 | 0.8339 | 0.5797 | 0.7614 | | 0.1173 | 5.2251 | 1834 | 0.5556 | 0.8339 | 0.5556 | 0.7454 | | 0.1173 | 5.2308 | 1836 | 0.4861 | 0.8151 | 0.4861 | 0.6972 | | 0.1173 | 5.2365 | 1838 | 0.4433 | 0.8014 | 0.4433 | 0.6658 | | 0.1173 | 5.2422 | 1840 | 0.3996 | 0.8014 | 0.3996 | 0.6321 | | 0.1173 | 5.2479 | 1842 | 0.3990 | 0.8014 | 0.3990 | 0.6316 | | 0.1173 | 5.2536 | 1844 | 0.4063 | 0.8014 | 0.4063 | 0.6374 | | 0.1173 | 5.2593 | 1846 | 0.4597 | 0.8014 | 0.4597 | 0.6780 | | 0.1173 | 5.2650 | 1848 | 0.5740 | 0.8082 | 0.5740 | 0.7576 | | 0.1173 | 5.2707 | 1850 | 0.6567 | 0.8383 | 0.6567 | 0.8104 | | 0.1173 | 5.2764 | 1852 | 0.6679 | 0.8383 | 0.6679 | 0.8172 | | 0.1173 | 5.2821 | 1854 | 0.6304 | 0.8383 | 0.6304 | 0.7940 | | 0.1173 | 5.2877 | 1856 | 0.5515 | 0.7756 | 0.5515 | 0.7426 | | 0.1173 | 5.2934 | 1858 | 0.4924 | 0.7697 | 0.4924 | 0.7017 | | 0.1173 | 5.2991 | 1860 | 0.4816 | 0.7697 | 0.4816 | 0.6940 | | 0.1173 | 5.3048 | 1862 | 0.4421 | 0.8014 | 0.4421 | 0.6649 | | 0.1173 | 5.3105 | 1864 | 0.4213 | 0.8014 | 0.4213 | 0.6491 | | 0.1173 | 5.3162 | 1866 | 0.4425 | 0.8014 | 0.4425 | 0.6652 | | 0.1173 | 5.3219 | 1868 | 0.4948 | 0.7697 | 0.4948 | 0.7034 | | 0.1173 | 5.3276 | 1870 | 0.5441 | 0.8121 | 0.5441 | 0.7376 | | 0.1173 | 5.3333 | 1872 | 0.5516 | 0.8 | 0.5516 | 0.7427 | | 0.1173 | 5.3390 | 1874 | 0.5101 | 0.8 | 0.5101 | 0.7142 | | 0.1173 | 5.3447 | 1876 | 0.4926 | 0.8 | 0.4926 | 0.7019 | | 0.1173 | 5.3504 | 1878 | 0.5047 | 0.8121 | 0.5047 | 0.7104 | | 0.1173 | 5.3561 | 1880 | 0.5056 | 0.7971 | 0.5056 | 0.7110 | | 0.1173 | 5.3618 | 1882 | 0.4850 | 0.7971 | 0.4850 | 0.6964 | | 0.1173 | 5.3675 | 1884 | 0.4527 | 0.8151 | 0.4527 | 0.6728 | | 0.1173 | 5.3732 | 1886 | 0.4303 | 0.8014 | 0.4303 | 0.6560 | | 0.1173 | 5.3789 | 1888 | 0.4659 | 0.8151 | 0.4659 | 0.6826 | | 0.1173 | 5.3846 | 1890 | 0.5358 | 0.7535 | 0.5358 | 0.7320 | | 0.1173 | 5.3903 | 1892 | 0.5812 | 0.7864 | 0.5812 | 0.7624 | | 0.1173 | 5.3960 | 1894 | 0.6348 | 0.7921 | 0.6348 | 0.7967 | | 0.1173 | 5.4017 | 1896 | 0.6607 | 0.8217 | 0.6607 | 0.8128 | | 0.1173 | 5.4074 | 1898 | 0.6189 | 0.8217 | 0.6189 | 0.7867 | | 0.1173 | 5.4131 | 1900 | 0.5170 | 0.8121 | 0.5170 | 0.7191 | | 0.1173 | 5.4188 | 1902 | 0.4347 | 0.8361 | 0.4347 | 0.6594 | | 0.1173 | 5.4245 | 1904 | 0.4029 | 0.7785 | 0.4029 | 0.6347 | | 0.1173 | 5.4302 | 1906 | 0.4105 | 0.7785 | 0.4105 | 0.6407 | | 0.1173 | 5.4359 | 1908 | 0.4597 | 0.8361 | 0.4597 | 0.6780 | | 0.1173 | 5.4416 | 1910 | 0.5579 | 0.8383 | 0.5579 | 0.7469 | | 0.1173 | 5.4473 | 1912 | 0.6811 | 0.7879 | 0.6811 | 0.8253 | | 0.1173 | 5.4530 | 1914 | 0.7445 | 0.7619 | 0.7445 | 0.8628 | | 0.1173 | 5.4587 | 1916 | 0.7184 | 0.7619 | 0.7184 | 0.8476 | | 0.1173 | 5.4644 | 1918 | 0.6382 | 0.7774 | 0.6382 | 0.7989 | | 0.1173 | 5.4701 | 1920 | 0.5445 | 0.8063 | 0.5445 | 0.7379 | | 0.1173 | 5.4758 | 1922 | 0.4672 | 0.8108 | 0.4672 | 0.6835 | | 0.1173 | 5.4815 | 1924 | 0.4191 | 0.8151 | 0.4191 | 0.6474 | | 0.1173 | 5.4872 | 1926 | 0.4257 | 0.8151 | 0.4257 | 0.6524 | | 0.1173 | 5.4929 | 1928 | 0.4692 | 0.7686 | 0.4692 | 0.6850 | | 0.1173 | 5.4986 | 1930 | 0.5055 | 0.8063 | 0.5055 | 0.7110 | | 0.1173 | 5.5043 | 1932 | 0.5313 | 0.8063 | 0.5313 | 0.7289 | | 0.1173 | 5.5100 | 1934 | 0.5092 | 0.8165 | 0.5092 | 0.7136 | | 0.1173 | 5.5157 | 1936 | 0.4822 | 0.8293 | 0.4822 | 0.6944 | | 0.1173 | 5.5214 | 1938 | 0.5124 | 0.7926 | 0.5124 | 0.7158 | | 0.1173 | 5.5271 | 1940 | 0.5661 | 0.8165 | 0.5661 | 0.7524 | | 0.1173 | 5.5328 | 1942 | 0.5931 | 0.8165 | 0.5931 | 0.7701 | | 0.1173 | 5.5385 | 1944 | 0.5489 | 0.8165 | 0.5489 | 0.7409 | | 0.1173 | 5.5442 | 1946 | 0.5454 | 0.8165 | 0.5454 | 0.7385 | | 0.1173 | 5.5499 | 1948 | 0.5661 | 0.8165 | 0.5661 | 0.7524 | | 0.1173 | 5.5556 | 1950 | 0.5837 | 0.8165 | 0.5837 | 0.7640 | | 0.1173 | 5.5613 | 1952 | 0.5509 | 0.8165 | 0.5509 | 0.7422 | | 0.1173 | 5.5670 | 1954 | 0.5401 | 0.8165 | 0.5401 | 0.7349 | | 0.1173 | 5.5726 | 1956 | 0.5137 | 0.8521 | 0.5137 | 0.7167 | | 0.1173 | 5.5783 | 1958 | 0.4746 | 0.8837 | 0.4746 | 0.6889 | | 0.1173 | 5.5840 | 1960 | 0.4565 | 0.8404 | 0.4565 | 0.6757 | | 0.1173 | 5.5897 | 1962 | 0.4638 | 0.8272 | 0.4638 | 0.6811 | | 0.1173 | 5.5954 | 1964 | 0.5192 | 0.8521 | 0.5192 | 0.7205 | | 0.1173 | 5.6011 | 1966 | 0.5204 | 0.8679 | 0.5204 | 0.7214 | | 0.1173 | 5.6068 | 1968 | 0.5377 | 0.8444 | 0.5377 | 0.7333 | | 0.1173 | 5.6125 | 1970 | 0.5746 | 0.8383 | 0.5746 | 0.7580 | | 0.1173 | 5.6182 | 1972 | 0.5793 | 0.8082 | 0.5793 | 0.7611 | | 0.1173 | 5.6239 | 1974 | 0.5851 | 0.8256 | 0.5851 | 0.7649 | | 0.1173 | 5.6296 | 1976 | 0.5552 | 0.8256 | 0.5552 | 0.7451 | | 0.1173 | 5.6353 | 1978 | 0.5014 | 0.8444 | 0.5014 | 0.7081 | | 0.1173 | 5.6410 | 1980 | 0.4465 | 0.8108 | 0.4465 | 0.6682 | | 0.1173 | 5.6467 | 1982 | 0.4183 | 0.8151 | 0.4183 | 0.6468 | | 0.1173 | 5.6524 | 1984 | 0.4393 | 0.8397 | 0.4393 | 0.6628 | | 0.1173 | 5.6581 | 1986 | 0.4421 | 0.8397 | 0.4421 | 0.6649 | | 0.1173 | 5.6638 | 1988 | 0.4655 | 0.8718 | 0.4655 | 0.6823 | | 0.1173 | 5.6695 | 1990 | 0.5384 | 0.8444 | 0.5384 | 0.7338 | | 0.1173 | 5.6752 | 1992 | 0.6399 | 0.7426 | 0.6399 | 0.7999 | | 0.1173 | 5.6809 | 1994 | 0.6697 | 0.7774 | 0.6697 | 0.8184 | | 0.1173 | 5.6866 | 1996 | 0.6165 | 0.7586 | 0.6165 | 0.7852 | | 0.1173 | 5.6923 | 1998 | 0.5171 | 0.8718 | 0.5171 | 0.7191 | | 0.0821 | 5.6980 | 2000 | 0.4439 | 0.8397 | 0.4439 | 0.6663 | | 0.0821 | 5.7037 | 2002 | 0.4164 | 0.8511 | 0.4164 | 0.6453 | | 0.0821 | 5.7094 | 2004 | 0.4364 | 0.8397 | 0.4364 | 0.6606 | | 0.0821 | 5.7151 | 2006 | 0.4533 | 0.8397 | 0.4533 | 0.6733 | | 0.0821 | 5.7208 | 2008 | 0.4957 | 0.8718 | 0.4957 | 0.7040 | | 0.0821 | 5.7265 | 2010 | 0.5572 | 0.8521 | 0.5572 | 0.7465 | | 0.0821 | 5.7322 | 2012 | 0.6754 | 0.8217 | 0.6754 | 0.8218 | | 0.0821 | 5.7379 | 2014 | 0.7225 | 0.7974 | 0.7225 | 0.8500 | | 0.0821 | 5.7436 | 2016 | 0.6766 | 0.7974 | 0.6766 | 0.8225 | | 0.0821 | 5.7493 | 2018 | 0.6019 | 0.8082 | 0.6019 | 0.7758 | | 0.0821 | 5.7550 | 2020 | 0.5395 | 0.8444 | 0.5395 | 0.7345 | | 0.0821 | 5.7607 | 2022 | 0.4792 | 0.8581 | 0.4792 | 0.6922 | | 0.0821 | 5.7664 | 2024 | 0.4483 | 0.8361 | 0.4483 | 0.6696 | | 0.0821 | 5.7721 | 2026 | 0.4576 | 0.8581 | 0.4576 | 0.6764 | | 0.0821 | 5.7778 | 2028 | 0.5105 | 0.8618 | 0.5105 | 0.7145 | | 0.0821 | 5.7835 | 2030 | 0.5830 | 0.8082 | 0.5830 | 0.7635 | | 0.0821 | 5.7892 | 2032 | 0.5904 | 0.8383 | 0.5904 | 0.7684 | | 0.0821 | 5.7949 | 2034 | 0.5980 | 0.8383 | 0.5980 | 0.7733 | | 0.0821 | 5.8006 | 2036 | 0.5631 | 0.8282 | 0.5631 | 0.7504 | | 0.0821 | 5.8063 | 2038 | 0.5328 | 0.8282 | 0.5328 | 0.7299 | | 0.0821 | 5.8120 | 2040 | 0.4940 | 0.8361 | 0.4940 | 0.7028 | | 0.0821 | 5.8177 | 2042 | 0.4715 | 0.8361 | 0.4715 | 0.6866 | | 0.0821 | 5.8234 | 2044 | 0.4872 | 0.8361 | 0.4872 | 0.6980 | | 0.0821 | 5.8291 | 2046 | 0.4961 | 0.8361 | 0.4961 | 0.7044 | | 0.0821 | 5.8348 | 2048 | 0.5279 | 0.7510 | 0.5279 | 0.7266 | | 0.0821 | 5.8405 | 2050 | 0.5249 | 0.7686 | 0.5249 | 0.7245 | | 0.0821 | 5.8462 | 2052 | 0.5055 | 0.7429 | 0.5055 | 0.7110 | | 0.0821 | 5.8519 | 2054 | 0.4918 | 0.7429 | 0.4918 | 0.7013 | | 0.0821 | 5.8575 | 2056 | 0.4936 | 0.7742 | 0.4936 | 0.7025 | | 0.0821 | 5.8632 | 2058 | 0.5111 | 0.7429 | 0.5111 | 0.7149 | | 0.0821 | 5.8689 | 2060 | 0.4929 | 0.8151 | 0.4929 | 0.7021 | | 0.0821 | 5.8746 | 2062 | 0.5160 | 0.7692 | 0.5160 | 0.7183 | | 0.0821 | 5.8803 | 2064 | 0.5549 | 0.8218 | 0.5549 | 0.7449 | | 0.0821 | 5.8860 | 2066 | 0.5873 | 0.8042 | 0.5873 | 0.7664 | | 0.0821 | 5.8917 | 2068 | 0.5784 | 0.8383 | 0.5784 | 0.7605 | | 0.0821 | 5.8974 | 2070 | 0.5664 | 0.8562 | 0.5664 | 0.7526 | | 0.0821 | 5.9031 | 2072 | 0.5450 | 0.7926 | 0.5450 | 0.7382 | | 0.0821 | 5.9088 | 2074 | 0.5194 | 0.8108 | 0.5194 | 0.7207 | | 0.0821 | 5.9145 | 2076 | 0.5524 | 0.8108 | 0.5524 | 0.7432 | | 0.0821 | 5.9202 | 2078 | 0.6004 | 0.8042 | 0.6004 | 0.7748 | | 0.0821 | 5.9259 | 2080 | 0.5854 | 0.8082 | 0.5854 | 0.7651 | | 0.0821 | 5.9316 | 2082 | 0.5346 | 0.8108 | 0.5346 | 0.7312 | | 0.0821 | 5.9373 | 2084 | 0.4969 | 0.8108 | 0.4969 | 0.7049 | | 0.0821 | 5.9430 | 2086 | 0.5009 | 0.8108 | 0.5009 | 0.7077 | | 0.0821 | 5.9487 | 2088 | 0.5122 | 0.7971 | 0.5122 | 0.7157 | | 0.0821 | 5.9544 | 2090 | 0.5059 | 0.7971 | 0.5059 | 0.7113 | | 0.0821 | 5.9601 | 2092 | 0.5039 | 0.7971 | 0.5039 | 0.7098 | | 0.0821 | 5.9658 | 2094 | 0.4779 | 0.8581 | 0.4779 | 0.6913 | | 0.0821 | 5.9715 | 2096 | 0.4838 | 0.8581 | 0.4838 | 0.6955 | | 0.0821 | 5.9772 | 2098 | 0.5222 | 0.8404 | 0.5222 | 0.7226 | | 0.0821 | 5.9829 | 2100 | 0.5448 | 0.8511 | 0.5448 | 0.7381 | | 0.0821 | 5.9886 | 2102 | 0.5450 | 0.8511 | 0.5450 | 0.7382 | | 0.0821 | 5.9943 | 2104 | 0.5751 | 0.825 | 0.5751 | 0.7583 | | 0.0821 | 6.0 | 2106 | 0.6201 | 0.7667 | 0.6201 | 0.7875 | | 0.0821 | 6.0057 | 2108 | 0.6445 | 0.7774 | 0.6445 | 0.8028 | | 0.0821 | 6.0114 | 2110 | 0.6584 | 0.7774 | 0.6584 | 0.8114 | | 0.0821 | 6.0171 | 2112 | 0.6264 | 0.7774 | 0.6264 | 0.7915 | | 0.0821 | 6.0228 | 2114 | 0.5805 | 0.7364 | 0.5805 | 0.7619 | | 0.0821 | 6.0285 | 2116 | 0.5044 | 0.8 | 0.5044 | 0.7102 | | 0.0821 | 6.0342 | 2118 | 0.4349 | 0.8151 | 0.4349 | 0.6594 | | 0.0821 | 6.0399 | 2120 | 0.3846 | 0.7445 | 0.3846 | 0.6201 | | 0.0821 | 6.0456 | 2122 | 0.3762 | 0.8146 | 0.3762 | 0.6133 | | 0.0821 | 6.0513 | 2124 | 0.4046 | 0.8014 | 0.4046 | 0.6361 | | 0.0821 | 6.0570 | 2126 | 0.4818 | 0.8151 | 0.4818 | 0.6941 | | 0.0821 | 6.0627 | 2128 | 0.5449 | 0.8121 | 0.5449 | 0.7382 | | 0.0821 | 6.0684 | 2130 | 0.6084 | 0.8627 | 0.6084 | 0.7800 | | 0.0821 | 6.0741 | 2132 | 0.6484 | 0.8217 | 0.6484 | 0.8052 | | 0.0821 | 6.0798 | 2134 | 0.6575 | 0.8217 | 0.6575 | 0.8109 | | 0.0821 | 6.0855 | 2136 | 0.6283 | 0.8383 | 0.6283 | 0.7927 | | 0.0821 | 6.0912 | 2138 | 0.5752 | 0.8082 | 0.5752 | 0.7584 | | 0.0821 | 6.0969 | 2140 | 0.5257 | 0.8151 | 0.5257 | 0.7250 | | 0.0821 | 6.1026 | 2142 | 0.4798 | 0.8151 | 0.4798 | 0.6927 | | 0.0821 | 6.1083 | 2144 | 0.4764 | 0.8151 | 0.4764 | 0.6902 | | 0.0821 | 6.1140 | 2146 | 0.5120 | 0.8151 | 0.5120 | 0.7156 | | 0.0821 | 6.1197 | 2148 | 0.5908 | 0.7758 | 0.5908 | 0.7686 | | 0.0821 | 6.1254 | 2150 | 0.6546 | 0.8000 | 0.6546 | 0.8091 | | 0.0821 | 6.1311 | 2152 | 0.6437 | 0.8000 | 0.6437 | 0.8023 | | 0.0821 | 6.1368 | 2154 | 0.5775 | 0.8108 | 0.5775 | 0.7599 | | 0.0821 | 6.1425 | 2156 | 0.5319 | 0.8397 | 0.5319 | 0.7293 | | 0.0821 | 6.1481 | 2158 | 0.5381 | 0.8397 | 0.5381 | 0.7336 | | 0.0821 | 6.1538 | 2160 | 0.6096 | 0.7552 | 0.6096 | 0.7807 | | 0.0821 | 6.1595 | 2162 | 0.6385 | 0.7879 | 0.6385 | 0.7991 | | 0.0821 | 6.1652 | 2164 | 0.6049 | 0.7552 | 0.6049 | 0.7778 | | 0.0821 | 6.1709 | 2166 | 0.5320 | 0.8397 | 0.5320 | 0.7294 | | 0.0821 | 6.1766 | 2168 | 0.4670 | 0.8397 | 0.4670 | 0.6834 | | 0.0821 | 6.1823 | 2170 | 0.4403 | 0.8151 | 0.4403 | 0.6635 | | 0.0821 | 6.1880 | 2172 | 0.4496 | 0.8397 | 0.4496 | 0.6705 | | 0.0821 | 6.1937 | 2174 | 0.4869 | 0.8397 | 0.4869 | 0.6978 | | 0.0821 | 6.1994 | 2176 | 0.5165 | 0.8108 | 0.5165 | 0.7187 | | 0.0821 | 6.2051 | 2178 | 0.5626 | 0.7758 | 0.5626 | 0.7501 | | 0.0821 | 6.2108 | 2180 | 0.5552 | 0.7921 | 0.5552 | 0.7451 | | 0.0821 | 6.2165 | 2182 | 0.5204 | 0.7758 | 0.5204 | 0.7214 | | 0.0821 | 6.2222 | 2184 | 0.4908 | 0.8552 | 0.4908 | 0.7006 | | 0.0821 | 6.2279 | 2186 | 0.4262 | 0.8361 | 0.4262 | 0.6528 | | 0.0821 | 6.2336 | 2188 | 0.4128 | 0.8361 | 0.4128 | 0.6425 | | 0.0821 | 6.2393 | 2190 | 0.4230 | 0.8581 | 0.4230 | 0.6504 | | 0.0821 | 6.2450 | 2192 | 0.4570 | 0.8328 | 0.4570 | 0.6760 | | 0.0821 | 6.2507 | 2194 | 0.5288 | 0.8082 | 0.5288 | 0.7272 | | 0.0821 | 6.2564 | 2196 | 0.5930 | 0.8217 | 0.5930 | 0.7701 | | 0.0821 | 6.2621 | 2198 | 0.6184 | 0.7974 | 0.6184 | 0.7864 | | 0.0821 | 6.2678 | 2200 | 0.6127 | 0.7820 | 0.6127 | 0.7828 | | 0.0821 | 6.2735 | 2202 | 0.5644 | 0.7926 | 0.5644 | 0.7513 | | 0.0821 | 6.2792 | 2204 | 0.5203 | 0.8108 | 0.5203 | 0.7213 | | 0.0821 | 6.2849 | 2206 | 0.5049 | 0.8108 | 0.5049 | 0.7106 | | 0.0821 | 6.2906 | 2208 | 0.5119 | 0.8108 | 0.5119 | 0.7155 | | 0.0821 | 6.2963 | 2210 | 0.5401 | 0.7926 | 0.5401 | 0.7349 | | 0.0821 | 6.3020 | 2212 | 0.5870 | 0.7926 | 0.5870 | 0.7661 | | 0.0821 | 6.3077 | 2214 | 0.6300 | 0.8217 | 0.6300 | 0.7937 | | 0.0821 | 6.3134 | 2216 | 0.6986 | 0.8217 | 0.6986 | 0.8358 | | 0.0821 | 6.3191 | 2218 | 0.7041 | 0.8217 | 0.7041 | 0.8391 | | 0.0821 | 6.3248 | 2220 | 0.6683 | 0.8217 | 0.6683 | 0.8175 | | 0.0821 | 6.3305 | 2222 | 0.6671 | 0.8217 | 0.6671 | 0.8168 | | 0.0821 | 6.3362 | 2224 | 0.6475 | 0.8217 | 0.6475 | 0.8047 | | 0.0821 | 6.3419 | 2226 | 0.6154 | 0.8562 | 0.6154 | 0.7844 | | 0.0821 | 6.3476 | 2228 | 0.5449 | 0.7926 | 0.5449 | 0.7382 | | 0.0821 | 6.3533 | 2230 | 0.4704 | 0.8108 | 0.4704 | 0.6859 | | 0.0821 | 6.3590 | 2232 | 0.4644 | 0.8108 | 0.4644 | 0.6815 | | 0.0821 | 6.3647 | 2234 | 0.5060 | 0.7686 | 0.5060 | 0.7113 | | 0.0821 | 6.3704 | 2236 | 0.5420 | 0.7686 | 0.5420 | 0.7362 | | 0.0821 | 6.3761 | 2238 | 0.6106 | 0.7510 | 0.6106 | 0.7814 | | 0.0821 | 6.3818 | 2240 | 0.7119 | 0.7941 | 0.7119 | 0.8438 | | 0.0821 | 6.3875 | 2242 | 0.7674 | 0.7619 | 0.7674 | 0.8760 | | 0.0821 | 6.3932 | 2244 | 0.7494 | 0.7619 | 0.7494 | 0.8657 | | 0.0821 | 6.3989 | 2246 | 0.6843 | 0.7941 | 0.6843 | 0.8272 | | 0.0821 | 6.4046 | 2248 | 0.6285 | 0.72 | 0.6285 | 0.7928 | | 0.0821 | 6.4103 | 2250 | 0.5671 | 0.7510 | 0.5671 | 0.7531 | | 0.0821 | 6.4160 | 2252 | 0.5472 | 0.7926 | 0.5472 | 0.7398 | | 0.0821 | 6.4217 | 2254 | 0.5586 | 0.7510 | 0.5586 | 0.7474 | | 0.0821 | 6.4274 | 2256 | 0.5602 | 0.7926 | 0.5602 | 0.7484 | | 0.0821 | 6.4330 | 2258 | 0.5531 | 0.7926 | 0.5531 | 0.7437 | | 0.0821 | 6.4387 | 2260 | 0.5309 | 0.7926 | 0.5309 | 0.7286 | | 0.0821 | 6.4444 | 2262 | 0.5566 | 0.7926 | 0.5566 | 0.7461 | | 0.0821 | 6.4501 | 2264 | 0.5497 | 0.7926 | 0.5497 | 0.7414 | | 0.0821 | 6.4558 | 2266 | 0.5694 | 0.7926 | 0.5694 | 0.7546 | | 0.0821 | 6.4615 | 2268 | 0.5832 | 0.7926 | 0.5832 | 0.7636 | | 0.0821 | 6.4672 | 2270 | 0.6474 | 0.8383 | 0.6474 | 0.8046 | | 0.0821 | 6.4729 | 2272 | 0.6893 | 0.7974 | 0.6893 | 0.8302 | | 0.0821 | 6.4786 | 2274 | 0.6975 | 0.7619 | 0.6975 | 0.8351 | | 0.0821 | 6.4843 | 2276 | 0.6439 | 0.8562 | 0.6439 | 0.8024 | | 0.0821 | 6.4900 | 2278 | 0.5599 | 0.7926 | 0.5599 | 0.7483 | | 0.0821 | 6.4957 | 2280 | 0.5199 | 0.7926 | 0.5199 | 0.7210 | | 0.0821 | 6.5014 | 2282 | 0.5259 | 0.7926 | 0.5259 | 0.7252 | | 0.0821 | 6.5071 | 2284 | 0.5570 | 0.7926 | 0.5570 | 0.7463 | | 0.0821 | 6.5128 | 2286 | 0.5795 | 0.7510 | 0.5795 | 0.7613 | | 0.0821 | 6.5185 | 2288 | 0.5960 | 0.7510 | 0.5960 | 0.7720 | | 0.0821 | 6.5242 | 2290 | 0.5815 | 0.7510 | 0.5815 | 0.7626 | | 0.0821 | 6.5299 | 2292 | 0.5576 | 0.7926 | 0.5576 | 0.7467 | | 0.0821 | 6.5356 | 2294 | 0.5441 | 0.7926 | 0.5441 | 0.7376 | | 0.0821 | 6.5413 | 2296 | 0.5376 | 0.7926 | 0.5376 | 0.7332 | | 0.0821 | 6.5470 | 2298 | 0.5181 | 0.7926 | 0.5181 | 0.7198 | | 0.0821 | 6.5527 | 2300 | 0.5026 | 0.7926 | 0.5026 | 0.7089 | | 0.0821 | 6.5584 | 2302 | 0.5114 | 0.7926 | 0.5114 | 0.7151 | | 0.0821 | 6.5641 | 2304 | 0.5448 | 0.7926 | 0.5448 | 0.7381 | | 0.0821 | 6.5698 | 2306 | 0.5737 | 0.7926 | 0.5737 | 0.7574 | | 0.0821 | 6.5755 | 2308 | 0.6152 | 0.7709 | 0.6152 | 0.7843 | | 0.0821 | 6.5812 | 2310 | 0.6427 | 0.7709 | 0.6427 | 0.8017 | | 0.0821 | 6.5869 | 2312 | 0.6076 | 0.7510 | 0.6076 | 0.7795 | | 0.0821 | 6.5926 | 2314 | 0.5426 | 0.7510 | 0.5426 | 0.7366 | | 0.0821 | 6.5983 | 2316 | 0.4768 | 0.8108 | 0.4768 | 0.6905 | | 0.0821 | 6.6040 | 2318 | 0.4428 | 0.8397 | 0.4428 | 0.6655 | | 0.0821 | 6.6097 | 2320 | 0.4647 | 0.8108 | 0.4647 | 0.6817 | | 0.0821 | 6.6154 | 2322 | 0.5349 | 0.7510 | 0.5349 | 0.7313 | | 0.0821 | 6.6211 | 2324 | 0.6020 | 0.8042 | 0.6020 | 0.7759 | | 0.0821 | 6.6268 | 2326 | 0.6936 | 0.8042 | 0.6936 | 0.8328 | | 0.0821 | 6.6325 | 2328 | 0.7383 | 0.7879 | 0.7383 | 0.8593 | | 0.0821 | 6.6382 | 2330 | 0.7157 | 0.8042 | 0.7157 | 0.8460 | | 0.0821 | 6.6439 | 2332 | 0.6353 | 0.8042 | 0.6353 | 0.7970 | | 0.0821 | 6.6496 | 2334 | 0.5444 | 0.7686 | 0.5444 | 0.7378 | | 0.0821 | 6.6553 | 2336 | 0.5194 | 0.7686 | 0.5194 | 0.7207 | | 0.0821 | 6.6610 | 2338 | 0.5059 | 0.7686 | 0.5059 | 0.7113 | | 0.0821 | 6.6667 | 2340 | 0.5304 | 0.7879 | 0.5304 | 0.7283 | | 0.0821 | 6.6724 | 2342 | 0.6085 | 0.8042 | 0.6085 | 0.7800 | | 0.0821 | 6.6781 | 2344 | 0.6665 | 0.8042 | 0.6665 | 0.8164 | | 0.0821 | 6.6838 | 2346 | 0.6547 | 0.8042 | 0.6547 | 0.8092 | | 0.0821 | 6.6895 | 2348 | 0.6302 | 0.8042 | 0.6302 | 0.7939 | | 0.0821 | 6.6952 | 2350 | 0.6391 | 0.8042 | 0.6391 | 0.7994 | | 0.0821 | 6.7009 | 2352 | 0.6575 | 0.8042 | 0.6575 | 0.8109 | | 0.0821 | 6.7066 | 2354 | 0.6341 | 0.8042 | 0.6341 | 0.7963 | | 0.0821 | 6.7123 | 2356 | 0.6016 | 0.8042 | 0.6016 | 0.7756 | | 0.0821 | 6.7179 | 2358 | 0.5391 | 0.8218 | 0.5391 | 0.7342 | | 0.0821 | 6.7236 | 2360 | 0.5002 | 0.8205 | 0.5002 | 0.7072 | | 0.0821 | 6.7293 | 2362 | 0.5137 | 0.7640 | 0.5137 | 0.7167 | | 0.0821 | 6.7350 | 2364 | 0.5700 | 0.8218 | 0.5700 | 0.7550 | | 0.0821 | 6.7407 | 2366 | 0.6031 | 0.8218 | 0.6031 | 0.7766 | | 0.0821 | 6.7464 | 2368 | 0.5808 | 0.8218 | 0.5808 | 0.7621 | | 0.0821 | 6.7521 | 2370 | 0.5566 | 0.7986 | 0.5566 | 0.7461 | | 0.0821 | 6.7578 | 2372 | 0.4928 | 0.8028 | 0.4928 | 0.7020 | | 0.0821 | 6.7635 | 2374 | 0.4429 | 0.8151 | 0.4429 | 0.6655 | | 0.0821 | 6.7692 | 2376 | 0.4331 | 0.7910 | 0.4331 | 0.6581 | | 0.0821 | 6.7749 | 2378 | 0.4407 | 0.7910 | 0.4407 | 0.6639 | | 0.0821 | 6.7806 | 2380 | 0.4106 | 0.7910 | 0.4106 | 0.6408 | | 0.0821 | 6.7863 | 2382 | 0.3872 | 0.7789 | 0.3872 | 0.6223 | | 0.0821 | 6.7920 | 2384 | 0.4053 | 0.7910 | 0.4053 | 0.6367 | | 0.0821 | 6.7977 | 2386 | 0.4650 | 0.7742 | 0.4650 | 0.6819 | | 0.0821 | 6.8034 | 2388 | 0.5722 | 0.7820 | 0.5722 | 0.7565 | | 0.0821 | 6.8091 | 2390 | 0.6939 | 0.8000 | 0.6939 | 0.8330 | | 0.0821 | 6.8148 | 2392 | 0.7438 | 0.8000 | 0.7438 | 0.8624 | | 0.0821 | 6.8205 | 2394 | 0.7375 | 0.8000 | 0.7375 | 0.8588 | | 0.0821 | 6.8262 | 2396 | 0.6914 | 0.8042 | 0.6914 | 0.8315 | | 0.0821 | 6.8319 | 2398 | 0.6067 | 0.8042 | 0.6067 | 0.7789 | | 0.0821 | 6.8376 | 2400 | 0.5213 | 0.7864 | 0.5213 | 0.7220 | | 0.0821 | 6.8433 | 2402 | 0.4461 | 0.7742 | 0.4461 | 0.6679 | | 0.0821 | 6.8490 | 2404 | 0.4241 | 0.7789 | 0.4241 | 0.6512 | | 0.0821 | 6.8547 | 2406 | 0.4445 | 0.7492 | 0.4445 | 0.6667 | | 0.0821 | 6.8604 | 2408 | 0.4890 | 0.7907 | 0.4890 | 0.6993 | | 0.0821 | 6.8661 | 2410 | 0.5498 | 0.8082 | 0.5498 | 0.7415 | | 0.0821 | 6.8718 | 2412 | 0.5783 | 0.8082 | 0.5783 | 0.7604 | | 0.0821 | 6.8775 | 2414 | 0.5973 | 0.8042 | 0.5973 | 0.7728 | | 0.0821 | 6.8832 | 2416 | 0.5952 | 0.8042 | 0.5952 | 0.7715 | | 0.0821 | 6.8889 | 2418 | 0.6025 | 0.8042 | 0.6025 | 0.7762 | | 0.0821 | 6.8946 | 2420 | 0.6130 | 0.8042 | 0.6130 | 0.7829 | | 0.0821 | 6.9003 | 2422 | 0.6080 | 0.8042 | 0.6080 | 0.7798 | | 0.0821 | 6.9060 | 2424 | 0.5599 | 0.8082 | 0.5599 | 0.7483 | | 0.0821 | 6.9117 | 2426 | 0.4853 | 0.7586 | 0.4853 | 0.6966 | | 0.0821 | 6.9174 | 2428 | 0.4441 | 0.7910 | 0.4441 | 0.6664 | | 0.0821 | 6.9231 | 2430 | 0.4318 | 0.7789 | 0.4318 | 0.6571 | | 0.0821 | 6.9288 | 2432 | 0.4306 | 0.7789 | 0.4306 | 0.6562 | | 0.0821 | 6.9345 | 2434 | 0.4623 | 0.7910 | 0.4623 | 0.6800 | | 0.0821 | 6.9402 | 2436 | 0.5232 | 0.8121 | 0.5232 | 0.7233 | | 0.0821 | 6.9459 | 2438 | 0.5487 | 0.8383 | 0.5487 | 0.7407 | | 0.0821 | 6.9516 | 2440 | 0.5359 | 0.8082 | 0.5359 | 0.7320 | | 0.0821 | 6.9573 | 2442 | 0.5048 | 0.8293 | 0.5048 | 0.7105 | | 0.0821 | 6.9630 | 2444 | 0.4987 | 0.7971 | 0.4987 | 0.7062 | | 0.0821 | 6.9687 | 2446 | 0.4888 | 0.8151 | 0.4888 | 0.6992 | | 0.0821 | 6.9744 | 2448 | 0.4629 | 0.7910 | 0.4629 | 0.6804 | | 0.0821 | 6.9801 | 2450 | 0.4238 | 0.7910 | 0.4238 | 0.6510 | | 0.0821 | 6.9858 | 2452 | 0.4196 | 0.7910 | 0.4196 | 0.6477 | | 0.0821 | 6.9915 | 2454 | 0.4375 | 0.7910 | 0.4375 | 0.6615 | | 0.0821 | 6.9972 | 2456 | 0.4808 | 0.7742 | 0.4808 | 0.6934 | | 0.0821 | 7.0028 | 2458 | 0.4896 | 0.7971 | 0.4896 | 0.6997 | | 0.0821 | 7.0085 | 2460 | 0.5021 | 0.8814 | 0.5021 | 0.7086 | | 0.0821 | 7.0142 | 2462 | 0.5232 | 0.8814 | 0.5232 | 0.7233 | | 0.0821 | 7.0199 | 2464 | 0.5493 | 0.8627 | 0.5493 | 0.7411 | | 0.0821 | 7.0256 | 2466 | 0.5554 | 0.8304 | 0.5554 | 0.7453 | | 0.0821 | 7.0313 | 2468 | 0.5847 | 0.8042 | 0.5847 | 0.7647 | | 0.0821 | 7.0370 | 2470 | 0.6143 | 0.8042 | 0.6143 | 0.7838 | | 0.0821 | 7.0427 | 2472 | 0.6221 | 0.8042 | 0.6221 | 0.7887 | | 0.0821 | 7.0484 | 2474 | 0.6194 | 0.8042 | 0.6194 | 0.7870 | | 0.0821 | 7.0541 | 2476 | 0.6158 | 0.8042 | 0.6158 | 0.7847 | | 0.0821 | 7.0598 | 2478 | 0.6025 | 0.8042 | 0.6025 | 0.7762 | | 0.0821 | 7.0655 | 2480 | 0.5669 | 0.8489 | 0.5669 | 0.7529 | | 0.0821 | 7.0712 | 2482 | 0.5187 | 0.8 | 0.5187 | 0.7202 | | 0.0821 | 7.0769 | 2484 | 0.4633 | 0.7742 | 0.4633 | 0.6807 | | 0.0821 | 7.0826 | 2486 | 0.4253 | 0.7490 | 0.4253 | 0.6521 | | 0.0821 | 7.0883 | 2488 | 0.4136 | 0.7490 | 0.4136 | 0.6431 | | 0.0821 | 7.0940 | 2490 | 0.4328 | 0.7490 | 0.4328 | 0.6579 | | 0.0821 | 7.0997 | 2492 | 0.4784 | 0.7742 | 0.4784 | 0.6917 | | 0.0821 | 7.1054 | 2494 | 0.5400 | 0.8489 | 0.5400 | 0.7348 | | 0.0821 | 7.1111 | 2496 | 0.6026 | 0.8218 | 0.6026 | 0.7763 | | 0.0821 | 7.1168 | 2498 | 0.6658 | 0.8042 | 0.6658 | 0.8160 | | 0.0671 | 7.1225 | 2500 | 0.7069 | 0.7879 | 0.7069 | 0.8408 | | 0.0671 | 7.1282 | 2502 | 0.7307 | 0.7879 | 0.7307 | 0.8548 | | 0.0671 | 7.1339 | 2504 | 0.6949 | 0.7879 | 0.6949 | 0.8336 | | 0.0671 | 7.1396 | 2506 | 0.6386 | 0.8042 | 0.6386 | 0.7991 | | 0.0671 | 7.1453 | 2508 | 0.5803 | 0.8042 | 0.5803 | 0.7618 | | 0.0671 | 7.1510 | 2510 | 0.5088 | 0.8165 | 0.5088 | 0.7133 | | 0.0671 | 7.1567 | 2512 | 0.4642 | 0.8397 | 0.4642 | 0.6813 | | 0.0671 | 7.1624 | 2514 | 0.4445 | 0.8151 | 0.4445 | 0.6667 | | 0.0671 | 7.1681 | 2516 | 0.4225 | 0.8014 | 0.4225 | 0.6500 | | 0.0671 | 7.1738 | 2518 | 0.4286 | 0.8151 | 0.4286 | 0.6547 | | 0.0671 | 7.1795 | 2520 | 0.4697 | 0.8397 | 0.4697 | 0.6853 | | 0.0671 | 7.1852 | 2522 | 0.5458 | 0.7879 | 0.5458 | 0.7388 | | 0.0671 | 7.1909 | 2524 | 0.5951 | 0.8218 | 0.5951 | 0.7714 | | 0.0671 | 7.1966 | 2526 | 0.6202 | 0.8218 | 0.6202 | 0.7875 | | 0.0671 | 7.2023 | 2528 | 0.6697 | 0.8178 | 0.6697 | 0.8183 | | 0.0671 | 7.2080 | 2530 | 0.6900 | 0.8000 | 0.6900 | 0.8307 | | 0.0671 | 7.2137 | 2532 | 0.6665 | 0.7879 | 0.6665 | 0.8164 | | 0.0671 | 7.2194 | 2534 | 0.6038 | 0.8042 | 0.6038 | 0.7770 | | 0.0671 | 7.2251 | 2536 | 0.5437 | 0.8218 | 0.5437 | 0.7373 | | 0.0671 | 7.2308 | 2538 | 0.5115 | 0.8562 | 0.5115 | 0.7152 | | 0.0671 | 7.2365 | 2540 | 0.4923 | 0.8205 | 0.4923 | 0.7017 | | 0.0671 | 7.2422 | 2542 | 0.4929 | 0.8205 | 0.4929 | 0.7020 | | 0.0671 | 7.2479 | 2544 | 0.4605 | 0.8205 | 0.4605 | 0.6786 | | 0.0671 | 7.2536 | 2546 | 0.4369 | 0.8244 | 0.4369 | 0.6610 | | 0.0671 | 7.2593 | 2548 | 0.4520 | 0.8069 | 0.4520 | 0.6723 | | 0.0671 | 7.2650 | 2550 | 0.4595 | 0.8205 | 0.4595 | 0.6779 | | 0.0671 | 7.2707 | 2552 | 0.4363 | 0.8069 | 0.4363 | 0.6605 | | 0.0671 | 7.2764 | 2554 | 0.4261 | 0.8069 | 0.4261 | 0.6527 | | 0.0671 | 7.2821 | 2556 | 0.4523 | 0.8521 | 0.4523 | 0.6725 | | 0.0671 | 7.2877 | 2558 | 0.4893 | 0.8521 | 0.4893 | 0.6995 | | 0.0671 | 7.2934 | 2560 | 0.5306 | 0.8562 | 0.5306 | 0.7284 | | 0.0671 | 7.2991 | 2562 | 0.5729 | 0.7879 | 0.5729 | 0.7569 | | 0.0671 | 7.3048 | 2564 | 0.5917 | 0.7879 | 0.5917 | 0.7692 | | 0.0671 | 7.3105 | 2566 | 0.5686 | 0.8217 | 0.5686 | 0.7541 | | 0.0671 | 7.3162 | 2568 | 0.5168 | 0.8256 | 0.5168 | 0.7189 | | 0.0671 | 7.3219 | 2570 | 0.4700 | 0.8521 | 0.4700 | 0.6855 | | 0.0671 | 7.3276 | 2572 | 0.4553 | 0.8205 | 0.4553 | 0.6747 | | 0.0671 | 7.3333 | 2574 | 0.4558 | 0.8205 | 0.4558 | 0.6751 | | 0.0671 | 7.3390 | 2576 | 0.4752 | 0.7926 | 0.4752 | 0.6893 | | 0.0671 | 7.3447 | 2578 | 0.4849 | 0.7926 | 0.4849 | 0.6964 | | 0.0671 | 7.3504 | 2580 | 0.4945 | 0.8256 | 0.4945 | 0.7032 | | 0.0671 | 7.3561 | 2582 | 0.5074 | 0.7879 | 0.5074 | 0.7123 | | 0.0671 | 7.3618 | 2584 | 0.5467 | 0.7879 | 0.5467 | 0.7394 | | 0.0671 | 7.3675 | 2586 | 0.5986 | 0.8042 | 0.5986 | 0.7737 | | 0.0671 | 7.3732 | 2588 | 0.6169 | 0.8042 | 0.6169 | 0.7854 | | 0.0671 | 7.3789 | 2590 | 0.5998 | 0.8042 | 0.5998 | 0.7744 | | 0.0671 | 7.3846 | 2592 | 0.6162 | 0.8042 | 0.6162 | 0.7850 | | 0.0671 | 7.3903 | 2594 | 0.6197 | 0.8042 | 0.6197 | 0.7872 | | 0.0671 | 7.3960 | 2596 | 0.5785 | 0.8218 | 0.5785 | 0.7606 | | 0.0671 | 7.4017 | 2598 | 0.5492 | 0.8218 | 0.5492 | 0.7411 | | 0.0671 | 7.4074 | 2600 | 0.5088 | 0.8256 | 0.5088 | 0.7133 | | 0.0671 | 7.4131 | 2602 | 0.4930 | 0.7926 | 0.4930 | 0.7021 | | 0.0671 | 7.4188 | 2604 | 0.4733 | 0.7926 | 0.4733 | 0.6880 | | 0.0671 | 7.4245 | 2606 | 0.4559 | 0.7926 | 0.4559 | 0.6752 | | 0.0671 | 7.4302 | 2608 | 0.4604 | 0.7926 | 0.4604 | 0.6785 | | 0.0671 | 7.4359 | 2610 | 0.4885 | 0.7926 | 0.4885 | 0.6989 | | 0.0671 | 7.4416 | 2612 | 0.5392 | 0.8562 | 0.5392 | 0.7343 | | 0.0671 | 7.4473 | 2614 | 0.5606 | 0.8562 | 0.5606 | 0.7487 | | 0.0671 | 7.4530 | 2616 | 0.5870 | 0.8562 | 0.5870 | 0.7662 | | 0.0671 | 7.4587 | 2618 | 0.6051 | 0.8383 | 0.6051 | 0.7779 | | 0.0671 | 7.4644 | 2620 | 0.6412 | 0.8042 | 0.6412 | 0.8007 | | 0.0671 | 7.4701 | 2622 | 0.6622 | 0.8042 | 0.6622 | 0.8138 | | 0.0671 | 7.4758 | 2624 | 0.6385 | 0.8042 | 0.6385 | 0.7990 | | 0.0671 | 7.4815 | 2626 | 0.5805 | 0.8562 | 0.5805 | 0.7619 | | 0.0671 | 7.4872 | 2628 | 0.5302 | 0.8256 | 0.5302 | 0.7282 | | 0.0671 | 7.4929 | 2630 | 0.4982 | 0.7926 | 0.4982 | 0.7058 | | 0.0671 | 7.4986 | 2632 | 0.4761 | 0.7926 | 0.4761 | 0.6900 | | 0.0671 | 7.5043 | 2634 | 0.4682 | 0.7926 | 0.4682 | 0.6843 | | 0.0671 | 7.5100 | 2636 | 0.4712 | 0.7926 | 0.4712 | 0.6864 | | 0.0671 | 7.5157 | 2638 | 0.4959 | 0.7926 | 0.4959 | 0.7042 | | 0.0671 | 7.5214 | 2640 | 0.5443 | 0.8562 | 0.5443 | 0.7377 | | 0.0671 | 7.5271 | 2642 | 0.5919 | 0.8383 | 0.5919 | 0.7694 | | 0.0671 | 7.5328 | 2644 | 0.5901 | 0.8383 | 0.5901 | 0.7682 | | 0.0671 | 7.5385 | 2646 | 0.5689 | 0.8383 | 0.5689 | 0.7543 | | 0.0671 | 7.5442 | 2648 | 0.5528 | 0.8562 | 0.5528 | 0.7435 | | 0.0671 | 7.5499 | 2650 | 0.5137 | 0.8562 | 0.5137 | 0.7167 | | 0.0671 | 7.5556 | 2652 | 0.4901 | 0.8256 | 0.4901 | 0.7000 | | 0.0671 | 7.5613 | 2654 | 0.4766 | 0.7926 | 0.4766 | 0.6903 | | 0.0671 | 7.5670 | 2656 | 0.4785 | 0.8121 | 0.4785 | 0.6918 | | 0.0671 | 7.5726 | 2658 | 0.5160 | 0.8562 | 0.5160 | 0.7183 | | 0.0671 | 7.5783 | 2660 | 0.5816 | 0.8383 | 0.5816 | 0.7627 | | 0.0671 | 7.5840 | 2662 | 0.6475 | 0.8042 | 0.6475 | 0.8047 | | 0.0671 | 7.5897 | 2664 | 0.6861 | 0.7879 | 0.6861 | 0.8283 | | 0.0671 | 7.5954 | 2666 | 0.7035 | 0.8000 | 0.7035 | 0.8388 | | 0.0671 | 7.6011 | 2668 | 0.6774 | 0.8178 | 0.6774 | 0.8231 | | 0.0671 | 7.6068 | 2670 | 0.6359 | 0.8178 | 0.6359 | 0.7974 | | 0.0671 | 7.6125 | 2672 | 0.5812 | 0.7627 | 0.5812 | 0.7623 | | 0.0671 | 7.6182 | 2674 | 0.5080 | 0.7686 | 0.5080 | 0.7127 | | 0.0671 | 7.6239 | 2676 | 0.4452 | 0.8108 | 0.4452 | 0.6672 | | 0.0671 | 7.6296 | 2678 | 0.4145 | 0.8151 | 0.4145 | 0.6439 | | 0.0671 | 7.6353 | 2680 | 0.4173 | 0.8151 | 0.4173 | 0.6460 | | 0.0671 | 7.6410 | 2682 | 0.4385 | 0.8151 | 0.4385 | 0.6622 | | 0.0671 | 7.6467 | 2684 | 0.4774 | 0.7926 | 0.4774 | 0.6909 | | 0.0671 | 7.6524 | 2686 | 0.5308 | 0.7510 | 0.5308 | 0.7285 | | 0.0671 | 7.6581 | 2688 | 0.5901 | 0.8042 | 0.5901 | 0.7682 | | 0.0671 | 7.6638 | 2690 | 0.6409 | 0.8042 | 0.6409 | 0.8006 | | 0.0671 | 7.6695 | 2692 | 0.6650 | 0.8042 | 0.6650 | 0.8155 | | 0.0671 | 7.6752 | 2694 | 0.6469 | 0.8042 | 0.6469 | 0.8043 | | 0.0671 | 7.6809 | 2696 | 0.5961 | 0.8042 | 0.5961 | 0.7721 | | 0.0671 | 7.6866 | 2698 | 0.5255 | 0.8082 | 0.5255 | 0.7249 | | 0.0671 | 7.6923 | 2700 | 0.4685 | 0.7697 | 0.4685 | 0.6845 | | 0.0671 | 7.6980 | 2702 | 0.4580 | 0.7697 | 0.4580 | 0.6767 | | 0.0671 | 7.7037 | 2704 | 0.4790 | 0.7697 | 0.4790 | 0.6921 | | 0.0671 | 7.7094 | 2706 | 0.5188 | 0.8082 | 0.5188 | 0.7203 | | 0.0671 | 7.7151 | 2708 | 0.5760 | 0.8383 | 0.5760 | 0.7590 | | 0.0671 | 7.7208 | 2710 | 0.6267 | 0.8042 | 0.6267 | 0.7917 | | 0.0671 | 7.7265 | 2712 | 0.6309 | 0.8042 | 0.6309 | 0.7943 | | 0.0671 | 7.7322 | 2714 | 0.6106 | 0.8042 | 0.6106 | 0.7814 | | 0.0671 | 7.7379 | 2716 | 0.6006 | 0.8042 | 0.6006 | 0.7750 | | 0.0671 | 7.7436 | 2718 | 0.5686 | 0.8383 | 0.5686 | 0.7541 | | 0.0671 | 7.7493 | 2720 | 0.5588 | 0.8383 | 0.5588 | 0.7475 | | 0.0671 | 7.7550 | 2722 | 0.5771 | 0.8383 | 0.5771 | 0.7597 | | 0.0671 | 7.7607 | 2724 | 0.6268 | 0.8383 | 0.6268 | 0.7917 | | 0.0671 | 7.7664 | 2726 | 0.6967 | 0.8042 | 0.6967 | 0.8347 | | 0.0671 | 7.7721 | 2728 | 0.7393 | 0.7619 | 0.7393 | 0.8598 | | 0.0671 | 7.7778 | 2730 | 0.7487 | 0.7619 | 0.7487 | 0.8653 | | 0.0671 | 7.7835 | 2732 | 0.7160 | 0.7879 | 0.7160 | 0.8462 | | 0.0671 | 7.7892 | 2734 | 0.6513 | 0.8042 | 0.6513 | 0.8070 | | 0.0671 | 7.7949 | 2736 | 0.5828 | 0.8383 | 0.5828 | 0.7634 | | 0.0671 | 7.8006 | 2738 | 0.5200 | 0.7758 | 0.5200 | 0.7211 | | 0.0671 | 7.8063 | 2740 | 0.4618 | 0.7926 | 0.4618 | 0.6796 | | 0.0671 | 7.8120 | 2742 | 0.4351 | 0.8108 | 0.4351 | 0.6596 | | 0.0671 | 7.8177 | 2744 | 0.4427 | 0.8108 | 0.4427 | 0.6653 | | 0.0671 | 7.8234 | 2746 | 0.4738 | 0.7926 | 0.4738 | 0.6884 | | 0.0671 | 7.8291 | 2748 | 0.5166 | 0.7758 | 0.5166 | 0.7188 | | 0.0671 | 7.8348 | 2750 | 0.5420 | 0.7758 | 0.5420 | 0.7362 | | 0.0671 | 7.8405 | 2752 | 0.5385 | 0.7758 | 0.5385 | 0.7339 | | 0.0671 | 7.8462 | 2754 | 0.5143 | 0.7758 | 0.5143 | 0.7172 | | 0.0671 | 7.8519 | 2756 | 0.5038 | 0.7758 | 0.5038 | 0.7098 | | 0.0671 | 7.8575 | 2758 | 0.4902 | 0.7758 | 0.4902 | 0.7002 | | 0.0671 | 7.8632 | 2760 | 0.4792 | 0.7758 | 0.4792 | 0.6923 | | 0.0671 | 7.8689 | 2762 | 0.4871 | 0.7758 | 0.4871 | 0.6979 | | 0.0671 | 7.8746 | 2764 | 0.5030 | 0.8082 | 0.5030 | 0.7093 | | 0.0671 | 7.8803 | 2766 | 0.5271 | 0.8383 | 0.5271 | 0.7260 | | 0.0671 | 7.8860 | 2768 | 0.5588 | 0.8383 | 0.5588 | 0.7475 | | 0.0671 | 7.8917 | 2770 | 0.6028 | 0.8042 | 0.6028 | 0.7764 | | 0.0671 | 7.8974 | 2772 | 0.6379 | 0.8042 | 0.6379 | 0.7987 | | 0.0671 | 7.9031 | 2774 | 0.6368 | 0.8042 | 0.6368 | 0.7980 | | 0.0671 | 7.9088 | 2776 | 0.5960 | 0.8042 | 0.5960 | 0.7720 | | 0.0671 | 7.9145 | 2778 | 0.5605 | 0.8042 | 0.5605 | 0.7486 | | 0.0671 | 7.9202 | 2780 | 0.5311 | 0.8042 | 0.5311 | 0.7288 | | 0.0671 | 7.9259 | 2782 | 0.4921 | 0.8082 | 0.4921 | 0.7015 | | 0.0671 | 7.9316 | 2784 | 0.4730 | 0.8521 | 0.4730 | 0.6878 | | 0.0671 | 7.9373 | 2786 | 0.4597 | 0.8293 | 0.4597 | 0.6780 | | 0.0671 | 7.9430 | 2788 | 0.4630 | 0.8293 | 0.4630 | 0.6805 | | 0.0671 | 7.9487 | 2790 | 0.4868 | 0.8383 | 0.4868 | 0.6977 | | 0.0671 | 7.9544 | 2792 | 0.4995 | 0.8383 | 0.4995 | 0.7068 | | 0.0671 | 7.9601 | 2794 | 0.5075 | 0.8383 | 0.5075 | 0.7124 | | 0.0671 | 7.9658 | 2796 | 0.5323 | 0.8383 | 0.5323 | 0.7296 | | 0.0671 | 7.9715 | 2798 | 0.5656 | 0.8383 | 0.5656 | 0.7521 | | 0.0671 | 7.9772 | 2800 | 0.5873 | 0.8383 | 0.5873 | 0.7664 | | 0.0671 | 7.9829 | 2802 | 0.6028 | 0.8383 | 0.6028 | 0.7764 | | 0.0671 | 7.9886 | 2804 | 0.5896 | 0.8383 | 0.5896 | 0.7679 | | 0.0671 | 7.9943 | 2806 | 0.5817 | 0.8383 | 0.5817 | 0.7627 | | 0.0671 | 8.0 | 2808 | 0.5749 | 0.8383 | 0.5749 | 0.7582 | | 0.0671 | 8.0057 | 2810 | 0.5730 | 0.8383 | 0.5730 | 0.7570 | | 0.0671 | 8.0114 | 2812 | 0.5660 | 0.8383 | 0.5660 | 0.7523 | | 0.0671 | 8.0171 | 2814 | 0.5528 | 0.8383 | 0.5528 | 0.7435 | | 0.0671 | 8.0228 | 2816 | 0.5585 | 0.8383 | 0.5585 | 0.7473 | | 0.0671 | 8.0285 | 2818 | 0.5548 | 0.8383 | 0.5548 | 0.7448 | | 0.0671 | 8.0342 | 2820 | 0.5489 | 0.8383 | 0.5489 | 0.7409 | | 0.0671 | 8.0399 | 2822 | 0.5218 | 0.8383 | 0.5218 | 0.7223 | | 0.0671 | 8.0456 | 2824 | 0.4788 | 0.8121 | 0.4788 | 0.6920 | | 0.0671 | 8.0513 | 2826 | 0.4443 | 0.8158 | 0.4443 | 0.6666 | | 0.0671 | 8.0570 | 2828 | 0.4350 | 0.8328 | 0.4350 | 0.6595 | | 0.0671 | 8.0627 | 2830 | 0.4318 | 0.8328 | 0.4318 | 0.6571 | | 0.0671 | 8.0684 | 2832 | 0.4327 | 0.8328 | 0.4327 | 0.6578 | | 0.0671 | 8.0741 | 2834 | 0.4251 | 0.8328 | 0.4251 | 0.6520 | | 0.0671 | 8.0798 | 2836 | 0.4330 | 0.8478 | 0.4330 | 0.6580 | | 0.0671 | 8.0855 | 2838 | 0.4586 | 0.8293 | 0.4586 | 0.6772 | | 0.0671 | 8.0912 | 2840 | 0.4866 | 0.8256 | 0.4866 | 0.6975 | | 0.0671 | 8.0969 | 2842 | 0.4984 | 0.8082 | 0.4984 | 0.7060 | | 0.0671 | 8.1026 | 2844 | 0.5108 | 0.8082 | 0.5108 | 0.7147 | | 0.0671 | 8.1083 | 2846 | 0.5147 | 0.8082 | 0.5147 | 0.7175 | | 0.0671 | 8.1140 | 2848 | 0.5069 | 0.8256 | 0.5069 | 0.7120 | | 0.0671 | 8.1197 | 2850 | 0.4933 | 0.8444 | 0.4933 | 0.7024 | | 0.0671 | 8.1254 | 2852 | 0.4708 | 0.8444 | 0.4708 | 0.6862 | | 0.0671 | 8.1311 | 2854 | 0.4473 | 0.8108 | 0.4473 | 0.6688 | | 0.0671 | 8.1368 | 2856 | 0.4230 | 0.8151 | 0.4230 | 0.6504 | | 0.0671 | 8.1425 | 2858 | 0.4169 | 0.8151 | 0.4169 | 0.6457 | | 0.0671 | 8.1481 | 2860 | 0.4279 | 0.8151 | 0.4279 | 0.6541 | | 0.0671 | 8.1538 | 2862 | 0.4487 | 0.8108 | 0.4487 | 0.6698 | | 0.0671 | 8.1595 | 2864 | 0.4559 | 0.8108 | 0.4559 | 0.6752 | | 0.0671 | 8.1652 | 2866 | 0.4839 | 0.8444 | 0.4839 | 0.6957 | | 0.0671 | 8.1709 | 2868 | 0.5128 | 0.8256 | 0.5128 | 0.7161 | | 0.0671 | 8.1766 | 2870 | 0.5505 | 0.7709 | 0.5505 | 0.7420 | | 0.0671 | 8.1823 | 2872 | 0.5714 | 0.7709 | 0.5714 | 0.7559 | | 0.0671 | 8.1880 | 2874 | 0.5895 | 0.8042 | 0.5895 | 0.7678 | | 0.0671 | 8.1937 | 2876 | 0.6125 | 0.8042 | 0.6125 | 0.7826 | | 0.0671 | 8.1994 | 2878 | 0.6359 | 0.8042 | 0.6359 | 0.7974 | | 0.0671 | 8.2051 | 2880 | 0.6265 | 0.8042 | 0.6265 | 0.7915 | | 0.0671 | 8.2108 | 2882 | 0.6088 | 0.8042 | 0.6088 | 0.7803 | | 0.0671 | 8.2165 | 2884 | 0.5672 | 0.8383 | 0.5672 | 0.7531 | | 0.0671 | 8.2222 | 2886 | 0.5263 | 0.8082 | 0.5263 | 0.7255 | | 0.0671 | 8.2279 | 2888 | 0.4708 | 0.8444 | 0.4708 | 0.6861 | | 0.0671 | 8.2336 | 2890 | 0.4089 | 0.8581 | 0.4089 | 0.6395 | | 0.0671 | 8.2393 | 2892 | 0.3760 | 0.8146 | 0.3760 | 0.6132 | | 0.0671 | 8.2450 | 2894 | 0.3652 | 0.8146 | 0.3652 | 0.6043 | | 0.0671 | 8.2507 | 2896 | 0.3705 | 0.8146 | 0.3705 | 0.6087 | | 0.0671 | 8.2564 | 2898 | 0.3903 | 0.8146 | 0.3903 | 0.6247 | | 0.0671 | 8.2621 | 2900 | 0.4206 | 0.8860 | 0.4206 | 0.6485 | | 0.0671 | 8.2678 | 2902 | 0.4654 | 0.8444 | 0.4654 | 0.6822 | | 0.0671 | 8.2735 | 2904 | 0.5251 | 0.8082 | 0.5251 | 0.7246 | | 0.0671 | 8.2792 | 2906 | 0.5762 | 0.8042 | 0.5762 | 0.7591 | | 0.0671 | 8.2849 | 2908 | 0.6059 | 0.8042 | 0.6059 | 0.7784 | | 0.0671 | 8.2906 | 2910 | 0.6200 | 0.8042 | 0.6200 | 0.7874 | | 0.0671 | 8.2963 | 2912 | 0.6123 | 0.8042 | 0.6123 | 0.7825 | | 0.0671 | 8.3020 | 2914 | 0.5879 | 0.8042 | 0.5879 | 0.7667 | | 0.0671 | 8.3077 | 2916 | 0.5499 | 0.8082 | 0.5499 | 0.7416 | | 0.0671 | 8.3134 | 2918 | 0.5050 | 0.8256 | 0.5050 | 0.7106 | | 0.0671 | 8.3191 | 2920 | 0.4698 | 0.7926 | 0.4698 | 0.6854 | | 0.0671 | 8.3248 | 2922 | 0.4648 | 0.7926 | 0.4648 | 0.6818 | | 0.0671 | 8.3305 | 2924 | 0.4811 | 0.7926 | 0.4811 | 0.6936 | | 0.0671 | 8.3362 | 2926 | 0.4936 | 0.8256 | 0.4936 | 0.7026 | | 0.0671 | 8.3419 | 2928 | 0.5039 | 0.8256 | 0.5039 | 0.7099 | | 0.0671 | 8.3476 | 2930 | 0.5160 | 0.7709 | 0.5160 | 0.7183 | | 0.0671 | 8.3533 | 2932 | 0.5333 | 0.7709 | 0.5333 | 0.7303 | | 0.0671 | 8.3590 | 2934 | 0.5570 | 0.8042 | 0.5570 | 0.7463 | | 0.0671 | 8.3647 | 2936 | 0.5600 | 0.8042 | 0.5600 | 0.7484 | | 0.0671 | 8.3704 | 2938 | 0.5584 | 0.8042 | 0.5584 | 0.7473 | | 0.0671 | 8.3761 | 2940 | 0.5421 | 0.8042 | 0.5421 | 0.7363 | | 0.0671 | 8.3818 | 2942 | 0.5405 | 0.8042 | 0.5405 | 0.7352 | | 0.0671 | 8.3875 | 2944 | 0.5330 | 0.8042 | 0.5330 | 0.7301 | | 0.0671 | 8.3932 | 2946 | 0.5437 | 0.8042 | 0.5437 | 0.7373 | | 0.0671 | 8.3989 | 2948 | 0.5714 | 0.8042 | 0.5714 | 0.7559 | | 0.0671 | 8.4046 | 2950 | 0.5867 | 0.8042 | 0.5867 | 0.7660 | | 0.0671 | 8.4103 | 2952 | 0.5907 | 0.8042 | 0.5907 | 0.7686 | | 0.0671 | 8.4160 | 2954 | 0.6103 | 0.8042 | 0.6103 | 0.7812 | | 0.0671 | 8.4217 | 2956 | 0.6117 | 0.8042 | 0.6117 | 0.7821 | | 0.0671 | 8.4274 | 2958 | 0.5894 | 0.8042 | 0.5894 | 0.7677 | | 0.0671 | 8.4330 | 2960 | 0.5714 | 0.8383 | 0.5714 | 0.7559 | | 0.0671 | 8.4387 | 2962 | 0.5458 | 0.8383 | 0.5458 | 0.7388 | | 0.0671 | 8.4444 | 2964 | 0.5382 | 0.8383 | 0.5382 | 0.7336 | | 0.0671 | 8.4501 | 2966 | 0.5126 | 0.8082 | 0.5126 | 0.7160 | | 0.0671 | 8.4558 | 2968 | 0.4975 | 0.7758 | 0.4975 | 0.7053 | | 0.0671 | 8.4615 | 2970 | 0.4918 | 0.7926 | 0.4918 | 0.7013 | | 0.0671 | 8.4672 | 2972 | 0.4981 | 0.7926 | 0.4981 | 0.7057 | | 0.0671 | 8.4729 | 2974 | 0.5179 | 0.8082 | 0.5179 | 0.7196 | | 0.0671 | 8.4786 | 2976 | 0.5490 | 0.8042 | 0.5490 | 0.7409 | | 0.0671 | 8.4843 | 2978 | 0.5744 | 0.8042 | 0.5744 | 0.7579 | | 0.0671 | 8.4900 | 2980 | 0.6091 | 0.8042 | 0.6091 | 0.7804 | | 0.0671 | 8.4957 | 2982 | 0.6467 | 0.8042 | 0.6467 | 0.8042 | | 0.0671 | 8.5014 | 2984 | 0.6554 | 0.8042 | 0.6554 | 0.8095 | | 0.0671 | 8.5071 | 2986 | 0.6537 | 0.8042 | 0.6537 | 0.8085 | | 0.0671 | 8.5128 | 2988 | 0.6366 | 0.8042 | 0.6366 | 0.7979 | | 0.0671 | 8.5185 | 2990 | 0.6067 | 0.8042 | 0.6067 | 0.7789 | | 0.0671 | 8.5242 | 2992 | 0.5656 | 0.8042 | 0.5656 | 0.7520 | | 0.0671 | 8.5299 | 2994 | 0.5139 | 0.7812 | 0.5139 | 0.7168 | | 0.0671 | 8.5356 | 2996 | 0.4721 | 0.8205 | 0.4721 | 0.6871 | | 0.0671 | 8.5413 | 2998 | 0.4580 | 0.8205 | 0.4580 | 0.6767 | | 0.0571 | 8.5470 | 3000 | 0.4459 | 0.8205 | 0.4459 | 0.6678 | | 0.0571 | 8.5527 | 3002 | 0.4471 | 0.8205 | 0.4471 | 0.6687 | | 0.0571 | 8.5584 | 3004 | 0.4598 | 0.8205 | 0.4598 | 0.6781 | | 0.0571 | 8.5641 | 3006 | 0.4827 | 0.8205 | 0.4827 | 0.6948 | | 0.0571 | 8.5698 | 3008 | 0.4959 | 0.8205 | 0.4959 | 0.7042 | | 0.0571 | 8.5755 | 3010 | 0.5051 | 0.8521 | 0.5051 | 0.7107 | | 0.0571 | 8.5812 | 3012 | 0.5134 | 0.8521 | 0.5134 | 0.7165 | | 0.0571 | 8.5869 | 3014 | 0.5203 | 0.8304 | 0.5203 | 0.7213 | | 0.0571 | 8.5926 | 3016 | 0.5142 | 0.8814 | 0.5142 | 0.7171 | | 0.0571 | 8.5983 | 3018 | 0.5178 | 0.8304 | 0.5178 | 0.7196 | | 0.0571 | 8.6040 | 3020 | 0.5194 | 0.8304 | 0.5194 | 0.7207 | | 0.0571 | 8.6097 | 3022 | 0.5067 | 0.8814 | 0.5067 | 0.7118 | | 0.0571 | 8.6154 | 3024 | 0.5088 | 0.8814 | 0.5088 | 0.7133 | | 0.0571 | 8.6211 | 3026 | 0.5274 | 0.8042 | 0.5274 | 0.7263 | | 0.0571 | 8.6268 | 3028 | 0.5604 | 0.8042 | 0.5604 | 0.7486 | | 0.0571 | 8.6325 | 3030 | 0.5828 | 0.8042 | 0.5828 | 0.7634 | | 0.0571 | 8.6382 | 3032 | 0.5913 | 0.8042 | 0.5913 | 0.7690 | | 0.0571 | 8.6439 | 3034 | 0.5919 | 0.8042 | 0.5919 | 0.7694 | | 0.0571 | 8.6496 | 3036 | 0.5925 | 0.8042 | 0.5925 | 0.7698 | | 0.0571 | 8.6553 | 3038 | 0.5734 | 0.8042 | 0.5734 | 0.7572 | | 0.0571 | 8.6610 | 3040 | 0.5691 | 0.8042 | 0.5691 | 0.7544 | | 0.0571 | 8.6667 | 3042 | 0.5618 | 0.8042 | 0.5618 | 0.7495 | | 0.0571 | 8.6724 | 3044 | 0.5450 | 0.8042 | 0.5450 | 0.7382 | | 0.0571 | 8.6781 | 3046 | 0.5221 | 0.8042 | 0.5221 | 0.7226 | | 0.0571 | 8.6838 | 3048 | 0.5165 | 0.7709 | 0.5165 | 0.7187 | | 0.0571 | 8.6895 | 3050 | 0.5181 | 0.8042 | 0.5181 | 0.7198 | | 0.0571 | 8.6952 | 3052 | 0.5255 | 0.8042 | 0.5255 | 0.7249 | | 0.0571 | 8.7009 | 3054 | 0.5202 | 0.8042 | 0.5202 | 0.7212 | | 0.0571 | 8.7066 | 3056 | 0.5243 | 0.8042 | 0.5243 | 0.7241 | | 0.0571 | 8.7123 | 3058 | 0.5400 | 0.8042 | 0.5400 | 0.7348 | | 0.0571 | 8.7179 | 3060 | 0.5409 | 0.8042 | 0.5409 | 0.7354 | | 0.0571 | 8.7236 | 3062 | 0.5455 | 0.8042 | 0.5455 | 0.7386 | | 0.0571 | 8.7293 | 3064 | 0.5356 | 0.8042 | 0.5356 | 0.7319 | | 0.0571 | 8.7350 | 3066 | 0.5111 | 0.7986 | 0.5111 | 0.7149 | | 0.0571 | 8.7407 | 3068 | 0.4786 | 0.8359 | 0.4786 | 0.6918 | | 0.0571 | 8.7464 | 3070 | 0.4562 | 0.8397 | 0.4562 | 0.6754 | | 0.0571 | 8.7521 | 3072 | 0.4468 | 0.8397 | 0.4468 | 0.6684 | | 0.0571 | 8.7578 | 3074 | 0.4509 | 0.8397 | 0.4509 | 0.6715 | | 0.0571 | 8.7635 | 3076 | 0.4712 | 0.8718 | 0.4712 | 0.6865 | | 0.0571 | 8.7692 | 3078 | 0.4972 | 0.8627 | 0.4972 | 0.7051 | | 0.0571 | 8.7749 | 3080 | 0.5361 | 0.8042 | 0.5361 | 0.7322 | | 0.0571 | 8.7806 | 3082 | 0.5789 | 0.8042 | 0.5789 | 0.7608 | | 0.0571 | 8.7863 | 3084 | 0.6080 | 0.8042 | 0.6080 | 0.7797 | | 0.0571 | 8.7920 | 3086 | 0.6359 | 0.8042 | 0.6359 | 0.7974 | | 0.0571 | 8.7977 | 3088 | 0.6370 | 0.8042 | 0.6370 | 0.7981 | | 0.0571 | 8.8034 | 3090 | 0.6187 | 0.8042 | 0.6187 | 0.7866 | | 0.0571 | 8.8091 | 3092 | 0.5882 | 0.8042 | 0.5882 | 0.7669 | | 0.0571 | 8.8148 | 3094 | 0.5557 | 0.8042 | 0.5557 | 0.7455 | | 0.0571 | 8.8205 | 3096 | 0.5341 | 0.7709 | 0.5341 | 0.7308 | | 0.0571 | 8.8262 | 3098 | 0.5216 | 0.7879 | 0.5216 | 0.7222 | | 0.0571 | 8.8319 | 3100 | 0.5255 | 0.7879 | 0.5255 | 0.7249 | | 0.0571 | 8.8376 | 3102 | 0.5290 | 0.7879 | 0.5290 | 0.7273 | | 0.0571 | 8.8433 | 3104 | 0.5213 | 0.7510 | 0.5213 | 0.7220 | | 0.0571 | 8.8490 | 3106 | 0.5176 | 0.7510 | 0.5176 | 0.7194 | | 0.0571 | 8.8547 | 3108 | 0.5189 | 0.7510 | 0.5189 | 0.7203 | | 0.0571 | 8.8604 | 3110 | 0.5180 | 0.7879 | 0.5180 | 0.7197 | | 0.0571 | 8.8661 | 3112 | 0.5148 | 0.7510 | 0.5148 | 0.7175 | | 0.0571 | 8.8718 | 3114 | 0.4966 | 0.7926 | 0.4966 | 0.7047 | | 0.0571 | 8.8775 | 3116 | 0.4816 | 0.8108 | 0.4816 | 0.6939 | | 0.0571 | 8.8832 | 3118 | 0.4790 | 0.8108 | 0.4790 | 0.6921 | | 0.0571 | 8.8889 | 3120 | 0.4900 | 0.7926 | 0.4900 | 0.7000 | | 0.0571 | 8.8946 | 3122 | 0.5131 | 0.8082 | 0.5131 | 0.7163 | | 0.0571 | 8.9003 | 3124 | 0.5395 | 0.8383 | 0.5395 | 0.7345 | | 0.0571 | 8.9060 | 3126 | 0.5492 | 0.8383 | 0.5492 | 0.7411 | | 0.0571 | 8.9117 | 3128 | 0.5569 | 0.8042 | 0.5569 | 0.7462 | | 0.0571 | 8.9174 | 3130 | 0.5759 | 0.8042 | 0.5759 | 0.7589 | | 0.0571 | 8.9231 | 3132 | 0.5791 | 0.8042 | 0.5791 | 0.7610 | | 0.0571 | 8.9288 | 3134 | 0.5638 | 0.8042 | 0.5638 | 0.7509 | | 0.0571 | 8.9345 | 3136 | 0.5427 | 0.8042 | 0.5427 | 0.7367 | | 0.0571 | 8.9402 | 3138 | 0.5290 | 0.8042 | 0.5290 | 0.7273 | | 0.0571 | 8.9459 | 3140 | 0.5252 | 0.8042 | 0.5252 | 0.7247 | | 0.0571 | 8.9516 | 3142 | 0.5341 | 0.8042 | 0.5341 | 0.7308 | | 0.0571 | 8.9573 | 3144 | 0.5447 | 0.8042 | 0.5447 | 0.7380 | | 0.0571 | 8.9630 | 3146 | 0.5473 | 0.8042 | 0.5473 | 0.7398 | | 0.0571 | 8.9687 | 3148 | 0.5439 | 0.8042 | 0.5439 | 0.7375 | | 0.0571 | 8.9744 | 3150 | 0.5317 | 0.8042 | 0.5317 | 0.7291 | | 0.0571 | 8.9801 | 3152 | 0.5319 | 0.8042 | 0.5319 | 0.7293 | | 0.0571 | 8.9858 | 3154 | 0.5328 | 0.8042 | 0.5328 | 0.7299 | | 0.0571 | 8.9915 | 3156 | 0.5347 | 0.8042 | 0.5347 | 0.7312 | | 0.0571 | 8.9972 | 3158 | 0.5344 | 0.8042 | 0.5344 | 0.7310 | | 0.0571 | 9.0028 | 3160 | 0.5274 | 0.8042 | 0.5274 | 0.7262 | | 0.0571 | 9.0085 | 3162 | 0.5187 | 0.8042 | 0.5187 | 0.7202 | | 0.0571 | 9.0142 | 3164 | 0.5104 | 0.8383 | 0.5104 | 0.7144 | | 0.0571 | 9.0199 | 3166 | 0.5049 | 0.8082 | 0.5049 | 0.7106 | | 0.0571 | 9.0256 | 3168 | 0.5026 | 0.8082 | 0.5026 | 0.7090 | | 0.0571 | 9.0313 | 3170 | 0.5005 | 0.8256 | 0.5005 | 0.7074 | | 0.0571 | 9.0370 | 3172 | 0.5134 | 0.8082 | 0.5134 | 0.7165 | | 0.0571 | 9.0427 | 3174 | 0.5335 | 0.8042 | 0.5335 | 0.7304 | | 0.0571 | 9.0484 | 3176 | 0.5497 | 0.8042 | 0.5497 | 0.7414 | | 0.0571 | 9.0541 | 3178 | 0.5579 | 0.8042 | 0.5579 | 0.7469 | | 0.0571 | 9.0598 | 3180 | 0.5678 | 0.8042 | 0.5678 | 0.7535 | | 0.0571 | 9.0655 | 3182 | 0.5643 | 0.8042 | 0.5643 | 0.7512 | | 0.0571 | 9.0712 | 3184 | 0.5588 | 0.8042 | 0.5588 | 0.7475 | | 0.0571 | 9.0769 | 3186 | 0.5410 | 0.8042 | 0.5410 | 0.7356 | | 0.0571 | 9.0826 | 3188 | 0.5193 | 0.8042 | 0.5193 | 0.7206 | | 0.0571 | 9.0883 | 3190 | 0.4945 | 0.8521 | 0.4945 | 0.7032 | | 0.0571 | 9.0940 | 3192 | 0.4665 | 0.7742 | 0.4665 | 0.6830 | | 0.0571 | 9.0997 | 3194 | 0.4517 | 0.7742 | 0.4517 | 0.6721 | | 0.0571 | 9.1054 | 3196 | 0.4507 | 0.7742 | 0.4507 | 0.6714 | | 0.0571 | 9.1111 | 3198 | 0.4619 | 0.7742 | 0.4619 | 0.6796 | | 0.0571 | 9.1168 | 3200 | 0.4770 | 0.8521 | 0.4770 | 0.6906 | | 0.0571 | 9.1225 | 3202 | 0.4895 | 0.8521 | 0.4895 | 0.6996 | | 0.0571 | 9.1282 | 3204 | 0.4962 | 0.8165 | 0.4962 | 0.7044 | | 0.0571 | 9.1339 | 3206 | 0.4978 | 0.8165 | 0.4978 | 0.7056 | | 0.0571 | 9.1396 | 3208 | 0.4932 | 0.8165 | 0.4932 | 0.7023 | | 0.0571 | 9.1453 | 3210 | 0.4886 | 0.8521 | 0.4886 | 0.6990 | | 0.0571 | 9.1510 | 3212 | 0.4884 | 0.8293 | 0.4884 | 0.6989 | | 0.0571 | 9.1567 | 3214 | 0.4949 | 0.8521 | 0.4949 | 0.7035 | | 0.0571 | 9.1624 | 3216 | 0.5050 | 0.8521 | 0.5050 | 0.7107 | | 0.0571 | 9.1681 | 3218 | 0.5151 | 0.8304 | 0.5151 | 0.7177 | | 0.0571 | 9.1738 | 3220 | 0.5204 | 0.8304 | 0.5204 | 0.7214 | | 0.0571 | 9.1795 | 3222 | 0.5176 | 0.8304 | 0.5176 | 0.7194 | | 0.0571 | 9.1852 | 3224 | 0.5043 | 0.8627 | 0.5043 | 0.7102 | | 0.0571 | 9.1909 | 3226 | 0.4870 | 0.8521 | 0.4870 | 0.6978 | | 0.0571 | 9.1966 | 3228 | 0.4730 | 0.7971 | 0.4730 | 0.6877 | | 0.0571 | 9.2023 | 3230 | 0.4654 | 0.7971 | 0.4654 | 0.6822 | | 0.0571 | 9.2080 | 3232 | 0.4636 | 0.7971 | 0.4636 | 0.6809 | | 0.0571 | 9.2137 | 3234 | 0.4668 | 0.8205 | 0.4668 | 0.6832 | | 0.0571 | 9.2194 | 3236 | 0.4738 | 0.8205 | 0.4738 | 0.6883 | | 0.0571 | 9.2251 | 3238 | 0.4849 | 0.8205 | 0.4849 | 0.6963 | | 0.0571 | 9.2308 | 3240 | 0.4974 | 0.7879 | 0.4974 | 0.7052 | | 0.0571 | 9.2365 | 3242 | 0.5128 | 0.7879 | 0.5128 | 0.7161 | | 0.0571 | 9.2422 | 3244 | 0.5154 | 0.7709 | 0.5154 | 0.7179 | | 0.0571 | 9.2479 | 3246 | 0.5152 | 0.7879 | 0.5152 | 0.7178 | | 0.0571 | 9.2536 | 3248 | 0.5104 | 0.7879 | 0.5104 | 0.7144 | | 0.0571 | 9.2593 | 3250 | 0.4983 | 0.7510 | 0.4983 | 0.7059 | | 0.0571 | 9.2650 | 3252 | 0.4930 | 0.7510 | 0.4930 | 0.7022 | | 0.0571 | 9.2707 | 3254 | 0.4963 | 0.7510 | 0.4963 | 0.7045 | | 0.0571 | 9.2764 | 3256 | 0.5012 | 0.7510 | 0.5012 | 0.7080 | | 0.0571 | 9.2821 | 3258 | 0.5080 | 0.7510 | 0.5080 | 0.7128 | | 0.0571 | 9.2877 | 3260 | 0.5179 | 0.7879 | 0.5179 | 0.7197 | | 0.0571 | 9.2934 | 3262 | 0.5211 | 0.7879 | 0.5211 | 0.7219 | | 0.0571 | 9.2991 | 3264 | 0.5142 | 0.7879 | 0.5142 | 0.7171 | | 0.0571 | 9.3048 | 3266 | 0.5110 | 0.7879 | 0.5110 | 0.7149 | | 0.0571 | 9.3105 | 3268 | 0.5134 | 0.7879 | 0.5134 | 0.7165 | | 0.0571 | 9.3162 | 3270 | 0.5183 | 0.7879 | 0.5183 | 0.7199 | | 0.0571 | 9.3219 | 3272 | 0.5128 | 0.7879 | 0.5128 | 0.7161 | | 0.0571 | 9.3276 | 3274 | 0.5054 | 0.7879 | 0.5054 | 0.7109 | | 0.0571 | 9.3333 | 3276 | 0.4917 | 0.7926 | 0.4917 | 0.7012 | | 0.0571 | 9.3390 | 3278 | 0.4854 | 0.7926 | 0.4854 | 0.6967 | | 0.0571 | 9.3447 | 3280 | 0.4863 | 0.7926 | 0.4863 | 0.6973 | | 0.0571 | 9.3504 | 3282 | 0.4810 | 0.8205 | 0.4810 | 0.6935 | | 0.0571 | 9.3561 | 3284 | 0.4732 | 0.8205 | 0.4732 | 0.6879 | | 0.0571 | 9.3618 | 3286 | 0.4685 | 0.8205 | 0.4685 | 0.6845 | | 0.0571 | 9.3675 | 3288 | 0.4665 | 0.8205 | 0.4665 | 0.6830 | | 0.0571 | 9.3732 | 3290 | 0.4658 | 0.8205 | 0.4658 | 0.6825 | | 0.0571 | 9.3789 | 3292 | 0.4646 | 0.8205 | 0.4646 | 0.6816 | | 0.0571 | 9.3846 | 3294 | 0.4687 | 0.8205 | 0.4687 | 0.6846 | | 0.0571 | 9.3903 | 3296 | 0.4798 | 0.8205 | 0.4798 | 0.6927 | | 0.0571 | 9.3960 | 3298 | 0.4890 | 0.7926 | 0.4890 | 0.6993 | | 0.0571 | 9.4017 | 3300 | 0.4955 | 0.7510 | 0.4955 | 0.7039 | | 0.0571 | 9.4074 | 3302 | 0.4935 | 0.7510 | 0.4935 | 0.7025 | | 0.0571 | 9.4131 | 3304 | 0.4859 | 0.7926 | 0.4859 | 0.6971 | | 0.0571 | 9.4188 | 3306 | 0.4821 | 0.7926 | 0.4821 | 0.6944 | | 0.0571 | 9.4245 | 3308 | 0.4775 | 0.8205 | 0.4775 | 0.6910 | | 0.0571 | 9.4302 | 3310 | 0.4813 | 0.7926 | 0.4813 | 0.6938 | | 0.0571 | 9.4359 | 3312 | 0.4877 | 0.7510 | 0.4877 | 0.6983 | | 0.0571 | 9.4416 | 3314 | 0.4963 | 0.7510 | 0.4963 | 0.7045 | | 0.0571 | 9.4473 | 3316 | 0.5111 | 0.7510 | 0.5111 | 0.7149 | | 0.0571 | 9.4530 | 3318 | 0.5224 | 0.7510 | 0.5224 | 0.7228 | | 0.0571 | 9.4587 | 3320 | 0.5351 | 0.7879 | 0.5351 | 0.7315 | | 0.0571 | 9.4644 | 3322 | 0.5387 | 0.7879 | 0.5387 | 0.7340 | | 0.0571 | 9.4701 | 3324 | 0.5374 | 0.7879 | 0.5374 | 0.7330 | | 0.0571 | 9.4758 | 3326 | 0.5335 | 0.7879 | 0.5335 | 0.7304 | | 0.0571 | 9.4815 | 3328 | 0.5254 | 0.7879 | 0.5254 | 0.7248 | | 0.0571 | 9.4872 | 3330 | 0.5159 | 0.7510 | 0.5159 | 0.7183 | | 0.0571 | 9.4929 | 3332 | 0.5067 | 0.7510 | 0.5067 | 0.7118 | | 0.0571 | 9.4986 | 3334 | 0.5036 | 0.7510 | 0.5036 | 0.7097 | | 0.0571 | 9.5043 | 3336 | 0.5013 | 0.7510 | 0.5013 | 0.7080 | | 0.0571 | 9.5100 | 3338 | 0.5037 | 0.7510 | 0.5037 | 0.7097 | | 0.0571 | 9.5157 | 3340 | 0.5060 | 0.7510 | 0.5060 | 0.7114 | | 0.0571 | 9.5214 | 3342 | 0.5092 | 0.7510 | 0.5092 | 0.7136 | | 0.0571 | 9.5271 | 3344 | 0.5126 | 0.7510 | 0.5126 | 0.7160 | | 0.0571 | 9.5328 | 3346 | 0.5191 | 0.7879 | 0.5191 | 0.7205 | | 0.0571 | 9.5385 | 3348 | 0.5279 | 0.7879 | 0.5279 | 0.7266 | | 0.0571 | 9.5442 | 3350 | 0.5356 | 0.7879 | 0.5356 | 0.7318 | | 0.0571 | 9.5499 | 3352 | 0.5385 | 0.7879 | 0.5385 | 0.7338 | | 0.0571 | 9.5556 | 3354 | 0.5404 | 0.7879 | 0.5404 | 0.7351 | | 0.0571 | 9.5613 | 3356 | 0.5379 | 0.7879 | 0.5379 | 0.7334 | | 0.0571 | 9.5670 | 3358 | 0.5388 | 0.7709 | 0.5388 | 0.7340 | | 0.0571 | 9.5726 | 3360 | 0.5388 | 0.7709 | 0.5388 | 0.7340 | | 0.0571 | 9.5783 | 3362 | 0.5359 | 0.7709 | 0.5359 | 0.7321 | | 0.0571 | 9.5840 | 3364 | 0.5301 | 0.7879 | 0.5301 | 0.7281 | | 0.0571 | 9.5897 | 3366 | 0.5246 | 0.8256 | 0.5246 | 0.7243 | | 0.0571 | 9.5954 | 3368 | 0.5217 | 0.8256 | 0.5217 | 0.7223 | | 0.0571 | 9.6011 | 3370 | 0.5139 | 0.8256 | 0.5139 | 0.7169 | | 0.0571 | 9.6068 | 3372 | 0.5029 | 0.7926 | 0.5029 | 0.7091 | | 0.0571 | 9.6125 | 3374 | 0.4924 | 0.7926 | 0.4924 | 0.7017 | | 0.0571 | 9.6182 | 3376 | 0.4856 | 0.7926 | 0.4856 | 0.6969 | | 0.0571 | 9.6239 | 3378 | 0.4815 | 0.7926 | 0.4815 | 0.6939 | | 0.0571 | 9.6296 | 3380 | 0.4795 | 0.8205 | 0.4795 | 0.6925 | | 0.0571 | 9.6353 | 3382 | 0.4770 | 0.8205 | 0.4770 | 0.6907 | | 0.0571 | 9.6410 | 3384 | 0.4795 | 0.8205 | 0.4795 | 0.6925 | | 0.0571 | 9.6467 | 3386 | 0.4861 | 0.7926 | 0.4861 | 0.6972 | | 0.0571 | 9.6524 | 3388 | 0.4954 | 0.7926 | 0.4954 | 0.7038 | | 0.0571 | 9.6581 | 3390 | 0.5052 | 0.7926 | 0.5052 | 0.7108 | | 0.0571 | 9.6638 | 3392 | 0.5139 | 0.7926 | 0.5139 | 0.7168 | | 0.0571 | 9.6695 | 3394 | 0.5235 | 0.7879 | 0.5235 | 0.7235 | | 0.0571 | 9.6752 | 3396 | 0.5319 | 0.7879 | 0.5319 | 0.7293 | | 0.0571 | 9.6809 | 3398 | 0.5383 | 0.7879 | 0.5383 | 0.7337 | | 0.0571 | 9.6866 | 3400 | 0.5432 | 0.7709 | 0.5432 | 0.7370 | | 0.0571 | 9.6923 | 3402 | 0.5471 | 0.7709 | 0.5471 | 0.7397 | | 0.0571 | 9.6980 | 3404 | 0.5505 | 0.7709 | 0.5505 | 0.7420 | | 0.0571 | 9.7037 | 3406 | 0.5543 | 0.7709 | 0.5543 | 0.7445 | | 0.0571 | 9.7094 | 3408 | 0.5611 | 0.7709 | 0.5611 | 0.7491 | | 0.0571 | 9.7151 | 3410 | 0.5648 | 0.7709 | 0.5648 | 0.7515 | | 0.0571 | 9.7208 | 3412 | 0.5640 | 0.7709 | 0.5640 | 0.7510 | | 0.0571 | 9.7265 | 3414 | 0.5604 | 0.7709 | 0.5604 | 0.7486 | | 0.0571 | 9.7322 | 3416 | 0.5533 | 0.7709 | 0.5533 | 0.7438 | | 0.0571 | 9.7379 | 3418 | 0.5451 | 0.7879 | 0.5451 | 0.7383 | | 0.0571 | 9.7436 | 3420 | 0.5379 | 0.7879 | 0.5379 | 0.7334 | | 0.0571 | 9.7493 | 3422 | 0.5342 | 0.7879 | 0.5342 | 0.7309 | | 0.0571 | 9.7550 | 3424 | 0.5300 | 0.7879 | 0.5300 | 0.7280 | | 0.0571 | 9.7607 | 3426 | 0.5257 | 0.7510 | 0.5257 | 0.7251 | | 0.0571 | 9.7664 | 3428 | 0.5214 | 0.7510 | 0.5214 | 0.7221 | | 0.0571 | 9.7721 | 3430 | 0.5154 | 0.7510 | 0.5154 | 0.7179 | | 0.0571 | 9.7778 | 3432 | 0.5085 | 0.7510 | 0.5085 | 0.7131 | | 0.0571 | 9.7835 | 3434 | 0.5025 | 0.7926 | 0.5025 | 0.7089 | | 0.0571 | 9.7892 | 3436 | 0.4992 | 0.7926 | 0.4992 | 0.7066 | | 0.0571 | 9.7949 | 3438 | 0.4984 | 0.7926 | 0.4984 | 0.7059 | | 0.0571 | 9.8006 | 3440 | 0.4988 | 0.7926 | 0.4988 | 0.7063 | | 0.0571 | 9.8063 | 3442 | 0.5015 | 0.7926 | 0.5015 | 0.7082 | | 0.0571 | 9.8120 | 3444 | 0.5032 | 0.7926 | 0.5032 | 0.7094 | | 0.0571 | 9.8177 | 3446 | 0.5036 | 0.7926 | 0.5036 | 0.7097 | | 0.0571 | 9.8234 | 3448 | 0.5055 | 0.7926 | 0.5055 | 0.7110 | | 0.0571 | 9.8291 | 3450 | 0.5064 | 0.7926 | 0.5064 | 0.7116 | | 0.0571 | 9.8348 | 3452 | 0.5082 | 0.7926 | 0.5082 | 0.7129 | | 0.0571 | 9.8405 | 3454 | 0.5110 | 0.7510 | 0.5110 | 0.7148 | | 0.0571 | 9.8462 | 3456 | 0.5134 | 0.7879 | 0.5134 | 0.7165 | | 0.0571 | 9.8519 | 3458 | 0.5138 | 0.7879 | 0.5138 | 0.7168 | | 0.0571 | 9.8575 | 3460 | 0.5139 | 0.7879 | 0.5139 | 0.7169 | | 0.0571 | 9.8632 | 3462 | 0.5129 | 0.8256 | 0.5129 | 0.7161 | | 0.0571 | 9.8689 | 3464 | 0.5111 | 0.8256 | 0.5111 | 0.7149 | | 0.0571 | 9.8746 | 3466 | 0.5100 | 0.8256 | 0.5100 | 0.7142 | | 0.0571 | 9.8803 | 3468 | 0.5091 | 0.7926 | 0.5091 | 0.7135 | | 0.0571 | 9.8860 | 3470 | 0.5085 | 0.7926 | 0.5085 | 0.7131 | | 0.0571 | 9.8917 | 3472 | 0.5089 | 0.7926 | 0.5089 | 0.7134 | | 0.0571 | 9.8974 | 3474 | 0.5092 | 0.7926 | 0.5092 | 0.7136 | | 0.0571 | 9.9031 | 3476 | 0.5095 | 0.7510 | 0.5095 | 0.7138 | | 0.0571 | 9.9088 | 3478 | 0.5093 | 0.7510 | 0.5093 | 0.7136 | | 0.0571 | 9.9145 | 3480 | 0.5102 | 0.7510 | 0.5102 | 0.7143 | | 0.0571 | 9.9202 | 3482 | 0.5117 | 0.7510 | 0.5117 | 0.7153 | | 0.0571 | 9.9259 | 3484 | 0.5123 | 0.7510 | 0.5123 | 0.7158 | | 0.0571 | 9.9316 | 3486 | 0.5134 | 0.7879 | 0.5134 | 0.7166 | | 0.0571 | 9.9373 | 3488 | 0.5147 | 0.7879 | 0.5147 | 0.7174 | | 0.0571 | 9.9430 | 3490 | 0.5161 | 0.7879 | 0.5161 | 0.7184 | | 0.0571 | 9.9487 | 3492 | 0.5174 | 0.7879 | 0.5174 | 0.7193 | | 0.0571 | 9.9544 | 3494 | 0.5185 | 0.7879 | 0.5185 | 0.7201 | | 0.0571 | 9.9601 | 3496 | 0.5197 | 0.7879 | 0.5197 | 0.7209 | | 0.0571 | 9.9658 | 3498 | 0.5208 | 0.7879 | 0.5208 | 0.7216 | | 0.0456 | 9.9715 | 3500 | 0.5216 | 0.7879 | 0.5216 | 0.7222 | | 0.0456 | 9.9772 | 3502 | 0.5219 | 0.7879 | 0.5219 | 0.7224 | | 0.0456 | 9.9829 | 3504 | 0.5219 | 0.7879 | 0.5219 | 0.7225 | | 0.0456 | 9.9886 | 3506 | 0.5220 | 0.7879 | 0.5220 | 0.7225 | | 0.0456 | 9.9943 | 3508 | 0.5218 | 0.7879 | 0.5218 | 0.7224 | | 0.0456 | 10.0 | 3510 | 0.5218 | 0.7879 | 0.5218 | 0.7224 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu118 - Datasets 2.21.0 - Tokenizers 0.19.1
Rich-J/subnet29_upload_c02_N13_0
Rich-J
2024-11-14T04:23:10Z
35
0
transformers
[ "transformers", "safetensors", "phi3", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2024-11-14T04:20:07Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold0
MayBashendy
2024-11-14T04:18:14Z
164
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:aubmindlab/bert-base-arabertv02", "base_model:finetune:aubmindlab/bert-base-arabertv02", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2024-11-14T03:58:01Z
--- library_name: transformers base_model: aubmindlab/bert-base-arabertv02 tags: - generated_from_trainer model-index: - name: Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold0 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold0 This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7729 - Qwk: 0.6951 - Mse: 0.7729 - Rmse: 0.8792 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse | |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:| | No log | 0.0051 | 2 | 4.8349 | -0.0064 | 4.8349 | 2.1988 | | No log | 0.0102 | 4 | 2.6648 | 0.0968 | 2.6648 | 1.6324 | | No log | 0.0153 | 6 | 1.6244 | 0.1548 | 1.6244 | 1.2745 | | No log | 0.0205 | 8 | 1.3794 | 0.0950 | 1.3794 | 1.1745 | | No log | 0.0256 | 10 | 1.2051 | 0.3693 | 1.2051 | 1.0978 | | No log | 0.0307 | 12 | 1.1561 | 0.3623 | 1.1561 | 1.0752 | | No log | 0.0358 | 14 | 1.4367 | 0.2957 | 1.4367 | 1.1986 | | No log | 0.0409 | 16 | 2.0532 | 0.1873 | 2.0532 | 1.4329 | | No log | 0.0460 | 18 | 2.3036 | 0.1600 | 2.3036 | 1.5178 | | No log | 0.0512 | 20 | 2.0619 | 0.1600 | 2.0619 | 1.4359 | | No log | 0.0563 | 22 | 2.0155 | 0.0801 | 2.0155 | 1.4197 | | No log | 0.0614 | 24 | 1.9389 | 0.1075 | 1.9389 | 1.3925 | | No log | 0.0665 | 26 | 2.0406 | 0.1075 | 2.0406 | 1.4285 | | No log | 0.0716 | 28 | 1.9531 | 0.1075 | 1.9531 | 1.3975 | | No log | 0.0767 | 30 | 1.8806 | 0.1873 | 1.8806 | 1.3714 | | No log | 0.0818 | 32 | 1.8895 | 0.1873 | 1.8895 | 1.3746 | | No log | 0.0870 | 34 | 1.8246 | 0.2145 | 1.8246 | 1.3508 | | No log | 0.0921 | 36 | 1.7821 | 0.2145 | 1.7821 | 1.3349 | | No log | 0.0972 | 38 | 2.0287 | 0.4324 | 2.0287 | 1.4243 | | No log | 0.1023 | 40 | 2.1866 | 0.3465 | 2.1866 | 1.4787 | | No log | 0.1074 | 42 | 2.1923 | 0.3465 | 2.1923 | 1.4807 | | No log | 0.1125 | 44 | 1.9613 | 0.5404 | 1.9613 | 1.4005 | | No log | 0.1176 | 46 | 1.6150 | 0.6265 | 1.6150 | 1.2708 | | No log | 0.1228 | 48 | 1.3209 | 0.4592 | 1.3209 | 1.1493 | | No log | 0.1279 | 50 | 1.3936 | 0.4592 | 1.3936 | 1.1805 | | No log | 0.1330 | 52 | 1.9210 | 0.5440 | 1.9210 | 1.3860 | | No log | 0.1381 | 54 | 2.3070 | 0.3977 | 2.3070 | 1.5189 | | No log | 0.1432 | 56 | 2.3246 | 0.3425 | 2.3246 | 1.5246 | | No log | 0.1483 | 58 | 2.5462 | 0.3425 | 2.5462 | 1.5957 | | No log | 0.1535 | 60 | 2.3681 | 0.3368 | 2.3681 | 1.5389 | | No log | 0.1586 | 62 | 1.9791 | 0.4783 | 1.9791 | 1.4068 | | No log | 0.1637 | 64 | 1.6492 | 0.5028 | 1.6492 | 1.2842 | | No log | 0.1688 | 66 | 1.6259 | 0.5028 | 1.6259 | 1.2751 | | No log | 0.1739 | 68 | 1.9838 | 0.4786 | 1.9838 | 1.4085 | | No log | 0.1790 | 70 | 2.6579 | 0.3478 | 2.6579 | 1.6303 | | No log | 0.1841 | 72 | 2.7509 | 0.3478 | 2.7509 | 1.6586 | | No log | 0.1893 | 74 | 2.2229 | 0.4786 | 2.2229 | 1.4909 | | No log | 0.1944 | 76 | 1.7089 | 0.5661 | 1.7089 | 1.3072 | | No log | 0.1995 | 78 | 1.3284 | 0.5670 | 1.3284 | 1.1526 | | No log | 0.2046 | 80 | 1.2327 | 0.5670 | 1.2327 | 1.1103 | | No log | 0.2097 | 82 | 1.4338 | 0.5882 | 1.4338 | 1.1974 | | No log | 0.2148 | 84 | 1.8134 | 0.4786 | 1.8134 | 1.3466 | | No log | 0.2199 | 86 | 2.3598 | 0.4011 | 2.3598 | 1.5362 | | No log | 0.2251 | 88 | 2.4987 | 0.3104 | 2.4987 | 1.5807 | | No log | 0.2302 | 90 | 2.0012 | 0.2743 | 2.0012 | 1.4147 | | No log | 0.2353 | 92 | 1.4163 | 0.5484 | 1.4163 | 1.1901 | | No log | 0.2404 | 94 | 1.3715 | 0.6165 | 1.3715 | 1.1711 | | No log | 0.2455 | 96 | 1.6667 | 0.4381 | 1.6667 | 1.2910 | | No log | 0.2506 | 98 | 1.9179 | 0.4606 | 1.9179 | 1.3849 | | No log | 0.2558 | 100 | 1.7097 | 0.4595 | 1.7097 | 1.3075 | | No log | 0.2609 | 102 | 1.6445 | 0.4595 | 1.6445 | 1.2824 | | No log | 0.2660 | 104 | 1.7876 | 0.4389 | 1.7876 | 1.3370 | | No log | 0.2711 | 106 | 1.7800 | 0.4389 | 1.7800 | 1.3342 | | No log | 0.2762 | 108 | 1.8974 | 0.4408 | 1.8974 | 1.3774 | | No log | 0.2813 | 110 | 1.5581 | 0.4995 | 1.5581 | 1.2483 | | No log | 0.2864 | 112 | 1.3174 | 0.7929 | 1.3174 | 1.1478 | | No log | 0.2916 | 114 | 1.2772 | 0.7929 | 1.2772 | 1.1302 | | No log | 0.2967 | 116 | 1.2373 | 0.7929 | 1.2373 | 1.1123 | | No log | 0.3018 | 118 | 1.5449 | 0.5200 | 1.5449 | 1.2429 | | No log | 0.3069 | 120 | 1.7477 | 0.4606 | 1.7477 | 1.3220 | | No log | 0.3120 | 122 | 1.6385 | 0.5188 | 1.6385 | 1.2800 | | No log | 0.3171 | 124 | 1.2969 | 0.7412 | 1.2969 | 1.1388 | | No log | 0.3223 | 126 | 1.0825 | 0.7626 | 1.0825 | 1.0404 | | No log | 0.3274 | 128 | 1.2306 | 0.7626 | 1.2306 | 1.1093 | | No log | 0.3325 | 130 | 1.7058 | 0.5200 | 1.7058 | 1.3061 | | No log | 0.3376 | 132 | 1.8015 | 0.6201 | 1.8015 | 1.3422 | | No log | 0.3427 | 134 | 1.5803 | 0.5737 | 1.5803 | 1.2571 | | No log | 0.3478 | 136 | 1.3957 | 0.7042 | 1.3957 | 1.1814 | | No log | 0.3529 | 138 | 1.4369 | 0.5404 | 1.4369 | 1.1987 | | No log | 0.3581 | 140 | 1.4721 | 0.5200 | 1.4721 | 1.2133 | | No log | 0.3632 | 142 | 1.7918 | 0.5200 | 1.7918 | 1.3386 | | No log | 0.3683 | 144 | 1.7818 | 0.4595 | 1.7818 | 1.3349 | | No log | 0.3734 | 146 | 1.9216 | 0.4595 | 1.9216 | 1.3862 | | No log | 0.3785 | 148 | 2.0893 | 0.4793 | 2.0893 | 1.4454 | | No log | 0.3836 | 150 | 1.6658 | 0.5200 | 1.6658 | 1.2907 | | No log | 0.3887 | 152 | 1.2029 | 0.7519 | 1.2029 | 1.0967 | | No log | 0.3939 | 154 | 1.2409 | 0.6975 | 1.2409 | 1.1140 | | No log | 0.3990 | 156 | 1.5974 | 0.5426 | 1.5974 | 1.2639 | | No log | 0.4041 | 158 | 2.0837 | 0.4991 | 2.0837 | 1.4435 | | No log | 0.4092 | 160 | 2.2363 | 0.5502 | 2.2363 | 1.4954 | | No log | 0.4143 | 162 | 1.8709 | 0.6065 | 1.8709 | 1.3678 | | No log | 0.4194 | 164 | 1.3125 | 0.7211 | 1.3125 | 1.1457 | | No log | 0.4246 | 166 | 1.2602 | 0.6890 | 1.2602 | 1.1226 | | No log | 0.4297 | 168 | 1.3720 | 0.7211 | 1.3720 | 1.1713 | | No log | 0.4348 | 170 | 1.6872 | 0.4595 | 1.6872 | 1.2989 | | No log | 0.4399 | 172 | 2.0884 | 0.4606 | 2.0884 | 1.4451 | | No log | 0.4450 | 174 | 1.9520 | 0.4011 | 1.9520 | 1.3971 | | No log | 0.4501 | 176 | 1.6423 | 0.4381 | 1.6423 | 1.2815 | | No log | 0.4552 | 178 | 1.4222 | 0.6157 | 1.4222 | 1.1926 | | No log | 0.4604 | 180 | 1.3405 | 0.6290 | 1.3405 | 1.1578 | | No log | 0.4655 | 182 | 1.3006 | 0.6290 | 1.3006 | 1.1404 | | No log | 0.4706 | 184 | 1.5053 | 0.5474 | 1.5053 | 1.2269 | | No log | 0.4757 | 186 | 1.5494 | 0.5449 | 1.5494 | 1.2447 | | No log | 0.4808 | 188 | 1.5028 | 0.5638 | 1.5028 | 1.2259 | | No log | 0.4859 | 190 | 1.3690 | 0.7042 | 1.3690 | 1.1700 | | No log | 0.4910 | 192 | 1.1758 | 0.7599 | 1.1758 | 1.0843 | | No log | 0.4962 | 194 | 1.1233 | 0.7599 | 1.1233 | 1.0598 | | No log | 0.5013 | 196 | 1.0566 | 0.7590 | 1.0566 | 1.0279 | | No log | 0.5064 | 198 | 1.3227 | 0.7905 | 1.3227 | 1.1501 | | No log | 0.5115 | 200 | 1.7937 | 0.7115 | 1.7937 | 1.3393 | | No log | 0.5166 | 202 | 2.1526 | 0.5653 | 2.1526 | 1.4672 | | No log | 0.5217 | 204 | 2.1276 | 0.5166 | 2.1276 | 1.4586 | | No log | 0.5269 | 206 | 1.8828 | 0.4381 | 1.8828 | 1.3721 | | No log | 0.5320 | 208 | 1.5447 | 0.5474 | 1.5447 | 1.2429 | | No log | 0.5371 | 210 | 1.5262 | 0.6157 | 1.5262 | 1.2354 | | No log | 0.5422 | 212 | 1.5911 | 0.5638 | 1.5911 | 1.2614 | | No log | 0.5473 | 214 | 1.5599 | 0.5638 | 1.5599 | 1.2490 | | No log | 0.5524 | 216 | 1.2361 | 0.6265 | 1.2361 | 1.1118 | | No log | 0.5575 | 218 | 1.0876 | 0.6982 | 1.0876 | 1.0429 | | No log | 0.5627 | 220 | 1.0005 | 0.6988 | 1.0005 | 1.0002 | | No log | 0.5678 | 222 | 1.1702 | 0.6061 | 1.1702 | 1.0817 | | No log | 0.5729 | 224 | 1.4755 | 0.6008 | 1.4755 | 1.2147 | | No log | 0.5780 | 226 | 1.6009 | 0.6349 | 1.6009 | 1.2652 | | No log | 0.5831 | 228 | 1.5692 | 0.6349 | 1.5692 | 1.2527 | | No log | 0.5882 | 230 | 1.3451 | 0.5638 | 1.3451 | 1.1598 | | No log | 0.5934 | 232 | 1.1034 | 0.6975 | 1.1034 | 1.0504 | | No log | 0.5985 | 234 | 1.0618 | 0.7519 | 1.0618 | 1.0304 | | No log | 0.6036 | 236 | 1.2523 | 0.6265 | 1.2523 | 1.1191 | | No log | 0.6087 | 238 | 1.3015 | 0.6265 | 1.3015 | 1.1408 | | No log | 0.6138 | 240 | 1.3130 | 0.6265 | 1.3130 | 1.1459 | | No log | 0.6189 | 242 | 1.2943 | 0.6265 | 1.2943 | 1.1377 | | No log | 0.6240 | 244 | 1.3726 | 0.5638 | 1.3726 | 1.1716 | | No log | 0.6292 | 246 | 1.4792 | 0.5638 | 1.4792 | 1.2162 | | No log | 0.6343 | 248 | 1.3838 | 0.5638 | 1.3838 | 1.1764 | | No log | 0.6394 | 250 | 1.6283 | 0.5404 | 1.6283 | 1.2760 | | No log | 0.6445 | 252 | 1.6280 | 0.5404 | 1.6280 | 1.2759 | | No log | 0.6496 | 254 | 1.5809 | 0.5404 | 1.5809 | 1.2573 | | No log | 0.6547 | 256 | 1.3672 | 0.6265 | 1.3672 | 1.1693 | | No log | 0.6598 | 258 | 1.0624 | 0.7296 | 1.0624 | 1.0307 | | No log | 0.6650 | 260 | 1.0751 | 0.7296 | 1.0751 | 1.0369 | | No log | 0.6701 | 262 | 1.4643 | 0.6008 | 1.4643 | 1.2101 | | No log | 0.6752 | 264 | 1.6173 | 0.5965 | 1.6173 | 1.2717 | | No log | 0.6803 | 266 | 1.4403 | 0.6008 | 1.4403 | 1.2001 | | No log | 0.6854 | 268 | 1.2346 | 0.7211 | 1.2346 | 1.1111 | | No log | 0.6905 | 270 | 1.2515 | 0.7211 | 1.2515 | 1.1187 | | No log | 0.6957 | 272 | 1.3066 | 0.7211 | 1.3066 | 1.1431 | | No log | 0.7008 | 274 | 1.4740 | 0.6485 | 1.4740 | 1.2141 | | No log | 0.7059 | 276 | 1.3580 | 0.7211 | 1.3580 | 1.1653 | | No log | 0.7110 | 278 | 1.3045 | 0.6890 | 1.3045 | 1.1421 | | No log | 0.7161 | 280 | 1.4184 | 0.7042 | 1.4184 | 1.1909 | | No log | 0.7212 | 282 | 1.6800 | 0.6446 | 1.6800 | 1.2962 | | No log | 0.7263 | 284 | 1.5553 | 0.6446 | 1.5553 | 1.2471 | | No log | 0.7315 | 286 | 1.3539 | 0.6008 | 1.3539 | 1.1636 | | No log | 0.7366 | 288 | 1.2274 | 0.6890 | 1.2274 | 1.1079 | | No log | 0.7417 | 290 | 1.0952 | 0.6818 | 1.0952 | 1.0465 | | No log | 0.7468 | 292 | 1.0540 | 0.6818 | 1.0540 | 1.0267 | | No log | 0.7519 | 294 | 1.0954 | 0.7129 | 1.0954 | 1.0466 | | No log | 0.7570 | 296 | 1.0829 | 0.7115 | 1.0829 | 1.0406 | | No log | 0.7621 | 298 | 1.2256 | 0.7583 | 1.2256 | 1.1071 | | No log | 0.7673 | 300 | 1.4730 | 0.7031 | 1.4730 | 1.2137 | | No log | 0.7724 | 302 | 1.3904 | 0.6457 | 1.3904 | 1.1792 | | No log | 0.7775 | 304 | 1.2508 | 0.5540 | 1.2508 | 1.1184 | | No log | 0.7826 | 306 | 1.2461 | 0.5608 | 1.2461 | 1.1163 | | No log | 0.7877 | 308 | 1.2609 | 0.5965 | 1.2609 | 1.1229 | | No log | 0.7928 | 310 | 1.0147 | 0.6763 | 1.0147 | 1.0073 | | No log | 0.7980 | 312 | 0.8106 | 0.72 | 0.8106 | 0.9003 | | No log | 0.8031 | 314 | 0.9081 | 0.6623 | 0.9081 | 0.9530 | | No log | 0.8082 | 316 | 1.1449 | 0.7328 | 1.1449 | 1.0700 | | No log | 0.8133 | 318 | 1.2634 | 0.7123 | 1.2634 | 1.1240 | | No log | 0.8184 | 320 | 1.1884 | 0.7123 | 1.1884 | 1.0902 | | No log | 0.8235 | 322 | 1.0369 | 0.6763 | 1.0369 | 1.0183 | | No log | 0.8286 | 324 | 1.1726 | 0.6890 | 1.1726 | 1.0829 | | No log | 0.8338 | 326 | 1.5457 | 0.6115 | 1.5457 | 1.2433 | | No log | 0.8389 | 328 | 1.5769 | 0.6115 | 1.5769 | 1.2557 | | No log | 0.8440 | 330 | 1.2436 | 0.6491 | 1.2436 | 1.1152 | | No log | 0.8491 | 332 | 0.9340 | 0.6940 | 0.9340 | 0.9664 | | No log | 0.8542 | 334 | 0.8555 | 0.7327 | 0.8555 | 0.9249 | | No log | 0.8593 | 336 | 0.8988 | 0.7239 | 0.8988 | 0.9480 | | No log | 0.8645 | 338 | 1.0635 | 0.7239 | 1.0635 | 1.0313 | | No log | 0.8696 | 340 | 0.9573 | 0.7239 | 0.9573 | 0.9784 | | No log | 0.8747 | 342 | 0.9880 | 0.7239 | 0.9880 | 0.9940 | | No log | 0.8798 | 344 | 1.0130 | 0.7239 | 1.0130 | 1.0065 | | No log | 0.8849 | 346 | 1.0970 | 0.7157 | 1.0970 | 1.0474 | | No log | 0.8900 | 348 | 1.1216 | 0.7239 | 1.1216 | 1.0590 | | No log | 0.8951 | 350 | 0.9248 | 0.7433 | 0.9248 | 0.9617 | | No log | 0.9003 | 352 | 0.7307 | 0.7685 | 0.7307 | 0.8548 | | No log | 0.9054 | 354 | 0.7261 | 0.7685 | 0.7261 | 0.8521 | | No log | 0.9105 | 356 | 0.7789 | 0.7685 | 0.7789 | 0.8826 | | No log | 0.9156 | 358 | 0.9725 | 0.7602 | 0.9725 | 0.9861 | | No log | 0.9207 | 360 | 1.2739 | 0.7701 | 1.2739 | 1.1287 | | No log | 0.9258 | 362 | 1.2980 | 0.7504 | 1.2980 | 1.1393 | | No log | 0.9309 | 364 | 1.1806 | 0.7684 | 1.1806 | 1.0866 | | No log | 0.9361 | 366 | 1.0025 | 0.7614 | 1.0025 | 1.0013 | | No log | 0.9412 | 368 | 0.9132 | 0.7519 | 0.9132 | 0.9556 | | No log | 0.9463 | 370 | 0.9538 | 0.7519 | 0.9538 | 0.9766 | | No log | 0.9514 | 372 | 1.0991 | 0.7305 | 1.0991 | 1.0484 | | No log | 0.9565 | 374 | 1.3855 | 0.6545 | 1.3855 | 1.1771 | | No log | 0.9616 | 376 | 1.4673 | 0.6545 | 1.4673 | 1.2113 | | No log | 0.9668 | 378 | 1.5124 | 0.6545 | 1.5124 | 1.2298 | | No log | 0.9719 | 380 | 1.5253 | 0.6545 | 1.5253 | 1.2350 | | No log | 0.9770 | 382 | 1.2814 | 0.7123 | 1.2814 | 1.1320 | | No log | 0.9821 | 384 | 0.9290 | 0.7614 | 0.9290 | 0.9638 | | No log | 0.9872 | 386 | 0.8922 | 0.7516 | 0.8922 | 0.9446 | | No log | 0.9923 | 388 | 1.0415 | 0.7417 | 1.0415 | 1.0205 | | No log | 0.9974 | 390 | 1.4723 | 0.8067 | 1.4723 | 1.2134 | | No log | 1.0026 | 392 | 1.7271 | 0.6328 | 1.7271 | 1.3142 | | No log | 1.0077 | 394 | 1.5359 | 0.5385 | 1.5359 | 1.2393 | | No log | 1.0128 | 396 | 1.1490 | 0.7519 | 1.1490 | 1.0719 | | No log | 1.0179 | 398 | 0.9212 | 0.7524 | 0.9212 | 0.9598 | | No log | 1.0230 | 400 | 0.7470 | 0.7118 | 0.7470 | 0.8643 | | No log | 1.0281 | 402 | 0.7112 | 0.7258 | 0.7112 | 0.8433 | | No log | 1.0332 | 404 | 0.8524 | 0.6877 | 0.8524 | 0.9232 | | No log | 1.0384 | 406 | 1.2090 | 0.7786 | 1.2090 | 1.0996 | | No log | 1.0435 | 408 | 1.4343 | 0.6727 | 1.4343 | 1.1976 | | No log | 1.0486 | 410 | 1.4046 | 0.7148 | 1.4046 | 1.1852 | | No log | 1.0537 | 412 | 1.1984 | 0.7786 | 1.1984 | 1.0947 | | No log | 1.0588 | 414 | 1.0933 | 0.7688 | 1.0933 | 1.0456 | | No log | 1.0639 | 416 | 0.9940 | 0.7337 | 0.9940 | 0.9970 | | No log | 1.0691 | 418 | 1.1435 | 0.7688 | 1.1435 | 1.0693 | | No log | 1.0742 | 420 | 1.3131 | 0.7594 | 1.3131 | 1.1459 | | No log | 1.0793 | 422 | 1.4282 | 0.7412 | 1.4282 | 1.1951 | | No log | 1.0844 | 424 | 1.6065 | 0.6296 | 1.6065 | 1.2675 | | No log | 1.0895 | 426 | 1.5965 | 0.6296 | 1.5965 | 1.2635 | | No log | 1.0946 | 428 | 1.4905 | 0.5608 | 1.4905 | 1.2209 | | No log | 1.0997 | 430 | 1.5453 | 0.5404 | 1.5453 | 1.2431 | | No log | 1.1049 | 432 | 1.4746 | 0.5608 | 1.4746 | 1.2143 | | No log | 1.1100 | 434 | 1.3179 | 0.6103 | 1.3179 | 1.1480 | | No log | 1.1151 | 436 | 1.2484 | 0.5638 | 1.2484 | 1.1173 | | No log | 1.1202 | 438 | 1.1734 | 0.6211 | 1.1734 | 1.0833 | | No log | 1.1253 | 440 | 1.0710 | 0.7424 | 1.0710 | 1.0349 | | No log | 1.1304 | 442 | 1.0806 | 0.7424 | 1.0806 | 1.0395 | | No log | 1.1355 | 444 | 1.2556 | 0.7333 | 1.2556 | 1.1205 | | No log | 1.1407 | 446 | 1.2766 | 0.6967 | 1.2766 | 1.1299 | | No log | 1.1458 | 448 | 1.3521 | 0.6967 | 1.3521 | 1.1628 | | No log | 1.1509 | 450 | 1.2611 | 0.6211 | 1.2611 | 1.1230 | | No log | 1.1560 | 452 | 1.1527 | 0.6265 | 1.1527 | 1.0736 | | No log | 1.1611 | 454 | 1.1408 | 0.6265 | 1.1408 | 1.0681 | | No log | 1.1662 | 456 | 1.1167 | 0.6812 | 1.1167 | 1.0568 | | No log | 1.1714 | 458 | 1.1761 | 0.7123 | 1.1761 | 1.0845 | | No log | 1.1765 | 460 | 1.1650 | 0.7123 | 1.1650 | 1.0793 | | No log | 1.1816 | 462 | 1.1737 | 0.7504 | 1.1737 | 1.0834 | | No log | 1.1867 | 464 | 1.3572 | 0.7502 | 1.3572 | 1.1650 | | No log | 1.1918 | 466 | 1.3875 | 0.7502 | 1.3875 | 1.1779 | | No log | 1.1969 | 468 | 1.1708 | 0.7502 | 1.1708 | 1.0820 | | No log | 1.2020 | 470 | 0.9057 | 0.6945 | 0.9057 | 0.9517 | | No log | 1.2072 | 472 | 0.8584 | 0.6813 | 0.8584 | 0.9265 | | No log | 1.2123 | 474 | 0.9388 | 0.6945 | 0.9388 | 0.9689 | | No log | 1.2174 | 476 | 1.1480 | 0.7417 | 1.1480 | 1.0714 | | No log | 1.2225 | 478 | 1.3034 | 0.7123 | 1.3034 | 1.1417 | | No log | 1.2276 | 480 | 1.2237 | 0.7123 | 1.2237 | 1.1062 | | No log | 1.2327 | 482 | 0.9916 | 0.7310 | 0.9916 | 0.9958 | | No log | 1.2379 | 484 | 0.7953 | 0.7525 | 0.7953 | 0.8918 | | No log | 1.2430 | 486 | 0.7772 | 0.7025 | 0.7772 | 0.8816 | | No log | 1.2481 | 488 | 0.9794 | 0.7053 | 0.9794 | 0.9897 | | No log | 1.2532 | 490 | 1.2996 | 0.6557 | 1.2996 | 1.1400 | | No log | 1.2583 | 492 | 1.5185 | 0.6129 | 1.5185 | 1.2323 | | No log | 1.2634 | 494 | 1.3329 | 0.6260 | 1.3329 | 1.1545 | | No log | 1.2685 | 496 | 0.9421 | 0.7429 | 0.9421 | 0.9706 | | No log | 1.2737 | 498 | 0.6690 | 0.7172 | 0.6691 | 0.8180 | | 0.5075 | 1.2788 | 500 | 0.6050 | 0.7862 | 0.6050 | 0.7778 | | 0.5075 | 1.2839 | 502 | 0.6545 | 0.7289 | 0.6545 | 0.8090 | | 0.5075 | 1.2890 | 504 | 0.7535 | 0.7095 | 0.7535 | 0.8680 | | 0.5075 | 1.2941 | 506 | 0.8276 | 0.7244 | 0.8276 | 0.9097 | | 0.5075 | 1.2992 | 508 | 1.1053 | 0.7157 | 1.1053 | 1.0513 | | 0.5075 | 1.3043 | 510 | 1.5103 | 0.6296 | 1.5103 | 1.2290 | | 0.5075 | 1.3095 | 512 | 1.6041 | 0.6296 | 1.6041 | 1.2665 | | 0.5075 | 1.3146 | 514 | 1.4010 | 0.6545 | 1.4010 | 1.1836 | | 0.5075 | 1.3197 | 516 | 1.2154 | 0.7123 | 1.2154 | 1.1025 | | 0.5075 | 1.3248 | 518 | 1.0488 | 0.6763 | 1.0488 | 1.0241 | | 0.5075 | 1.3299 | 520 | 1.0842 | 0.6686 | 1.0842 | 1.0413 | | 0.5075 | 1.3350 | 522 | 1.2987 | 0.7412 | 1.2987 | 1.1396 | | 0.5075 | 1.3402 | 524 | 1.5701 | 0.6855 | 1.5701 | 1.2530 | | 0.5075 | 1.3453 | 526 | 1.6355 | 0.6296 | 1.6355 | 1.2789 | | 0.5075 | 1.3504 | 528 | 1.6912 | 0.5889 | 1.6912 | 1.3005 | | 0.5075 | 1.3555 | 530 | 1.5611 | 0.6296 | 1.5611 | 1.2494 | | 0.5075 | 1.3606 | 532 | 1.4316 | 0.6855 | 1.4316 | 1.1965 | | 0.5075 | 1.3657 | 534 | 1.5025 | 0.6108 | 1.5025 | 1.2258 | | 0.5075 | 1.3708 | 536 | 1.5531 | 0.5769 | 1.5531 | 1.2463 | | 0.5075 | 1.3760 | 538 | 1.7146 | 0.5548 | 1.7146 | 1.3094 | | 0.5075 | 1.3811 | 540 | 1.8542 | 0.5166 | 1.8542 | 1.3617 | | 0.5075 | 1.3862 | 542 | 1.8613 | 0.5166 | 1.8612 | 1.3643 | | 0.5075 | 1.3913 | 544 | 1.6929 | 0.6928 | 1.6929 | 1.3011 | | 0.5075 | 1.3964 | 546 | 1.3209 | 0.6860 | 1.3209 | 1.1493 | | 0.5075 | 1.4015 | 548 | 1.0328 | 0.7216 | 1.0328 | 1.0162 | | 0.5075 | 1.4066 | 550 | 0.8930 | 0.7244 | 0.8930 | 0.9450 | | 0.5075 | 1.4118 | 552 | 0.8647 | 0.7244 | 0.8647 | 0.9299 | | 0.5075 | 1.4169 | 554 | 0.9753 | 0.7421 | 0.9753 | 0.9876 | | 0.5075 | 1.4220 | 556 | 1.0389 | 0.7421 | 1.0389 | 1.0193 | | 0.5075 | 1.4271 | 558 | 0.9714 | 0.7020 | 0.9714 | 0.9856 | | 0.5075 | 1.4322 | 560 | 1.0200 | 0.7688 | 1.0200 | 1.0100 | | 0.5075 | 1.4373 | 562 | 0.9233 | 0.7517 | 0.9233 | 0.9609 | | 0.5075 | 1.4425 | 564 | 0.7317 | 0.7358 | 0.7317 | 0.8554 | | 0.5075 | 1.4476 | 566 | 0.6643 | 0.7441 | 0.6643 | 0.8150 | | 0.5075 | 1.4527 | 568 | 0.7304 | 0.7601 | 0.7304 | 0.8546 | | 0.5075 | 1.4578 | 570 | 0.8961 | 0.7517 | 0.8961 | 0.9466 | | 0.5075 | 1.4629 | 572 | 1.0690 | 0.7333 | 1.0690 | 1.0339 | | 0.5075 | 1.4680 | 574 | 1.0145 | 0.7421 | 1.0145 | 1.0072 | | 0.5075 | 1.4731 | 576 | 0.8286 | 0.7332 | 0.8286 | 0.9103 | | 0.5075 | 1.4783 | 578 | 0.6859 | 0.7441 | 0.6859 | 0.8282 | | 0.5075 | 1.4834 | 580 | 0.6679 | 0.7441 | 0.6679 | 0.8172 | | 0.5075 | 1.4885 | 582 | 0.7510 | 0.7167 | 0.7510 | 0.8666 | | 0.5075 | 1.4936 | 584 | 0.9088 | 0.7692 | 0.9088 | 0.9533 | | 0.5075 | 1.4987 | 586 | 1.0046 | 0.7598 | 1.0046 | 1.0023 | | 0.5075 | 1.5038 | 588 | 1.0991 | 0.7081 | 1.0991 | 1.0484 | | 0.5075 | 1.5090 | 590 | 1.0170 | 0.7598 | 1.0170 | 1.0085 | | 0.5075 | 1.5141 | 592 | 0.8817 | 0.7602 | 0.8817 | 0.9390 | | 0.5075 | 1.5192 | 594 | 0.7671 | 0.7606 | 0.7671 | 0.8759 | | 0.5075 | 1.5243 | 596 | 0.7125 | 0.7525 | 0.7125 | 0.8441 | | 0.5075 | 1.5294 | 598 | 0.7283 | 0.7700 | 0.7283 | 0.8534 | | 0.5075 | 1.5345 | 600 | 0.8319 | 0.7602 | 0.8319 | 0.9121 | | 0.5075 | 1.5396 | 602 | 0.9640 | 0.7333 | 0.9640 | 0.9819 | | 0.5075 | 1.5448 | 604 | 1.0797 | 0.7908 | 1.0797 | 1.0391 | | 0.5075 | 1.5499 | 606 | 0.9749 | 0.7255 | 0.9749 | 0.9874 | | 0.5075 | 1.5550 | 608 | 0.8725 | 0.7255 | 0.8725 | 0.9341 | | 0.5075 | 1.5601 | 610 | 0.9486 | 0.7333 | 0.9486 | 0.9740 | | 0.5075 | 1.5652 | 612 | 1.0052 | 0.7333 | 1.0052 | 1.0026 | | 0.5075 | 1.5703 | 614 | 1.0609 | 0.7333 | 1.0609 | 1.0300 | | 0.5075 | 1.5754 | 616 | 1.1211 | 0.7417 | 1.1211 | 1.0588 | | 0.5075 | 1.5806 | 618 | 1.3273 | 0.7123 | 1.3273 | 1.1521 | | 0.5075 | 1.5857 | 620 | 1.3902 | 0.6545 | 1.3902 | 1.1791 | | 0.5075 | 1.5908 | 622 | 1.2558 | 0.7211 | 1.2558 | 1.1206 | | 0.5075 | 1.5959 | 624 | 1.1108 | 0.7211 | 1.1108 | 1.0539 | | 0.5075 | 1.6010 | 626 | 0.9813 | 0.7511 | 0.9813 | 0.9906 | | 0.5075 | 1.6061 | 628 | 0.9265 | 0.7239 | 0.9265 | 0.9625 | | 0.5075 | 1.6113 | 630 | 0.9584 | 0.7259 | 0.9584 | 0.9790 | | 0.5075 | 1.6164 | 632 | 1.1368 | 0.7720 | 1.1368 | 1.0662 | | 0.5075 | 1.6215 | 634 | 1.2595 | 0.7882 | 1.2595 | 1.1223 | | 0.5075 | 1.6266 | 636 | 1.3397 | 0.7342 | 1.3397 | 1.1575 | | 0.5075 | 1.6317 | 638 | 1.3930 | 0.6831 | 1.3930 | 1.1803 | | 0.5075 | 1.6368 | 640 | 1.2310 | 0.6855 | 1.2310 | 1.1095 | | 0.5075 | 1.6419 | 642 | 1.0432 | 0.6818 | 1.0432 | 1.0214 | | 0.5075 | 1.6471 | 644 | 1.0380 | 0.7129 | 1.0380 | 1.0188 | | 0.5075 | 1.6522 | 646 | 1.1224 | 0.7129 | 1.1224 | 1.0594 | | 0.5075 | 1.6573 | 648 | 1.0584 | 0.7129 | 1.0584 | 1.0288 | | 0.5075 | 1.6624 | 650 | 0.9872 | 0.7129 | 0.9872 | 0.9936 | | 0.5075 | 1.6675 | 652 | 0.9018 | 0.7518 | 0.9018 | 0.9496 | | 0.5075 | 1.6726 | 654 | 1.0147 | 0.7511 | 1.0147 | 1.0073 | | 0.5075 | 1.6777 | 656 | 0.9905 | 0.7509 | 0.9905 | 0.9952 | | 0.5075 | 1.6829 | 658 | 0.9473 | 0.7724 | 0.9473 | 0.9733 | | 0.5075 | 1.6880 | 660 | 0.9219 | 0.7583 | 0.9219 | 0.9602 | | 0.5075 | 1.6931 | 662 | 0.9036 | 0.7583 | 0.9036 | 0.9506 | | 0.5075 | 1.6982 | 664 | 0.9355 | 0.7660 | 0.9355 | 0.9672 | | 0.5075 | 1.7033 | 666 | 0.9677 | 0.7354 | 0.9677 | 0.9837 | | 0.5075 | 1.7084 | 668 | 0.9230 | 0.7354 | 0.9230 | 0.9607 | | 0.5075 | 1.7136 | 670 | 0.8581 | 0.7601 | 0.8581 | 0.9263 | | 0.5075 | 1.7187 | 672 | 0.8767 | 0.7601 | 0.8767 | 0.9363 | | 0.5075 | 1.7238 | 674 | 0.8683 | 0.7522 | 0.8683 | 0.9318 | | 0.5075 | 1.7289 | 676 | 0.9119 | 0.7759 | 0.9119 | 0.9549 | | 0.5075 | 1.7340 | 678 | 0.8048 | 0.7522 | 0.8048 | 0.8971 | | 0.5075 | 1.7391 | 680 | 0.7859 | 0.7882 | 0.7859 | 0.8865 | | 0.5075 | 1.7442 | 682 | 0.8602 | 0.7354 | 0.8602 | 0.9275 | | 0.5075 | 1.7494 | 684 | 0.9239 | 0.7354 | 0.9239 | 0.9612 | | 0.5075 | 1.7545 | 686 | 1.0329 | 0.7259 | 1.0329 | 1.0163 | | 0.5075 | 1.7596 | 688 | 1.1704 | 0.6797 | 1.1704 | 1.0819 | | 0.5075 | 1.7647 | 690 | 1.1014 | 0.6866 | 1.1014 | 1.0495 | | 0.5075 | 1.7698 | 692 | 0.8593 | 0.7437 | 0.8593 | 0.9270 | | 0.5075 | 1.7749 | 694 | 0.7551 | 0.7529 | 0.7551 | 0.8690 | | 0.5075 | 1.7801 | 696 | 0.7765 | 0.7100 | 0.7765 | 0.8812 | | 0.5075 | 1.7852 | 698 | 0.9332 | 0.7354 | 0.9332 | 0.9660 | | 0.5075 | 1.7903 | 700 | 0.9644 | 0.7019 | 0.9644 | 0.9821 | | 0.5075 | 1.7954 | 702 | 0.9534 | 0.7277 | 0.9534 | 0.9764 | | 0.5075 | 1.8005 | 704 | 0.8824 | 0.7208 | 0.8824 | 0.9394 | | 0.5075 | 1.8056 | 706 | 0.8378 | 0.7208 | 0.8378 | 0.9153 | | 0.5075 | 1.8107 | 708 | 0.8983 | 0.7208 | 0.8983 | 0.9478 | | 0.5075 | 1.8159 | 710 | 0.8336 | 0.7208 | 0.8336 | 0.9130 | | 0.5075 | 1.8210 | 712 | 0.8422 | 0.7208 | 0.8422 | 0.9177 | | 0.5075 | 1.8261 | 714 | 0.7818 | 0.7208 | 0.7818 | 0.8842 | | 0.5075 | 1.8312 | 716 | 0.7732 | 0.7281 | 0.7732 | 0.8793 | | 0.5075 | 1.8363 | 718 | 0.8908 | 0.7208 | 0.8908 | 0.9438 | | 0.5075 | 1.8414 | 720 | 0.8871 | 0.7208 | 0.8871 | 0.9419 | | 0.5075 | 1.8465 | 722 | 0.7572 | 0.7281 | 0.7572 | 0.8702 | | 0.5075 | 1.8517 | 724 | 0.6228 | 0.7786 | 0.6228 | 0.7891 | | 0.5075 | 1.8568 | 726 | 0.6091 | 0.7786 | 0.6091 | 0.7804 | | 0.5075 | 1.8619 | 728 | 0.7122 | 0.7363 | 0.7122 | 0.8439 | | 0.5075 | 1.8670 | 730 | 0.9815 | 0.7255 | 0.9815 | 0.9907 | | 0.5075 | 1.8721 | 732 | 1.3285 | 0.6841 | 1.3285 | 1.1526 | | 0.5075 | 1.8772 | 734 | 1.4493 | 0.7119 | 1.4493 | 1.2039 | | 0.5075 | 1.8824 | 736 | 1.3106 | 0.6841 | 1.3106 | 1.1448 | | 0.5075 | 1.8875 | 738 | 1.0109 | 0.7421 | 1.0109 | 1.0054 | | 0.5075 | 1.8926 | 740 | 0.8258 | 0.7902 | 0.8258 | 0.9087 | | 0.5075 | 1.8977 | 742 | 0.7337 | 0.8164 | 0.7337 | 0.8565 | | 0.5075 | 1.9028 | 744 | 0.6861 | 0.8185 | 0.6861 | 0.8283 | | 0.5075 | 1.9079 | 746 | 0.7383 | 0.7675 | 0.7383 | 0.8593 | | 0.5075 | 1.9130 | 748 | 0.8563 | 0.7593 | 0.8563 | 0.9254 | | 0.5075 | 1.9182 | 750 | 1.0207 | 0.7514 | 1.0207 | 1.0103 | | 0.5075 | 1.9233 | 752 | 0.9610 | 0.7514 | 0.9610 | 0.9803 | | 0.5075 | 1.9284 | 754 | 0.8452 | 0.7948 | 0.8452 | 0.9193 | | 0.5075 | 1.9335 | 756 | 0.7150 | 0.7955 | 0.7150 | 0.8456 | | 0.5075 | 1.9386 | 758 | 0.6680 | 0.7786 | 0.6680 | 0.8173 | | 0.5075 | 1.9437 | 760 | 0.7380 | 0.7852 | 0.7380 | 0.8590 | | 0.5075 | 1.9488 | 762 | 0.9634 | 0.7514 | 0.9634 | 0.9816 | | 0.5075 | 1.9540 | 764 | 1.0908 | 0.6738 | 1.0908 | 1.0444 | | 0.5075 | 1.9591 | 766 | 1.0150 | 0.7514 | 1.0150 | 1.0075 | | 0.5075 | 1.9642 | 768 | 0.8832 | 0.7602 | 0.8832 | 0.9398 | | 0.5075 | 1.9693 | 770 | 0.8007 | 0.7906 | 0.8007 | 0.8948 | | 0.5075 | 1.9744 | 772 | 0.7222 | 0.7986 | 0.7222 | 0.8498 | | 0.5075 | 1.9795 | 774 | 0.6887 | 0.7986 | 0.6887 | 0.8299 | | 0.5075 | 1.9847 | 776 | 0.7453 | 0.7700 | 0.7453 | 0.8633 | | 0.5075 | 1.9898 | 778 | 0.8388 | 0.7756 | 0.8388 | 0.9158 | | 0.5075 | 1.9949 | 780 | 1.0073 | 0.7014 | 1.0073 | 1.0036 | | 0.5075 | 2.0 | 782 | 0.9634 | 0.7081 | 0.9634 | 0.9816 | | 0.5075 | 2.0051 | 784 | 0.7917 | 0.7430 | 0.7917 | 0.8898 | | 0.5075 | 2.0102 | 786 | 0.6532 | 0.8171 | 0.6532 | 0.8082 | | 0.5075 | 2.0153 | 788 | 0.6354 | 0.7990 | 0.6354 | 0.7971 | | 0.5075 | 2.0205 | 790 | 0.7059 | 0.7910 | 0.7059 | 0.8402 | | 0.5075 | 2.0256 | 792 | 0.9409 | 0.7421 | 0.9409 | 0.9700 | | 0.5075 | 2.0307 | 794 | 1.1616 | 0.7502 | 1.1616 | 1.0778 | | 0.5075 | 2.0358 | 796 | 1.1253 | 0.7944 | 1.1253 | 1.0608 | | 0.5075 | 2.0409 | 798 | 0.9609 | 0.7947 | 0.9609 | 0.9802 | | 0.5075 | 2.0460 | 800 | 0.8319 | 0.7259 | 0.8319 | 0.9121 | | 0.5075 | 2.0512 | 802 | 0.8681 | 0.7244 | 0.8681 | 0.9317 | | 0.5075 | 2.0563 | 804 | 0.9234 | 0.7337 | 0.9234 | 0.9609 | | 0.5075 | 2.0614 | 806 | 0.8479 | 0.7522 | 0.8479 | 0.9208 | | 0.5075 | 2.0665 | 808 | 0.8113 | 0.7519 | 0.8113 | 0.9007 | | 0.5075 | 2.0716 | 810 | 0.8768 | 0.7849 | 0.8768 | 0.9364 | | 0.5075 | 2.0767 | 812 | 0.8101 | 0.7716 | 0.8101 | 0.9001 | | 0.5075 | 2.0818 | 814 | 0.8197 | 0.7445 | 0.8197 | 0.9054 | | 0.5075 | 2.0870 | 816 | 0.8325 | 0.7593 | 0.8325 | 0.9124 | | 0.5075 | 2.0921 | 818 | 0.9011 | 0.7759 | 0.9011 | 0.9493 | | 0.5075 | 2.0972 | 820 | 1.0210 | 0.6866 | 1.0210 | 1.0104 | | 0.5075 | 2.1023 | 822 | 0.9564 | 0.6802 | 0.9564 | 0.9779 | | 0.5075 | 2.1074 | 824 | 0.8065 | 0.8293 | 0.8065 | 0.8981 | | 0.5075 | 2.1125 | 826 | 0.7455 | 0.8123 | 0.7455 | 0.8634 | | 0.5075 | 2.1176 | 828 | 0.8014 | 0.7852 | 0.8014 | 0.8952 | | 0.5075 | 2.1228 | 830 | 0.8530 | 0.7518 | 0.8530 | 0.9236 | | 0.5075 | 2.1279 | 832 | 1.0189 | 0.6738 | 1.0189 | 1.0094 | | 0.5075 | 2.1330 | 834 | 1.0800 | 0.6797 | 1.0800 | 1.0392 | | 0.5075 | 2.1381 | 836 | 0.9920 | 0.6866 | 0.9920 | 0.9960 | | 0.5075 | 2.1432 | 838 | 0.7895 | 0.7518 | 0.7895 | 0.8885 | | 0.5075 | 2.1483 | 840 | 0.6395 | 0.7886 | 0.6395 | 0.7997 | | 0.5075 | 2.1535 | 842 | 0.6168 | 0.7789 | 0.6168 | 0.7853 | | 0.5075 | 2.1586 | 844 | 0.6903 | 0.8295 | 0.6903 | 0.8309 | | 0.5075 | 2.1637 | 846 | 0.8306 | 0.7518 | 0.8306 | 0.9114 | | 0.5075 | 2.1688 | 848 | 0.9566 | 0.7337 | 0.9566 | 0.9780 | | 0.5075 | 2.1739 | 850 | 0.9045 | 0.7337 | 0.9045 | 0.9511 | | 0.5075 | 2.1790 | 852 | 0.9188 | 0.7421 | 0.9188 | 0.9585 | | 0.5075 | 2.1841 | 854 | 1.0379 | 0.7511 | 1.0379 | 1.0188 | | 0.5075 | 2.1893 | 856 | 1.0469 | 0.7511 | 1.0469 | 1.0232 | | 0.5075 | 2.1944 | 858 | 1.1013 | 0.7511 | 1.1013 | 1.0494 | | 0.5075 | 2.1995 | 860 | 1.0348 | 0.7511 | 1.0348 | 1.0173 | | 0.5075 | 2.2046 | 862 | 0.8443 | 0.7337 | 0.8443 | 0.9189 | | 0.5075 | 2.2097 | 864 | 0.7072 | 0.7346 | 0.7072 | 0.8409 | | 0.5075 | 2.2148 | 866 | 0.7338 | 0.7346 | 0.7338 | 0.8566 | | 0.5075 | 2.2199 | 868 | 0.8855 | 0.7421 | 0.8855 | 0.9410 | | 0.5075 | 2.2251 | 870 | 1.0990 | 0.7417 | 1.0990 | 1.0483 | | 0.5075 | 2.2302 | 872 | 1.2641 | 0.6860 | 1.2641 | 1.1243 | | 0.5075 | 2.2353 | 874 | 1.2085 | 0.6860 | 1.2085 | 1.0993 | | 0.5075 | 2.2404 | 876 | 1.0439 | 0.7511 | 1.0439 | 1.0217 | | 0.5075 | 2.2455 | 878 | 0.9690 | 0.7511 | 0.9690 | 0.9844 | | 0.5075 | 2.2506 | 880 | 0.8315 | 0.7729 | 0.8315 | 0.9119 | | 0.5075 | 2.2558 | 882 | 0.7510 | 0.7623 | 0.7510 | 0.8666 | | 0.5075 | 2.2609 | 884 | 0.7671 | 0.7902 | 0.7671 | 0.8759 | | 0.5075 | 2.2660 | 886 | 0.8357 | 0.7337 | 0.8357 | 0.9142 | | 0.5075 | 2.2711 | 888 | 0.9812 | 0.7333 | 0.9812 | 0.9906 | | 0.5075 | 2.2762 | 890 | 0.9855 | 0.7333 | 0.9855 | 0.9927 | | 0.5075 | 2.2813 | 892 | 0.9255 | 0.7255 | 0.9255 | 0.9620 | | 0.5075 | 2.2864 | 894 | 0.7733 | 0.7759 | 0.7733 | 0.8794 | | 0.5075 | 2.2916 | 896 | 0.7467 | 0.7852 | 0.7467 | 0.8641 | | 0.5075 | 2.2967 | 898 | 0.8576 | 0.7259 | 0.8576 | 0.9261 | | 0.5075 | 2.3018 | 900 | 0.9894 | 0.7255 | 0.9894 | 0.9947 | | 0.5075 | 2.3069 | 902 | 1.0213 | 0.7255 | 1.0213 | 1.0106 | | 0.5075 | 2.3120 | 904 | 0.9987 | 0.7255 | 0.9987 | 0.9994 | | 0.5075 | 2.3171 | 906 | 0.9179 | 0.7337 | 0.9179 | 0.9581 | | 0.5075 | 2.3223 | 908 | 0.8079 | 0.7342 | 0.8079 | 0.8989 | | 0.5075 | 2.3274 | 910 | 0.7996 | 0.7342 | 0.7996 | 0.8942 | | 0.5075 | 2.3325 | 912 | 0.8753 | 0.7259 | 0.8753 | 0.9356 | | 0.5075 | 2.3376 | 914 | 1.0428 | 0.7255 | 1.0428 | 1.0212 | | 0.5075 | 2.3427 | 916 | 1.2219 | 0.6797 | 1.2219 | 1.1054 | | 0.5075 | 2.3478 | 918 | 1.2670 | 0.6260 | 1.2670 | 1.1256 | | 0.5075 | 2.3529 | 920 | 1.1984 | 0.6303 | 1.1984 | 1.0947 | | 0.5075 | 2.3581 | 922 | 1.0727 | 0.7511 | 1.0727 | 1.0357 | | 0.5075 | 2.3632 | 924 | 1.0132 | 0.7511 | 1.0132 | 1.0066 | | 0.5075 | 2.3683 | 926 | 1.0931 | 0.6934 | 1.0931 | 1.0455 | | 0.5075 | 2.3734 | 928 | 1.2974 | 0.6303 | 1.2974 | 1.1390 | | 0.5075 | 2.3785 | 930 | 1.3794 | 0.6303 | 1.3794 | 1.1745 | | 0.5075 | 2.3836 | 932 | 1.2617 | 0.6303 | 1.2617 | 1.1233 | | 0.5075 | 2.3887 | 934 | 1.1074 | 0.6797 | 1.1074 | 1.0523 | | 0.5075 | 2.3939 | 936 | 1.1169 | 0.6797 | 1.1169 | 1.0568 | | 0.5075 | 2.3990 | 938 | 1.1822 | 0.6797 | 1.1822 | 1.0873 | | 0.5075 | 2.4041 | 940 | 1.1475 | 0.6797 | 1.1475 | 1.0712 | | 0.5075 | 2.4092 | 942 | 1.0258 | 0.6738 | 1.0258 | 1.0128 | | 0.5075 | 2.4143 | 944 | 0.8731 | 0.7346 | 0.8731 | 0.9344 | | 0.5075 | 2.4194 | 946 | 0.7770 | 0.7610 | 0.7770 | 0.8815 | | 0.5075 | 2.4246 | 948 | 0.7857 | 0.7610 | 0.7857 | 0.8864 | | 0.5075 | 2.4297 | 950 | 0.8307 | 0.7522 | 0.8307 | 0.9114 | | 0.5075 | 2.4348 | 952 | 0.8700 | 0.7672 | 0.8700 | 0.9327 | | 0.5075 | 2.4399 | 954 | 0.8679 | 0.7672 | 0.8679 | 0.9316 | | 0.5075 | 2.4450 | 956 | 0.8410 | 0.7672 | 0.8410 | 0.9171 | | 0.5075 | 2.4501 | 958 | 0.8271 | 0.7610 | 0.8271 | 0.9094 | | 0.5075 | 2.4552 | 960 | 0.8548 | 0.7852 | 0.8548 | 0.9246 | | 0.5075 | 2.4604 | 962 | 0.8927 | 0.7672 | 0.8927 | 0.9448 | | 0.5075 | 2.4655 | 964 | 0.8678 | 0.7672 | 0.8678 | 0.9316 | | 0.5075 | 2.4706 | 966 | 0.8360 | 0.7672 | 0.8360 | 0.9143 | | 0.5075 | 2.4757 | 968 | 0.8320 | 0.7759 | 0.8320 | 0.9121 | | 0.5075 | 2.4808 | 970 | 0.8198 | 0.7700 | 0.8198 | 0.9055 | | 0.5075 | 2.4859 | 972 | 0.7304 | 0.7610 | 0.7304 | 0.8546 | | 0.5075 | 2.4910 | 974 | 0.6352 | 0.7529 | 0.6352 | 0.7970 | | 0.5075 | 2.4962 | 976 | 0.6208 | 0.7529 | 0.6208 | 0.7879 | | 0.5075 | 2.5013 | 978 | 0.7065 | 0.7525 | 0.7065 | 0.8405 | | 0.5075 | 2.5064 | 980 | 0.8503 | 0.7255 | 0.8503 | 0.9221 | | 0.5075 | 2.5115 | 982 | 0.9419 | 0.7333 | 0.9419 | 0.9705 | | 0.5075 | 2.5166 | 984 | 0.9286 | 0.7333 | 0.9286 | 0.9637 | | 0.5075 | 2.5217 | 986 | 0.8852 | 0.7333 | 0.8852 | 0.9408 | | 0.5075 | 2.5269 | 988 | 0.8849 | 0.7333 | 0.8849 | 0.9407 | | 0.5075 | 2.5320 | 990 | 0.8290 | 0.7255 | 0.8290 | 0.9105 | | 0.5075 | 2.5371 | 992 | 0.7753 | 0.7350 | 0.7753 | 0.8805 | | 0.5075 | 2.5422 | 994 | 0.7986 | 0.7086 | 0.7986 | 0.8937 | | 0.5075 | 2.5473 | 996 | 0.9149 | 0.7333 | 0.9149 | 0.9565 | | 0.5075 | 2.5524 | 998 | 0.8668 | 0.7598 | 0.8668 | 0.9310 | | 0.1762 | 2.5575 | 1000 | 0.7745 | 0.7597 | 0.7745 | 0.8800 | | 0.1762 | 2.5627 | 1002 | 0.6135 | 0.8022 | 0.6135 | 0.7832 | | 0.1762 | 2.5678 | 1004 | 0.5577 | 0.7773 | 0.5577 | 0.7468 | | 0.1762 | 2.5729 | 1006 | 0.5899 | 0.7603 | 0.5899 | 0.7681 | | 0.1762 | 2.5780 | 1008 | 0.7404 | 0.7445 | 0.7404 | 0.8605 | | 0.1762 | 2.5831 | 1010 | 0.8310 | 0.7593 | 0.8310 | 0.9116 | | 0.1762 | 2.5882 | 1012 | 0.7640 | 0.7593 | 0.7640 | 0.8741 | | 0.1762 | 2.5934 | 1014 | 0.6595 | 0.7522 | 0.6595 | 0.8121 | | 0.1762 | 2.5985 | 1016 | 0.6780 | 0.7601 | 0.6780 | 0.8234 | | 0.1762 | 2.6036 | 1018 | 0.8036 | 0.7849 | 0.8036 | 0.8965 | | 0.1762 | 2.6087 | 1020 | 0.9673 | 0.7594 | 0.9673 | 0.9835 | | 0.1762 | 2.6138 | 1022 | 1.1890 | 0.7586 | 1.1890 | 1.0904 | | 0.1762 | 2.6189 | 1024 | 1.2370 | 0.7882 | 1.2370 | 1.1122 | | 0.1762 | 2.6240 | 1026 | 1.0680 | 0.7586 | 1.0680 | 1.0334 | | 0.1762 | 2.6292 | 1028 | 0.7920 | 0.7852 | 0.7920 | 0.8899 | | 0.1762 | 2.6343 | 1030 | 0.6052 | 0.7689 | 0.6052 | 0.7779 | | 0.1762 | 2.6394 | 1032 | 0.5520 | 0.7367 | 0.5520 | 0.7430 | | 0.1762 | 2.6445 | 1034 | 0.5925 | 0.7766 | 0.5925 | 0.7697 | | 0.1762 | 2.6496 | 1036 | 0.7418 | 0.7601 | 0.7418 | 0.8613 | | 0.1762 | 2.6547 | 1038 | 1.0496 | 0.7882 | 1.0496 | 1.0245 | | 0.1762 | 2.6598 | 1040 | 1.2622 | 0.8144 | 1.2622 | 1.1235 | | 0.1762 | 2.6650 | 1042 | 1.2282 | 0.7882 | 1.2282 | 1.1083 | | 0.1762 | 2.6701 | 1044 | 1.0325 | 0.7908 | 1.0325 | 1.0161 | | 0.1762 | 2.6752 | 1046 | 0.9060 | 0.7421 | 0.9060 | 0.9518 | | 0.1762 | 2.6803 | 1048 | 0.8154 | 0.7516 | 0.8154 | 0.9030 | | 0.1762 | 2.6854 | 1050 | 0.7909 | 0.7433 | 0.7909 | 0.8893 | | 0.1762 | 2.6905 | 1052 | 0.7434 | 0.7601 | 0.7434 | 0.8622 | | 0.1762 | 2.6957 | 1054 | 0.6838 | 0.7601 | 0.6838 | 0.8269 | | 0.1762 | 2.7008 | 1056 | 0.7232 | 0.7601 | 0.7232 | 0.8504 | | 0.1762 | 2.7059 | 1058 | 0.7077 | 0.7522 | 0.7077 | 0.8413 | | 0.1762 | 2.7110 | 1060 | 0.6500 | 0.7605 | 0.6500 | 0.8062 | | 0.1762 | 2.7161 | 1062 | 0.6557 | 0.7605 | 0.6557 | 0.8097 | | 0.1762 | 2.7212 | 1064 | 0.7320 | 0.7522 | 0.7320 | 0.8556 | | 0.1762 | 2.7263 | 1066 | 0.8764 | 0.7672 | 0.8764 | 0.9361 | | 0.1762 | 2.7315 | 1068 | 0.9354 | 0.7953 | 0.9354 | 0.9672 | | 0.1762 | 2.7366 | 1070 | 0.8874 | 0.7435 | 0.8874 | 0.9420 | | 0.1762 | 2.7417 | 1072 | 0.8196 | 0.7759 | 0.8196 | 0.9053 | | 0.1762 | 2.7468 | 1074 | 0.7181 | 0.7689 | 0.7181 | 0.8474 | | 0.1762 | 2.7519 | 1076 | 0.7195 | 0.7689 | 0.7195 | 0.8482 | | 0.1762 | 2.7570 | 1078 | 0.7736 | 0.7852 | 0.7736 | 0.8796 | | 0.1762 | 2.7621 | 1080 | 0.8289 | 0.7435 | 0.8289 | 0.9104 | | 0.1762 | 2.7673 | 1082 | 0.8751 | 0.7435 | 0.8751 | 0.9354 | | 0.1762 | 2.7724 | 1084 | 0.9846 | 0.7255 | 0.9846 | 0.9922 | | 0.1762 | 2.7775 | 1086 | 1.1573 | 0.7342 | 1.1573 | 1.0758 | | 0.1762 | 2.7826 | 1088 | 1.1705 | 0.7342 | 1.1705 | 1.0819 | | 0.1762 | 2.7877 | 1090 | 1.0142 | 0.7255 | 1.0142 | 1.0071 | | 0.1762 | 2.7928 | 1092 | 0.7830 | 0.7606 | 0.7830 | 0.8848 | | 0.1762 | 2.7980 | 1094 | 0.6257 | 0.7786 | 0.6257 | 0.7910 | | 0.1762 | 2.8031 | 1096 | 0.5682 | 0.7786 | 0.5682 | 0.7538 | | 0.1762 | 2.8082 | 1098 | 0.5395 | 0.7696 | 0.5395 | 0.7345 | | 0.1762 | 2.8133 | 1100 | 0.5744 | 0.7696 | 0.5744 | 0.7579 | | 0.1762 | 2.8184 | 1102 | 0.6883 | 0.7605 | 0.6883 | 0.8297 | | 0.1762 | 2.8235 | 1104 | 0.8674 | 0.7672 | 0.8674 | 0.9313 | | 0.1762 | 2.8286 | 1106 | 1.0343 | 0.7514 | 1.0343 | 1.0170 | | 0.1762 | 2.8338 | 1108 | 1.0090 | 0.7514 | 1.0090 | 1.0045 | | 0.1762 | 2.8389 | 1110 | 0.9103 | 0.7696 | 0.9103 | 0.9541 | | 0.1762 | 2.8440 | 1112 | 0.7739 | 0.7430 | 0.7739 | 0.8797 | | 0.1762 | 2.8491 | 1114 | 0.6572 | 0.7882 | 0.6572 | 0.8107 | | 0.1762 | 2.8542 | 1116 | 0.6428 | 0.7882 | 0.6428 | 0.8018 | | 0.1762 | 2.8593 | 1118 | 0.6597 | 0.7882 | 0.6597 | 0.8122 | | 0.1762 | 2.8645 | 1120 | 0.7645 | 0.7437 | 0.7645 | 0.8744 | | 0.1762 | 2.8696 | 1122 | 0.8515 | 0.7437 | 0.8515 | 0.9228 | | 0.1762 | 2.8747 | 1124 | 0.8358 | 0.7724 | 0.8358 | 0.9142 | | 0.1762 | 2.8798 | 1126 | 0.7281 | 0.7605 | 0.7281 | 0.8533 | | 0.1762 | 2.8849 | 1128 | 0.6854 | 0.7605 | 0.6854 | 0.8279 | | 0.1762 | 2.8900 | 1130 | 0.6179 | 0.7605 | 0.6179 | 0.7861 | | 0.1762 | 2.8951 | 1132 | 0.6147 | 0.7605 | 0.6147 | 0.7840 | | 0.1762 | 2.9003 | 1134 | 0.6183 | 0.7752 | 0.6183 | 0.7863 | | 0.1762 | 2.9054 | 1136 | 0.6226 | 0.7605 | 0.6226 | 0.7891 | | 0.1762 | 2.9105 | 1138 | 0.6341 | 0.7605 | 0.6341 | 0.7963 | | 0.1762 | 2.9156 | 1140 | 0.6974 | 0.7522 | 0.6974 | 0.8351 | | 0.1762 | 2.9207 | 1142 | 0.7980 | 0.7675 | 0.7980 | 0.8933 | | 0.1762 | 2.9258 | 1144 | 0.7815 | 0.7675 | 0.7815 | 0.8840 | | 0.1762 | 2.9309 | 1146 | 0.7238 | 0.8057 | 0.7238 | 0.8508 | | 0.1762 | 2.9361 | 1148 | 0.6517 | 0.7786 | 0.6517 | 0.8073 | | 0.1762 | 2.9412 | 1150 | 0.6555 | 0.7882 | 0.6555 | 0.8096 | | 0.1762 | 2.9463 | 1152 | 0.6543 | 0.7882 | 0.6543 | 0.8089 | | 0.1762 | 2.9514 | 1154 | 0.6846 | 0.8057 | 0.6846 | 0.8274 | | 0.1762 | 2.9565 | 1156 | 0.7083 | 0.8057 | 0.7083 | 0.8416 | | 0.1762 | 2.9616 | 1158 | 0.7207 | 0.7526 | 0.7207 | 0.8489 | | 0.1762 | 2.9668 | 1160 | 0.7315 | 0.7852 | 0.7315 | 0.8553 | | 0.1762 | 2.9719 | 1162 | 0.7724 | 0.7663 | 0.7724 | 0.8789 | | 0.1762 | 2.9770 | 1164 | 0.7468 | 0.7663 | 0.7468 | 0.8642 | | 0.1762 | 2.9821 | 1166 | 0.6953 | 0.7675 | 0.6953 | 0.8339 | | 0.1762 | 2.9872 | 1168 | 0.7460 | 0.7675 | 0.7460 | 0.8637 | | 0.1762 | 2.9923 | 1170 | 0.8607 | 0.7518 | 0.8607 | 0.9277 | | 0.1762 | 2.9974 | 1172 | 0.8725 | 0.7518 | 0.8725 | 0.9341 | | 0.1762 | 3.0026 | 1174 | 0.8641 | 0.7518 | 0.8641 | 0.9296 | | 0.1762 | 3.0077 | 1176 | 0.8186 | 0.7606 | 0.8186 | 0.9047 | | 0.1762 | 3.0128 | 1178 | 0.8826 | 0.7342 | 0.8826 | 0.9395 | | 0.1762 | 3.0179 | 1180 | 1.0208 | 0.7421 | 1.0208 | 1.0104 | | 0.1762 | 3.0230 | 1182 | 1.1625 | 0.6860 | 1.1625 | 1.0782 | | 0.1762 | 3.0281 | 1184 | 1.1418 | 0.6860 | 1.1418 | 1.0686 | | 0.1762 | 3.0332 | 1186 | 1.0299 | 0.7333 | 1.0299 | 1.0149 | | 0.1762 | 3.0384 | 1188 | 1.0102 | 0.7333 | 1.0102 | 1.0051 | | 0.1762 | 3.0435 | 1190 | 0.9015 | 0.7337 | 0.9015 | 0.9495 | | 0.1762 | 3.0486 | 1192 | 0.8093 | 0.7063 | 0.8093 | 0.8996 | | 0.1762 | 3.0537 | 1194 | 0.8355 | 0.7342 | 0.8355 | 0.9141 | | 0.1762 | 3.0588 | 1196 | 0.9180 | 0.7426 | 0.9180 | 0.9581 | | 0.1762 | 3.0639 | 1198 | 0.8930 | 0.7221 | 0.8930 | 0.9450 | | 0.1762 | 3.0691 | 1200 | 0.8618 | 0.7729 | 0.8618 | 0.9283 | | 0.1762 | 3.0742 | 1202 | 0.8570 | 0.7729 | 0.8570 | 0.9258 | | 0.1762 | 3.0793 | 1204 | 0.8210 | 0.7729 | 0.8210 | 0.9061 | | 0.1762 | 3.0844 | 1206 | 0.7620 | 0.7525 | 0.7620 | 0.8729 | | 0.1762 | 3.0895 | 1208 | 0.7238 | 0.7525 | 0.7238 | 0.8507 | | 0.1762 | 3.0946 | 1210 | 0.7358 | 0.7801 | 0.7358 | 0.8578 | | 0.1762 | 3.0997 | 1212 | 0.8017 | 0.6987 | 0.8017 | 0.8954 | | 0.1762 | 3.1049 | 1214 | 0.8454 | 0.7182 | 0.8454 | 0.9195 | | 0.1762 | 3.1100 | 1216 | 0.9564 | 0.7255 | 0.9564 | 0.9780 | | 0.1762 | 3.1151 | 1218 | 0.8981 | 0.7182 | 0.8981 | 0.9477 | | 0.1762 | 3.1202 | 1220 | 0.7293 | 0.7522 | 0.7293 | 0.8540 | | 0.1762 | 3.1253 | 1222 | 0.6318 | 0.7358 | 0.6318 | 0.7949 | | 0.1762 | 3.1304 | 1224 | 0.6443 | 0.7522 | 0.6443 | 0.8027 | | 0.1762 | 3.1355 | 1226 | 0.6319 | 0.7358 | 0.6319 | 0.7949 | | 0.1762 | 3.1407 | 1228 | 0.6628 | 0.7522 | 0.6628 | 0.8141 | | 0.1762 | 3.1458 | 1230 | 0.7275 | 0.7672 | 0.7275 | 0.8529 | | 0.1762 | 3.1509 | 1232 | 0.8678 | 0.7672 | 0.8678 | 0.9316 | | 0.1762 | 3.1560 | 1234 | 0.9162 | 0.7891 | 0.9162 | 0.9572 | | 0.1762 | 3.1611 | 1236 | 0.8168 | 0.7672 | 0.8168 | 0.9037 | | 0.1762 | 3.1662 | 1238 | 0.6908 | 0.7593 | 0.6908 | 0.8312 | | 0.1762 | 3.1714 | 1240 | 0.5865 | 0.7786 | 0.5865 | 0.7658 | | 0.1762 | 3.1765 | 1242 | 0.5540 | 0.7862 | 0.5540 | 0.7443 | | 0.1762 | 3.1816 | 1244 | 0.5885 | 0.7786 | 0.5885 | 0.7672 | | 0.1762 | 3.1867 | 1246 | 0.6732 | 0.7692 | 0.6732 | 0.8205 | | 0.1762 | 3.1918 | 1248 | 0.7324 | 0.7518 | 0.7324 | 0.8558 | | 0.1762 | 3.1969 | 1250 | 0.8167 | 0.7672 | 0.8167 | 0.9037 | | 0.1762 | 3.2020 | 1252 | 0.8068 | 0.7672 | 0.8068 | 0.8982 | | 0.1762 | 3.2072 | 1254 | 0.7380 | 0.7518 | 0.7380 | 0.8591 | | 0.1762 | 3.2123 | 1256 | 0.6137 | 0.7786 | 0.6137 | 0.7834 | | 0.1762 | 3.2174 | 1258 | 0.5258 | 0.7862 | 0.5258 | 0.7251 | | 0.1762 | 3.2225 | 1260 | 0.5037 | 0.7933 | 0.5037 | 0.7097 | | 0.1762 | 3.2276 | 1262 | 0.5186 | 0.7933 | 0.5186 | 0.7201 | | 0.1762 | 3.2327 | 1264 | 0.5850 | 0.7608 | 0.5850 | 0.7649 | | 0.1762 | 3.2379 | 1266 | 0.7252 | 0.7445 | 0.7252 | 0.8516 | | 0.1762 | 3.2430 | 1268 | 0.8301 | 0.7445 | 0.8301 | 0.9111 | | 0.1762 | 3.2481 | 1270 | 0.8080 | 0.7445 | 0.8080 | 0.8989 | | 0.1762 | 3.2532 | 1272 | 0.6896 | 0.7924 | 0.6896 | 0.8304 | | 0.1762 | 3.2583 | 1274 | 0.6581 | 0.7786 | 0.6581 | 0.8112 | | 0.1762 | 3.2634 | 1276 | 0.7478 | 0.8120 | 0.7478 | 0.8648 | | 0.1762 | 3.2685 | 1278 | 0.8356 | 0.8051 | 0.8356 | 0.9141 | | 0.1762 | 3.2737 | 1280 | 0.9472 | 0.7255 | 0.9472 | 0.9732 | | 0.1762 | 3.2788 | 1282 | 0.9352 | 0.7255 | 0.9352 | 0.9670 | | 0.1762 | 3.2839 | 1284 | 0.8363 | 0.7139 | 0.8363 | 0.9145 | | 0.1762 | 3.2890 | 1286 | 0.7191 | 0.7337 | 0.7191 | 0.8480 | | 0.1762 | 3.2941 | 1288 | 0.6807 | 0.7882 | 0.6807 | 0.8250 | | 0.1762 | 3.2992 | 1290 | 0.7235 | 0.7801 | 0.7235 | 0.8506 | | 0.1762 | 3.3043 | 1292 | 0.8575 | 0.7182 | 0.8575 | 0.9260 | | 0.1762 | 3.3095 | 1294 | 1.0346 | 0.7182 | 1.0346 | 1.0171 | | 0.1762 | 3.3146 | 1296 | 1.1295 | 0.7510 | 1.1295 | 1.0628 | | 0.1762 | 3.3197 | 1298 | 1.0640 | 0.7431 | 1.0640 | 1.0315 | | 0.1762 | 3.3248 | 1300 | 1.0225 | 0.7182 | 1.0225 | 1.0112 | | 0.1762 | 3.3299 | 1302 | 0.9866 | 0.7182 | 0.9866 | 0.9933 | | 0.1762 | 3.3350 | 1304 | 0.9609 | 0.7182 | 0.9609 | 0.9803 | | 0.1762 | 3.3402 | 1306 | 0.8950 | 0.7435 | 0.8950 | 0.9461 | | 0.1762 | 3.3453 | 1308 | 0.9036 | 0.7259 | 0.9036 | 0.9506 | | 0.1762 | 3.3504 | 1310 | 0.9497 | 0.7134 | 0.9497 | 0.9745 | | 0.1762 | 3.3555 | 1312 | 0.9502 | 0.7134 | 0.9502 | 0.9748 | | 0.1762 | 3.3606 | 1314 | 0.8725 | 0.7058 | 0.8725 | 0.9341 | | 0.1762 | 3.3657 | 1316 | 0.7518 | 0.7620 | 0.7518 | 0.8671 | | 0.1762 | 3.3708 | 1318 | 0.6653 | 0.7786 | 0.6653 | 0.8157 | | 0.1762 | 3.3760 | 1320 | 0.6617 | 0.7363 | 0.6617 | 0.8134 | | 0.1762 | 3.3811 | 1322 | 0.7116 | 0.7522 | 0.7116 | 0.8436 | | 0.1762 | 3.3862 | 1324 | 0.7657 | 0.7445 | 0.7657 | 0.8750 | | 0.1762 | 3.3913 | 1326 | 0.7931 | 0.7435 | 0.7931 | 0.8905 | | 0.1762 | 3.3964 | 1328 | 0.8109 | 0.7350 | 0.8109 | 0.9005 | | 0.1762 | 3.4015 | 1330 | 0.7371 | 0.7521 | 0.7371 | 0.8586 | | 0.1762 | 3.4066 | 1332 | 0.6703 | 0.7717 | 0.6703 | 0.8187 | | 0.1762 | 3.4118 | 1334 | 0.6886 | 0.7332 | 0.6886 | 0.8298 | | 0.1762 | 3.4169 | 1336 | 0.7884 | 0.7612 | 0.7884 | 0.8879 | | 0.1762 | 3.4220 | 1338 | 0.8466 | 0.7598 | 0.8466 | 0.9201 | | 0.1762 | 3.4271 | 1340 | 0.8541 | 0.7598 | 0.8541 | 0.9242 | | 0.1762 | 3.4322 | 1342 | 0.7976 | 0.7429 | 0.7976 | 0.8931 | | 0.1762 | 3.4373 | 1344 | 0.7129 | 0.7350 | 0.7129 | 0.8443 | | 0.1762 | 3.4425 | 1346 | 0.6697 | 0.6927 | 0.6697 | 0.8183 | | 0.1762 | 3.4476 | 1348 | 0.6439 | 0.7196 | 0.6439 | 0.8025 | | 0.1762 | 3.4527 | 1350 | 0.6521 | 0.7692 | 0.6521 | 0.8075 | | 0.1762 | 3.4578 | 1352 | 0.6399 | 0.7786 | 0.6399 | 0.7999 | | 0.1762 | 3.4629 | 1354 | 0.6444 | 0.7618 | 0.6444 | 0.8027 | | 0.1762 | 3.4680 | 1356 | 0.7005 | 0.7525 | 0.7005 | 0.8370 | | 0.1762 | 3.4731 | 1358 | 0.7631 | 0.7095 | 0.7631 | 0.8736 | | 0.1762 | 3.4783 | 1360 | 0.7567 | 0.7028 | 0.7567 | 0.8699 | | 0.1762 | 3.4834 | 1362 | 0.7677 | 0.7281 | 0.7677 | 0.8762 | | 0.1762 | 3.4885 | 1364 | 0.7160 | 0.7522 | 0.7160 | 0.8462 | | 0.1762 | 3.4936 | 1366 | 0.6629 | 0.7128 | 0.6629 | 0.8142 | | 0.1762 | 3.4987 | 1368 | 0.6394 | 0.7128 | 0.6394 | 0.7996 | | 0.1762 | 3.5038 | 1370 | 0.6440 | 0.7128 | 0.6440 | 0.8025 | | 0.1762 | 3.5090 | 1372 | 0.6968 | 0.7522 | 0.6968 | 0.8347 | | 0.1762 | 3.5141 | 1374 | 0.6984 | 0.7522 | 0.6984 | 0.8357 | | 0.1762 | 3.5192 | 1376 | 0.6757 | 0.7522 | 0.6757 | 0.8220 | | 0.1762 | 3.5243 | 1378 | 0.6720 | 0.7748 | 0.6720 | 0.8197 | | 0.1762 | 3.5294 | 1380 | 0.6158 | 0.7451 | 0.6158 | 0.7847 | | 0.1762 | 3.5345 | 1382 | 0.5916 | 0.7451 | 0.5916 | 0.7691 | | 0.1762 | 3.5396 | 1384 | 0.6089 | 0.7451 | 0.6089 | 0.7803 | | 0.1762 | 3.5448 | 1386 | 0.6600 | 0.7526 | 0.6600 | 0.8124 | | 0.1762 | 3.5499 | 1388 | 0.7308 | 0.7522 | 0.7308 | 0.8548 | | 0.1762 | 3.5550 | 1390 | 0.8602 | 0.7350 | 0.8602 | 0.9275 | | 0.1762 | 3.5601 | 1392 | 0.9134 | 0.7424 | 0.9134 | 0.9557 | | 0.1762 | 3.5652 | 1394 | 0.8665 | 0.7337 | 0.8665 | 0.9309 | | 0.1762 | 3.5703 | 1396 | 0.8472 | 0.7139 | 0.8472 | 0.9205 | | 0.1762 | 3.5754 | 1398 | 0.7720 | 0.7717 | 0.7720 | 0.8786 | | 0.1762 | 3.5806 | 1400 | 0.7343 | 0.7717 | 0.7343 | 0.8569 | | 0.1762 | 3.5857 | 1402 | 0.7411 | 0.7717 | 0.7411 | 0.8609 | | 0.1762 | 3.5908 | 1404 | 0.7369 | 0.7437 | 0.7369 | 0.8584 | | 0.1762 | 3.5959 | 1406 | 0.7270 | 0.7437 | 0.7270 | 0.8526 | | 0.1762 | 3.6010 | 1408 | 0.7005 | 0.7358 | 0.7005 | 0.8370 | | 0.1762 | 3.6061 | 1410 | 0.6892 | 0.7605 | 0.6892 | 0.8302 | | 0.1762 | 3.6113 | 1412 | 0.7153 | 0.7605 | 0.7153 | 0.8458 | | 0.1762 | 3.6164 | 1414 | 0.7779 | 0.7354 | 0.7779 | 0.8820 | | 0.1762 | 3.6215 | 1416 | 0.8957 | 0.7358 | 0.8957 | 0.9464 | | 0.1762 | 3.6266 | 1418 | 1.0643 | 0.7427 | 1.0643 | 1.0317 | | 0.1762 | 3.6317 | 1420 | 1.1203 | 0.7427 | 1.1203 | 1.0585 | | 0.1762 | 3.6368 | 1422 | 1.0501 | 0.7081 | 1.0501 | 1.0248 | | 0.1762 | 3.6419 | 1424 | 0.9627 | 0.7239 | 0.9627 | 0.9812 | | 0.1762 | 3.6471 | 1426 | 0.9293 | 0.6696 | 0.9293 | 0.9640 | | 0.1762 | 3.6522 | 1428 | 0.9547 | 0.6877 | 0.9547 | 0.9771 | | 0.1762 | 3.6573 | 1430 | 1.0672 | 0.7412 | 1.0672 | 1.0330 | | 0.1762 | 3.6624 | 1432 | 1.2003 | 0.7318 | 1.2003 | 1.0956 | | 0.1762 | 3.6675 | 1434 | 1.2330 | 0.7504 | 1.2330 | 1.1104 | | 0.1762 | 3.6726 | 1436 | 1.2227 | 0.7504 | 1.2227 | 1.1058 | | 0.1762 | 3.6777 | 1438 | 1.0861 | 0.7318 | 1.0861 | 1.0422 | | 0.1762 | 3.6829 | 1440 | 0.9666 | 0.6945 | 0.9666 | 0.9832 | | 0.1762 | 3.6880 | 1442 | 0.9155 | 0.6945 | 0.9155 | 0.9568 | | 0.1762 | 3.6931 | 1444 | 0.9278 | 0.6945 | 0.9278 | 0.9632 | | 0.1762 | 3.6982 | 1446 | 0.9729 | 0.6945 | 0.9729 | 0.9864 | | 0.1762 | 3.7033 | 1448 | 1.0218 | 0.7081 | 1.0218 | 1.0108 | | 0.1762 | 3.7084 | 1450 | 1.1359 | 0.7420 | 1.1359 | 1.0658 | | 0.1762 | 3.7136 | 1452 | 1.1585 | 0.7420 | 1.1585 | 1.0763 | | 0.1762 | 3.7187 | 1454 | 1.0834 | 0.7342 | 1.0834 | 1.0409 | | 0.1762 | 3.7238 | 1456 | 0.9253 | 0.6912 | 0.9253 | 0.9619 | | 0.1762 | 3.7289 | 1458 | 0.8109 | 0.6696 | 0.8109 | 0.9005 | | 0.1762 | 3.7340 | 1460 | 0.8175 | 0.6696 | 0.8175 | 0.9042 | | 0.1762 | 3.7391 | 1462 | 0.9153 | 0.6982 | 0.9153 | 0.9567 | | 0.1762 | 3.7442 | 1464 | 1.0306 | 0.7047 | 1.0306 | 1.0152 | | 0.1762 | 3.7494 | 1466 | 1.0783 | 0.7230 | 1.0783 | 1.0384 | | 0.1762 | 3.7545 | 1468 | 1.1952 | 0.7230 | 1.1952 | 1.0932 | | 0.1762 | 3.7596 | 1470 | 1.3541 | 0.7225 | 1.3541 | 1.1636 | | 0.1762 | 3.7647 | 1472 | 1.3835 | 0.7225 | 1.3835 | 1.1762 | | 0.1762 | 3.7698 | 1474 | 1.3052 | 0.7225 | 1.3052 | 1.1425 | | 0.1762 | 3.7749 | 1476 | 1.1851 | 0.7230 | 1.1851 | 1.0886 | | 0.1762 | 3.7801 | 1478 | 1.0369 | 0.7318 | 1.0369 | 1.0183 | | 0.1762 | 3.7852 | 1480 | 0.8844 | 0.7316 | 0.8844 | 0.9404 | | 0.1762 | 3.7903 | 1482 | 0.8134 | 0.7525 | 0.8134 | 0.9019 | | 0.1762 | 3.7954 | 1484 | 0.7765 | 0.7525 | 0.7765 | 0.8812 | | 0.1762 | 3.8005 | 1486 | 0.8185 | 0.7025 | 0.8185 | 0.9047 | | 0.1762 | 3.8056 | 1488 | 0.9420 | 0.7053 | 0.9420 | 0.9706 | | 0.1762 | 3.8107 | 1490 | 1.0412 | 0.7153 | 1.0412 | 1.0204 | | 0.1762 | 3.8159 | 1492 | 1.0121 | 0.7153 | 1.0121 | 1.0060 | | 0.1762 | 3.8210 | 1494 | 0.9320 | 0.7239 | 0.9320 | 0.9654 | | 0.1762 | 3.8261 | 1496 | 0.8567 | 0.7162 | 0.8567 | 0.9256 | | 0.1762 | 3.8312 | 1498 | 0.8572 | 0.7090 | 0.8572 | 0.9259 | | 0.1269 | 3.8363 | 1500 | 0.9071 | 0.6982 | 0.9071 | 0.9524 | | 0.1269 | 3.8414 | 1502 | 0.9121 | 0.6982 | 0.9121 | 0.9551 | | 0.1269 | 3.8465 | 1504 | 0.8209 | 0.7616 | 0.8209 | 0.9060 | | 0.1269 | 3.8517 | 1506 | 0.7319 | 0.7782 | 0.7319 | 0.8555 | | 0.1269 | 3.8568 | 1508 | 0.6703 | 0.7786 | 0.6703 | 0.8187 | | 0.1269 | 3.8619 | 1510 | 0.6286 | 0.7786 | 0.6286 | 0.7928 | | 0.1269 | 3.8670 | 1512 | 0.6360 | 0.7786 | 0.6360 | 0.7975 | | 0.1269 | 3.8721 | 1514 | 0.6812 | 0.8022 | 0.6812 | 0.8253 | | 0.1269 | 3.8772 | 1516 | 0.7280 | 0.7924 | 0.7280 | 0.8532 | | 0.1269 | 3.8824 | 1518 | 0.8286 | 0.7259 | 0.8286 | 0.9103 | | 0.1269 | 3.8875 | 1520 | 0.8883 | 0.7607 | 0.8883 | 0.9425 | | 0.1269 | 3.8926 | 1522 | 0.8527 | 0.7618 | 0.8527 | 0.9234 | | 0.1269 | 3.8977 | 1524 | 0.8034 | 0.7623 | 0.8034 | 0.8963 | | 0.1269 | 3.9028 | 1526 | 0.7260 | 0.7620 | 0.7260 | 0.8520 | | 0.1269 | 3.9079 | 1528 | 0.6599 | 0.7882 | 0.6599 | 0.8123 | | 0.1269 | 3.9130 | 1530 | 0.6254 | 0.7882 | 0.6254 | 0.7908 | | 0.1269 | 3.9182 | 1532 | 0.6485 | 0.7882 | 0.6485 | 0.8053 | | 0.1269 | 3.9233 | 1534 | 0.7010 | 0.7689 | 0.7010 | 0.8373 | | 0.1269 | 3.9284 | 1536 | 0.7547 | 0.7518 | 0.7547 | 0.8688 | | 0.1269 | 3.9335 | 1538 | 0.8057 | 0.7672 | 0.8057 | 0.8976 | | 0.1269 | 3.9386 | 1540 | 0.8677 | 0.7255 | 0.8677 | 0.9315 | | 0.1269 | 3.9437 | 1542 | 0.8687 | 0.7255 | 0.8687 | 0.9320 | | 0.1269 | 3.9488 | 1544 | 0.8944 | 0.7255 | 0.8944 | 0.9458 | | 0.1269 | 3.9540 | 1546 | 0.8497 | 0.6982 | 0.8497 | 0.9218 | | 0.1269 | 3.9591 | 1548 | 0.7934 | 0.7139 | 0.7934 | 0.8907 | | 0.1269 | 3.9642 | 1550 | 0.7341 | 0.7427 | 0.7341 | 0.8568 | | 0.1269 | 3.9693 | 1552 | 0.7399 | 0.7427 | 0.7399 | 0.8602 | | 0.1269 | 3.9744 | 1554 | 0.7356 | 0.7427 | 0.7356 | 0.8577 | | 0.1269 | 3.9795 | 1556 | 0.7739 | 0.6951 | 0.7739 | 0.8797 | | 0.1269 | 3.9847 | 1558 | 0.8425 | 0.7182 | 0.8425 | 0.9179 | | 0.1269 | 3.9898 | 1560 | 0.9220 | 0.7255 | 0.9220 | 0.9602 | | 0.1269 | 3.9949 | 1562 | 0.8883 | 0.7435 | 0.8883 | 0.9425 | | 0.1269 | 4.0 | 1564 | 0.8493 | 0.7435 | 0.8493 | 0.9216 | | 0.1269 | 4.0051 | 1566 | 0.7599 | 0.7354 | 0.7599 | 0.8717 | | 0.1269 | 4.0102 | 1568 | 0.7029 | 0.7172 | 0.7029 | 0.8384 | | 0.1269 | 4.0153 | 1570 | 0.7278 | 0.6882 | 0.7278 | 0.8531 | | 0.1269 | 4.0205 | 1572 | 0.8168 | 0.7139 | 0.8168 | 0.9038 | | 0.1269 | 4.0256 | 1574 | 0.9023 | 0.7421 | 0.9023 | 0.9499 | | 0.1269 | 4.0307 | 1576 | 0.9632 | 0.7333 | 0.9632 | 0.9814 | | 0.1269 | 4.0358 | 1578 | 0.9352 | 0.7421 | 0.9352 | 0.9670 | | 0.1269 | 4.0409 | 1580 | 0.8946 | 0.7221 | 0.8946 | 0.9458 | | 0.1269 | 4.0460 | 1582 | 0.8564 | 0.7139 | 0.8564 | 0.9254 | | 0.1269 | 4.0512 | 1584 | 0.8290 | 0.6951 | 0.8290 | 0.9105 | | 0.1269 | 4.0563 | 1586 | 0.8488 | 0.7090 | 0.8488 | 0.9213 | | 0.1269 | 4.0614 | 1588 | 0.8823 | 0.7259 | 0.8823 | 0.9393 | | 0.1269 | 4.0665 | 1590 | 0.9405 | 0.7081 | 0.9405 | 0.9698 | | 0.1269 | 4.0716 | 1592 | 1.0076 | 0.7081 | 1.0076 | 1.0038 | | 0.1269 | 4.0767 | 1594 | 0.9525 | 0.7081 | 0.9525 | 0.9760 | | 0.1269 | 4.0818 | 1596 | 0.8584 | 0.7086 | 0.8584 | 0.9265 | | 0.1269 | 4.0870 | 1598 | 0.8591 | 0.7086 | 0.8591 | 0.9269 | | 0.1269 | 4.0921 | 1600 | 0.9064 | 0.6871 | 0.9064 | 0.9520 | | 0.1269 | 4.0972 | 1602 | 0.9485 | 0.7520 | 0.9485 | 0.9739 | | 0.1269 | 4.1023 | 1604 | 1.0550 | 0.6414 | 1.0550 | 1.0271 | | 0.1269 | 4.1074 | 1606 | 1.1320 | 0.6414 | 1.1320 | 1.0640 | | 0.1269 | 4.1125 | 1608 | 1.1302 | 0.6414 | 1.1302 | 1.0631 | | 0.1269 | 4.1176 | 1610 | 1.0511 | 0.7626 | 1.0511 | 1.0252 | | 0.1269 | 4.1228 | 1612 | 0.9673 | 0.7413 | 0.9673 | 0.9835 | | 0.1269 | 4.1279 | 1614 | 0.9126 | 0.7413 | 0.9126 | 0.9553 | | 0.1269 | 4.1330 | 1616 | 0.8841 | 0.7316 | 0.8841 | 0.9403 | | 0.1269 | 4.1381 | 1618 | 0.8973 | 0.6982 | 0.8973 | 0.9472 | | 0.1269 | 4.1432 | 1620 | 0.8819 | 0.7162 | 0.8820 | 0.9391 | | 0.1269 | 4.1483 | 1622 | 0.8350 | 0.7019 | 0.8350 | 0.9138 | | 0.1269 | 4.1535 | 1624 | 0.8177 | 0.7277 | 0.8177 | 0.9043 | | 0.1269 | 4.1586 | 1626 | 0.8025 | 0.7445 | 0.8025 | 0.8958 | | 0.1269 | 4.1637 | 1628 | 0.7644 | 0.7445 | 0.7644 | 0.8743 | | 0.1269 | 4.1688 | 1630 | 0.6919 | 0.7522 | 0.6919 | 0.8318 | | 0.1269 | 4.1739 | 1632 | 0.6520 | 0.7786 | 0.6520 | 0.8074 | | 0.1269 | 4.1790 | 1634 | 0.6963 | 0.7522 | 0.6963 | 0.8344 | | 0.1269 | 4.1841 | 1636 | 0.7780 | 0.7672 | 0.7780 | 0.8820 | | 0.1269 | 4.1893 | 1638 | 0.7524 | 0.7759 | 0.7524 | 0.8674 | | 0.1269 | 4.1944 | 1640 | 0.6932 | 0.7882 | 0.6932 | 0.8326 | | 0.1269 | 4.1995 | 1642 | 0.6483 | 0.7786 | 0.6483 | 0.8052 | | 0.1269 | 4.2046 | 1644 | 0.6142 | 0.7786 | 0.6142 | 0.7837 | | 0.1269 | 4.2097 | 1646 | 0.6276 | 0.7786 | 0.6276 | 0.7922 | | 0.1269 | 4.2148 | 1648 | 0.6993 | 0.7601 | 0.6993 | 0.8363 | | 0.1269 | 4.2199 | 1650 | 0.7088 | 0.7601 | 0.7088 | 0.8419 | | 0.1269 | 4.2251 | 1652 | 0.6891 | 0.7601 | 0.6891 | 0.8301 | | 0.1269 | 4.2302 | 1654 | 0.6975 | 0.7601 | 0.6975 | 0.8352 | | 0.1269 | 4.2353 | 1656 | 0.6864 | 0.7522 | 0.6864 | 0.8285 | | 0.1269 | 4.2404 | 1658 | 0.6360 | 0.7605 | 0.6360 | 0.7975 | | 0.1269 | 4.2455 | 1660 | 0.6603 | 0.7522 | 0.6603 | 0.8126 | | 0.1269 | 4.2506 | 1662 | 0.7684 | 0.7593 | 0.7684 | 0.8766 | | 0.1269 | 4.2558 | 1664 | 0.8728 | 0.7358 | 0.8728 | 0.9342 | | 0.1269 | 4.2609 | 1666 | 0.8560 | 0.7433 | 0.8560 | 0.9252 | | 0.1269 | 4.2660 | 1668 | 0.7499 | 0.7601 | 0.7499 | 0.8660 | | 0.1269 | 4.2711 | 1670 | 0.6427 | 0.7689 | 0.6427 | 0.8017 | | 0.1269 | 4.2762 | 1672 | 0.6167 | 0.7529 | 0.6167 | 0.7853 | | 0.1269 | 4.2813 | 1674 | 0.6194 | 0.7363 | 0.6194 | 0.7870 | | 0.1269 | 4.2864 | 1676 | 0.6514 | 0.7358 | 0.6514 | 0.8071 | | 0.1269 | 4.2916 | 1678 | 0.6974 | 0.7601 | 0.6974 | 0.8351 | | 0.1269 | 4.2967 | 1680 | 0.7901 | 0.7601 | 0.7901 | 0.8889 | | 0.1269 | 4.3018 | 1682 | 0.8115 | 0.7759 | 0.8115 | 0.9008 | | 0.1269 | 4.3069 | 1684 | 0.8022 | 0.7601 | 0.8022 | 0.8956 | | 0.1269 | 4.3120 | 1686 | 0.7929 | 0.7601 | 0.7929 | 0.8905 | | 0.1269 | 4.3171 | 1688 | 0.7339 | 0.7358 | 0.7339 | 0.8567 | | 0.1269 | 4.3223 | 1690 | 0.6794 | 0.7285 | 0.6794 | 0.8242 | | 0.1269 | 4.3274 | 1692 | 0.6940 | 0.7285 | 0.6940 | 0.8331 | | 0.1269 | 4.3325 | 1694 | 0.7745 | 0.7358 | 0.7745 | 0.8801 | | 0.1269 | 4.3376 | 1696 | 0.8005 | 0.7358 | 0.8005 | 0.8947 | | 0.1269 | 4.3427 | 1698 | 0.7742 | 0.7358 | 0.7742 | 0.8799 | | 0.1269 | 4.3478 | 1700 | 0.7755 | 0.7358 | 0.7755 | 0.8806 | | 0.1269 | 4.3529 | 1702 | 0.7297 | 0.7358 | 0.7297 | 0.8542 | | 0.1269 | 4.3581 | 1704 | 0.6995 | 0.7358 | 0.6995 | 0.8364 | | 0.1269 | 4.3632 | 1706 | 0.6939 | 0.7358 | 0.6939 | 0.8330 | | 0.1269 | 4.3683 | 1708 | 0.7300 | 0.7358 | 0.7300 | 0.8544 | | 0.1269 | 4.3734 | 1710 | 0.8482 | 0.7759 | 0.8482 | 0.9210 | | 0.1269 | 4.3785 | 1712 | 0.9853 | 0.7435 | 0.9853 | 0.9926 | | 0.1269 | 4.3836 | 1714 | 1.0422 | 0.7435 | 1.0422 | 1.0209 | | 0.1269 | 4.3887 | 1716 | 0.9629 | 0.7672 | 0.9629 | 0.9813 | | 0.1269 | 4.3939 | 1718 | 0.8371 | 0.7518 | 0.8371 | 0.9149 | | 0.1269 | 4.3990 | 1720 | 0.7034 | 0.7358 | 0.7034 | 0.8387 | | 0.1269 | 4.4041 | 1722 | 0.6235 | 0.8060 | 0.6235 | 0.7896 | | 0.1269 | 4.4092 | 1724 | 0.6197 | 0.8060 | 0.6197 | 0.7872 | | 0.1269 | 4.4143 | 1726 | 0.6731 | 0.7441 | 0.6731 | 0.8204 | | 0.1269 | 4.4194 | 1728 | 0.7900 | 0.7759 | 0.7900 | 0.8888 | | 0.1269 | 4.4246 | 1730 | 0.9764 | 0.7602 | 0.9764 | 0.9881 | | 0.1269 | 4.4297 | 1732 | 1.0711 | 0.7081 | 1.0711 | 1.0349 | | 0.1269 | 4.4348 | 1734 | 1.0372 | 0.7081 | 1.0372 | 1.0184 | | 0.1269 | 4.4399 | 1736 | 0.9084 | 0.7602 | 0.9084 | 0.9531 | | 0.1269 | 4.4450 | 1738 | 0.7624 | 0.7249 | 0.7624 | 0.8732 | | 0.1269 | 4.4501 | 1740 | 0.7148 | 0.7441 | 0.7148 | 0.8455 | | 0.1269 | 4.4552 | 1742 | 0.7484 | 0.7441 | 0.7484 | 0.8651 | | 0.1269 | 4.4604 | 1744 | 0.8763 | 0.7602 | 0.8763 | 0.9361 | | 0.1269 | 4.4655 | 1746 | 1.0640 | 0.7106 | 1.0640 | 1.0315 | | 0.1269 | 4.4706 | 1748 | 1.1311 | 0.7106 | 1.1311 | 1.0635 | | 0.1269 | 4.4757 | 1750 | 1.0763 | 0.7106 | 1.0763 | 1.0374 | | 0.1269 | 4.4808 | 1752 | 0.9425 | 0.7255 | 0.9425 | 0.9708 | | 0.1269 | 4.4859 | 1754 | 0.8203 | 0.7342 | 0.8203 | 0.9057 | | 0.1269 | 4.4910 | 1756 | 0.7024 | 0.7882 | 0.7024 | 0.8381 | | 0.1269 | 4.4962 | 1758 | 0.6656 | 0.8060 | 0.6656 | 0.8158 | | 0.1269 | 4.5013 | 1760 | 0.6846 | 0.7882 | 0.6846 | 0.8274 | | 0.1269 | 4.5064 | 1762 | 0.7556 | 0.7986 | 0.7556 | 0.8693 | | 0.1269 | 4.5115 | 1764 | 0.8827 | 0.7342 | 0.8827 | 0.9395 | | 0.1269 | 4.5166 | 1766 | 0.9816 | 0.7157 | 0.9816 | 0.9908 | | 0.1269 | 4.5217 | 1768 | 1.0033 | 0.7429 | 1.0033 | 1.0016 | | 0.1269 | 4.5269 | 1770 | 0.9713 | 0.7429 | 0.9713 | 0.9856 | | 0.1269 | 4.5320 | 1772 | 0.9610 | 0.7429 | 0.9610 | 0.9803 | | 0.1269 | 4.5371 | 1774 | 0.8919 | 0.7433 | 0.8919 | 0.9444 | | 0.1269 | 4.5422 | 1776 | 0.8489 | 0.7606 | 0.8489 | 0.9214 | | 0.1269 | 4.5473 | 1778 | 0.8685 | 0.7433 | 0.8685 | 0.9319 | | 0.1269 | 4.5524 | 1780 | 0.8797 | 0.7162 | 0.8797 | 0.9379 | | 0.1269 | 4.5575 | 1782 | 0.9565 | 0.7429 | 0.9565 | 0.9780 | | 0.1269 | 4.5627 | 1784 | 0.9927 | 0.7081 | 0.9927 | 0.9964 | | 0.1269 | 4.5678 | 1786 | 0.9423 | 0.7429 | 0.9423 | 0.9707 | | 0.1269 | 4.5729 | 1788 | 0.8208 | 0.7358 | 0.8208 | 0.9060 | | 0.1269 | 4.5780 | 1790 | 0.7681 | 0.7358 | 0.7681 | 0.8764 | | 0.1269 | 4.5831 | 1792 | 0.7607 | 0.7358 | 0.7607 | 0.8722 | | 0.1269 | 4.5882 | 1794 | 0.7908 | 0.7285 | 0.7908 | 0.8892 | | 0.1269 | 4.5934 | 1796 | 0.8751 | 0.7518 | 0.8751 | 0.9355 | | 0.1269 | 4.5985 | 1798 | 0.9146 | 0.7518 | 0.9146 | 0.9563 | | 0.1269 | 4.6036 | 1800 | 0.9443 | 0.7672 | 0.9443 | 0.9717 | | 0.1269 | 4.6087 | 1802 | 0.9017 | 0.7672 | 0.9017 | 0.9496 | | 0.1269 | 4.6138 | 1804 | 0.8424 | 0.7601 | 0.8424 | 0.9178 | | 0.1269 | 4.6189 | 1806 | 0.7827 | 0.7358 | 0.7827 | 0.8847 | | 0.1269 | 4.6240 | 1808 | 0.7620 | 0.7358 | 0.7620 | 0.8729 | | 0.1269 | 4.6292 | 1810 | 0.7994 | 0.7358 | 0.7994 | 0.8941 | | 0.1269 | 4.6343 | 1812 | 0.8423 | 0.7354 | 0.8423 | 0.9178 | | 0.1269 | 4.6394 | 1814 | 0.8664 | 0.7433 | 0.8664 | 0.9308 | | 0.1269 | 4.6445 | 1816 | 0.8670 | 0.7167 | 0.8670 | 0.9311 | | 0.1269 | 4.6496 | 1818 | 0.8187 | 0.7167 | 0.8187 | 0.9048 | | 0.1269 | 4.6547 | 1820 | 0.8302 | 0.7167 | 0.8302 | 0.9111 | | 0.1269 | 4.6598 | 1822 | 0.8753 | 0.7433 | 0.8753 | 0.9356 | | 0.1269 | 4.6650 | 1824 | 0.8525 | 0.7433 | 0.8525 | 0.9233 | | 0.1269 | 4.6701 | 1826 | 0.7716 | 0.7095 | 0.7716 | 0.8784 | | 0.1269 | 4.6752 | 1828 | 0.7038 | 0.7441 | 0.7038 | 0.8389 | | 0.1269 | 4.6803 | 1830 | 0.7078 | 0.7441 | 0.7078 | 0.8413 | | 0.1269 | 4.6854 | 1832 | 0.7642 | 0.7095 | 0.7642 | 0.8742 | | 0.1269 | 4.6905 | 1834 | 0.8551 | 0.7167 | 0.8551 | 0.9247 | | 0.1269 | 4.6957 | 1836 | 0.9546 | 0.7421 | 0.9546 | 0.9770 | | 0.1269 | 4.7008 | 1838 | 0.9957 | 0.7333 | 0.9957 | 0.9978 | | 0.1269 | 4.7059 | 1840 | 1.0314 | 0.7333 | 1.0314 | 1.0156 | | 0.1269 | 4.7110 | 1842 | 1.0058 | 0.7333 | 1.0058 | 1.0029 | | 0.1269 | 4.7161 | 1844 | 0.9430 | 0.7255 | 0.9430 | 0.9711 | | 0.1269 | 4.7212 | 1846 | 0.8425 | 0.7167 | 0.8425 | 0.9179 | | 0.1269 | 4.7263 | 1848 | 0.7877 | 0.7167 | 0.7877 | 0.8875 | | 0.1269 | 4.7315 | 1850 | 0.7981 | 0.7167 | 0.7981 | 0.8934 | | 0.1269 | 4.7366 | 1852 | 0.7999 | 0.7167 | 0.7999 | 0.8944 | | 0.1269 | 4.7417 | 1854 | 0.7846 | 0.7616 | 0.7846 | 0.8858 | | 0.1269 | 4.7468 | 1856 | 0.8144 | 0.7342 | 0.8144 | 0.9024 | | 0.1269 | 4.7519 | 1858 | 0.8664 | 0.7342 | 0.8664 | 0.9308 | | 0.1269 | 4.7570 | 1860 | 0.9371 | 0.7602 | 0.9371 | 0.9680 | | 0.1269 | 4.7621 | 1862 | 0.9463 | 0.7514 | 0.9463 | 0.9728 | | 0.1269 | 4.7673 | 1864 | 0.9062 | 0.7514 | 0.9062 | 0.9520 | | 0.1269 | 4.7724 | 1866 | 0.8576 | 0.7435 | 0.8576 | 0.9261 | | 0.1269 | 4.7775 | 1868 | 0.7661 | 0.7095 | 0.7661 | 0.8753 | | 0.1269 | 4.7826 | 1870 | 0.7257 | 0.7095 | 0.7257 | 0.8519 | | 0.1269 | 4.7877 | 1872 | 0.7520 | 0.7525 | 0.7520 | 0.8672 | | 0.1269 | 4.7928 | 1874 | 0.8140 | 0.7525 | 0.8140 | 0.9022 | | 0.1269 | 4.7980 | 1876 | 0.8499 | 0.6877 | 0.8499 | 0.9219 | | 0.1269 | 4.8031 | 1878 | 0.8507 | 0.7162 | 0.8507 | 0.9223 | | 0.1269 | 4.8082 | 1880 | 0.8060 | 0.7616 | 0.8060 | 0.8978 | | 0.1269 | 4.8133 | 1882 | 0.7811 | 0.7616 | 0.7811 | 0.8838 | | 0.1269 | 4.8184 | 1884 | 0.8010 | 0.7162 | 0.8010 | 0.8950 | | 0.1269 | 4.8235 | 1886 | 0.8606 | 0.7162 | 0.8606 | 0.9277 | | 0.1269 | 4.8286 | 1888 | 0.9405 | 0.7162 | 0.9405 | 0.9698 | | 0.1269 | 4.8338 | 1890 | 0.9968 | 0.7337 | 0.9968 | 0.9984 | | 0.1269 | 4.8389 | 1892 | 0.9980 | 0.7337 | 0.9980 | 0.9990 | | 0.1269 | 4.8440 | 1894 | 0.9146 | 0.7162 | 0.9146 | 0.9563 | | 0.1269 | 4.8491 | 1896 | 0.8103 | 0.6877 | 0.8103 | 0.9001 | | 0.1269 | 4.8542 | 1898 | 0.7573 | 0.7427 | 0.7573 | 0.8702 | | 0.1269 | 4.8593 | 1900 | 0.7601 | 0.7167 | 0.7601 | 0.8719 | | 0.1269 | 4.8645 | 1902 | 0.7555 | 0.7167 | 0.7555 | 0.8692 | | 0.1269 | 4.8696 | 1904 | 0.7836 | 0.7433 | 0.7836 | 0.8852 | | 0.1269 | 4.8747 | 1906 | 0.8101 | 0.7433 | 0.8101 | 0.9001 | | 0.1269 | 4.8798 | 1908 | 0.8188 | 0.7433 | 0.8188 | 0.9049 | | 0.1269 | 4.8849 | 1910 | 0.8105 | 0.7433 | 0.8105 | 0.9003 | | 0.1269 | 4.8900 | 1912 | 0.8320 | 0.7433 | 0.8320 | 0.9121 | | 0.1269 | 4.8951 | 1914 | 0.8051 | 0.7433 | 0.8051 | 0.8973 | | 0.1269 | 4.9003 | 1916 | 0.7499 | 0.7167 | 0.7499 | 0.8659 | | 0.1269 | 4.9054 | 1918 | 0.7434 | 0.7249 | 0.7434 | 0.8622 | | 0.1269 | 4.9105 | 1920 | 0.7467 | 0.7249 | 0.7467 | 0.8641 | | 0.1269 | 4.9156 | 1922 | 0.7921 | 0.7162 | 0.7921 | 0.8900 | | 0.1269 | 4.9207 | 1924 | 0.8953 | 0.7421 | 0.8953 | 0.9462 | | 0.1269 | 4.9258 | 1926 | 1.0408 | 0.7333 | 1.0408 | 1.0202 | | 0.1269 | 4.9309 | 1928 | 1.0764 | 0.7333 | 1.0764 | 1.0375 | | 0.1269 | 4.9361 | 1930 | 1.0085 | 0.7333 | 1.0085 | 1.0043 | | 0.1269 | 4.9412 | 1932 | 0.9150 | 0.7421 | 0.9150 | 0.9565 | | 0.1269 | 4.9463 | 1934 | 0.8573 | 0.7337 | 0.8573 | 0.9259 | | 0.1269 | 4.9514 | 1936 | 0.8032 | 0.7342 | 0.8032 | 0.8962 | | 0.1269 | 4.9565 | 1938 | 0.8063 | 0.7342 | 0.8063 | 0.8979 | | 0.1269 | 4.9616 | 1940 | 0.8507 | 0.7602 | 0.8507 | 0.9224 | | 0.1269 | 4.9668 | 1942 | 0.9234 | 0.7514 | 0.9234 | 0.9609 | | 0.1269 | 4.9719 | 1944 | 0.9734 | 0.7514 | 0.9734 | 0.9866 | | 0.1269 | 4.9770 | 1946 | 0.9670 | 0.7514 | 0.9670 | 0.9834 | | 0.1269 | 4.9821 | 1948 | 0.8922 | 0.7514 | 0.8922 | 0.9446 | | 0.1269 | 4.9872 | 1950 | 0.8126 | 0.7602 | 0.8126 | 0.9014 | | 0.1269 | 4.9923 | 1952 | 0.7956 | 0.7342 | 0.7956 | 0.8920 | | 0.1269 | 4.9974 | 1954 | 0.8052 | 0.7342 | 0.8052 | 0.8973 | | 0.1269 | 5.0026 | 1956 | 0.8300 | 0.7342 | 0.8300 | 0.9110 | | 0.1269 | 5.0077 | 1958 | 0.8364 | 0.7430 | 0.8364 | 0.9145 | | 0.1269 | 5.0128 | 1960 | 0.8045 | 0.7906 | 0.8045 | 0.8969 | | 0.1269 | 5.0179 | 1962 | 0.8227 | 0.7906 | 0.8227 | 0.9070 | | 0.1269 | 5.0230 | 1964 | 0.8469 | 0.7342 | 0.8469 | 0.9203 | | 0.1269 | 5.0281 | 1966 | 0.8955 | 0.7342 | 0.8955 | 0.9463 | | 0.1269 | 5.0332 | 1968 | 0.8909 | 0.7602 | 0.8909 | 0.9439 | | 0.1269 | 5.0384 | 1970 | 0.8186 | 0.7602 | 0.8186 | 0.9048 | | 0.1269 | 5.0435 | 1972 | 0.7871 | 0.7602 | 0.7871 | 0.8872 | | 0.1269 | 5.0486 | 1974 | 0.7808 | 0.7602 | 0.7808 | 0.8836 | | 0.1269 | 5.0537 | 1976 | 0.7596 | 0.7354 | 0.7596 | 0.8715 | | 0.1269 | 5.0588 | 1978 | 0.7960 | 0.7354 | 0.7960 | 0.8922 | | 0.1269 | 5.0639 | 1980 | 0.8873 | 0.7514 | 0.8873 | 0.9420 | | 0.1269 | 5.0691 | 1982 | 1.0179 | 0.7255 | 1.0179 | 1.0089 | | 0.1269 | 5.0742 | 1984 | 1.0724 | 0.7333 | 1.0724 | 1.0356 | | 0.1269 | 5.0793 | 1986 | 1.0372 | 0.7333 | 1.0372 | 1.0184 | | 0.1269 | 5.0844 | 1988 | 0.9306 | 0.7337 | 0.9306 | 0.9647 | | 0.1269 | 5.0895 | 1990 | 0.8055 | 0.7167 | 0.8055 | 0.8975 | | 0.1269 | 5.0946 | 1992 | 0.6943 | 0.7441 | 0.6943 | 0.8332 | | 0.1269 | 5.0997 | 1994 | 0.6684 | 0.7363 | 0.6684 | 0.8175 | | 0.1269 | 5.1049 | 1996 | 0.6743 | 0.7363 | 0.6743 | 0.8211 | | 0.1269 | 5.1100 | 1998 | 0.7293 | 0.7522 | 0.7293 | 0.8540 | | 0.0963 | 5.1151 | 2000 | 0.8083 | 0.7277 | 0.8083 | 0.8990 | | 0.0963 | 5.1202 | 2002 | 0.8817 | 0.7514 | 0.8817 | 0.9390 | | 0.0963 | 5.1253 | 2004 | 0.8705 | 0.7514 | 0.8705 | 0.9330 | | 0.0963 | 5.1304 | 2006 | 0.7975 | 0.7350 | 0.7975 | 0.8930 | | 0.0963 | 5.1355 | 2008 | 0.7256 | 0.7285 | 0.7256 | 0.8518 | | 0.0963 | 5.1407 | 2010 | 0.7151 | 0.7285 | 0.7151 | 0.8456 | | 0.0963 | 5.1458 | 2012 | 0.7327 | 0.7522 | 0.7327 | 0.8560 | | 0.0963 | 5.1509 | 2014 | 0.7117 | 0.7285 | 0.7117 | 0.8436 | | 0.0963 | 5.1560 | 2016 | 0.6862 | 0.7285 | 0.6862 | 0.8284 | | 0.0963 | 5.1611 | 2018 | 0.7195 | 0.7358 | 0.7195 | 0.8482 | | 0.0963 | 5.1662 | 2020 | 0.7400 | 0.7264 | 0.7400 | 0.8602 | | 0.0963 | 5.1714 | 2022 | 0.8003 | 0.7514 | 0.8003 | 0.8946 | | 0.0963 | 5.1765 | 2024 | 0.9075 | 0.7514 | 0.9075 | 0.9526 | | 0.0963 | 5.1816 | 2026 | 1.0166 | 0.7514 | 1.0166 | 1.0083 | | 0.0963 | 5.1867 | 2028 | 1.0543 | 0.7688 | 1.0543 | 1.0268 | | 0.0963 | 5.1918 | 2030 | 1.0220 | 0.7511 | 1.0220 | 1.0109 | | 0.0963 | 5.1969 | 2032 | 0.9248 | 0.7520 | 0.9248 | 0.9617 | | 0.0963 | 5.2020 | 2034 | 0.8426 | 0.7430 | 0.8426 | 0.9179 | | 0.0963 | 5.2072 | 2036 | 0.7736 | 0.7717 | 0.7736 | 0.8796 | | 0.0963 | 5.2123 | 2038 | 0.6919 | 0.7529 | 0.6919 | 0.8318 | | 0.0963 | 5.2174 | 2040 | 0.6568 | 0.7439 | 0.6568 | 0.8104 | | 0.0963 | 5.2225 | 2042 | 0.6765 | 0.7172 | 0.6765 | 0.8225 | | 0.0963 | 5.2276 | 2044 | 0.7547 | 0.7342 | 0.7547 | 0.8687 | | 0.0963 | 5.2327 | 2046 | 0.8557 | 0.7602 | 0.8557 | 0.9250 | | 0.0963 | 5.2379 | 2048 | 0.9642 | 0.7514 | 0.9642 | 0.9819 | | 0.0963 | 5.2430 | 2050 | 0.9939 | 0.7514 | 0.9939 | 0.9969 | | 0.0963 | 5.2481 | 2052 | 0.9578 | 0.7602 | 0.9578 | 0.9787 | | 0.0963 | 5.2532 | 2054 | 0.8790 | 0.7342 | 0.8790 | 0.9376 | | 0.0963 | 5.2583 | 2056 | 0.8005 | 0.7430 | 0.8005 | 0.8947 | | 0.0963 | 5.2634 | 2058 | 0.7781 | 0.7906 | 0.7781 | 0.8821 | | 0.0963 | 5.2685 | 2060 | 0.7883 | 0.7906 | 0.7883 | 0.8879 | | 0.0963 | 5.2737 | 2062 | 0.8087 | 0.7906 | 0.8087 | 0.8993 | | 0.0963 | 5.2788 | 2064 | 0.8683 | 0.7520 | 0.8683 | 0.9318 | | 0.0963 | 5.2839 | 2066 | 0.9673 | 0.7101 | 0.9673 | 0.9835 | | 0.0963 | 5.2890 | 2068 | 1.0424 | 0.7014 | 1.0424 | 1.0210 | | 0.0963 | 5.2941 | 2070 | 1.1259 | 0.7318 | 1.1260 | 1.0611 | | 0.0963 | 5.2992 | 2072 | 1.1521 | 0.7230 | 1.1521 | 1.0734 | | 0.0963 | 5.3043 | 2074 | 1.0859 | 0.7230 | 1.0859 | 1.0421 | | 0.0963 | 5.3095 | 2076 | 0.9749 | 0.7234 | 0.9749 | 0.9874 | | 0.0963 | 5.3146 | 2078 | 0.8752 | 0.6945 | 0.8752 | 0.9355 | | 0.0963 | 5.3197 | 2080 | 0.8543 | 0.7025 | 0.8543 | 0.9243 | | 0.0963 | 5.3248 | 2082 | 0.8408 | 0.7025 | 0.8408 | 0.9170 | | 0.0963 | 5.3299 | 2084 | 0.8874 | 0.6945 | 0.8874 | 0.9420 | | 0.0963 | 5.3350 | 2086 | 0.9396 | 0.6940 | 0.9396 | 0.9693 | | 0.0963 | 5.3402 | 2088 | 1.0272 | 0.7318 | 1.0272 | 1.0135 | | 0.0963 | 5.3453 | 2090 | 1.0952 | 0.7318 | 1.0952 | 1.0465 | | 0.0963 | 5.3504 | 2092 | 1.1961 | 0.7230 | 1.1961 | 1.0936 | | 0.0963 | 5.3555 | 2094 | 1.2271 | 0.7230 | 1.2271 | 1.1077 | | 0.0963 | 5.3606 | 2096 | 1.1730 | 0.7230 | 1.1730 | 1.0831 | | 0.0963 | 5.3657 | 2098 | 1.1235 | 0.7318 | 1.1235 | 1.0600 | | 0.0963 | 5.3708 | 2100 | 1.0137 | 0.7318 | 1.0137 | 1.0068 | | 0.0963 | 5.3760 | 2102 | 0.8984 | 0.7134 | 0.8984 | 0.9479 | | 0.0963 | 5.3811 | 2104 | 0.8350 | 0.7221 | 0.8350 | 0.9138 | | 0.0963 | 5.3862 | 2106 | 0.8346 | 0.7221 | 0.8346 | 0.9136 | | 0.0963 | 5.3913 | 2108 | 0.8729 | 0.7421 | 0.8729 | 0.9343 | | 0.0963 | 5.3964 | 2110 | 0.9138 | 0.7421 | 0.9138 | 0.9559 | | 0.0963 | 5.4015 | 2112 | 0.9037 | 0.7421 | 0.9037 | 0.9506 | | 0.0963 | 5.4066 | 2114 | 0.9200 | 0.7421 | 0.9200 | 0.9592 | | 0.0963 | 5.4118 | 2116 | 0.8836 | 0.7221 | 0.8836 | 0.9400 | | 0.0963 | 5.4169 | 2118 | 0.8634 | 0.7844 | 0.8634 | 0.9292 | | 0.0963 | 5.4220 | 2120 | 0.8466 | 0.7729 | 0.8466 | 0.9201 | | 0.0963 | 5.4271 | 2122 | 0.8333 | 0.7729 | 0.8333 | 0.9128 | | 0.0963 | 5.4322 | 2124 | 0.8507 | 0.7844 | 0.8507 | 0.9223 | | 0.0963 | 5.4373 | 2126 | 0.9117 | 0.7614 | 0.9117 | 0.9548 | | 0.0963 | 5.4425 | 2128 | 0.9555 | 0.7318 | 0.9555 | 0.9775 | | 0.0963 | 5.4476 | 2130 | 0.9911 | 0.7318 | 0.9911 | 0.9955 | | 0.0963 | 5.4527 | 2132 | 1.0108 | 0.7318 | 1.0108 | 1.0054 | | 0.0963 | 5.4578 | 2134 | 0.9777 | 0.7412 | 0.9777 | 0.9888 | | 0.0963 | 5.4629 | 2136 | 0.9459 | 0.7310 | 0.9459 | 0.9726 | | 0.0963 | 5.4680 | 2138 | 0.9033 | 0.7844 | 0.9033 | 0.9504 | | 0.0963 | 5.4731 | 2140 | 0.8336 | 0.7844 | 0.8336 | 0.9130 | | 0.0963 | 5.4783 | 2142 | 0.8030 | 0.7623 | 0.8030 | 0.8961 | | 0.0963 | 5.4834 | 2144 | 0.8059 | 0.7696 | 0.8059 | 0.8977 | | 0.0963 | 5.4885 | 2146 | 0.8783 | 0.7514 | 0.8783 | 0.9372 | | 0.0963 | 5.4936 | 2148 | 0.9942 | 0.7346 | 0.9942 | 0.9971 | | 0.0963 | 5.4987 | 2150 | 1.0392 | 0.7153 | 1.0392 | 1.0194 | | 0.0963 | 5.5038 | 2152 | 1.0253 | 0.7153 | 1.0253 | 1.0126 | | 0.0963 | 5.5090 | 2154 | 0.9528 | 0.7346 | 0.9528 | 0.9761 | | 0.0963 | 5.5141 | 2156 | 0.9083 | 0.7514 | 0.9083 | 0.9531 | | 0.0963 | 5.5192 | 2158 | 0.8464 | 0.7602 | 0.8464 | 0.9200 | | 0.0963 | 5.5243 | 2160 | 0.8434 | 0.7602 | 0.8434 | 0.9184 | | 0.0963 | 5.5294 | 2162 | 0.8720 | 0.7516 | 0.8720 | 0.9338 | | 0.0963 | 5.5345 | 2164 | 0.9235 | 0.7412 | 0.9235 | 0.9610 | | 0.0963 | 5.5396 | 2166 | 0.9728 | 0.7412 | 0.9728 | 0.9863 | | 0.0963 | 5.5448 | 2168 | 1.0069 | 0.7318 | 1.0069 | 1.0034 | | 0.0963 | 5.5499 | 2170 | 1.0035 | 0.7318 | 1.0035 | 1.0018 | | 0.0963 | 5.5550 | 2172 | 0.9740 | 0.7412 | 0.9740 | 0.9869 | | 0.0963 | 5.5601 | 2174 | 0.9828 | 0.7318 | 0.9828 | 0.9913 | | 0.0963 | 5.5652 | 2176 | 1.0285 | 0.7230 | 1.0285 | 1.0141 | | 0.0963 | 5.5703 | 2178 | 1.0423 | 0.7230 | 1.0423 | 1.0209 | | 0.0963 | 5.5754 | 2180 | 0.9953 | 0.7230 | 0.9953 | 0.9977 | | 0.0963 | 5.5806 | 2182 | 0.9577 | 0.7230 | 0.9577 | 0.9786 | | 0.0963 | 5.5857 | 2184 | 0.8868 | 0.7333 | 0.8868 | 0.9417 | | 0.0963 | 5.5908 | 2186 | 0.8295 | 0.7255 | 0.8295 | 0.9108 | | 0.0963 | 5.5959 | 2188 | 0.8020 | 0.7255 | 0.8020 | 0.8956 | | 0.0963 | 5.6010 | 2190 | 0.7977 | 0.7255 | 0.7977 | 0.8931 | | 0.0963 | 5.6061 | 2192 | 0.8590 | 0.7255 | 0.8590 | 0.9268 | | 0.0963 | 5.6113 | 2194 | 0.8623 | 0.7333 | 0.8623 | 0.9286 | | 0.0963 | 5.6164 | 2196 | 0.7940 | 0.7692 | 0.7940 | 0.8911 | | 0.0963 | 5.6215 | 2198 | 0.6892 | 0.7525 | 0.6892 | 0.8302 | | 0.0963 | 5.6266 | 2200 | 0.6424 | 0.7441 | 0.6424 | 0.8015 | | 0.0963 | 5.6317 | 2202 | 0.6548 | 0.7441 | 0.6548 | 0.8092 | | 0.0963 | 5.6368 | 2204 | 0.6862 | 0.7441 | 0.6862 | 0.8284 | | 0.0963 | 5.6419 | 2206 | 0.7513 | 0.7945 | 0.7513 | 0.8668 | | 0.0963 | 5.6471 | 2208 | 0.8089 | 0.7845 | 0.8089 | 0.8994 | | 0.0963 | 5.6522 | 2210 | 0.8216 | 0.7845 | 0.8216 | 0.9064 | | 0.0963 | 5.6573 | 2212 | 0.7985 | 0.7845 | 0.7985 | 0.8936 | | 0.0963 | 5.6624 | 2214 | 0.7483 | 0.7775 | 0.7483 | 0.8650 | | 0.0963 | 5.6675 | 2216 | 0.7035 | 0.7616 | 0.7035 | 0.8388 | | 0.0963 | 5.6726 | 2218 | 0.6824 | 0.7616 | 0.6824 | 0.8261 | | 0.0963 | 5.6777 | 2220 | 0.6923 | 0.7779 | 0.6923 | 0.8320 | | 0.0963 | 5.6829 | 2222 | 0.6938 | 0.7689 | 0.6938 | 0.8329 | | 0.0963 | 5.6880 | 2224 | 0.7239 | 0.7597 | 0.7239 | 0.8508 | | 0.0963 | 5.6931 | 2226 | 0.7186 | 0.7518 | 0.7186 | 0.8477 | | 0.0963 | 5.6982 | 2228 | 0.7240 | 0.7597 | 0.7240 | 0.8509 | | 0.0963 | 5.7033 | 2230 | 0.7847 | 0.7598 | 0.7847 | 0.8858 | | 0.0963 | 5.7084 | 2232 | 0.7986 | 0.7598 | 0.7986 | 0.8936 | | 0.0963 | 5.7136 | 2234 | 0.7550 | 0.7598 | 0.7550 | 0.8689 | | 0.0963 | 5.7187 | 2236 | 0.6919 | 0.7689 | 0.6919 | 0.8318 | | 0.0963 | 5.7238 | 2238 | 0.6606 | 0.7689 | 0.6606 | 0.8128 | | 0.0963 | 5.7289 | 2240 | 0.6261 | 0.7441 | 0.6261 | 0.7913 | | 0.0963 | 5.7340 | 2242 | 0.6134 | 0.7441 | 0.6134 | 0.7832 | | 0.0963 | 5.7391 | 2244 | 0.6319 | 0.7441 | 0.6319 | 0.7949 | | 0.0963 | 5.7442 | 2246 | 0.6976 | 0.7689 | 0.6976 | 0.8352 | | 0.0963 | 5.7494 | 2248 | 0.7555 | 0.7517 | 0.7555 | 0.8692 | | 0.0963 | 5.7545 | 2250 | 0.8268 | 0.7333 | 0.8268 | 0.9093 | | 0.0963 | 5.7596 | 2252 | 0.8212 | 0.7333 | 0.8212 | 0.9062 | | 0.0963 | 5.7647 | 2254 | 0.7518 | 0.7521 | 0.7518 | 0.8671 | | 0.0963 | 5.7698 | 2256 | 0.7197 | 0.7521 | 0.7197 | 0.8483 | | 0.0963 | 5.7749 | 2258 | 0.6595 | 0.7986 | 0.6595 | 0.8121 | | 0.0963 | 5.7801 | 2260 | 0.6269 | 0.7986 | 0.6269 | 0.7917 | | 0.0963 | 5.7852 | 2262 | 0.6031 | 0.7882 | 0.6031 | 0.7766 | | 0.0963 | 5.7903 | 2264 | 0.6187 | 0.7986 | 0.6187 | 0.7866 | | 0.0963 | 5.7954 | 2266 | 0.6752 | 0.7525 | 0.6752 | 0.8217 | | 0.0963 | 5.8005 | 2268 | 0.7584 | 0.7433 | 0.7584 | 0.8709 | | 0.0963 | 5.8056 | 2270 | 0.8385 | 0.7514 | 0.8385 | 0.9157 | | 0.0963 | 5.8107 | 2272 | 0.8792 | 0.7514 | 0.8792 | 0.9377 | | 0.0963 | 5.8159 | 2274 | 0.8593 | 0.7514 | 0.8593 | 0.9270 | | 0.0963 | 5.8210 | 2276 | 0.8159 | 0.7514 | 0.8159 | 0.9033 | | 0.0963 | 5.8261 | 2278 | 0.7697 | 0.7514 | 0.7697 | 0.8773 | | 0.0963 | 5.8312 | 2280 | 0.7381 | 0.7350 | 0.7381 | 0.8591 | | 0.0963 | 5.8363 | 2282 | 0.7075 | 0.7597 | 0.7075 | 0.8411 | | 0.0963 | 5.8414 | 2284 | 0.6793 | 0.7779 | 0.6793 | 0.8242 | | 0.0963 | 5.8465 | 2286 | 0.6973 | 0.7779 | 0.6973 | 0.8350 | | 0.0963 | 5.8517 | 2288 | 0.7614 | 0.7602 | 0.7614 | 0.8726 | | 0.0963 | 5.8568 | 2290 | 0.8066 | 0.7602 | 0.8066 | 0.8981 | | 0.0963 | 5.8619 | 2292 | 0.7988 | 0.7602 | 0.7988 | 0.8938 | | 0.0963 | 5.8670 | 2294 | 0.7496 | 0.7249 | 0.7496 | 0.8658 | | 0.0963 | 5.8721 | 2296 | 0.7453 | 0.7249 | 0.7453 | 0.8633 | | 0.0963 | 5.8772 | 2298 | 0.7397 | 0.7249 | 0.7397 | 0.8601 | | 0.0963 | 5.8824 | 2300 | 0.7636 | 0.7167 | 0.7636 | 0.8738 | | 0.0963 | 5.8875 | 2302 | 0.7578 | 0.7167 | 0.7578 | 0.8705 | | 0.0963 | 5.8926 | 2304 | 0.7266 | 0.7249 | 0.7266 | 0.8524 | | 0.0963 | 5.8977 | 2306 | 0.6669 | 0.7249 | 0.6669 | 0.8166 | | 0.0963 | 5.9028 | 2308 | 0.6456 | 0.7704 | 0.6456 | 0.8035 | | 0.0963 | 5.9079 | 2310 | 0.6683 | 0.7525 | 0.6683 | 0.8175 | | 0.0963 | 5.9130 | 2312 | 0.6681 | 0.7525 | 0.6681 | 0.8174 | | 0.0963 | 5.9182 | 2314 | 0.6781 | 0.7525 | 0.6781 | 0.8235 | | 0.0963 | 5.9233 | 2316 | 0.7099 | 0.7437 | 0.7099 | 0.8425 | | 0.0963 | 5.9284 | 2318 | 0.7091 | 0.7437 | 0.7091 | 0.8421 | | 0.0963 | 5.9335 | 2320 | 0.7437 | 0.7597 | 0.7437 | 0.8624 | | 0.0963 | 5.9386 | 2322 | 0.7948 | 0.7514 | 0.7948 | 0.8915 | | 0.0963 | 5.9437 | 2324 | 0.8139 | 0.7514 | 0.8139 | 0.9021 | | 0.0963 | 5.9488 | 2326 | 0.7680 | 0.7597 | 0.7680 | 0.8764 | | 0.0963 | 5.9540 | 2328 | 0.6926 | 0.7437 | 0.6926 | 0.8322 | | 0.0963 | 5.9591 | 2330 | 0.6611 | 0.7525 | 0.6611 | 0.8131 | | 0.0963 | 5.9642 | 2332 | 0.6504 | 0.7525 | 0.6504 | 0.8065 | | 0.0963 | 5.9693 | 2334 | 0.6701 | 0.7525 | 0.6701 | 0.8186 | | 0.0963 | 5.9744 | 2336 | 0.7314 | 0.7437 | 0.7314 | 0.8552 | | 0.0963 | 5.9795 | 2338 | 0.7774 | 0.7058 | 0.7774 | 0.8817 | | 0.0963 | 5.9847 | 2340 | 0.8086 | 0.7337 | 0.8086 | 0.8992 | | 0.0963 | 5.9898 | 2342 | 0.7877 | 0.7342 | 0.7877 | 0.8875 | | 0.0963 | 5.9949 | 2344 | 0.7568 | 0.7606 | 0.7568 | 0.8699 | | 0.0963 | 6.0 | 2346 | 0.7399 | 0.7606 | 0.7399 | 0.8602 | | 0.0963 | 6.0051 | 2348 | 0.7026 | 0.7700 | 0.7026 | 0.8382 | | 0.0963 | 6.0102 | 2350 | 0.7014 | 0.7700 | 0.7014 | 0.8375 | | 0.0963 | 6.0153 | 2352 | 0.6815 | 0.7700 | 0.6815 | 0.8255 | | 0.0963 | 6.0205 | 2354 | 0.6883 | 0.7700 | 0.6883 | 0.8296 | | 0.0963 | 6.0256 | 2356 | 0.7422 | 0.7849 | 0.7422 | 0.8615 | | 0.0963 | 6.0307 | 2358 | 0.8133 | 0.7514 | 0.8133 | 0.9018 | | 0.0963 | 6.0358 | 2360 | 0.8294 | 0.7514 | 0.8294 | 0.9107 | | 0.0963 | 6.0409 | 2362 | 0.7764 | 0.7514 | 0.7764 | 0.8811 | | 0.0963 | 6.0460 | 2364 | 0.6977 | 0.7779 | 0.6977 | 0.8353 | | 0.0963 | 6.0512 | 2366 | 0.6428 | 0.7525 | 0.6428 | 0.8018 | | 0.0963 | 6.0563 | 2368 | 0.6411 | 0.7525 | 0.6411 | 0.8007 | | 0.0963 | 6.0614 | 2370 | 0.6696 | 0.7525 | 0.6696 | 0.8183 | | 0.0963 | 6.0665 | 2372 | 0.6981 | 0.7249 | 0.6981 | 0.8355 | | 0.0963 | 6.0716 | 2374 | 0.6897 | 0.7249 | 0.6897 | 0.8305 | | 0.0963 | 6.0767 | 2376 | 0.6952 | 0.7249 | 0.6952 | 0.8338 | | 0.0963 | 6.0818 | 2378 | 0.7256 | 0.7249 | 0.7256 | 0.8518 | | 0.0963 | 6.0870 | 2380 | 0.7773 | 0.7421 | 0.7773 | 0.8816 | | 0.0963 | 6.0921 | 2382 | 0.8327 | 0.7421 | 0.8327 | 0.9125 | | 0.0963 | 6.0972 | 2384 | 0.8280 | 0.7421 | 0.8280 | 0.9099 | | 0.0963 | 6.1023 | 2386 | 0.7770 | 0.7692 | 0.7770 | 0.8815 | | 0.0963 | 6.1074 | 2388 | 0.7027 | 0.7332 | 0.7027 | 0.8383 | | 0.0963 | 6.1125 | 2390 | 0.6678 | 0.7525 | 0.6678 | 0.8172 | | 0.0963 | 6.1176 | 2392 | 0.6405 | 0.7525 | 0.6405 | 0.8003 | | 0.0963 | 6.1228 | 2394 | 0.6085 | 0.7525 | 0.6085 | 0.7801 | | 0.0963 | 6.1279 | 2396 | 0.6212 | 0.7525 | 0.6212 | 0.7882 | | 0.0963 | 6.1330 | 2398 | 0.6264 | 0.7525 | 0.6264 | 0.7914 | | 0.0963 | 6.1381 | 2400 | 0.6493 | 0.7525 | 0.6493 | 0.8058 | | 0.0963 | 6.1432 | 2402 | 0.6700 | 0.7525 | 0.6700 | 0.8186 | | 0.0963 | 6.1483 | 2404 | 0.6884 | 0.7249 | 0.6884 | 0.8297 | | 0.0963 | 6.1535 | 2406 | 0.6811 | 0.7525 | 0.6811 | 0.8253 | | 0.0963 | 6.1586 | 2408 | 0.6945 | 0.7437 | 0.6945 | 0.8333 | | 0.0963 | 6.1637 | 2410 | 0.7158 | 0.7433 | 0.7158 | 0.8461 | | 0.0963 | 6.1688 | 2412 | 0.7145 | 0.7167 | 0.7145 | 0.8453 | | 0.0963 | 6.1739 | 2414 | 0.6731 | 0.7525 | 0.6731 | 0.8204 | | 0.0963 | 6.1790 | 2416 | 0.6301 | 0.7441 | 0.6301 | 0.7938 | | 0.0963 | 6.1841 | 2418 | 0.6196 | 0.7614 | 0.6196 | 0.7871 | | 0.0963 | 6.1893 | 2420 | 0.6248 | 0.7614 | 0.6248 | 0.7904 | | 0.0963 | 6.1944 | 2422 | 0.6633 | 0.7525 | 0.6633 | 0.8145 | | 0.0963 | 6.1995 | 2424 | 0.7443 | 0.7517 | 0.7443 | 0.8627 | | 0.0963 | 6.2046 | 2426 | 0.8255 | 0.7239 | 0.8255 | 0.9086 | | 0.0963 | 6.2097 | 2428 | 0.8527 | 0.7239 | 0.8527 | 0.9234 | | 0.0963 | 6.2148 | 2430 | 0.8238 | 0.7239 | 0.8238 | 0.9076 | | 0.0963 | 6.2199 | 2432 | 0.7555 | 0.6945 | 0.7555 | 0.8692 | | 0.0963 | 6.2251 | 2434 | 0.6841 | 0.7249 | 0.6841 | 0.8271 | | 0.0963 | 6.2302 | 2436 | 0.6545 | 0.7249 | 0.6545 | 0.8090 | | 0.0963 | 6.2353 | 2438 | 0.6435 | 0.7525 | 0.6435 | 0.8022 | | 0.0963 | 6.2404 | 2440 | 0.6615 | 0.7437 | 0.6615 | 0.8133 | | 0.0963 | 6.2455 | 2442 | 0.6953 | 0.7685 | 0.6953 | 0.8338 | | 0.0963 | 6.2506 | 2444 | 0.7294 | 0.7601 | 0.7294 | 0.8540 | | 0.0963 | 6.2558 | 2446 | 0.7405 | 0.7601 | 0.7405 | 0.8605 | | 0.0963 | 6.2609 | 2448 | 0.7668 | 0.7685 | 0.7668 | 0.8757 | | 0.0963 | 6.2660 | 2450 | 0.7503 | 0.7685 | 0.7503 | 0.8662 | | 0.0963 | 6.2711 | 2452 | 0.7211 | 0.7685 | 0.7211 | 0.8492 | | 0.0963 | 6.2762 | 2454 | 0.7277 | 0.7167 | 0.7277 | 0.8530 | | 0.0963 | 6.2813 | 2456 | 0.7546 | 0.7162 | 0.7546 | 0.8687 | | 0.0963 | 6.2864 | 2458 | 0.7588 | 0.7162 | 0.7588 | 0.8711 | | 0.0963 | 6.2916 | 2460 | 0.7411 | 0.7167 | 0.7411 | 0.8609 | | 0.0963 | 6.2967 | 2462 | 0.7508 | 0.7433 | 0.7508 | 0.8665 | | 0.0963 | 6.3018 | 2464 | 0.7669 | 0.7162 | 0.7669 | 0.8758 | | 0.0963 | 6.3069 | 2466 | 0.7884 | 0.7162 | 0.7884 | 0.8879 | | 0.0963 | 6.3120 | 2468 | 0.8209 | 0.7337 | 0.8209 | 0.9060 | | 0.0963 | 6.3171 | 2470 | 0.8658 | 0.7337 | 0.8658 | 0.9305 | | 0.0963 | 6.3223 | 2472 | 0.8920 | 0.7234 | 0.8920 | 0.9445 | | 0.0963 | 6.3274 | 2474 | 0.8626 | 0.7426 | 0.8626 | 0.9288 | | 0.0963 | 6.3325 | 2476 | 0.8207 | 0.6951 | 0.8207 | 0.9059 | | 0.0963 | 6.3376 | 2478 | 0.8315 | 0.7139 | 0.8315 | 0.9119 | | 0.0963 | 6.3427 | 2480 | 0.8302 | 0.7139 | 0.8302 | 0.9112 | | 0.0963 | 6.3478 | 2482 | 0.8296 | 0.7139 | 0.8296 | 0.9108 | | 0.0963 | 6.3529 | 2484 | 0.8206 | 0.7139 | 0.8206 | 0.9059 | | 0.0963 | 6.3581 | 2486 | 0.7790 | 0.7430 | 0.7790 | 0.8826 | | 0.0963 | 6.3632 | 2488 | 0.7542 | 0.7249 | 0.7542 | 0.8684 | | 0.0963 | 6.3683 | 2490 | 0.7381 | 0.7249 | 0.7381 | 0.8591 | | 0.0963 | 6.3734 | 2492 | 0.7309 | 0.7249 | 0.7309 | 0.8549 | | 0.0963 | 6.3785 | 2494 | 0.7653 | 0.7521 | 0.7653 | 0.8748 | | 0.0963 | 6.3836 | 2496 | 0.8320 | 0.7162 | 0.8320 | 0.9122 | | 0.0963 | 6.3887 | 2498 | 0.8976 | 0.7337 | 0.8976 | 0.9474 | | 0.0759 | 6.3939 | 2500 | 0.9393 | 0.7157 | 0.9393 | 0.9692 | | 0.0759 | 6.3990 | 2502 | 0.9397 | 0.7157 | 0.9397 | 0.9694 | | 0.0759 | 6.4041 | 2504 | 0.8762 | 0.7337 | 0.8762 | 0.9361 | | 0.0759 | 6.4092 | 2506 | 0.8132 | 0.7162 | 0.8132 | 0.9018 | | 0.0759 | 6.4143 | 2508 | 0.7801 | 0.7433 | 0.7801 | 0.8832 | | 0.0759 | 6.4194 | 2510 | 0.7909 | 0.7602 | 0.7909 | 0.8893 | | 0.0759 | 6.4246 | 2512 | 0.8030 | 0.7342 | 0.8030 | 0.8961 | | 0.0759 | 6.4297 | 2514 | 0.8151 | 0.7342 | 0.8151 | 0.9028 | | 0.0759 | 6.4348 | 2516 | 0.8221 | 0.7058 | 0.8221 | 0.9067 | | 0.0759 | 6.4399 | 2518 | 0.8090 | 0.6877 | 0.8090 | 0.8995 | | 0.0759 | 6.4450 | 2520 | 0.7847 | 0.6951 | 0.7847 | 0.8858 | | 0.0759 | 6.4501 | 2522 | 0.7812 | 0.6951 | 0.7812 | 0.8839 | | 0.0759 | 6.4552 | 2524 | 0.7613 | 0.7249 | 0.7613 | 0.8725 | | 0.0759 | 6.4604 | 2526 | 0.7615 | 0.7249 | 0.7615 | 0.8727 | | 0.0759 | 6.4655 | 2528 | 0.7880 | 0.7249 | 0.7880 | 0.8877 | | 0.0759 | 6.4706 | 2530 | 0.7796 | 0.7249 | 0.7796 | 0.8830 | | 0.0759 | 6.4757 | 2532 | 0.7704 | 0.7249 | 0.7704 | 0.8777 | | 0.0759 | 6.4808 | 2534 | 0.7846 | 0.6951 | 0.7846 | 0.8858 | | 0.0759 | 6.4859 | 2536 | 0.7922 | 0.6951 | 0.7922 | 0.8901 | | 0.0759 | 6.4910 | 2538 | 0.7850 | 0.6951 | 0.7850 | 0.8860 | | 0.0759 | 6.4962 | 2540 | 0.7545 | 0.6951 | 0.7545 | 0.8686 | | 0.0759 | 6.5013 | 2542 | 0.7376 | 0.7249 | 0.7376 | 0.8589 | | 0.0759 | 6.5064 | 2544 | 0.7029 | 0.7249 | 0.7029 | 0.8384 | | 0.0759 | 6.5115 | 2546 | 0.6993 | 0.7249 | 0.6993 | 0.8363 | | 0.0759 | 6.5166 | 2548 | 0.7394 | 0.7249 | 0.7394 | 0.8599 | | 0.0759 | 6.5217 | 2550 | 0.7898 | 0.7162 | 0.7898 | 0.8887 | | 0.0759 | 6.5269 | 2552 | 0.8622 | 0.7421 | 0.8622 | 0.9286 | | 0.0759 | 6.5320 | 2554 | 0.9140 | 0.7511 | 0.9140 | 0.9561 | | 0.0759 | 6.5371 | 2556 | 0.9239 | 0.7511 | 0.9239 | 0.9612 | | 0.0759 | 6.5422 | 2558 | 0.8915 | 0.7511 | 0.8915 | 0.9442 | | 0.0759 | 6.5473 | 2560 | 0.8447 | 0.7310 | 0.8447 | 0.9191 | | 0.0759 | 6.5524 | 2562 | 0.8443 | 0.7310 | 0.8443 | 0.9189 | | 0.0759 | 6.5575 | 2564 | 0.8387 | 0.7310 | 0.8387 | 0.9158 | | 0.0759 | 6.5627 | 2566 | 0.7994 | 0.7525 | 0.7994 | 0.8941 | | 0.0759 | 6.5678 | 2568 | 0.7698 | 0.7525 | 0.7698 | 0.8774 | | 0.0759 | 6.5729 | 2570 | 0.7685 | 0.7025 | 0.7685 | 0.8766 | | 0.0759 | 6.5780 | 2572 | 0.7879 | 0.6877 | 0.7879 | 0.8876 | | 0.0759 | 6.5831 | 2574 | 0.7864 | 0.7162 | 0.7864 | 0.8868 | | 0.0759 | 6.5882 | 2576 | 0.7684 | 0.7433 | 0.7684 | 0.8766 | | 0.0759 | 6.5934 | 2578 | 0.7198 | 0.7685 | 0.7198 | 0.8484 | | 0.0759 | 6.5985 | 2580 | 0.6848 | 0.7437 | 0.6848 | 0.8275 | | 0.0759 | 6.6036 | 2582 | 0.6657 | 0.7525 | 0.6657 | 0.8159 | | 0.0759 | 6.6087 | 2584 | 0.6791 | 0.7437 | 0.6791 | 0.8241 | | 0.0759 | 6.6138 | 2586 | 0.7003 | 0.7685 | 0.7003 | 0.8368 | | 0.0759 | 6.6189 | 2588 | 0.7564 | 0.7685 | 0.7564 | 0.8697 | | 0.0759 | 6.6240 | 2590 | 0.8311 | 0.7602 | 0.8311 | 0.9116 | | 0.0759 | 6.6292 | 2592 | 0.8466 | 0.7602 | 0.8466 | 0.9201 | | 0.0759 | 6.6343 | 2594 | 0.8212 | 0.7602 | 0.8212 | 0.9062 | | 0.0759 | 6.6394 | 2596 | 0.7840 | 0.7602 | 0.7840 | 0.8854 | | 0.0759 | 6.6445 | 2598 | 0.7495 | 0.7525 | 0.7495 | 0.8657 | | 0.0759 | 6.6496 | 2600 | 0.7425 | 0.7525 | 0.7425 | 0.8617 | | 0.0759 | 6.6547 | 2602 | 0.7093 | 0.7525 | 0.7093 | 0.8422 | | 0.0759 | 6.6598 | 2604 | 0.7038 | 0.7525 | 0.7038 | 0.8389 | | 0.0759 | 6.6650 | 2606 | 0.7009 | 0.7525 | 0.7009 | 0.8372 | | 0.0759 | 6.6701 | 2608 | 0.7093 | 0.7525 | 0.7093 | 0.8422 | | 0.0759 | 6.6752 | 2610 | 0.7136 | 0.7525 | 0.7136 | 0.8447 | | 0.0759 | 6.6803 | 2612 | 0.7484 | 0.7948 | 0.7484 | 0.8651 | | 0.0759 | 6.6854 | 2614 | 0.7888 | 0.7602 | 0.7888 | 0.8881 | | 0.0759 | 6.6905 | 2616 | 0.7802 | 0.7602 | 0.7802 | 0.8833 | | 0.0759 | 6.6957 | 2618 | 0.7530 | 0.7696 | 0.7530 | 0.8677 | | 0.0759 | 6.7008 | 2620 | 0.7311 | 0.7779 | 0.7311 | 0.8551 | | 0.0759 | 6.7059 | 2622 | 0.7426 | 0.7779 | 0.7426 | 0.8618 | | 0.0759 | 6.7110 | 2624 | 0.7348 | 0.7779 | 0.7348 | 0.8572 | | 0.0759 | 6.7161 | 2626 | 0.7132 | 0.7779 | 0.7132 | 0.8445 | | 0.0759 | 6.7212 | 2628 | 0.7322 | 0.7779 | 0.7322 | 0.8557 | | 0.0759 | 6.7263 | 2630 | 0.7597 | 0.7426 | 0.7597 | 0.8716 | | 0.0759 | 6.7315 | 2632 | 0.7754 | 0.7426 | 0.7754 | 0.8806 | | 0.0759 | 6.7366 | 2634 | 0.7852 | 0.7426 | 0.7852 | 0.8861 | | 0.0759 | 6.7417 | 2636 | 0.7885 | 0.7426 | 0.7885 | 0.8880 | | 0.0759 | 6.7468 | 2638 | 0.8048 | 0.7516 | 0.8048 | 0.8971 | | 0.0759 | 6.7519 | 2640 | 0.8011 | 0.7516 | 0.8011 | 0.8950 | | 0.0759 | 6.7570 | 2642 | 0.7861 | 0.7244 | 0.7861 | 0.8866 | | 0.0759 | 6.7621 | 2644 | 0.7750 | 0.7244 | 0.7750 | 0.8803 | | 0.0759 | 6.7673 | 2646 | 0.7791 | 0.7244 | 0.7791 | 0.8826 | | 0.0759 | 6.7724 | 2648 | 0.7991 | 0.7244 | 0.7991 | 0.8939 | | 0.0759 | 6.7775 | 2650 | 0.8172 | 0.7244 | 0.8172 | 0.9040 | | 0.0759 | 6.7826 | 2652 | 0.8464 | 0.7327 | 0.8464 | 0.9200 | | 0.0759 | 6.7877 | 2654 | 0.8329 | 0.7327 | 0.8329 | 0.9126 | | 0.0759 | 6.7928 | 2656 | 0.7832 | 0.7244 | 0.7832 | 0.8850 | | 0.0759 | 6.7980 | 2658 | 0.7348 | 0.6951 | 0.7348 | 0.8572 | | 0.0759 | 6.8031 | 2660 | 0.7202 | 0.6951 | 0.7202 | 0.8487 | | 0.0759 | 6.8082 | 2662 | 0.7029 | 0.6951 | 0.7029 | 0.8384 | | 0.0759 | 6.8133 | 2664 | 0.7160 | 0.6951 | 0.7160 | 0.8462 | | 0.0759 | 6.8184 | 2666 | 0.7677 | 0.7244 | 0.7677 | 0.8762 | | 0.0759 | 6.8235 | 2668 | 0.8430 | 0.7162 | 0.8430 | 0.9182 | | 0.0759 | 6.8286 | 2670 | 0.8864 | 0.7421 | 0.8864 | 0.9415 | | 0.0759 | 6.8338 | 2672 | 0.8890 | 0.7421 | 0.8890 | 0.9429 | | 0.0759 | 6.8389 | 2674 | 0.9118 | 0.7511 | 0.9118 | 0.9549 | | 0.0759 | 6.8440 | 2676 | 0.9072 | 0.7614 | 0.9072 | 0.9525 | | 0.0759 | 6.8491 | 2678 | 0.8638 | 0.7310 | 0.8638 | 0.9294 | | 0.0759 | 6.8542 | 2680 | 0.8022 | 0.7631 | 0.8022 | 0.8957 | | 0.0759 | 6.8593 | 2682 | 0.7306 | 0.7525 | 0.7306 | 0.8547 | | 0.0759 | 6.8645 | 2684 | 0.6938 | 0.7427 | 0.6938 | 0.8330 | | 0.0759 | 6.8696 | 2686 | 0.6654 | 0.7427 | 0.6654 | 0.8157 | | 0.0759 | 6.8747 | 2688 | 0.6660 | 0.7427 | 0.6660 | 0.8161 | | 0.0759 | 6.8798 | 2690 | 0.6890 | 0.7427 | 0.6890 | 0.8300 | | 0.0759 | 6.8849 | 2692 | 0.7305 | 0.6951 | 0.7305 | 0.8547 | | 0.0759 | 6.8900 | 2694 | 0.7686 | 0.6951 | 0.7686 | 0.8767 | | 0.0759 | 6.8951 | 2696 | 0.7906 | 0.6951 | 0.7906 | 0.8892 | | 0.0759 | 6.9003 | 2698 | 0.7827 | 0.6951 | 0.7827 | 0.8847 | | 0.0759 | 6.9054 | 2700 | 0.7650 | 0.6951 | 0.7650 | 0.8746 | | 0.0759 | 6.9105 | 2702 | 0.7550 | 0.6951 | 0.7550 | 0.8689 | | 0.0759 | 6.9156 | 2704 | 0.7428 | 0.6951 | 0.7428 | 0.8618 | | 0.0759 | 6.9207 | 2706 | 0.7208 | 0.6951 | 0.7208 | 0.8490 | | 0.0759 | 6.9258 | 2708 | 0.7254 | 0.6951 | 0.7254 | 0.8517 | | 0.0759 | 6.9309 | 2710 | 0.7524 | 0.6951 | 0.7524 | 0.8674 | | 0.0759 | 6.9361 | 2712 | 0.7724 | 0.7139 | 0.7724 | 0.8788 | | 0.0759 | 6.9412 | 2714 | 0.7607 | 0.7139 | 0.7607 | 0.8722 | | 0.0759 | 6.9463 | 2716 | 0.7695 | 0.7139 | 0.7695 | 0.8772 | | 0.0759 | 6.9514 | 2718 | 0.8040 | 0.7139 | 0.8040 | 0.8966 | | 0.0759 | 6.9565 | 2720 | 0.8113 | 0.7139 | 0.8113 | 0.9007 | | 0.0759 | 6.9616 | 2722 | 0.8473 | 0.7139 | 0.8473 | 0.9205 | | 0.0759 | 6.9668 | 2724 | 0.8934 | 0.7134 | 0.8934 | 0.9452 | | 0.0759 | 6.9719 | 2726 | 0.9016 | 0.7134 | 0.9016 | 0.9495 | | 0.0759 | 6.9770 | 2728 | 0.8635 | 0.7221 | 0.8635 | 0.9292 | | 0.0759 | 6.9821 | 2730 | 0.7984 | 0.7139 | 0.7984 | 0.8935 | | 0.0759 | 6.9872 | 2732 | 0.7208 | 0.6951 | 0.7208 | 0.8490 | | 0.0759 | 6.9923 | 2734 | 0.6839 | 0.7717 | 0.6839 | 0.8270 | | 0.0759 | 6.9974 | 2736 | 0.6856 | 0.7427 | 0.6856 | 0.8280 | | 0.0759 | 7.0026 | 2738 | 0.7121 | 0.7427 | 0.7121 | 0.8439 | | 0.0759 | 7.0077 | 2740 | 0.7639 | 0.6951 | 0.7639 | 0.8740 | | 0.0759 | 7.0128 | 2742 | 0.8194 | 0.7139 | 0.8194 | 0.9052 | | 0.0759 | 7.0179 | 2744 | 0.8522 | 0.7221 | 0.8522 | 0.9232 | | 0.0759 | 7.0230 | 2746 | 0.8548 | 0.7221 | 0.8548 | 0.9246 | | 0.0759 | 7.0281 | 2748 | 0.8298 | 0.7139 | 0.8298 | 0.9109 | | 0.0759 | 7.0332 | 2750 | 0.7958 | 0.7139 | 0.7958 | 0.8921 | | 0.0759 | 7.0384 | 2752 | 0.7852 | 0.7139 | 0.7852 | 0.8861 | | 0.0759 | 7.0435 | 2754 | 0.7572 | 0.6951 | 0.7572 | 0.8702 | | 0.0759 | 7.0486 | 2756 | 0.7272 | 0.7525 | 0.7272 | 0.8528 | | 0.0759 | 7.0537 | 2758 | 0.7331 | 0.7525 | 0.7331 | 0.8562 | | 0.0759 | 7.0588 | 2760 | 0.7629 | 0.7433 | 0.7629 | 0.8735 | | 0.0759 | 7.0639 | 2762 | 0.7569 | 0.7433 | 0.7569 | 0.8700 | | 0.0759 | 7.0691 | 2764 | 0.7199 | 0.7779 | 0.7199 | 0.8485 | | 0.0759 | 7.0742 | 2766 | 0.6904 | 0.7779 | 0.6904 | 0.8309 | | 0.0759 | 7.0793 | 2768 | 0.6585 | 0.7441 | 0.6585 | 0.8115 | | 0.0759 | 7.0844 | 2770 | 0.6614 | 0.7441 | 0.6614 | 0.8133 | | 0.0759 | 7.0895 | 2772 | 0.6881 | 0.7525 | 0.6881 | 0.8295 | | 0.0759 | 7.0946 | 2774 | 0.7363 | 0.7249 | 0.7363 | 0.8581 | | 0.0759 | 7.0997 | 2776 | 0.8043 | 0.7162 | 0.8043 | 0.8968 | | 0.0759 | 7.1049 | 2778 | 0.8410 | 0.7421 | 0.8410 | 0.9171 | | 0.0759 | 7.1100 | 2780 | 0.8725 | 0.7421 | 0.8725 | 0.9341 | | 0.0759 | 7.1151 | 2782 | 0.9116 | 0.7511 | 0.9116 | 0.9548 | | 0.0759 | 7.1202 | 2784 | 0.9095 | 0.7511 | 0.9095 | 0.9537 | | 0.0759 | 7.1253 | 2786 | 0.8719 | 0.7421 | 0.8719 | 0.9337 | | 0.0759 | 7.1304 | 2788 | 0.8053 | 0.6951 | 0.8053 | 0.8974 | | 0.0759 | 7.1355 | 2790 | 0.7381 | 0.6951 | 0.7381 | 0.8591 | | 0.0759 | 7.1407 | 2792 | 0.6844 | 0.7986 | 0.6844 | 0.8273 | | 0.0759 | 7.1458 | 2794 | 0.6660 | 0.7986 | 0.6660 | 0.8161 | | 0.0759 | 7.1509 | 2796 | 0.6665 | 0.7986 | 0.6665 | 0.8164 | | 0.0759 | 7.1560 | 2798 | 0.6902 | 0.7986 | 0.6902 | 0.8308 | | 0.0759 | 7.1611 | 2800 | 0.7276 | 0.7427 | 0.7276 | 0.8530 | | 0.0759 | 7.1662 | 2802 | 0.7456 | 0.6951 | 0.7456 | 0.8635 | | 0.0759 | 7.1714 | 2804 | 0.7722 | 0.7244 | 0.7722 | 0.8787 | | 0.0759 | 7.1765 | 2806 | 0.7818 | 0.7426 | 0.7818 | 0.8842 | | 0.0759 | 7.1816 | 2808 | 0.7629 | 0.7244 | 0.7629 | 0.8734 | | 0.0759 | 7.1867 | 2810 | 0.7145 | 0.7986 | 0.7145 | 0.8453 | | 0.0759 | 7.1918 | 2812 | 0.6661 | 0.7986 | 0.6661 | 0.8161 | | 0.0759 | 7.1969 | 2814 | 0.6503 | 0.7986 | 0.6503 | 0.8064 | | 0.0759 | 7.2020 | 2816 | 0.6641 | 0.7986 | 0.6641 | 0.8149 | | 0.0759 | 7.2072 | 2818 | 0.6906 | 0.7986 | 0.6906 | 0.8310 | | 0.0759 | 7.2123 | 2820 | 0.7290 | 0.6951 | 0.7290 | 0.8538 | | 0.0759 | 7.2174 | 2822 | 0.7743 | 0.7244 | 0.7743 | 0.8799 | | 0.0759 | 7.2225 | 2824 | 0.8037 | 0.7244 | 0.8037 | 0.8965 | | 0.0759 | 7.2276 | 2826 | 0.7886 | 0.7244 | 0.7886 | 0.8881 | | 0.0759 | 7.2327 | 2828 | 0.7490 | 0.7244 | 0.7490 | 0.8654 | | 0.0759 | 7.2379 | 2830 | 0.7047 | 0.6951 | 0.7047 | 0.8395 | | 0.0759 | 7.2430 | 2832 | 0.6731 | 0.7525 | 0.6731 | 0.8204 | | 0.0759 | 7.2481 | 2834 | 0.6684 | 0.7525 | 0.6684 | 0.8176 | | 0.0759 | 7.2532 | 2836 | 0.6960 | 0.6951 | 0.6960 | 0.8343 | | 0.0759 | 7.2583 | 2838 | 0.7507 | 0.7244 | 0.7507 | 0.8665 | | 0.0759 | 7.2634 | 2840 | 0.7859 | 0.7244 | 0.7859 | 0.8865 | | 0.0759 | 7.2685 | 2842 | 0.7884 | 0.7244 | 0.7884 | 0.8879 | | 0.0759 | 7.2737 | 2844 | 0.7979 | 0.7426 | 0.7979 | 0.8933 | | 0.0759 | 7.2788 | 2846 | 0.7856 | 0.7244 | 0.7856 | 0.8863 | | 0.0759 | 7.2839 | 2848 | 0.7766 | 0.7244 | 0.7766 | 0.8812 | | 0.0759 | 7.2890 | 2850 | 0.8080 | 0.7426 | 0.8080 | 0.8989 | | 0.0759 | 7.2941 | 2852 | 0.8371 | 0.7426 | 0.8371 | 0.9149 | | 0.0759 | 7.2992 | 2854 | 0.8530 | 0.7516 | 0.8530 | 0.9236 | | 0.0759 | 7.3043 | 2856 | 0.8588 | 0.7516 | 0.8588 | 0.9267 | | 0.0759 | 7.3095 | 2858 | 0.8386 | 0.7729 | 0.8386 | 0.9158 | | 0.0759 | 7.3146 | 2860 | 0.8471 | 0.7729 | 0.8471 | 0.9204 | | 0.0759 | 7.3197 | 2862 | 0.8616 | 0.7729 | 0.8616 | 0.9282 | | 0.0759 | 7.3248 | 2864 | 0.8623 | 0.7221 | 0.8623 | 0.9286 | | 0.0759 | 7.3299 | 2866 | 0.8528 | 0.7516 | 0.8528 | 0.9235 | | 0.0759 | 7.3350 | 2868 | 0.8499 | 0.7516 | 0.8499 | 0.9219 | | 0.0759 | 7.3402 | 2870 | 0.8174 | 0.7426 | 0.8174 | 0.9041 | | 0.0759 | 7.3453 | 2872 | 0.7900 | 0.7426 | 0.7900 | 0.8888 | | 0.0759 | 7.3504 | 2874 | 0.7679 | 0.7426 | 0.7679 | 0.8763 | | 0.0759 | 7.3555 | 2876 | 0.7425 | 0.6951 | 0.7425 | 0.8617 | | 0.0759 | 7.3606 | 2878 | 0.7474 | 0.6951 | 0.7474 | 0.8645 | | 0.0759 | 7.3657 | 2880 | 0.7710 | 0.6951 | 0.7710 | 0.8781 | | 0.0759 | 7.3708 | 2882 | 0.8098 | 0.7244 | 0.8098 | 0.8999 | | 0.0759 | 7.3760 | 2884 | 0.8453 | 0.7421 | 0.8453 | 0.9194 | | 0.0759 | 7.3811 | 2886 | 0.8920 | 0.7318 | 0.8920 | 0.9444 | | 0.0759 | 7.3862 | 2888 | 0.9115 | 0.7318 | 0.9115 | 0.9547 | | 0.0759 | 7.3913 | 2890 | 0.9134 | 0.7318 | 0.9134 | 0.9557 | | 0.0759 | 7.3964 | 2892 | 0.9371 | 0.7318 | 0.9371 | 0.9680 | | 0.0759 | 7.4015 | 2894 | 0.9357 | 0.7318 | 0.9357 | 0.9673 | | 0.0759 | 7.4066 | 2896 | 0.9191 | 0.7318 | 0.9191 | 0.9587 | | 0.0759 | 7.4118 | 2898 | 0.9116 | 0.7318 | 0.9116 | 0.9548 | | 0.0759 | 7.4169 | 2900 | 0.8957 | 0.7318 | 0.8957 | 0.9464 | | 0.0759 | 7.4220 | 2902 | 0.8747 | 0.7014 | 0.8747 | 0.9353 | | 0.0759 | 7.4271 | 2904 | 0.8351 | 0.7221 | 0.8351 | 0.9138 | | 0.0759 | 7.4322 | 2906 | 0.8151 | 0.7139 | 0.8151 | 0.9028 | | 0.0759 | 7.4373 | 2908 | 0.8149 | 0.7058 | 0.8149 | 0.9027 | | 0.0759 | 7.4425 | 2910 | 0.8150 | 0.7058 | 0.8150 | 0.9028 | | 0.0759 | 7.4476 | 2912 | 0.8068 | 0.7058 | 0.8068 | 0.8982 | | 0.0759 | 7.4527 | 2914 | 0.7791 | 0.7337 | 0.7791 | 0.8827 | | 0.0759 | 7.4578 | 2916 | 0.7716 | 0.7162 | 0.7716 | 0.8784 | | 0.0759 | 7.4629 | 2918 | 0.7727 | 0.7162 | 0.7727 | 0.8790 | | 0.0759 | 7.4680 | 2920 | 0.7801 | 0.7337 | 0.7801 | 0.8832 | | 0.0759 | 7.4731 | 2922 | 0.7814 | 0.7337 | 0.7814 | 0.8840 | | 0.0759 | 7.4783 | 2924 | 0.7897 | 0.7337 | 0.7897 | 0.8887 | | 0.0759 | 7.4834 | 2926 | 0.7778 | 0.7427 | 0.7778 | 0.8819 | | 0.0759 | 7.4885 | 2928 | 0.7836 | 0.7427 | 0.7836 | 0.8852 | | 0.0759 | 7.4936 | 2930 | 0.8028 | 0.7729 | 0.8028 | 0.8960 | | 0.0759 | 7.4987 | 2932 | 0.7980 | 0.7729 | 0.7980 | 0.8933 | | 0.0759 | 7.5038 | 2934 | 0.7923 | 0.7525 | 0.7923 | 0.8901 | | 0.0759 | 7.5090 | 2936 | 0.7880 | 0.7525 | 0.7880 | 0.8877 | | 0.0759 | 7.5141 | 2938 | 0.7804 | 0.7525 | 0.7804 | 0.8834 | | 0.0759 | 7.5192 | 2940 | 0.7597 | 0.7427 | 0.7597 | 0.8716 | | 0.0759 | 7.5243 | 2942 | 0.7424 | 0.7427 | 0.7424 | 0.8616 | | 0.0759 | 7.5294 | 2944 | 0.7208 | 0.7427 | 0.7208 | 0.8490 | | 0.0759 | 7.5345 | 2946 | 0.7060 | 0.7983 | 0.7060 | 0.8402 | | 0.0759 | 7.5396 | 2948 | 0.6945 | 0.7983 | 0.6945 | 0.8334 | | 0.0759 | 7.5448 | 2950 | 0.7138 | 0.7521 | 0.7138 | 0.8448 | | 0.0759 | 7.5499 | 2952 | 0.7490 | 0.7162 | 0.7490 | 0.8654 | | 0.0759 | 7.5550 | 2954 | 0.7682 | 0.7162 | 0.7682 | 0.8765 | | 0.0759 | 7.5601 | 2956 | 0.7913 | 0.7162 | 0.7913 | 0.8895 | | 0.0759 | 7.5652 | 2958 | 0.7987 | 0.7162 | 0.7987 | 0.8937 | | 0.0759 | 7.5703 | 2960 | 0.8156 | 0.7162 | 0.8156 | 0.9031 | | 0.0759 | 7.5754 | 2962 | 0.8161 | 0.7162 | 0.8161 | 0.9034 | | 0.0759 | 7.5806 | 2964 | 0.7992 | 0.7162 | 0.7992 | 0.8940 | | 0.0759 | 7.5857 | 2966 | 0.7690 | 0.7244 | 0.7690 | 0.8769 | | 0.0759 | 7.5908 | 2968 | 0.7353 | 0.7713 | 0.7353 | 0.8575 | | 0.0759 | 7.5959 | 2970 | 0.7207 | 0.7427 | 0.7207 | 0.8490 | | 0.0759 | 7.6010 | 2972 | 0.7071 | 0.7427 | 0.7071 | 0.8409 | | 0.0759 | 7.6061 | 2974 | 0.7000 | 0.7427 | 0.7000 | 0.8367 | | 0.0759 | 7.6113 | 2976 | 0.7078 | 0.7427 | 0.7078 | 0.8413 | | 0.0759 | 7.6164 | 2978 | 0.7354 | 0.7427 | 0.7354 | 0.8575 | | 0.0759 | 7.6215 | 2980 | 0.7839 | 0.7162 | 0.7839 | 0.8854 | | 0.0759 | 7.6266 | 2982 | 0.8452 | 0.7421 | 0.8452 | 0.9193 | | 0.0759 | 7.6317 | 2984 | 0.8820 | 0.7421 | 0.8820 | 0.9391 | | 0.0759 | 7.6368 | 2986 | 0.9155 | 0.7421 | 0.9155 | 0.9568 | | 0.0759 | 7.6419 | 2988 | 0.9062 | 0.7421 | 0.9062 | 0.9519 | | 0.0759 | 7.6471 | 2990 | 0.8766 | 0.7421 | 0.8766 | 0.9363 | | 0.0759 | 7.6522 | 2992 | 0.8499 | 0.7421 | 0.8499 | 0.9219 | | 0.0759 | 7.6573 | 2994 | 0.8283 | 0.7421 | 0.8283 | 0.9101 | | 0.0759 | 7.6624 | 2996 | 0.7891 | 0.7337 | 0.7891 | 0.8883 | | 0.0759 | 7.6675 | 2998 | 0.7313 | 0.7717 | 0.7313 | 0.8551 | | 0.0613 | 7.6726 | 3000 | 0.6889 | 0.7717 | 0.6889 | 0.8300 | | 0.0613 | 7.6777 | 3002 | 0.6626 | 0.7986 | 0.6626 | 0.8140 | | 0.0613 | 7.6829 | 3004 | 0.6637 | 0.7986 | 0.6637 | 0.8147 | | 0.0613 | 7.6880 | 3006 | 0.6808 | 0.7986 | 0.6808 | 0.8251 | | 0.0613 | 7.6931 | 3008 | 0.7003 | 0.7986 | 0.7003 | 0.8368 | | 0.0613 | 7.6982 | 3010 | 0.7388 | 0.7521 | 0.7388 | 0.8595 | | 0.0613 | 7.7033 | 3012 | 0.7966 | 0.7602 | 0.7966 | 0.8925 | | 0.0613 | 7.7084 | 3014 | 0.8590 | 0.7337 | 0.8590 | 0.9268 | | 0.0613 | 7.7136 | 3016 | 0.9164 | 0.7255 | 0.9164 | 0.9573 | | 0.0613 | 7.7187 | 3018 | 0.9368 | 0.7255 | 0.9368 | 0.9679 | | 0.0613 | 7.7238 | 3020 | 0.9175 | 0.7255 | 0.9175 | 0.9578 | | 0.0613 | 7.7289 | 3022 | 0.8808 | 0.7337 | 0.8808 | 0.9385 | | 0.0613 | 7.7340 | 3024 | 0.8375 | 0.7426 | 0.8375 | 0.9152 | | 0.0613 | 7.7391 | 3026 | 0.7938 | 0.7139 | 0.7938 | 0.8910 | | 0.0613 | 7.7442 | 3028 | 0.7596 | 0.7623 | 0.7596 | 0.8715 | | 0.0613 | 7.7494 | 3030 | 0.7282 | 0.7427 | 0.7282 | 0.8534 | | 0.0613 | 7.7545 | 3032 | 0.7068 | 0.7717 | 0.7068 | 0.8407 | | 0.0613 | 7.7596 | 3034 | 0.7082 | 0.7717 | 0.7082 | 0.8415 | | 0.0613 | 7.7647 | 3036 | 0.7111 | 0.7717 | 0.7111 | 0.8433 | | 0.0613 | 7.7698 | 3038 | 0.7256 | 0.7717 | 0.7256 | 0.8518 | | 0.0613 | 7.7749 | 3040 | 0.7415 | 0.7717 | 0.7415 | 0.8611 | | 0.0613 | 7.7801 | 3042 | 0.7530 | 0.7906 | 0.7530 | 0.8677 | | 0.0613 | 7.7852 | 3044 | 0.7586 | 0.7906 | 0.7586 | 0.8710 | | 0.0613 | 7.7903 | 3046 | 0.7590 | 0.7906 | 0.7590 | 0.8712 | | 0.0613 | 7.7954 | 3048 | 0.7437 | 0.7717 | 0.7437 | 0.8624 | | 0.0613 | 7.8005 | 3050 | 0.7393 | 0.7249 | 0.7393 | 0.8598 | | 0.0613 | 7.8056 | 3052 | 0.7513 | 0.7521 | 0.7513 | 0.8668 | | 0.0613 | 7.8107 | 3054 | 0.7456 | 0.7433 | 0.7456 | 0.8635 | | 0.0613 | 7.8159 | 3056 | 0.7247 | 0.7433 | 0.7247 | 0.8513 | | 0.0613 | 7.8210 | 3058 | 0.7156 | 0.7685 | 0.7156 | 0.8459 | | 0.0613 | 7.8261 | 3060 | 0.7054 | 0.7685 | 0.7054 | 0.8399 | | 0.0613 | 7.8312 | 3062 | 0.7050 | 0.7685 | 0.7050 | 0.8396 | | 0.0613 | 7.8363 | 3064 | 0.6971 | 0.7685 | 0.6971 | 0.8349 | | 0.0613 | 7.8414 | 3066 | 0.6945 | 0.7685 | 0.6945 | 0.8334 | | 0.0613 | 7.8465 | 3068 | 0.6947 | 0.7685 | 0.6947 | 0.8335 | | 0.0613 | 7.8517 | 3070 | 0.6813 | 0.7525 | 0.6813 | 0.8254 | | 0.0613 | 7.8568 | 3072 | 0.6750 | 0.7525 | 0.6750 | 0.8216 | | 0.0613 | 7.8619 | 3074 | 0.6644 | 0.7525 | 0.6644 | 0.8151 | | 0.0613 | 7.8670 | 3076 | 0.6657 | 0.7525 | 0.6657 | 0.8159 | | 0.0613 | 7.8721 | 3078 | 0.6738 | 0.7986 | 0.6738 | 0.8209 | | 0.0613 | 7.8772 | 3080 | 0.6850 | 0.7525 | 0.6850 | 0.8276 | | 0.0613 | 7.8824 | 3082 | 0.6922 | 0.7525 | 0.6922 | 0.8320 | | 0.0613 | 7.8875 | 3084 | 0.6792 | 0.7986 | 0.6792 | 0.8241 | | 0.0613 | 7.8926 | 3086 | 0.6725 | 0.7986 | 0.6725 | 0.8200 | | 0.0613 | 7.8977 | 3088 | 0.6641 | 0.7986 | 0.6641 | 0.8150 | | 0.0613 | 7.9028 | 3090 | 0.6603 | 0.7986 | 0.6603 | 0.8126 | | 0.0613 | 7.9079 | 3092 | 0.6586 | 0.7986 | 0.6586 | 0.8115 | | 0.0613 | 7.9130 | 3094 | 0.6494 | 0.7986 | 0.6494 | 0.8058 | | 0.0613 | 7.9182 | 3096 | 0.6446 | 0.7986 | 0.6446 | 0.8029 | | 0.0613 | 7.9233 | 3098 | 0.6326 | 0.7986 | 0.6326 | 0.7953 | | 0.0613 | 7.9284 | 3100 | 0.6283 | 0.7986 | 0.6283 | 0.7927 | | 0.0613 | 7.9335 | 3102 | 0.6303 | 0.7986 | 0.6303 | 0.7939 | | 0.0613 | 7.9386 | 3104 | 0.6502 | 0.7986 | 0.6502 | 0.8063 | | 0.0613 | 7.9437 | 3106 | 0.6622 | 0.7986 | 0.6622 | 0.8138 | | 0.0613 | 7.9488 | 3108 | 0.6792 | 0.7879 | 0.6792 | 0.8241 | | 0.0613 | 7.9540 | 3110 | 0.6934 | 0.7616 | 0.6934 | 0.8327 | | 0.0613 | 7.9591 | 3112 | 0.6834 | 0.7616 | 0.6834 | 0.8267 | | 0.0613 | 7.9642 | 3114 | 0.6609 | 0.7717 | 0.6609 | 0.8130 | | 0.0613 | 7.9693 | 3116 | 0.6446 | 0.7717 | 0.6446 | 0.8029 | | 0.0613 | 7.9744 | 3118 | 0.6494 | 0.7717 | 0.6494 | 0.8058 | | 0.0613 | 7.9795 | 3120 | 0.6611 | 0.7717 | 0.6611 | 0.8131 | | 0.0613 | 7.9847 | 3122 | 0.6900 | 0.7717 | 0.6900 | 0.8306 | | 0.0613 | 7.9898 | 3124 | 0.7294 | 0.7427 | 0.7294 | 0.8540 | | 0.0613 | 7.9949 | 3126 | 0.7475 | 0.7427 | 0.7475 | 0.8646 | | 0.0613 | 8.0 | 3128 | 0.7391 | 0.7427 | 0.7391 | 0.8597 | | 0.0613 | 8.0051 | 3130 | 0.7109 | 0.7427 | 0.7109 | 0.8431 | | 0.0613 | 8.0102 | 3132 | 0.6950 | 0.7427 | 0.6950 | 0.8337 | | 0.0613 | 8.0153 | 3134 | 0.6976 | 0.7427 | 0.6976 | 0.8352 | | 0.0613 | 8.0205 | 3136 | 0.7166 | 0.7427 | 0.7166 | 0.8465 | | 0.0613 | 8.0256 | 3138 | 0.7396 | 0.7427 | 0.7396 | 0.8600 | | 0.0613 | 8.0307 | 3140 | 0.7467 | 0.7427 | 0.7467 | 0.8641 | | 0.0613 | 8.0358 | 3142 | 0.7674 | 0.6877 | 0.7674 | 0.8760 | | 0.0613 | 8.0409 | 3144 | 0.7843 | 0.6877 | 0.7843 | 0.8856 | | 0.0613 | 8.0460 | 3146 | 0.7841 | 0.6877 | 0.7841 | 0.8855 | | 0.0613 | 8.0512 | 3148 | 0.7945 | 0.7162 | 0.7945 | 0.8913 | | 0.0613 | 8.0563 | 3150 | 0.8034 | 0.7162 | 0.8034 | 0.8963 | | 0.0613 | 8.0614 | 3152 | 0.7924 | 0.7162 | 0.7924 | 0.8902 | | 0.0613 | 8.0665 | 3154 | 0.7576 | 0.7162 | 0.7576 | 0.8704 | | 0.0613 | 8.0716 | 3156 | 0.7407 | 0.7162 | 0.7407 | 0.8606 | | 0.0613 | 8.0767 | 3158 | 0.7149 | 0.7433 | 0.7149 | 0.8455 | | 0.0613 | 8.0818 | 3160 | 0.6858 | 0.7249 | 0.6858 | 0.8281 | | 0.0613 | 8.0870 | 3162 | 0.6674 | 0.7717 | 0.6674 | 0.8170 | | 0.0613 | 8.0921 | 3164 | 0.6610 | 0.7717 | 0.6610 | 0.8130 | | 0.0613 | 8.0972 | 3166 | 0.6628 | 0.7717 | 0.6628 | 0.8142 | | 0.0613 | 8.1023 | 3168 | 0.6831 | 0.7717 | 0.6831 | 0.8265 | | 0.0613 | 8.1074 | 3170 | 0.7159 | 0.7427 | 0.7159 | 0.8461 | | 0.0613 | 8.1125 | 3172 | 0.7610 | 0.7427 | 0.7610 | 0.8723 | | 0.0613 | 8.1176 | 3174 | 0.7988 | 0.7332 | 0.7988 | 0.8938 | | 0.0613 | 8.1228 | 3176 | 0.8152 | 0.7162 | 0.8152 | 0.9029 | | 0.0613 | 8.1279 | 3178 | 0.8033 | 0.7162 | 0.8033 | 0.8963 | | 0.0613 | 8.1330 | 3180 | 0.7668 | 0.7612 | 0.7668 | 0.8757 | | 0.0613 | 8.1381 | 3182 | 0.7335 | 0.7427 | 0.7335 | 0.8564 | | 0.0613 | 8.1432 | 3184 | 0.7105 | 0.7717 | 0.7105 | 0.8429 | | 0.0613 | 8.1483 | 3186 | 0.6864 | 0.7717 | 0.6864 | 0.8285 | | 0.0613 | 8.1535 | 3188 | 0.6645 | 0.7717 | 0.6645 | 0.8152 | | 0.0613 | 8.1586 | 3190 | 0.6520 | 0.7717 | 0.6520 | 0.8074 | | 0.0613 | 8.1637 | 3192 | 0.6573 | 0.7717 | 0.6573 | 0.8107 | | 0.0613 | 8.1688 | 3194 | 0.6755 | 0.7521 | 0.6755 | 0.8219 | | 0.0613 | 8.1739 | 3196 | 0.7029 | 0.7433 | 0.7029 | 0.8384 | | 0.0613 | 8.1790 | 3198 | 0.7232 | 0.7162 | 0.7232 | 0.8504 | | 0.0613 | 8.1841 | 3200 | 0.7301 | 0.7162 | 0.7301 | 0.8544 | | 0.0613 | 8.1893 | 3202 | 0.7246 | 0.7162 | 0.7246 | 0.8513 | | 0.0613 | 8.1944 | 3204 | 0.7157 | 0.7162 | 0.7157 | 0.8460 | | 0.0613 | 8.1995 | 3206 | 0.7171 | 0.7162 | 0.7171 | 0.8468 | | 0.0613 | 8.2046 | 3208 | 0.7300 | 0.7162 | 0.7300 | 0.8544 | | 0.0613 | 8.2097 | 3210 | 0.7369 | 0.7162 | 0.7369 | 0.8584 | | 0.0613 | 8.2148 | 3212 | 0.7390 | 0.7612 | 0.7390 | 0.8596 | | 0.0613 | 8.2199 | 3214 | 0.7452 | 0.7612 | 0.7452 | 0.8633 | | 0.0613 | 8.2251 | 3216 | 0.7550 | 0.7162 | 0.7550 | 0.8689 | | 0.0613 | 8.2302 | 3218 | 0.7657 | 0.7162 | 0.7657 | 0.8750 | | 0.0613 | 8.2353 | 3220 | 0.7954 | 0.7162 | 0.7954 | 0.8918 | | 0.0613 | 8.2404 | 3222 | 0.8190 | 0.7162 | 0.8190 | 0.9050 | | 0.0613 | 8.2455 | 3224 | 0.8163 | 0.7162 | 0.8163 | 0.9035 | | 0.0613 | 8.2506 | 3226 | 0.7957 | 0.7162 | 0.7957 | 0.8920 | | 0.0613 | 8.2558 | 3228 | 0.7586 | 0.7162 | 0.7586 | 0.8710 | | 0.0613 | 8.2609 | 3230 | 0.7370 | 0.6877 | 0.7370 | 0.8585 | | 0.0613 | 8.2660 | 3232 | 0.7337 | 0.6877 | 0.7337 | 0.8566 | | 0.0613 | 8.2711 | 3234 | 0.7308 | 0.7162 | 0.7308 | 0.8549 | | 0.0613 | 8.2762 | 3236 | 0.7298 | 0.7162 | 0.7298 | 0.8543 | | 0.0613 | 8.2813 | 3238 | 0.7393 | 0.7162 | 0.7393 | 0.8598 | | 0.0613 | 8.2864 | 3240 | 0.7379 | 0.7162 | 0.7379 | 0.8590 | | 0.0613 | 8.2916 | 3242 | 0.7382 | 0.7162 | 0.7382 | 0.8592 | | 0.0613 | 8.2967 | 3244 | 0.7409 | 0.7162 | 0.7409 | 0.8608 | | 0.0613 | 8.3018 | 3246 | 0.7414 | 0.7162 | 0.7414 | 0.8610 | | 0.0613 | 8.3069 | 3248 | 0.7406 | 0.7162 | 0.7406 | 0.8606 | | 0.0613 | 8.3120 | 3250 | 0.7439 | 0.7162 | 0.7439 | 0.8625 | | 0.0613 | 8.3171 | 3252 | 0.7449 | 0.7162 | 0.7449 | 0.8631 | | 0.0613 | 8.3223 | 3254 | 0.7397 | 0.7162 | 0.7397 | 0.8601 | | 0.0613 | 8.3274 | 3256 | 0.7284 | 0.7162 | 0.7284 | 0.8535 | | 0.0613 | 8.3325 | 3258 | 0.7137 | 0.7244 | 0.7137 | 0.8448 | | 0.0613 | 8.3376 | 3260 | 0.7211 | 0.7162 | 0.7211 | 0.8492 | | 0.0613 | 8.3427 | 3262 | 0.7361 | 0.7162 | 0.7361 | 0.8580 | | 0.0613 | 8.3478 | 3264 | 0.7594 | 0.7162 | 0.7594 | 0.8714 | | 0.0613 | 8.3529 | 3266 | 0.7865 | 0.7162 | 0.7865 | 0.8868 | | 0.0613 | 8.3581 | 3268 | 0.8053 | 0.7255 | 0.8053 | 0.8974 | | 0.0613 | 8.3632 | 3270 | 0.8136 | 0.7255 | 0.8136 | 0.9020 | | 0.0613 | 8.3683 | 3272 | 0.8107 | 0.7337 | 0.8107 | 0.9004 | | 0.0613 | 8.3734 | 3274 | 0.7964 | 0.7337 | 0.7964 | 0.8924 | | 0.0613 | 8.3785 | 3276 | 0.7841 | 0.7162 | 0.7841 | 0.8855 | | 0.0613 | 8.3836 | 3278 | 0.7539 | 0.7162 | 0.7539 | 0.8683 | | 0.0613 | 8.3887 | 3280 | 0.7386 | 0.7244 | 0.7386 | 0.8594 | | 0.0613 | 8.3939 | 3282 | 0.7401 | 0.7244 | 0.7401 | 0.8603 | | 0.0613 | 8.3990 | 3284 | 0.7478 | 0.7244 | 0.7478 | 0.8648 | | 0.0613 | 8.4041 | 3286 | 0.7516 | 0.7244 | 0.7516 | 0.8670 | | 0.0613 | 8.4092 | 3288 | 0.7517 | 0.7244 | 0.7517 | 0.8670 | | 0.0613 | 8.4143 | 3290 | 0.7562 | 0.7244 | 0.7562 | 0.8696 | | 0.0613 | 8.4194 | 3292 | 0.7602 | 0.7244 | 0.7602 | 0.8719 | | 0.0613 | 8.4246 | 3294 | 0.7642 | 0.7244 | 0.7642 | 0.8742 | | 0.0613 | 8.4297 | 3296 | 0.7672 | 0.7162 | 0.7672 | 0.8759 | | 0.0613 | 8.4348 | 3298 | 0.7601 | 0.7162 | 0.7601 | 0.8719 | | 0.0613 | 8.4399 | 3300 | 0.7615 | 0.7162 | 0.7615 | 0.8726 | | 0.0613 | 8.4450 | 3302 | 0.7520 | 0.7244 | 0.7520 | 0.8672 | | 0.0613 | 8.4501 | 3304 | 0.7530 | 0.7244 | 0.7530 | 0.8677 | | 0.0613 | 8.4552 | 3306 | 0.7619 | 0.7244 | 0.7619 | 0.8729 | | 0.0613 | 8.4604 | 3308 | 0.7689 | 0.7162 | 0.7689 | 0.8768 | | 0.0613 | 8.4655 | 3310 | 0.7811 | 0.7162 | 0.7811 | 0.8838 | | 0.0613 | 8.4706 | 3312 | 0.7755 | 0.7244 | 0.7755 | 0.8806 | | 0.0613 | 8.4757 | 3314 | 0.7623 | 0.7244 | 0.7623 | 0.8731 | | 0.0613 | 8.4808 | 3316 | 0.7535 | 0.7244 | 0.7535 | 0.8681 | | 0.0613 | 8.4859 | 3318 | 0.7548 | 0.7244 | 0.7548 | 0.8688 | | 0.0613 | 8.4910 | 3320 | 0.7677 | 0.7244 | 0.7677 | 0.8762 | | 0.0613 | 8.4962 | 3322 | 0.7846 | 0.7244 | 0.7846 | 0.8858 | | 0.0613 | 8.5013 | 3324 | 0.8031 | 0.7162 | 0.8031 | 0.8962 | | 0.0613 | 8.5064 | 3326 | 0.8087 | 0.7162 | 0.8087 | 0.8993 | | 0.0613 | 8.5115 | 3328 | 0.7941 | 0.7162 | 0.7941 | 0.8911 | | 0.0613 | 8.5166 | 3330 | 0.7828 | 0.7162 | 0.7828 | 0.8847 | | 0.0613 | 8.5217 | 3332 | 0.7702 | 0.7244 | 0.7702 | 0.8776 | | 0.0613 | 8.5269 | 3334 | 0.7516 | 0.7244 | 0.7516 | 0.8669 | | 0.0613 | 8.5320 | 3336 | 0.7218 | 0.7521 | 0.7218 | 0.8496 | | 0.0613 | 8.5371 | 3338 | 0.6847 | 0.7249 | 0.6847 | 0.8274 | | 0.0613 | 8.5422 | 3340 | 0.6556 | 0.7249 | 0.6556 | 0.8097 | | 0.0613 | 8.5473 | 3342 | 0.6459 | 0.7525 | 0.6459 | 0.8037 | | 0.0613 | 8.5524 | 3344 | 0.6516 | 0.7525 | 0.6516 | 0.8072 | | 0.0613 | 8.5575 | 3346 | 0.6614 | 0.7525 | 0.6614 | 0.8133 | | 0.0613 | 8.5627 | 3348 | 0.6665 | 0.7249 | 0.6665 | 0.8164 | | 0.0613 | 8.5678 | 3350 | 0.6828 | 0.7521 | 0.6828 | 0.8263 | | 0.0613 | 8.5729 | 3352 | 0.7032 | 0.7521 | 0.7032 | 0.8386 | | 0.0613 | 8.5780 | 3354 | 0.7174 | 0.7521 | 0.7174 | 0.8470 | | 0.0613 | 8.5831 | 3356 | 0.7368 | 0.7433 | 0.7368 | 0.8584 | | 0.0613 | 8.5882 | 3358 | 0.7534 | 0.7433 | 0.7534 | 0.8680 | | 0.0613 | 8.5934 | 3360 | 0.7755 | 0.7162 | 0.7755 | 0.8806 | | 0.0613 | 8.5985 | 3362 | 0.7828 | 0.7162 | 0.7828 | 0.8848 | | 0.0613 | 8.6036 | 3364 | 0.7967 | 0.7162 | 0.7967 | 0.8926 | | 0.0613 | 8.6087 | 3366 | 0.7930 | 0.7162 | 0.7930 | 0.8905 | | 0.0613 | 8.6138 | 3368 | 0.7722 | 0.7433 | 0.7722 | 0.8788 | | 0.0613 | 8.6189 | 3370 | 0.7577 | 0.7433 | 0.7577 | 0.8704 | | 0.0613 | 8.6240 | 3372 | 0.7557 | 0.7433 | 0.7557 | 0.8693 | | 0.0613 | 8.6292 | 3374 | 0.7667 | 0.7162 | 0.7667 | 0.8756 | | 0.0613 | 8.6343 | 3376 | 0.7674 | 0.7162 | 0.7674 | 0.8760 | | 0.0613 | 8.6394 | 3378 | 0.7626 | 0.7162 | 0.7626 | 0.8732 | | 0.0613 | 8.6445 | 3380 | 0.7623 | 0.7162 | 0.7623 | 0.8731 | | 0.0613 | 8.6496 | 3382 | 0.7628 | 0.7162 | 0.7628 | 0.8734 | | 0.0613 | 8.6547 | 3384 | 0.7563 | 0.7162 | 0.7563 | 0.8697 | | 0.0613 | 8.6598 | 3386 | 0.7412 | 0.7244 | 0.7412 | 0.8610 | | 0.0613 | 8.6650 | 3388 | 0.7214 | 0.6951 | 0.7214 | 0.8494 | | 0.0613 | 8.6701 | 3390 | 0.7062 | 0.6951 | 0.7062 | 0.8404 | | 0.0613 | 8.6752 | 3392 | 0.6890 | 0.7249 | 0.6890 | 0.8301 | | 0.0613 | 8.6803 | 3394 | 0.6871 | 0.7249 | 0.6871 | 0.8289 | | 0.0613 | 8.6854 | 3396 | 0.6993 | 0.6951 | 0.6993 | 0.8363 | | 0.0613 | 8.6905 | 3398 | 0.7157 | 0.6951 | 0.7157 | 0.8460 | | 0.0613 | 8.6957 | 3400 | 0.7307 | 0.6951 | 0.7307 | 0.8548 | | 0.0613 | 8.7008 | 3402 | 0.7343 | 0.6951 | 0.7343 | 0.8569 | | 0.0613 | 8.7059 | 3404 | 0.7337 | 0.6951 | 0.7337 | 0.8566 | | 0.0613 | 8.7110 | 3406 | 0.7242 | 0.6951 | 0.7242 | 0.8510 | | 0.0613 | 8.7161 | 3408 | 0.7150 | 0.6951 | 0.7150 | 0.8456 | | 0.0613 | 8.7212 | 3410 | 0.7168 | 0.6951 | 0.7168 | 0.8466 | | 0.0613 | 8.7263 | 3412 | 0.7267 | 0.6951 | 0.7267 | 0.8525 | | 0.0613 | 8.7315 | 3414 | 0.7371 | 0.6951 | 0.7371 | 0.8585 | | 0.0613 | 8.7366 | 3416 | 0.7467 | 0.6951 | 0.7467 | 0.8641 | | 0.0613 | 8.7417 | 3418 | 0.7497 | 0.6951 | 0.7497 | 0.8658 | | 0.0613 | 8.7468 | 3420 | 0.7529 | 0.7162 | 0.7529 | 0.8677 | | 0.0613 | 8.7519 | 3422 | 0.7567 | 0.7162 | 0.7567 | 0.8699 | | 0.0613 | 8.7570 | 3424 | 0.7446 | 0.6951 | 0.7446 | 0.8629 | | 0.0613 | 8.7621 | 3426 | 0.7308 | 0.6951 | 0.7308 | 0.8548 | | 0.0613 | 8.7673 | 3428 | 0.7111 | 0.7249 | 0.7111 | 0.8432 | | 0.0613 | 8.7724 | 3430 | 0.7009 | 0.7249 | 0.7009 | 0.8372 | | 0.0613 | 8.7775 | 3432 | 0.7025 | 0.7249 | 0.7025 | 0.8381 | | 0.0613 | 8.7826 | 3434 | 0.7172 | 0.7249 | 0.7172 | 0.8469 | | 0.0613 | 8.7877 | 3436 | 0.7320 | 0.6951 | 0.7320 | 0.8556 | | 0.0613 | 8.7928 | 3438 | 0.7372 | 0.7244 | 0.7372 | 0.8586 | | 0.0613 | 8.7980 | 3440 | 0.7364 | 0.7244 | 0.7364 | 0.8581 | | 0.0613 | 8.8031 | 3442 | 0.7343 | 0.7244 | 0.7343 | 0.8569 | | 0.0613 | 8.8082 | 3444 | 0.7248 | 0.7249 | 0.7248 | 0.8514 | | 0.0613 | 8.8133 | 3446 | 0.7095 | 0.7249 | 0.7095 | 0.8423 | | 0.0613 | 8.8184 | 3448 | 0.7007 | 0.7249 | 0.7007 | 0.8371 | | 0.0613 | 8.8235 | 3450 | 0.6981 | 0.7249 | 0.6981 | 0.8355 | | 0.0613 | 8.8286 | 3452 | 0.6876 | 0.7525 | 0.6876 | 0.8292 | | 0.0613 | 8.8338 | 3454 | 0.6863 | 0.7525 | 0.6863 | 0.8284 | | 0.0613 | 8.8389 | 3456 | 0.6922 | 0.7249 | 0.6922 | 0.8320 | | 0.0613 | 8.8440 | 3458 | 0.6950 | 0.7249 | 0.6950 | 0.8337 | | 0.0613 | 8.8491 | 3460 | 0.6946 | 0.7525 | 0.6946 | 0.8334 | | 0.0613 | 8.8542 | 3462 | 0.7004 | 0.7525 | 0.7004 | 0.8369 | | 0.0613 | 8.8593 | 3464 | 0.7143 | 0.7249 | 0.7143 | 0.8452 | | 0.0613 | 8.8645 | 3466 | 0.7309 | 0.7249 | 0.7309 | 0.8549 | | 0.0613 | 8.8696 | 3468 | 0.7541 | 0.7162 | 0.7541 | 0.8684 | | 0.0613 | 8.8747 | 3470 | 0.7598 | 0.7162 | 0.7598 | 0.8717 | | 0.0613 | 8.8798 | 3472 | 0.7539 | 0.7162 | 0.7539 | 0.8683 | | 0.0613 | 8.8849 | 3474 | 0.7450 | 0.7433 | 0.7450 | 0.8631 | | 0.0613 | 8.8900 | 3476 | 0.7375 | 0.7433 | 0.7375 | 0.8588 | | 0.0613 | 8.8951 | 3478 | 0.7449 | 0.7433 | 0.7449 | 0.8631 | | 0.0613 | 8.9003 | 3480 | 0.7588 | 0.7433 | 0.7588 | 0.8711 | | 0.0613 | 8.9054 | 3482 | 0.7644 | 0.7433 | 0.7644 | 0.8743 | | 0.0613 | 8.9105 | 3484 | 0.7652 | 0.7433 | 0.7652 | 0.8748 | | 0.0613 | 8.9156 | 3486 | 0.7719 | 0.7162 | 0.7719 | 0.8786 | | 0.0613 | 8.9207 | 3488 | 0.7669 | 0.7162 | 0.7669 | 0.8758 | | 0.0613 | 8.9258 | 3490 | 0.7569 | 0.7433 | 0.7569 | 0.8700 | | 0.0613 | 8.9309 | 3492 | 0.7563 | 0.7433 | 0.7563 | 0.8697 | | 0.0613 | 8.9361 | 3494 | 0.7564 | 0.7433 | 0.7564 | 0.8697 | | 0.0613 | 8.9412 | 3496 | 0.7632 | 0.7433 | 0.7632 | 0.8736 | | 0.0613 | 8.9463 | 3498 | 0.7738 | 0.7433 | 0.7738 | 0.8796 | | 0.0548 | 8.9514 | 3500 | 0.7775 | 0.7433 | 0.7775 | 0.8818 | | 0.0548 | 8.9565 | 3502 | 0.7792 | 0.7433 | 0.7792 | 0.8827 | | 0.0548 | 8.9616 | 3504 | 0.7880 | 0.7162 | 0.7880 | 0.8877 | | 0.0548 | 8.9668 | 3506 | 0.8036 | 0.7162 | 0.8036 | 0.8965 | | 0.0548 | 8.9719 | 3508 | 0.8080 | 0.7162 | 0.8080 | 0.8989 | | 0.0548 | 8.9770 | 3510 | 0.8002 | 0.7162 | 0.8002 | 0.8946 | | 0.0548 | 8.9821 | 3512 | 0.7952 | 0.7162 | 0.7952 | 0.8918 | | 0.0548 | 8.9872 | 3514 | 0.7780 | 0.7162 | 0.7780 | 0.8821 | | 0.0548 | 8.9923 | 3516 | 0.7644 | 0.7162 | 0.7644 | 0.8743 | | 0.0548 | 8.9974 | 3518 | 0.7466 | 0.6951 | 0.7466 | 0.8641 | | 0.0548 | 9.0026 | 3520 | 0.7386 | 0.6951 | 0.7386 | 0.8594 | | 0.0548 | 9.0077 | 3522 | 0.7305 | 0.6951 | 0.7305 | 0.8547 | | 0.0548 | 9.0128 | 3524 | 0.7306 | 0.6951 | 0.7306 | 0.8547 | | 0.0548 | 9.0179 | 3526 | 0.7367 | 0.6951 | 0.7367 | 0.8583 | | 0.0548 | 9.0230 | 3528 | 0.7425 | 0.6951 | 0.7425 | 0.8617 | | 0.0548 | 9.0281 | 3530 | 0.7411 | 0.6951 | 0.7411 | 0.8609 | | 0.0548 | 9.0332 | 3532 | 0.7282 | 0.6951 | 0.7282 | 0.8533 | | 0.0548 | 9.0384 | 3534 | 0.7213 | 0.7249 | 0.7213 | 0.8493 | | 0.0548 | 9.0435 | 3536 | 0.7218 | 0.7249 | 0.7218 | 0.8496 | | 0.0548 | 9.0486 | 3538 | 0.7281 | 0.6951 | 0.7281 | 0.8533 | | 0.0548 | 9.0537 | 3540 | 0.7399 | 0.6951 | 0.7399 | 0.8602 | | 0.0548 | 9.0588 | 3542 | 0.7458 | 0.6951 | 0.7458 | 0.8636 | | 0.0548 | 9.0639 | 3544 | 0.7432 | 0.6951 | 0.7432 | 0.8621 | | 0.0548 | 9.0691 | 3546 | 0.7440 | 0.6951 | 0.7440 | 0.8625 | | 0.0548 | 9.0742 | 3548 | 0.7409 | 0.6951 | 0.7409 | 0.8607 | | 0.0548 | 9.0793 | 3550 | 0.7403 | 0.7244 | 0.7403 | 0.8604 | | 0.0548 | 9.0844 | 3552 | 0.7396 | 0.6951 | 0.7396 | 0.8600 | | 0.0548 | 9.0895 | 3554 | 0.7420 | 0.7244 | 0.7420 | 0.8614 | | 0.0548 | 9.0946 | 3556 | 0.7455 | 0.7244 | 0.7455 | 0.8634 | | 0.0548 | 9.0997 | 3558 | 0.7455 | 0.7244 | 0.7455 | 0.8634 | | 0.0548 | 9.1049 | 3560 | 0.7520 | 0.7244 | 0.7520 | 0.8672 | | 0.0548 | 9.1100 | 3562 | 0.7528 | 0.7244 | 0.7528 | 0.8676 | | 0.0548 | 9.1151 | 3564 | 0.7601 | 0.7244 | 0.7601 | 0.8718 | | 0.0548 | 9.1202 | 3566 | 0.7649 | 0.7244 | 0.7649 | 0.8746 | | 0.0548 | 9.1253 | 3568 | 0.7727 | 0.7244 | 0.7727 | 0.8790 | | 0.0548 | 9.1304 | 3570 | 0.7746 | 0.7244 | 0.7746 | 0.8801 | | 0.0548 | 9.1355 | 3572 | 0.7792 | 0.7244 | 0.7792 | 0.8827 | | 0.0548 | 9.1407 | 3574 | 0.7759 | 0.7244 | 0.7759 | 0.8809 | | 0.0548 | 9.1458 | 3576 | 0.7787 | 0.7244 | 0.7787 | 0.8825 | | 0.0548 | 9.1509 | 3578 | 0.7785 | 0.7244 | 0.7785 | 0.8823 | | 0.0548 | 9.1560 | 3580 | 0.7774 | 0.7244 | 0.7774 | 0.8817 | | 0.0548 | 9.1611 | 3582 | 0.7815 | 0.7244 | 0.7815 | 0.8840 | | 0.0548 | 9.1662 | 3584 | 0.7805 | 0.6951 | 0.7805 | 0.8835 | | 0.0548 | 9.1714 | 3586 | 0.7839 | 0.6951 | 0.7839 | 0.8854 | | 0.0548 | 9.1765 | 3588 | 0.7929 | 0.6951 | 0.7929 | 0.8904 | | 0.0548 | 9.1816 | 3590 | 0.8078 | 0.7426 | 0.8078 | 0.8988 | | 0.0548 | 9.1867 | 3592 | 0.8200 | 0.7426 | 0.8200 | 0.9056 | | 0.0548 | 9.1918 | 3594 | 0.8337 | 0.7337 | 0.8337 | 0.9130 | | 0.0548 | 9.1969 | 3596 | 0.8436 | 0.7337 | 0.8436 | 0.9185 | | 0.0548 | 9.2020 | 3598 | 0.8578 | 0.7337 | 0.8578 | 0.9262 | | 0.0548 | 9.2072 | 3600 | 0.8663 | 0.7337 | 0.8663 | 0.9307 | | 0.0548 | 9.2123 | 3602 | 0.8778 | 0.7337 | 0.8778 | 0.9369 | | 0.0548 | 9.2174 | 3604 | 0.8905 | 0.7337 | 0.8905 | 0.9437 | | 0.0548 | 9.2225 | 3606 | 0.8942 | 0.7337 | 0.8942 | 0.9456 | | 0.0548 | 9.2276 | 3608 | 0.8968 | 0.7337 | 0.8968 | 0.9470 | | 0.0548 | 9.2327 | 3610 | 0.8941 | 0.7337 | 0.8941 | 0.9456 | | 0.0548 | 9.2379 | 3612 | 0.8825 | 0.7337 | 0.8825 | 0.9394 | | 0.0548 | 9.2430 | 3614 | 0.8668 | 0.7337 | 0.8668 | 0.9310 | | 0.0548 | 9.2481 | 3616 | 0.8492 | 0.7337 | 0.8492 | 0.9215 | | 0.0548 | 9.2532 | 3618 | 0.8314 | 0.7337 | 0.8314 | 0.9118 | | 0.0548 | 9.2583 | 3620 | 0.8175 | 0.7337 | 0.8175 | 0.9042 | | 0.0548 | 9.2634 | 3622 | 0.8022 | 0.6951 | 0.8022 | 0.8957 | | 0.0548 | 9.2685 | 3624 | 0.7884 | 0.6951 | 0.7884 | 0.8879 | | 0.0548 | 9.2737 | 3626 | 0.7751 | 0.6951 | 0.7751 | 0.8804 | | 0.0548 | 9.2788 | 3628 | 0.7689 | 0.6951 | 0.7689 | 0.8768 | | 0.0548 | 9.2839 | 3630 | 0.7697 | 0.6951 | 0.7697 | 0.8773 | | 0.0548 | 9.2890 | 3632 | 0.7751 | 0.6951 | 0.7751 | 0.8804 | | 0.0548 | 9.2941 | 3634 | 0.7849 | 0.6951 | 0.7849 | 0.8860 | | 0.0548 | 9.2992 | 3636 | 0.8019 | 0.7139 | 0.8019 | 0.8955 | | 0.0548 | 9.3043 | 3638 | 0.8171 | 0.7337 | 0.8171 | 0.9039 | | 0.0548 | 9.3095 | 3640 | 0.8299 | 0.7337 | 0.8299 | 0.9110 | | 0.0548 | 9.3146 | 3642 | 0.8356 | 0.7337 | 0.8356 | 0.9141 | | 0.0548 | 9.3197 | 3644 | 0.8323 | 0.7337 | 0.8323 | 0.9123 | | 0.0548 | 9.3248 | 3646 | 0.8200 | 0.7337 | 0.8200 | 0.9055 | | 0.0548 | 9.3299 | 3648 | 0.8060 | 0.7139 | 0.8060 | 0.8978 | | 0.0548 | 9.3350 | 3650 | 0.7960 | 0.7139 | 0.7960 | 0.8922 | | 0.0548 | 9.3402 | 3652 | 0.7921 | 0.6951 | 0.7921 | 0.8900 | | 0.0548 | 9.3453 | 3654 | 0.7851 | 0.6951 | 0.7851 | 0.8861 | | 0.0548 | 9.3504 | 3656 | 0.7769 | 0.6951 | 0.7769 | 0.8814 | | 0.0548 | 9.3555 | 3658 | 0.7684 | 0.6951 | 0.7684 | 0.8766 | | 0.0548 | 9.3606 | 3660 | 0.7688 | 0.6951 | 0.7688 | 0.8768 | | 0.0548 | 9.3657 | 3662 | 0.7742 | 0.6951 | 0.7742 | 0.8799 | | 0.0548 | 9.3708 | 3664 | 0.7772 | 0.6951 | 0.7772 | 0.8816 | | 0.0548 | 9.3760 | 3666 | 0.7861 | 0.6951 | 0.7861 | 0.8866 | | 0.0548 | 9.3811 | 3668 | 0.7952 | 0.7139 | 0.7952 | 0.8917 | | 0.0548 | 9.3862 | 3670 | 0.8014 | 0.7139 | 0.8014 | 0.8952 | | 0.0548 | 9.3913 | 3672 | 0.8073 | 0.7058 | 0.8073 | 0.8985 | | 0.0548 | 9.3964 | 3674 | 0.8079 | 0.7337 | 0.8079 | 0.8988 | | 0.0548 | 9.4015 | 3676 | 0.8014 | 0.7058 | 0.8014 | 0.8952 | | 0.0548 | 9.4066 | 3678 | 0.7886 | 0.6951 | 0.7886 | 0.8880 | | 0.0548 | 9.4118 | 3680 | 0.7797 | 0.6951 | 0.7797 | 0.8830 | | 0.0548 | 9.4169 | 3682 | 0.7709 | 0.6951 | 0.7709 | 0.8780 | | 0.0548 | 9.4220 | 3684 | 0.7600 | 0.6951 | 0.7600 | 0.8718 | | 0.0548 | 9.4271 | 3686 | 0.7523 | 0.6951 | 0.7523 | 0.8673 | | 0.0548 | 9.4322 | 3688 | 0.7493 | 0.6951 | 0.7493 | 0.8656 | | 0.0548 | 9.4373 | 3690 | 0.7515 | 0.6951 | 0.7515 | 0.8669 | | 0.0548 | 9.4425 | 3692 | 0.7596 | 0.6951 | 0.7596 | 0.8715 | | 0.0548 | 9.4476 | 3694 | 0.7649 | 0.6951 | 0.7649 | 0.8746 | | 0.0548 | 9.4527 | 3696 | 0.7715 | 0.6951 | 0.7715 | 0.8783 | | 0.0548 | 9.4578 | 3698 | 0.7714 | 0.6951 | 0.7714 | 0.8783 | | 0.0548 | 9.4629 | 3700 | 0.7733 | 0.6951 | 0.7733 | 0.8793 | | 0.0548 | 9.4680 | 3702 | 0.7801 | 0.6951 | 0.7801 | 0.8832 | | 0.0548 | 9.4731 | 3704 | 0.7894 | 0.6877 | 0.7894 | 0.8885 | | 0.0548 | 9.4783 | 3706 | 0.7959 | 0.7162 | 0.7959 | 0.8921 | | 0.0548 | 9.4834 | 3708 | 0.7984 | 0.7337 | 0.7984 | 0.8935 | | 0.0548 | 9.4885 | 3710 | 0.8011 | 0.7337 | 0.8011 | 0.8950 | | 0.0548 | 9.4936 | 3712 | 0.8014 | 0.7058 | 0.8014 | 0.8952 | | 0.0548 | 9.4987 | 3714 | 0.7981 | 0.6877 | 0.7981 | 0.8933 | | 0.0548 | 9.5038 | 3716 | 0.7927 | 0.6877 | 0.7927 | 0.8903 | | 0.0548 | 9.5090 | 3718 | 0.7947 | 0.6877 | 0.7947 | 0.8915 | | 0.0548 | 9.5141 | 3720 | 0.7959 | 0.6877 | 0.7959 | 0.8922 | | 0.0548 | 9.5192 | 3722 | 0.7966 | 0.6877 | 0.7966 | 0.8925 | | 0.0548 | 9.5243 | 3724 | 0.7943 | 0.6877 | 0.7943 | 0.8912 | | 0.0548 | 9.5294 | 3726 | 0.7920 | 0.6877 | 0.7920 | 0.8899 | | 0.0548 | 9.5345 | 3728 | 0.7883 | 0.6951 | 0.7883 | 0.8879 | | 0.0548 | 9.5396 | 3730 | 0.7837 | 0.6951 | 0.7837 | 0.8853 | | 0.0548 | 9.5448 | 3732 | 0.7791 | 0.6951 | 0.7791 | 0.8827 | | 0.0548 | 9.5499 | 3734 | 0.7787 | 0.6951 | 0.7787 | 0.8825 | | 0.0548 | 9.5550 | 3736 | 0.7773 | 0.6951 | 0.7773 | 0.8816 | | 0.0548 | 9.5601 | 3738 | 0.7812 | 0.6951 | 0.7812 | 0.8838 | | 0.0548 | 9.5652 | 3740 | 0.7856 | 0.7139 | 0.7856 | 0.8864 | | 0.0548 | 9.5703 | 3742 | 0.7869 | 0.7139 | 0.7869 | 0.8871 | | 0.0548 | 9.5754 | 3744 | 0.7895 | 0.7139 | 0.7895 | 0.8886 | | 0.0548 | 9.5806 | 3746 | 0.7902 | 0.7139 | 0.7902 | 0.8889 | | 0.0548 | 9.5857 | 3748 | 0.7945 | 0.7139 | 0.7945 | 0.8913 | | 0.0548 | 9.5908 | 3750 | 0.7954 | 0.7139 | 0.7954 | 0.8918 | | 0.0548 | 9.5959 | 3752 | 0.7952 | 0.7139 | 0.7952 | 0.8917 | | 0.0548 | 9.6010 | 3754 | 0.7928 | 0.7139 | 0.7928 | 0.8904 | | 0.0548 | 9.6061 | 3756 | 0.7928 | 0.7139 | 0.7928 | 0.8904 | | 0.0548 | 9.6113 | 3758 | 0.7962 | 0.7058 | 0.7962 | 0.8923 | | 0.0548 | 9.6164 | 3760 | 0.7968 | 0.7058 | 0.7968 | 0.8926 | | 0.0548 | 9.6215 | 3762 | 0.7949 | 0.7058 | 0.7949 | 0.8916 | | 0.0548 | 9.6266 | 3764 | 0.7925 | 0.7058 | 0.7925 | 0.8902 | | 0.0548 | 9.6317 | 3766 | 0.7893 | 0.7058 | 0.7893 | 0.8884 | | 0.0548 | 9.6368 | 3768 | 0.7888 | 0.7337 | 0.7888 | 0.8882 | | 0.0548 | 9.6419 | 3770 | 0.7905 | 0.7337 | 0.7905 | 0.8891 | | 0.0548 | 9.6471 | 3772 | 0.7943 | 0.7337 | 0.7943 | 0.8912 | | 0.0548 | 9.6522 | 3774 | 0.7944 | 0.7337 | 0.7944 | 0.8913 | | 0.0548 | 9.6573 | 3776 | 0.7940 | 0.7337 | 0.7940 | 0.8910 | | 0.0548 | 9.6624 | 3778 | 0.7901 | 0.7337 | 0.7901 | 0.8889 | | 0.0548 | 9.6675 | 3780 | 0.7880 | 0.7337 | 0.7880 | 0.8877 | | 0.0548 | 9.6726 | 3782 | 0.7848 | 0.6877 | 0.7848 | 0.8859 | | 0.0548 | 9.6777 | 3784 | 0.7855 | 0.7337 | 0.7855 | 0.8863 | | 0.0548 | 9.6829 | 3786 | 0.7870 | 0.7337 | 0.7870 | 0.8871 | | 0.0548 | 9.6880 | 3788 | 0.7845 | 0.6877 | 0.7845 | 0.8857 | | 0.0548 | 9.6931 | 3790 | 0.7815 | 0.6951 | 0.7815 | 0.8840 | | 0.0548 | 9.6982 | 3792 | 0.7804 | 0.6951 | 0.7804 | 0.8834 | | 0.0548 | 9.7033 | 3794 | 0.7783 | 0.6951 | 0.7783 | 0.8822 | | 0.0548 | 9.7084 | 3796 | 0.7799 | 0.6951 | 0.7799 | 0.8831 | | 0.0548 | 9.7136 | 3798 | 0.7824 | 0.6951 | 0.7824 | 0.8845 | | 0.0548 | 9.7187 | 3800 | 0.7851 | 0.7162 | 0.7851 | 0.8861 | | 0.0548 | 9.7238 | 3802 | 0.7860 | 0.7162 | 0.7860 | 0.8865 | | 0.0548 | 9.7289 | 3804 | 0.7851 | 0.7162 | 0.7851 | 0.8861 | | 0.0548 | 9.7340 | 3806 | 0.7837 | 0.6877 | 0.7837 | 0.8853 | | 0.0548 | 9.7391 | 3808 | 0.7823 | 0.6877 | 0.7823 | 0.8845 | | 0.0548 | 9.7442 | 3810 | 0.7844 | 0.7162 | 0.7844 | 0.8857 | | 0.0548 | 9.7494 | 3812 | 0.7880 | 0.7162 | 0.7880 | 0.8877 | | 0.0548 | 9.7545 | 3814 | 0.7921 | 0.7337 | 0.7921 | 0.8900 | | 0.0548 | 9.7596 | 3816 | 0.7953 | 0.7337 | 0.7953 | 0.8918 | | 0.0548 | 9.7647 | 3818 | 0.7962 | 0.7337 | 0.7962 | 0.8923 | | 0.0548 | 9.7698 | 3820 | 0.7968 | 0.7337 | 0.7968 | 0.8927 | | 0.0548 | 9.7749 | 3822 | 0.7943 | 0.7337 | 0.7943 | 0.8912 | | 0.0548 | 9.7801 | 3824 | 0.7894 | 0.6877 | 0.7894 | 0.8885 | | 0.0548 | 9.7852 | 3826 | 0.7829 | 0.6951 | 0.7829 | 0.8848 | | 0.0548 | 9.7903 | 3828 | 0.7752 | 0.6951 | 0.7752 | 0.8804 | | 0.0548 | 9.7954 | 3830 | 0.7691 | 0.6951 | 0.7691 | 0.8770 | | 0.0548 | 9.8005 | 3832 | 0.7651 | 0.6951 | 0.7651 | 0.8747 | | 0.0548 | 9.8056 | 3834 | 0.7608 | 0.6951 | 0.7608 | 0.8722 | | 0.0548 | 9.8107 | 3836 | 0.7564 | 0.6951 | 0.7564 | 0.8697 | | 0.0548 | 9.8159 | 3838 | 0.7549 | 0.6951 | 0.7549 | 0.8689 | | 0.0548 | 9.8210 | 3840 | 0.7542 | 0.6951 | 0.7542 | 0.8684 | | 0.0548 | 9.8261 | 3842 | 0.7530 | 0.6951 | 0.7530 | 0.8677 | | 0.0548 | 9.8312 | 3844 | 0.7525 | 0.6951 | 0.7525 | 0.8675 | | 0.0548 | 9.8363 | 3846 | 0.7532 | 0.6951 | 0.7532 | 0.8678 | | 0.0548 | 9.8414 | 3848 | 0.7545 | 0.6951 | 0.7545 | 0.8686 | | 0.0548 | 9.8465 | 3850 | 0.7554 | 0.6951 | 0.7554 | 0.8692 | | 0.0548 | 9.8517 | 3852 | 0.7568 | 0.6951 | 0.7568 | 0.8700 | | 0.0548 | 9.8568 | 3854 | 0.7577 | 0.6951 | 0.7577 | 0.8705 | | 0.0548 | 9.8619 | 3856 | 0.7591 | 0.6951 | 0.7591 | 0.8712 | | 0.0548 | 9.8670 | 3858 | 0.7596 | 0.6951 | 0.7596 | 0.8716 | | 0.0548 | 9.8721 | 3860 | 0.7604 | 0.6951 | 0.7604 | 0.8720 | | 0.0548 | 9.8772 | 3862 | 0.7614 | 0.6951 | 0.7614 | 0.8726 | | 0.0548 | 9.8824 | 3864 | 0.7619 | 0.6951 | 0.7619 | 0.8729 | | 0.0548 | 9.8875 | 3866 | 0.7631 | 0.6951 | 0.7631 | 0.8735 | | 0.0548 | 9.8926 | 3868 | 0.7641 | 0.6951 | 0.7641 | 0.8741 | | 0.0548 | 9.8977 | 3870 | 0.7637 | 0.6951 | 0.7637 | 0.8739 | | 0.0548 | 9.9028 | 3872 | 0.7637 | 0.6951 | 0.7637 | 0.8739 | | 0.0548 | 9.9079 | 3874 | 0.7646 | 0.6951 | 0.7646 | 0.8744 | | 0.0548 | 9.9130 | 3876 | 0.7660 | 0.6951 | 0.7660 | 0.8752 | | 0.0548 | 9.9182 | 3878 | 0.7676 | 0.6951 | 0.7676 | 0.8761 | | 0.0548 | 9.9233 | 3880 | 0.7684 | 0.6951 | 0.7684 | 0.8766 | | 0.0548 | 9.9284 | 3882 | 0.7692 | 0.6951 | 0.7692 | 0.8770 | | 0.0548 | 9.9335 | 3884 | 0.7693 | 0.6951 | 0.7693 | 0.8771 | | 0.0548 | 9.9386 | 3886 | 0.7696 | 0.6951 | 0.7696 | 0.8773 | | 0.0548 | 9.9437 | 3888 | 0.7700 | 0.6951 | 0.7700 | 0.8775 | | 0.0548 | 9.9488 | 3890 | 0.7707 | 0.6951 | 0.7707 | 0.8779 | | 0.0548 | 9.9540 | 3892 | 0.7718 | 0.6951 | 0.7718 | 0.8785 | | 0.0548 | 9.9591 | 3894 | 0.7723 | 0.6951 | 0.7723 | 0.8788 | | 0.0548 | 9.9642 | 3896 | 0.7727 | 0.6951 | 0.7727 | 0.8790 | | 0.0548 | 9.9693 | 3898 | 0.7729 | 0.6951 | 0.7729 | 0.8791 | | 0.0548 | 9.9744 | 3900 | 0.7731 | 0.6951 | 0.7731 | 0.8793 | | 0.0548 | 9.9795 | 3902 | 0.7731 | 0.6951 | 0.7731 | 0.8793 | | 0.0548 | 9.9847 | 3904 | 0.7731 | 0.6951 | 0.7731 | 0.8792 | | 0.0548 | 9.9898 | 3906 | 0.7730 | 0.6951 | 0.7730 | 0.8792 | | 0.0548 | 9.9949 | 3908 | 0.7729 | 0.6951 | 0.7729 | 0.8792 | | 0.0548 | 10.0 | 3910 | 0.7729 | 0.6951 | 0.7729 | 0.8792 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu118 - Datasets 2.21.0 - Tokenizers 0.19.1
GrupoUtec/ModeloTraductor
GrupoUtec
2024-11-14T04:18:13Z
159
0
transformers
[ "transformers", "tensorboard", "safetensors", "marian", "text2text-generation", "translation", "es", "en", "base_model:Helsinki-NLP/opus-mt-en-es", "base_model:finetune:Helsinki-NLP/opus-mt-en-es", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
translation
2024-11-09T16:29:54Z
--- license: apache-2.0 language: - es - en base_model: - Helsinki-NLP/opus-mt-es-en - Helsinki-NLP/opus-mt-en-es pipeline_tag: translation library_name: transformers ---
5CD-AI/ColVintern-1B-v1
5CD-AI
2024-11-14T04:13:26Z
136
6
transformers
[ "transformers", "safetensors", "internvl_chat", "feature-extraction", "colpali", "custom_code", "vi", "en", "dataset:vidore/colpali_train_set", "dataset:5CD-AI/Viet-Doc-VQA", "dataset:5CD-AI/Viet-OCR-VQA", "dataset:5CD-AI/Viet-Doc-VQA-II", "base_model:5CD-AI/Vintern-1B-v2", "base_model:finetune:5CD-AI/Vintern-1B-v2", "region:us" ]
feature-extraction
2024-11-13T09:39:36Z
--- library_name: transformers language: - vi - en base_model: - 5CD-AI/Vintern-1B-v2 datasets: - vidore/colpali_train_set - 5CD-AI/Viet-Doc-VQA - 5CD-AI/Viet-OCR-VQA - 5CD-AI/Viet-Doc-VQA-II tags: - colpali --- <div align="center"> <img src="colvintern.png" width="400"/> </div> ## ColVintern-1B-v1 🇻🇳 ❄️ - Colpali version for Vietnamese. **What's new in ColVintern-1B-v1!** - We coded and successfully trained the **Colpali pipeline** for **Vintern-1B-v2**. The model supports RAG by extracting embedding vectors for questions and images containing related information. - This is the first experimental version, trained on the [**Colpali dataset**](https://huggingface.co/datasets/vidore/colpali_train_set) for English and **5%** of the image-based question-answer pairs we have for Vietnamese. - The model achieves results nearly equivalent to Colpali version 1, with strong support for Vietnamese texts and only 1 billion parameters compared to current 2B-3B Colpali models. ## Colpali Benchmarks We tested on the [**ViDoRe benchmark**](https://huggingface.co/collections/vidore/vidore-benchmark-667173f98e70a1c0fa4db00d) from the Colpali paper. The **TabF** and **Shift** test datasets were not used because they are in French. We plan to expand to multiple languages in the near future. | | ArxivQ | DocQ | InfoQ | TATQ | AI | Energy | Gov. | Health. | Avg. | |:------------------------------:|:--------:|:------:|:-------:|:------:|:------:|:--------:|:-------:|:---------:|:--------:| | **Unstructured** Text only | | | | | | | | | | | - BM25 | - | 34.1 | - | 44.0 | 90.4 | 78.3 | 78.8 | 82.6 | - | | - BGE-M3 | - | 28.4 | - | 36.1 | 88.4 | 76.8 | 77.7 | 84.6 | - | | **Unstructured** + OCR | | | | | | | | | | | - BM25 | 31.6 | 36.8 | 62.9 | 62.7 | 92.8 | 85.9 | 83.9 | 87.2 | 68.0 | | - BGE-M3 | 31.4 | 25.7 | 60.1 | 50.5 | 90.2 | 83.6 | 84.9 | 91.1 | 64.7 | | **Unstructured** + Captioning | | | | | | | | | | | - BM25 | 40.1 | 38.4 | 70.0 | 61.5 | 88.0 | 84.7 | 82.7 | 89.2 | 69.3 | | - BGE-M3 | 35.7 | 32.9 | 71.9 | 43.8 | 88.8 | 83.3 | 80.4 | 91.3 | 66.0 | | **Contrastive VLMs** | | | | | | | | | | | - Jina-CLIP | 25.4 | 11.9 | 35.5 | 3.3 | 15.2 | 19.7 | 21.4 | 20.8 | 19.2 | | - Nomic-vision | 17.1 | 10.7 | 30.1 | 2.7 | 12.9 | 10.9 | 11.4 | 15.7 | 13.9 | | - SigLIP (Vanilla) | 43.2 | 30.3 | 64.1 | 26.2 | 62.5 | 65.7 | 66.1 | 79.1 | 54.7 | | **Colpali** | | | | | | | | | | | - SigLIP (Vanilla) | 43.2 | 30.3 | 64.1 | 26.2 | 62.5 | 65.7 | 66.1 | 79.1 | 54.7 | | - BiSigLIP (+fine-tuning) | 58.5 | 32.9 | 70.5 | 30.5 | 74.3 | 73.7 | 74.2 | 82.3 | 62.1 | | - BiPali (+LLM) | 56.5 | 30.0 | 67.4 | 33.4 | 71.2 | 61.9 | 73.8 | 73.6 | 58.5 | | - ColPali (+Late Inter.) | **79.1** | **54.4** | 81.8 | **65.8** | **96.2** | **91.0** | **92.7** | 94.4 | **81.3** | | **Ours** | | | | | | | | | | | - ColVintern-1B (+Late Inter.) | 71.6 | 48.3 | **84.6** | 59.6 | 92.9 | 88.7 | 89.4 | **95.2** | 78.8 | We are expanding the training dataset for upcoming versions, including adding hard negative mining techniques, increasing GPU VRAM, etc., to achieve better results. ## Examples Input Images: <div style="display: flex; gap: 20px;"> <img src="ex1.jpg" width="300"/> <img src="ex2.jpg" width="300"/> </div> Input Queries: ``` queries = ["Cảng Hải Phòng thông báo gì ?","Phí giao hàng bao nhiêu ?"] ``` Output Scores: | Query | Image 1 Score | Image 2 Score | |--------------------------------------|---------------|---------------| | Chuyện gì xảy ra với quốc lộ 5 TP Hải Phòng ? | 62.4333 | 59.9523 | | Phí giao hàng bao nhiêu ? | 60.7748 | 62.8654 | ## Quickstart: Colab: https://colab.research.google.com/drive/1-y8HLiyS0oCj7Vpy4i7FsJ1A6kU7ROca?usp=sharing ```python import torch from PIL import Image from transformers import AutoModel, AutoTokenizer, AutoProcessor import matplotlib.pyplot as plt model_name = "5CD-AI/ColVintern-1B-v1" processor = AutoProcessor.from_pretrained( model_name, trust_remote_code=True ) model = AutoModel.from_pretrained( model_name, torch_dtype=torch.bfloat16, low_cpu_mem_usage=True, trust_remote_code=True, ).eval().cuda() #!wget https://huggingface.co/5CD-AI/ColVintern-1B-v1/resolve/main/ex1.jpg #!wget https://huggingface.co/5CD-AI/ColVintern-1B-v1/resolve/main/ex2.jpg images = [Image.open("ex1.jpg"),Image.open("ex2.jpg")] batch_images = processor.process_images(images) queries = [ "Cảng Hải Phòng thông báo gì ?", "Phí giao hàng bao nhiêu ?", ] batch_queries = processor.process_queries(queries) batch_images["pixel_values"] = batch_images["pixel_values"].cuda().bfloat16() batch_images["input_ids"] = batch_images["input_ids"].cuda() batch_images["attention_mask"] = batch_images["attention_mask"].cuda().bfloat16() batch_queries["input_ids"] = batch_queries["input_ids"].cuda() batch_queries["attention_mask"] = batch_queries["attention_mask"].cuda().bfloat16() with torch.no_grad(): image_embeddings = model(**batch_images) query_embeddings = model(**batch_queries) scores = processor.score_multi_vector(query_embeddings, image_embeddings) max_scores, max_indices = torch.max(scores, dim=1) # In ra kết quả cho mỗi câu hỏi for i, query in enumerate(queries): print(f"Câu hỏi: '{query}'") print(f"Điểm số: {max_scores[i].item()}\n") plt.figure(figsize=(5,5)) plt.imshow(images[max_indices[i]]) plt.show() ``` ## Citation ``` ```
emarro/axial_caducues_1200
emarro
2024-11-14T04:09:53Z
101
0
transformers
[ "transformers", "pytorch", "axial_caduceus", "fill-mask", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
fill-mask
2024-11-14T03:53:04Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
HappyAIUser/AtmasiddhiGPTv9-gguf
HappyAIUser
2024-11-14T04:06:11Z
9
1
null
[ "gguf", "atmasiddhi-shastra", "spirituality", "jainism", "philosophy", "conversational-ai", "religious-text", "self-realization", "text-generation", "en", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2024-11-14T02:59:06Z
--- language: en tags: - atmasiddhi-shastra - spirituality - jainism - philosophy - conversational-ai - religious-text - self-realization license: apache-2.0 pipeline_tag: text-generation --- # AtmasiddhiGPTv9-gguf **AtmasiddhiGPTv9-gguf** is a fine-tuned version of the **LLaMA 3.2 3B Instruct** model, designed to provide conversational insights and spiritual guidance based on the teachings of the *Atmasiddhi Shastra*, a revered Jain text by Shrimad Rajchandra. This model is specially aligned with contemporary interpretations by Shri Gurudevshri Rakeshbhai, making it a valuable tool for exploring the themes of self-realization, soul, and liberation in Jain philosophy. ## Model Details - **Model Name**: AtmasiddhiGPTv9-gguf - **Base Model**: LLaMA 3.2 3B Instruct (Meta) - **Model Type**: Language Model (GGUF format) - **Language**: English - **Intended Use**: Spiritual guidance, philosophical inquiry, Jain studies, self-reflection - **Alignment**: Based on the recent commentaries and teachings of Shri Gurudevshri Rakeshbhai on the *Atmasiddhi Shastra* - **Recommended Platforms**: LM Studio, Jan (support GGUF models) - **License**: Apache 2.0 - **Framework**: GGUF-compatible ## Model Scope and Purpose **AtmasiddhiGPTv9-gguf** is designed to serve as an interactive tool for individuals seeking a deeper understanding of Jain spiritual concepts, guided by the most recent teachings of Shri Gurudevshri Rakeshbhai. This model uses the philosophical foundation of the *Atmasiddhi Shastra* while adopting the conversational style of the LLaMA 3.2 3B Instruct model, ensuring responses are both spiritually aligned and easily understandable. ### Key Philosophical Themes The model focuses on interpreting key themes of the *Atmasiddhi Shastra*, particularly as presented in Shri Gurudevshri Rakeshbhai’s teachings. These include: 1. **The Nature of the Soul (Atma)**: Exploring the soul's inherent qualities, permanence, and its distinction from physical existence. 2. **Path to Liberation (Moksha)**: Insights into the steps and virtues needed to achieve liberation from the cycle of birth and death. 3. **Karma and Its Impact**: Explanations of karmic law, the effects of accumulated karma, and how it shapes the soul’s journey. 4. **Self-Realization**: Encouraging self-inquiry to unveil true self-identity and transcend ego-driven life. 5. **Discernment and Detachment (Vairagya)**: Offering practical advice on embracing detachment, renouncing material attachments, and cultivating spiritual insight. The model seeks to convey these themes with the depth and clarity characteristic of Shri Gurudevshri’s teachings, while maintaining the conversational ease provided by the LLaMA 3.2 3B Instruct model architecture. ## Recommended Platforms: LM Studio and Jan AtmasiddhiGPTv9-gguf is optimized for use with GGUF-compatible applications like LM Studio and Jan, which allow local, offline interactions with the model. ### [LM Studio](https://lmstudio.ai/download) LM Studio is a free application supporting GGUF-formatted models, ideal for downloading and running large language models offline. **How to Use AtmasiddhiGPTv9-gguf with LM Studio**: 1. **Download LM Studio**: Visit the [LM Studio download page](https://lmstudio.ai/download) and choose your operating system. 2. **Install and Launch**: Follow the installation instructions provided. 3. **Load the Model**: - Search for "AtmasiddhiGPTv9-gguf" in the model catalog, or import it manually if previously downloaded. - Interact with the model via LM Studio’s chat interface or set up a local API server for integration into applications. For additional guidance, refer to the [LM Studio Documentation](https://lmstudio.ai/docs). ### [Jan](https://jan.ai/download) Jan is an open-source application that supports GGUF models, allowing users to interact with models entirely offline. **How to Use AtmasiddhiGPTv9-gguf with Jan**: 1. **Download Jan**: Access the [Jan download page](https://jan.ai/download). 2. **Install and Launch Jan**: Follow the setup instructions. 3. **Import the Model**: - Use Jan’s model management section to add the AtmasiddhiGPTv9-gguf model. - Engage with the model via Jan’s conversational interface. Refer to [Jan Documentation](https://jan.ai/docs) for more details. ## Example Code for Local Use To load AtmasiddhiGPTv9-gguf with compatible libraries (if supported) or GGUF-compatible applications, you can use this sample code: ```python from transformers import AutoModelForCausalLM, AutoTokenizer # Model path model_name = "HappyAIUser/AtmasiddhiGPTv9-gguf" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) # Sample input input_text = "What insights does Atmasiddhi offer about liberation?" inputs = tokenizer(input_text, return_tensors="pt") outputs = model.generate(**inputs, max_length=50) response = tokenizer.decode(outputs[0], skip_special_tokens=True) print(response)