modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-08-29 00:38:39
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 525
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-08-29 00:38:28
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
dcaustin33/llama_friends
|
dcaustin33
| 2023-12-23T04:51:14Z | 2 | 0 |
peft
|
[
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-hf",
"base_model:adapter:meta-llama/Llama-2-7b-hf",
"region:us"
] | null | 2023-12-23T02:37:28Z |
---
library_name: peft
base_model: meta-llama/Llama-2-7b-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.1
|
jebcarter/Psyonic-Rose-20B-GGUF
|
jebcarter
| 2023-12-23T04:50:39Z | 21 | 5 | null |
[
"gguf",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2023-12-23T02:35:30Z |
---
license: other
license_name: microsoft-research-license
license_link: LICENSE
---
|
DaRkSpyro/JewelTheMacaw
|
DaRkSpyro
| 2023-12-23T04:41:24Z | 0 | 0 |
flair
|
[
"flair",
"music",
"en",
"dataset:HuggingFaceH4/no_robots",
"license:apache-2.0",
"region:us"
] | null | 2023-12-23T04:25:09Z |
---
license: apache-2.0
language:
- en
metrics:
- accuracy
tags:
- music
datasets:
- HuggingFaceH4/no_robots
library_name: flair
---
|
Maxx0/mistral_instruct_generation
|
Maxx0
| 2023-12-23T04:21:07Z | 1 | 0 |
peft
|
[
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:generator",
"base_model:mistralai/Mistral-7B-Instruct-v0.1",
"base_model:adapter:mistralai/Mistral-7B-Instruct-v0.1",
"license:apache-2.0",
"region:us"
] | null | 2023-12-23T04:20:58Z |
---
license: apache-2.0
library_name: peft
tags:
- trl
- sft
- generated_from_trainer
datasets:
- generator
base_model: mistralai/Mistral-7B-Instruct-v0.1
model-index:
- name: mistral_instruct_generation
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mistral_instruct_generation
This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0138
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_steps: 0.03
- training_steps: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.3071 | 20.0 | 20 | 0.0966 |
| 0.0239 | 40.0 | 40 | 0.0214 |
| 0.0192 | 60.0 | 60 | 0.0189 |
| 0.0179 | 80.0 | 80 | 0.0173 |
| 0.0149 | 100.0 | 100 | 0.0138 |
### Framework versions
- PEFT 0.7.1
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
dthseemsbttr/gpt2-finetuned-wikitext2-copy
|
dthseemsbttr
| 2023-12-23T04:13:03Z | 5 | 0 |
transformers
|
[
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T17:33:10Z |
---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.36.2
- TensorFlow 2.13.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
adandu/dreambooth_output
|
adandu
| 2023-12-23T04:02:14Z | 0 | 0 |
diffusers
|
[
"diffusers",
"tensorboard",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"dreambooth",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:finetune:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-12-23T02:03:01Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
instance_prompt: a photo of AESARNAV person
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- dreambooth
inference: true
---
# DreamBooth - adandu/dreambooth_output
This is a dreambooth model derived from runwayml/stable-diffusion-v1-5. The weights were trained on a photo of AESARNAV person using [DreamBooth](https://dreambooth.github.io/).
You can find some example images in the following.
DreamBooth for the text encoder was enabled: True.
|
Gummybear05/whisper-small-ko-E30_Y_freq_speed
|
Gummybear05
| 2023-12-23T03:59:40Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"generated_from_trainer",
"hi",
"dataset:aihub_elder",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-12-23T01:53:20Z |
---
language:
- hi
license: apache-2.0
base_model: openai/whisper-small
tags:
- hf-asr-leaderboard
- generated_from_trainer
datasets:
- aihub_elder
model-index:
- name: whisper-small-ko-E30_Y_freq_speed
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-ko-E30_Y_freq_speed
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the aihub Y dialogue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1876
- Cer: 5.2573
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 2
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.4514 | 0.13 | 100 | 0.2782 | 6.3910 |
| 0.2636 | 0.26 | 200 | 0.2298 | 6.1913 |
| 0.2355 | 0.39 | 300 | 0.2313 | 6.5789 |
| 0.2075 | 0.52 | 400 | 0.2121 | 6.1149 |
| 0.1899 | 0.64 | 500 | 0.2107 | 5.9622 |
| 0.1746 | 0.77 | 600 | 0.2040 | 5.8212 |
| 0.1791 | 0.9 | 700 | 0.1974 | 5.6685 |
| 0.0826 | 1.03 | 800 | 0.1924 | 5.4335 |
| 0.0725 | 1.16 | 900 | 0.1959 | 5.4570 |
| 0.072 | 1.29 | 1000 | 0.1942 | 5.2749 |
| 0.0658 | 1.42 | 1100 | 0.1935 | 5.4746 |
| 0.0639 | 1.55 | 1200 | 0.1894 | 5.2867 |
| 0.0658 | 1.68 | 1300 | 0.1891 | 5.3043 |
| 0.0606 | 1.81 | 1400 | 0.1876 | 5.1985 |
| 0.0648 | 1.93 | 1500 | 0.1876 | 5.2573 |
### Framework versions
- Transformers 4.37.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
CocoyGames9/JBrown
|
CocoyGames9
| 2023-12-23T03:50:37Z | 0 | 0 | null |
[
"license:other",
"region:us"
] | null | 2023-12-23T03:47:15Z |
---
license: other
license_name: icescream4
license_link: LICENSE
---
|
gibhug/llama2-7b-chicks_v2-0
|
gibhug
| 2023-12-23T03:33:22Z | 5 | 0 |
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"autotrain",
"conversational",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-23T02:33:39Z |
---
tags:
- autotrain
- text-generation
widget:
- text: "I love AutoTrain because "
license: other
---
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
```
|
liyoo/IntegratedModel_PairClassification
|
liyoo
| 2023-12-23T03:33:19Z | 0 | 0 | null |
[
"code",
"text-classification",
"zh",
"region:us"
] |
text-classification
| 2023-12-23T03:30:08Z |
---
language:
- zh
pipeline_tag: text-classification
tags:
- code
---
|
iamandrewliao/lunarlanding-ppo
|
iamandrewliao
| 2023-12-23T03:17:05Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-12-23T03:16:45Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 260.85 +/- 21.43
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
apuku0723/AIALMWork01
|
apuku0723
| 2023-12-23T03:16:55Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:35:46Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: AIALMWork01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# AIALMWork01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5073
- Matthews Correlation: 0.5102
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.6339690156551746e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 29
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5221 | 1.0 | 535 | 0.5117 | 0.4060 |
| 0.3027 | 2.0 | 1070 | 0.5073 | 0.5102 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
pingstudio07/distilbert-base-uncased-finetuned-cola
|
pingstudio07
| 2023-12-23T03:16:39Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:37:58Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: distilbert-base-uncased-finetuned-cola
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7613
- Matthews Correlation: 0.5218
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5174 | 1.0 | 535 | 0.4562 | 0.4364 |
| 0.3426 | 2.0 | 1070 | 0.4706 | 0.5147 |
| 0.2363 | 3.0 | 1605 | 0.6783 | 0.5016 |
| 0.1593 | 4.0 | 2140 | 0.7613 | 0.5218 |
| 0.1249 | 5.0 | 2675 | 0.8566 | 0.5139 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
dapa93/q-taxi-v3
|
dapa93
| 2023-12-23T03:16:26Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-12-23T03:16:23Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="dapa93/q-taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
tunyu/HW1223_01
|
tunyu
| 2023-12-23T03:05:39Z | 10 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:37:59Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW1223_01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW1223_01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7204
- Matthews Correlation: 0.5641
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5161 | 1.0 | 535 | 0.4596 | 0.4449 |
| 0.3391 | 2.0 | 1070 | 0.4532 | 0.5385 |
| 0.2312 | 3.0 | 1605 | 0.6441 | 0.5136 |
| 0.1591 | 4.0 | 2140 | 0.7204 | 0.5641 |
| 0.1205 | 5.0 | 2675 | 0.8336 | 0.5444 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
GordonMcGregor/stable-diffusion-xl-base-1.0-lora-TOK-Gordon2
|
GordonMcGregor
| 2023-12-23T03:03:24Z | 1 | 0 |
diffusers
|
[
"diffusers",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"lora",
"template:sd-lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] |
text-to-image
| 2023-12-22T22:52:22Z |
---
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
- lora
- template:sd-lora
widget:
- text: 'A photo of TOK man in the rain'
output:
url:
"image_0.png"
- text: 'A photo of TOK man in the rain'
output:
url:
"image_1.png"
- text: 'A photo of TOK man in the rain'
output:
url:
"image_2.png"
- text: 'A photo of TOK man in the rain'
output:
url:
"image_3.png"
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: A photo of TOK man
license: openrail++
---
# SDXL LoRA DreamBooth - GordonMcGregor/stable-diffusion-xl-base-1.0-lora-TOK-Gordon2
<Gallery />
## Model description
These are GordonMcGregor/stable-diffusion-xl-base-1.0-lora-TOK-Gordon2 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use A photo of TOK man to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](GordonMcGregor/stable-diffusion-xl-base-1.0-lora-TOK-Gordon2/tree/main) them in the Files & versions tab.
|
andyWuTw/homework001
|
andyWuTw
| 2023-12-23T02:55:29Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:38:23Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: homework001
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# homework001
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0680
- Matthews Correlation: 0.5309
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.5801518396562525e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 37
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5108 | 1.0 | 1069 | 0.4912 | 0.4402 |
| 0.3536 | 2.0 | 2138 | 0.5719 | 0.5015 |
| 0.2655 | 3.0 | 3207 | 0.7624 | 0.5285 |
| 0.1485 | 4.0 | 4276 | 1.0106 | 0.5132 |
| 0.1107 | 5.0 | 5345 | 1.0680 | 0.5309 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
chtsai2104/llmhw01
|
chtsai2104
| 2023-12-23T02:53:10Z | 7 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:35:56Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: llmhw01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# llmhw01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6108
- Matthews Correlation: 0.5102
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.2536324688169738e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 23
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4708 | 0.4553 |
| 0.4459 | 2.0 | 536 | 0.4836 | 0.4888 |
| 0.4459 | 3.0 | 804 | 0.5368 | 0.5123 |
| 0.2266 | 4.0 | 1072 | 0.6108 | 0.5102 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
youwei1208/HW01
|
youwei1208
| 2023-12-23T02:52:46Z | 11 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:36:19Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7769
- Matthews Correlation: 0.5215
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.5351461883308228e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 7
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5235 | 1.0 | 535 | 0.4588 | 0.4891 |
| 0.3664 | 2.0 | 1070 | 0.4700 | 0.5054 |
| 0.2665 | 3.0 | 1605 | 0.5404 | 0.5358 |
| 0.1984 | 4.0 | 2140 | 0.7282 | 0.5097 |
| 0.147 | 5.0 | 2675 | 0.7769 | 0.5215 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
jkloip/cm124057-01
|
jkloip
| 2023-12-23T02:40:00Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:37:48Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: cm124057-01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# cm124057-01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8311
- Matthews Correlation: 0.5373
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5197 | 1.0 | 535 | 0.4535 | 0.4636 |
| 0.3446 | 2.0 | 1070 | 0.4631 | 0.5118 |
| 0.2344 | 3.0 | 1605 | 0.6146 | 0.5314 |
| 0.1653 | 4.0 | 2140 | 0.7437 | 0.5000 |
| 0.1263 | 5.0 | 2675 | 0.8311 | 0.5373 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
lgl630720/HW_01
|
lgl630720
| 2023-12-23T02:39:00Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:36:03Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW_01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW_01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7295
- Matthews Correlation: 0.5412
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5202 | 1.0 | 535 | 0.4524 | 0.4598 |
| 0.3431 | 2.0 | 1070 | 0.4669 | 0.5277 |
| 0.2373 | 3.0 | 1605 | 0.6410 | 0.5118 |
| 0.1621 | 4.0 | 2140 | 0.7295 | 0.5412 |
| 0.1209 | 5.0 | 2675 | 0.8518 | 0.5264 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Willngo/HW01
|
Willngo
| 2023-12-23T02:37:16Z | 3 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:35:44Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9512
- Matthews Correlation: 0.4832
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5668 | 1.0 | 535 | 0.5497 | 0.3499 |
| 0.3911 | 2.0 | 1070 | 0.5230 | 0.4592 |
| 0.2503 | 3.0 | 1605 | 0.7398 | 0.4542 |
| 0.162 | 4.0 | 2140 | 0.9512 | 0.4832 |
| 0.0968 | 5.0 | 2675 | 1.1938 | 0.4770 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
RandyTsai/HW001
|
RandyTsai
| 2023-12-23T02:36:02Z | 6 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:40:12Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW001
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW001
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9586
- Matthews Correlation: 0.5403
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.1355 | 1.0 | 535 | 0.7101 | 0.5191 |
| 0.099 | 2.0 | 1070 | 0.9586 | 0.5403 |
| 0.0766 | 3.0 | 1605 | 1.1402 | 0.5198 |
| 0.053 | 4.0 | 2140 | 1.2587 | 0.5321 |
| 0.041 | 5.0 | 2675 | 1.2867 | 0.5286 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Rick1999/AI_School_HW01
|
Rick1999
| 2023-12-23T02:25:30Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:35:57Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: AI_School_HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# AI_School_HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8255
- Matthews Correlation: 0.5317
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.518 | 1.0 | 535 | 0.4502 | 0.4464 |
| 0.3421 | 2.0 | 1070 | 0.4666 | 0.5168 |
| 0.2373 | 3.0 | 1605 | 0.6423 | 0.5076 |
| 0.1646 | 4.0 | 2140 | 0.7371 | 0.5224 |
| 0.123 | 5.0 | 2675 | 0.8255 | 0.5317 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
ThuyNT03/KLTN_COQE_viT5_total_POASL_v2
|
ThuyNT03
| 2023-12-23T02:22:17Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"base_model:finetune:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-23T01:36:48Z |
---
license: mit
base_model: VietAI/vit5-large
tags:
- generated_from_trainer
model-index:
- name: KLTN_COQE_viT5_total_POASL_v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# KLTN_COQE_viT5_total_POASL_v2
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.36.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
lostck/HW001
|
lostck
| 2023-12-23T02:21:24Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:36:50Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW001
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW001
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7440
- Matthews Correlation: 0.5395
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5288 | 1.0 | 535 | 0.4658 | 0.4101 |
| 0.3503 | 2.0 | 1070 | 0.4681 | 0.5057 |
| 0.241 | 3.0 | 1605 | 0.6724 | 0.4879 |
| 0.1648 | 4.0 | 2140 | 0.7440 | 0.5395 |
| 0.1279 | 5.0 | 2675 | 0.8706 | 0.5321 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
dyu056/ppo-LunarLander-v2
|
dyu056
| 2023-12-23T02:21:22Z | 1 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-12-22T23:05:08Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: -91.06 +/- 19.81
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
PritK99/ppo-Huggy
|
PritK99
| 2023-12-23T02:21:02Z | 0 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] |
reinforcement-learning
| 2023-12-23T02:20:54Z |
---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: PritK99/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
namtran/LLaMA-7b-AWQ-GGUF
|
namtran
| 2023-12-23T02:20:53Z | 14 | 1 |
transformers
|
[
"transformers",
"gguf",
"llama",
"license:other",
"region:us"
] | null | 2023-12-22T10:41:22Z |
---
inference: false
license: other
model_type: llama
---
# Meta's LLaMA 7B - AWQ GGUF
These files are in GGUF format.
- Model creator: [Meta](https://huggingface.co/none)
- Original model: [LLaMA 7B](https://ai.meta.com/blog/large-language-model-llama-meta-ai)
The model was converted by the combination of [llama.cpp](https://github.com/ggerganov/llama.cpp) and quantization method [AWQ](https://github.com/mit-han-lab/llm-awq)
## How to use models in `llama.cpp`
```
./main -m ggml-model-q4_0-awq.gguf -n 128 --prompt "Once upon a time"
```
Please refer to the instructions at the [PR](https://github.com/ggerganov/llama.cpp/pull/4593)
|
lovejog99/AIA-HW01
|
lovejog99
| 2023-12-23T02:19:37Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:36:03Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: AIA-HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# AIA-HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7654
- Matthews Correlation: 0.5470
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5211 | 1.0 | 535 | 0.4557 | 0.4529 |
| 0.3478 | 2.0 | 1070 | 0.4807 | 0.5198 |
| 0.2342 | 3.0 | 1605 | 0.6712 | 0.5043 |
| 0.1669 | 4.0 | 2140 | 0.7654 | 0.5470 |
| 0.1273 | 5.0 | 2675 | 0.8689 | 0.5271 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
AcEzKeViNz/HW01
|
AcEzKeViNz
| 2023-12-23T02:18:52Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:35:44Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7639
- Matthews Correlation: 0.5142
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5222 | 1.0 | 535 | 0.4592 | 0.4547 |
| 0.3491 | 2.0 | 1070 | 0.4676 | 0.5035 |
| 0.2404 | 3.0 | 1605 | 0.6595 | 0.5033 |
| 0.1643 | 4.0 | 2140 | 0.7639 | 0.5142 |
| 0.1305 | 5.0 | 2675 | 0.8609 | 0.5089 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
albertttt/HW01
|
albertttt
| 2023-12-23T02:18:25Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:38:42Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4621
- Matthews Correlation: 0.5272
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5138 | 1.0 | 535 | 0.4520 | 0.4487 |
| 0.3393 | 2.0 | 1070 | 0.4621 | 0.5272 |
| 0.2334 | 3.0 | 1605 | 0.6378 | 0.5112 |
| 0.159 | 4.0 | 2140 | 0.7787 | 0.5212 |
| 0.1199 | 5.0 | 2675 | 0.8694 | 0.5245 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
eatim/HW01
|
eatim
| 2023-12-23T02:17:41Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:36:23Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7483
- Matthews Correlation: 0.5294
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5288 | 1.0 | 535 | 0.4686 | 0.4343 |
| 0.3583 | 2.0 | 1070 | 0.4728 | 0.5166 |
| 0.2516 | 3.0 | 1605 | 0.6470 | 0.5050 |
| 0.1798 | 4.0 | 2140 | 0.7483 | 0.5294 |
| 0.1381 | 5.0 | 2675 | 0.8334 | 0.5224 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
ecyor/hw1
|
ecyor
| 2023-12-23T01:59:03Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T01:36:15Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: hw1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hw1
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7215
- Matthews Correlation: 0.5423
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5241 | 1.0 | 535 | 0.4491 | 0.4708 |
| 0.353 | 2.0 | 1070 | 0.4616 | 0.5311 |
| 0.244 | 3.0 | 1605 | 0.6495 | 0.4986 |
| 0.171 | 4.0 | 2140 | 0.7215 | 0.5423 |
| 0.132 | 5.0 | 2675 | 0.8294 | 0.5199 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
lilyftyunjin/nmixx1
|
lilyftyunjin
| 2023-12-23T01:54:48Z | 0 | 1 |
diffusers
|
[
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"base_model:latent-consistency/lcm-lora-sdxl",
"base_model:adapter:latent-consistency/lcm-lora-sdxl",
"license:unknown",
"region:us"
] |
text-to-image
| 2023-12-23T01:53:15Z |
---
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
widget:
- text: '-'
output:
url: images/IMG_8503.jpeg
base_model: latent-consistency/lcm-lora-sdxl
instance_prompt: null
license: unknown
---
# nmixx
<Gallery />
## Download model
[Download](/lilyftyunjin/nmixx1/tree/main) them in the Files & versions tab.
|
ThuyNT03/KLTN_COQE_viT5_total_SPAOL_v2
|
ThuyNT03
| 2023-12-23T01:47:01Z | 3 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"base_model:finetune:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-23T00:58:50Z |
---
license: mit
base_model: VietAI/vit5-large
tags:
- generated_from_trainer
model-index:
- name: KLTN_COQE_viT5_total_SPAOL_v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# KLTN_COQE_viT5_total_SPAOL_v2
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
Ethan615/try
|
Ethan615
| 2023-12-23T01:45:46Z | 6 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-22T06:37:57Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: try
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# try
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5544
- Matthews Correlation: 0.5009
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.546889870762945e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 6
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 268 | 0.4784 | 0.4982 |
| 0.4708 | 2.0 | 536 | 0.4544 | 0.5011 |
| 0.4708 | 3.0 | 804 | 0.5128 | 0.5070 |
| 0.2823 | 4.0 | 1072 | 0.5544 | 0.5009 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
AhmedTaha012/pargraphs_titlesV1.0
|
AhmedTaha012
| 2023-12-23T01:37:25Z | 4 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-base",
"base_model:finetune:google-t5/t5-base",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-23T01:37:01Z |
---
license: apache-2.0
base_model: t5-base
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: pargraphs_titlesV1.0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# pargraphs_titlesV1.0
This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2697
- Rouge1: 68.705
- Rouge2: 54.5204
- Rougel: 67.7709
- Rougelsum: 67.7942
- Gen Len: 1401169535.5
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:------------:|
| 0.347 | 0.44 | 100 | 0.2634 | 65.1158 | 48.282 | 63.708 | 63.7424 | 1401169536.0 |
| 0.2412 | 0.88 | 200 | 0.3167 | 66.0958 | 50.4705 | 65.1041 | 65.1412 | 1401169536.0 |
| 0.2069 | 1.32 | 300 | 0.2357 | 68.6707 | 53.5945 | 67.3654 | 67.371 | 1401169536.0 |
| 0.1825 | 1.76 | 400 | 0.3932 | 65.7022 | 51.08 | 64.9927 | 65.0322 | 1401169536.0 |
| 0.1643 | 2.2 | 500 | 0.2223 | 69.132 | 54.5176 | 67.881 | 67.8987 | 1401169535.0 |
| 0.1715 | 2.64 | 600 | 0.2227 | 69.2258 | 54.2845 | 68.0181 | 68.0404 | 1401169535.5 |
| 0.1571 | 3.08 | 700 | 0.2707 | 68.9908 | 54.7777 | 68.1279 | 68.151 | 1401169536.0 |
| 0.1584 | 3.52 | 800 | 0.2193 | 70.9126 | 56.4866 | 69.6718 | 69.6687 | 1401169535.5 |
| 0.1565 | 3.96 | 900 | 0.3482 | 68.6691 | 54.8446 | 67.796 | 67.8541 | 1401169536.0 |
| 0.155 | 4.4 | 1000 | 0.2694 | 69.1457 | 55.1123 | 68.2207 | 68.2543 | 1401169536.0 |
| 0.1586 | 4.84 | 1100 | 0.2697 | 68.705 | 54.5204 | 67.7709 | 67.7942 | 1401169535.5 |
### Framework versions
- Transformers 4.36.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
kevinmcmahon/corgy_dog_LoRA
|
kevinmcmahon
| 2023-12-23T01:34:35Z | 1 | 1 |
diffusers
|
[
"diffusers",
"tensorboard",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"lora",
"template:sd-lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] |
text-to-image
| 2023-12-22T22:38:06Z |
---
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
- lora
- template:sd-lora
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: a photo of TOK dog
license: openrail++
---
# SDXL LoRA DreamBooth - kevinmcmahon/corgy_dog_LoRA
<Gallery />
## Model description
These are kevinmcmahon/corgy_dog_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use a photo of TOK dog to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](kevinmcmahon/corgy_dog_LoRA/tree/main) them in the Files & versions tab.
|
CVL-Heidelberg/ControlNet-XS
|
CVL-Heidelberg
| 2023-12-23T01:28:08Z | 0 | 45 | null |
[
"license:openrail",
"region:us"
] | null | 2023-09-22T10:19:35Z |
---
license: openrail
---
# ControlNet-XS


These are ControlNet-XS weights trained on [stabilityai/stable-diffusion-xl-base-1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) and [stabilityai/stable-diffusion-2-1](https://huggingface.co/stabilityai/stable-diffusion-2-1) on edge and depthmap conditioning respectively. You can find more details and further visual examples on the project page [ControlNet-XS](https://vislearn.github.io/ControlNet-XS/).
## The codebase
The code is based on on the StableDiffusion frameworks. To use the ControlNet-XS, you need to access the weights for the StableDiffusion version that you want to control separately.
We provide the weights with both depth and edge control for StableDiffusion2.1 and StableDiffusion-XL.
After obtaining the weights, you need the replace the paths to the weights of StableDiffusion and ControlNet-XS in the config files.
## Usage
Example for StableDiffusion-XL with Canny Edges
```python
import scripts.control_utils as cu
import torch
from PIL import Image
path_to_config = 'ControlNet-XS-main/configs/inference/sdxl/sdxl_encD_canny_48m.yaml'
model = cu.create_model(path_to_config).to('cuda')
image_path = 'PATH/TO/IMAGES/Shoe.png'
canny_high_th = 250
canny_low_th = 100
size = 768
num_samples=2
image = cu.get_image(image_path, size=size)
edges = cu.get_canny_edges(image, low_th=canny_low_th, high_th=canny_high_th)
samples, controls = cu.get_sdxl_sample(
guidance=edges,
ddim_steps=10,
num_samples=num_samples,
model=model,
shape=[4, size // 8, size // 8],
control_scale=0.95,
prompt='cinematic, shoe in the streets, made from meat, photorealistic shoe, highly detailed',
n_prompt='lowres, bad anatomy, worst quality, low quality',
)
Image.fromarray(cu.create_image_grid(samples)).save('SDXL_MyShoe.png')
```

Example for StableDiffusion2.1 with depth maps
```python
import scripts.control_utils as cu
import torch
from PIL import Image
path_to_config = 'PATH/TO/CONFIG/sd21_encD_depth_14m.yaml'
model = cu.create_model(path_to_config).to('cuda')
size = 768
image_path = 'PATH/TO/IMAGES/Shoe.png'
image = cu.get_image(image_path, size=size)
depth = cu.get_midas_depth(image, max_resolution=size)
num_samples = 2
samples, controls = cu.get_sd_sample(
guidance=depth,
ddim_steps=10,
num_samples=num_samples,
model=model,
shape=[4, size // 8, size // 8],
control_scale=0.95,
prompt='cinematic, advertising shot, shoe in a city street, photorealistic shoe, colourful, highly detailed',
n_prompt='low quality, bad quality, sketches'
)
Image.fromarray(cu.create_image_grid(samples)).save('SD_MyShoe.png')
```

|
raoel/mt5-small-openai-summarize_from_feedback
|
raoel
| 2023-12-23T01:10:12Z | 5 | 0 |
transformers
|
[
"transformers",
"tf",
"mt5",
"text2text-generation",
"generated_from_keras_callback",
"base_model:google/mt5-small",
"base_model:finetune:google/mt5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-22T23:45:59Z |
---
license: apache-2.0
base_model: google/mt5-small
tags:
- generated_from_keras_callback
model-index:
- name: raoel/mt5-small-openai-summarize_from_feedback
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# raoel/mt5-small-openai-summarize_from_feedback
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.8354
- Validation Loss: 1.9661
- Epoch: 6
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5.6e-05, 'decay_steps': 5032, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 2.8327 | 1.9661 | 0 |
| 2.8194 | 1.9661 | 1 |
| 2.8246 | 1.9661 | 2 |
| 2.8298 | 1.9661 | 3 |
| 2.8326 | 1.9661 | 4 |
| 2.8265 | 1.9661 | 5 |
| 2.8354 | 1.9661 | 6 |
### Framework versions
- Transformers 4.36.0
- TensorFlow 2.13.0
- Datasets 2.15.0
- Tokenizers 0.15.0
|
LarryAIDraw/shushu
|
LarryAIDraw
| 2023-12-23T01:08:56Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T01:06:56Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/239447/shuraina-west-or-i-dont-want-to-be-an-ojakgyo-or-manhwa
|
LarryAIDraw/lucyheartfilia-055555
|
LarryAIDraw
| 2023-12-23T01:04:06Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T01:02:17Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/239818/lucy-heartfilia-fairy-tail
|
LarryAIDraw/mythra-xb-richy-v1
|
LarryAIDraw
| 2023-12-23T01:03:42Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T01:01:31Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/240311/mythrahikari-xenoblade-chronicles-2-lora-or-3-outfits-swimsuit-massive-melee-and-default
|
LarryAIDraw/pyra-xb-richy-v1
|
LarryAIDraw
| 2023-12-23T01:03:31Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T01:01:08Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/240310/pyrahomura-xenoblade-chronicles-2-lora-or-2-outfits-swimsuit-and-default
|
LarryAIDraw/Shuna
|
LarryAIDraw
| 2023-12-23T00:59:27Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T00:56:36Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/240229/shuna-that-time-i-got-reincarnated-as-a-slime
|
LarryAIDraw/hoshimiyamukuro_scarxzys
|
LarryAIDraw
| 2023-12-23T00:58:55Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T00:55:05Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/240803/mukuro-hoshimiya-or-date-a-live
|
LarryAIDraw/ashuna-virginrd-01
|
LarryAIDraw
| 2023-12-23T00:58:43Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T00:54:42Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/240655/ashuna-shokei-shoujo-no-virgin-road
|
LarryAIDraw/menou-virginrd-01
|
LarryAIDraw
| 2023-12-23T00:58:25Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-12-23T00:54:08Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/240650/menou-shokei-shoujo-no-virgin-road
|
naninya/mt5-small-finetuned-amazon-en-es
|
naninya
| 2023-12-23T00:53:44Z | 10 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"mt5",
"text2text-generation",
"summarization",
"generated_from_trainer",
"base_model:google/mt5-small",
"base_model:finetune:google/mt5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
summarization
| 2023-12-23T00:42:25Z |
---
license: apache-2.0
base_model: google/mt5-small
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: mt5-small-finetuned-amazon-en-es
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mt5-small-finetuned-amazon-en-es
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4213
- Rouge1: 31.833
- Rouge2: 11.5704
- Rougel: 28.3537
- Rougelsum: 29.7517
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| 12.421 | 1.0 | 185 | 4.1761 | 8.5214 | 1.3412 | 8.1063 | 8.2251 |
| 4.6683 | 2.0 | 370 | 2.8343 | 20.6485 | 7.1917 | 18.6741 | 19.4706 |
| 3.6666 | 3.0 | 555 | 2.5616 | 20.3673 | 6.1998 | 18.2531 | 19.0305 |
| 3.3157 | 4.0 | 740 | 2.5002 | 28.4326 | 11.0801 | 25.391 | 26.4882 |
| 3.1834 | 5.0 | 925 | 2.4586 | 29.0975 | 11.3058 | 26.0004 | 27.5342 |
| 3.0983 | 6.0 | 1110 | 2.4191 | 31.5865 | 11.3633 | 27.6063 | 29.6726 |
| 3.0338 | 7.0 | 1295 | 2.4258 | 31.845 | 11.9743 | 28.3534 | 29.8196 |
| 2.9805 | 8.0 | 1480 | 2.4213 | 31.833 | 11.5704 | 28.3537 | 29.7517 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
powerpuf-bot/mdeberta-v3-th-wiki-qa_hyp-params
|
powerpuf-bot
| 2023-12-23T00:42:39Z | 7 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"deberta-v2",
"question-answering",
"generated_from_trainer",
"base_model:timpal0l/mdeberta-v3-base-squad2",
"base_model:finetune:timpal0l/mdeberta-v3-base-squad2",
"license:mit",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-12-09T08:38:35Z |
---
license: mit
base_model: timpal0l/mdeberta-v3-base-squad2
tags:
- generated_from_trainer
model-index:
- name: model1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model1
This model is a fine-tuned version of [timpal0l/mdeberta-v3-base-squad2](https://huggingface.co/timpal0l/mdeberta-v3-base-squad2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0164
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.2922909480977358e-06
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 4.375 | 0.0 | 10 | 4.1404 |
| 4.1912 | 0.0 | 20 | 3.9428 |
| 4.092 | 0.01 | 30 | 3.7984 |
| 3.8669 | 0.01 | 40 | 3.7072 |
| 3.686 | 0.01 | 50 | 3.6632 |
| 3.8802 | 0.01 | 60 | 3.6311 |
| 3.7809 | 0.02 | 70 | 3.5933 |
| 3.7638 | 0.02 | 80 | 3.5530 |
| 3.7452 | 0.02 | 90 | 3.5179 |
| 3.5223 | 0.02 | 100 | 3.4762 |
| 3.5319 | 0.03 | 110 | 3.4259 |
| 3.5082 | 0.03 | 120 | 3.3582 |
| 3.4774 | 0.03 | 130 | 3.2531 |
| 3.5161 | 0.03 | 140 | 3.1888 |
| 3.505 | 0.04 | 150 | 3.1303 |
| 3.2086 | 0.04 | 160 | 3.0741 |
| 3.2909 | 0.04 | 170 | 3.0192 |
| 3.3154 | 0.04 | 180 | 2.9615 |
| 3.185 | 0.04 | 190 | 2.8924 |
| 3.1328 | 0.05 | 200 | 2.8199 |
| 3.1915 | 0.05 | 210 | 2.7362 |
| 3.2172 | 0.05 | 220 | 2.6635 |
| 2.862 | 0.05 | 230 | 2.5907 |
| 3.083 | 0.06 | 240 | 2.5439 |
| 2.9065 | 0.06 | 250 | 2.4994 |
| 2.9812 | 0.06 | 260 | 2.4005 |
| 2.6782 | 0.06 | 270 | 2.3395 |
| 2.7705 | 0.07 | 280 | 2.3121 |
| 2.7435 | 0.07 | 290 | 2.2823 |
| 2.8056 | 0.07 | 300 | 2.3203 |
| 2.7294 | 0.07 | 310 | 2.2172 |
| 2.3808 | 0.08 | 320 | 2.1658 |
| 2.5029 | 0.08 | 330 | 2.1111 |
| 2.6777 | 0.08 | 340 | 2.0256 |
| 2.7282 | 0.08 | 350 | 1.9819 |
| 2.57 | 0.08 | 360 | 1.9812 |
| 2.3672 | 0.09 | 370 | 2.0118 |
| 2.6299 | 0.09 | 380 | 1.9472 |
| 2.3196 | 0.09 | 390 | 1.8996 |
| 2.215 | 0.09 | 400 | 1.9120 |
| 2.4375 | 0.1 | 410 | 1.9036 |
| 2.1814 | 0.1 | 420 | 1.8292 |
| 2.1152 | 0.1 | 430 | 1.8553 |
| 2.0335 | 0.1 | 440 | 1.8034 |
| 2.187 | 0.11 | 450 | 1.8023 |
| 2.4437 | 0.11 | 460 | 1.7244 |
| 1.7916 | 0.11 | 470 | 1.6887 |
| 2.1537 | 0.11 | 480 | 1.6579 |
| 1.8474 | 0.12 | 490 | 1.6313 |
| 1.9795 | 0.12 | 500 | 1.5955 |
| 1.8623 | 0.12 | 510 | 1.5810 |
| 2.1003 | 0.12 | 520 | 1.5760 |
| 1.8952 | 0.12 | 530 | 1.5297 |
| 2.0518 | 0.13 | 540 | 1.5184 |
| 1.8124 | 0.13 | 550 | 1.5123 |
| 1.8168 | 0.13 | 560 | 1.5117 |
| 1.9548 | 0.13 | 570 | 1.4487 |
| 1.9592 | 0.14 | 580 | 1.4095 |
| 1.9761 | 0.14 | 590 | 1.3941 |
| 1.8741 | 0.14 | 600 | 1.4356 |
| 1.6443 | 0.14 | 610 | 1.3777 |
| 1.7726 | 0.15 | 620 | 1.3423 |
| 1.552 | 0.15 | 630 | 1.3788 |
| 1.4915 | 0.15 | 640 | 1.4771 |
| 1.6819 | 0.15 | 650 | 1.3560 |
| 1.5504 | 0.16 | 660 | 1.3146 |
| 1.7773 | 0.16 | 670 | 1.2680 |
| 1.5283 | 0.16 | 680 | 1.2236 |
| 1.595 | 0.16 | 690 | 1.2083 |
| 1.5692 | 0.16 | 700 | 1.2230 |
| 1.6049 | 0.17 | 710 | 1.1877 |
| 1.6463 | 0.17 | 720 | 1.1822 |
| 1.6622 | 0.17 | 730 | 1.1653 |
| 1.7127 | 0.17 | 740 | 1.1454 |
| 1.5633 | 0.18 | 750 | 1.1122 |
| 1.6619 | 0.18 | 760 | 1.1077 |
| 1.67 | 0.18 | 770 | 1.0997 |
| 1.3763 | 0.18 | 780 | 1.0825 |
| 1.5286 | 0.19 | 790 | 1.0683 |
| 1.5665 | 0.19 | 800 | 1.0711 |
| 1.3382 | 0.19 | 810 | 1.0924 |
| 1.2342 | 0.19 | 820 | 1.0977 |
| 1.2506 | 0.2 | 830 | 1.0506 |
| 1.5802 | 0.2 | 840 | 1.0330 |
| 1.3535 | 0.2 | 850 | 1.0072 |
| 1.4384 | 0.2 | 860 | 1.0093 |
| 1.3955 | 0.2 | 870 | 0.9962 |
| 1.3652 | 0.21 | 880 | 0.9996 |
| 1.382 | 0.21 | 890 | 0.9898 |
| 1.4646 | 0.21 | 900 | 1.0083 |
| 1.2399 | 0.21 | 910 | 0.9693 |
| 1.3242 | 0.22 | 920 | 0.9397 |
| 1.1119 | 0.22 | 930 | 0.9344 |
| 1.3281 | 0.22 | 940 | 0.9227 |
| 1.2404 | 0.22 | 950 | 0.9122 |
| 1.3808 | 0.23 | 960 | 0.9193 |
| 1.7082 | 0.23 | 970 | 0.8976 |
| 1.1682 | 0.23 | 980 | 0.8860 |
| 1.4836 | 0.23 | 990 | 0.8793 |
| 1.1542 | 0.24 | 1000 | 0.8774 |
| 1.4316 | 0.24 | 1010 | 0.8797 |
| 0.929 | 0.24 | 1020 | 0.8823 |
| 1.3463 | 0.24 | 1030 | 0.8618 |
| 1.2789 | 0.24 | 1040 | 0.8517 |
| 1.7579 | 0.25 | 1050 | 0.8431 |
| 1.4194 | 0.25 | 1060 | 0.8322 |
| 1.1674 | 0.25 | 1070 | 0.8391 |
| 1.4809 | 0.25 | 1080 | 0.8619 |
| 1.209 | 0.26 | 1090 | 0.8467 |
| 1.2726 | 0.26 | 1100 | 0.8282 |
| 1.2159 | 0.26 | 1110 | 0.7985 |
| 1.1183 | 0.26 | 1120 | 0.7767 |
| 1.3627 | 0.27 | 1130 | 0.7703 |
| 1.2547 | 0.27 | 1140 | 0.7534 |
| 0.9355 | 0.27 | 1150 | 0.7594 |
| 1.1281 | 0.27 | 1160 | 0.7596 |
| 1.1924 | 0.28 | 1170 | 0.7487 |
| 1.1475 | 0.28 | 1180 | 0.7516 |
| 1.0629 | 0.28 | 1190 | 0.7326 |
| 0.9613 | 0.28 | 1200 | 0.7236 |
| 1.2041 | 0.28 | 1210 | 0.7155 |
| 1.2971 | 0.29 | 1220 | 0.7267 |
| 1.0594 | 0.29 | 1230 | 0.7210 |
| 0.9503 | 0.29 | 1240 | 0.7344 |
| 1.0725 | 0.29 | 1250 | 0.7183 |
| 1.0849 | 0.3 | 1260 | 0.7087 |
| 1.3877 | 0.3 | 1270 | 0.6950 |
| 1.0516 | 0.3 | 1280 | 0.6946 |
| 1.1895 | 0.3 | 1290 | 0.6859 |
| 1.0827 | 0.31 | 1300 | 0.7029 |
| 1.0178 | 0.31 | 1310 | 0.7280 |
| 0.9494 | 0.31 | 1320 | 0.7525 |
| 1.0853 | 0.31 | 1330 | 0.7235 |
| 1.2498 | 0.32 | 1340 | 0.7018 |
| 1.0128 | 0.32 | 1350 | 0.6845 |
| 1.0692 | 0.32 | 1360 | 0.6666 |
| 1.2221 | 0.32 | 1370 | 0.6454 |
| 1.3963 | 0.32 | 1380 | 0.6265 |
| 0.9985 | 0.33 | 1390 | 0.6321 |
| 0.885 | 0.33 | 1400 | 0.6368 |
| 0.9406 | 0.33 | 1410 | 0.6185 |
| 1.6654 | 0.33 | 1420 | 0.5753 |
| 1.2005 | 0.34 | 1430 | 0.5558 |
| 0.7428 | 0.34 | 1440 | 0.5553 |
| 0.7904 | 0.34 | 1450 | 0.5687 |
| 1.068 | 0.34 | 1460 | 0.5762 |
| 0.9732 | 0.35 | 1470 | 0.5811 |
| 1.1683 | 0.35 | 1480 | 0.5703 |
| 1.1428 | 0.35 | 1490 | 0.5568 |
| 1.3573 | 0.35 | 1500 | 0.5689 |
| 1.1233 | 0.36 | 1510 | 0.5711 |
| 0.8034 | 0.36 | 1520 | 0.5669 |
| 1.0412 | 0.36 | 1530 | 0.5755 |
| 1.0574 | 0.36 | 1540 | 0.5719 |
| 0.7889 | 0.36 | 1550 | 0.5856 |
| 0.8737 | 0.37 | 1560 | 0.5856 |
| 1.0901 | 0.37 | 1570 | 0.5701 |
| 0.8094 | 0.37 | 1580 | 0.5373 |
| 0.9114 | 0.37 | 1590 | 0.5207 |
| 0.896 | 0.38 | 1600 | 0.5279 |
| 0.7269 | 0.38 | 1610 | 0.5340 |
| 1.0278 | 0.38 | 1620 | 0.5303 |
| 0.9435 | 0.38 | 1630 | 0.5206 |
| 0.8486 | 0.39 | 1640 | 0.5193 |
| 0.8943 | 0.39 | 1650 | 0.5247 |
| 1.3937 | 0.39 | 1660 | 0.5298 |
| 1.1081 | 0.39 | 1670 | 0.5268 |
| 0.9757 | 0.4 | 1680 | 0.5390 |
| 0.7825 | 0.4 | 1690 | 0.5355 |
| 1.1096 | 0.4 | 1700 | 0.5305 |
| 1.2618 | 0.4 | 1710 | 0.5071 |
| 0.9897 | 0.4 | 1720 | 0.5090 |
| 1.1452 | 0.41 | 1730 | 0.5180 |
| 0.9241 | 0.41 | 1740 | 0.5092 |
| 1.0909 | 0.41 | 1750 | 0.5135 |
| 1.2112 | 0.41 | 1760 | 0.5140 |
| 1.2592 | 0.42 | 1770 | 0.5066 |
| 1.2503 | 0.42 | 1780 | 0.4883 |
| 1.0507 | 0.42 | 1790 | 0.5065 |
| 0.576 | 0.42 | 1800 | 0.5477 |
| 1.0647 | 0.43 | 1810 | 0.5238 |
| 0.7856 | 0.43 | 1820 | 0.4910 |
| 0.6518 | 0.43 | 1830 | 0.4811 |
| 0.8611 | 0.43 | 1840 | 0.4627 |
| 0.7594 | 0.44 | 1850 | 0.4449 |
| 1.1667 | 0.44 | 1860 | 0.4647 |
| 0.8421 | 0.44 | 1870 | 0.4623 |
| 0.6977 | 0.44 | 1880 | 0.4698 |
| 0.777 | 0.44 | 1890 | 0.4965 |
| 1.2923 | 0.45 | 1900 | 0.4945 |
| 0.5802 | 0.45 | 1910 | 0.4951 |
| 1.1421 | 0.45 | 1920 | 0.5059 |
| 0.769 | 0.45 | 1930 | 0.4708 |
| 0.962 | 0.46 | 1940 | 0.4418 |
| 0.8201 | 0.46 | 1950 | 0.4082 |
| 0.7339 | 0.46 | 1960 | 0.4170 |
| 1.1713 | 0.46 | 1970 | 0.4262 |
| 0.7221 | 0.47 | 1980 | 0.4020 |
| 0.7731 | 0.47 | 1990 | 0.3957 |
| 0.8662 | 0.47 | 2000 | 0.4062 |
| 0.7804 | 0.47 | 2010 | 0.4201 |
| 0.8776 | 0.48 | 2020 | 0.4355 |
| 1.1361 | 0.48 | 2030 | 0.3997 |
| 0.8406 | 0.48 | 2040 | 0.4000 |
| 0.6875 | 0.48 | 2050 | 0.4196 |
| 1.1029 | 0.48 | 2060 | 0.4189 |
| 0.9125 | 0.49 | 2070 | 0.3967 |
| 1.042 | 0.49 | 2080 | 0.3798 |
| 0.9562 | 0.49 | 2090 | 0.3794 |
| 0.7689 | 0.49 | 2100 | 0.3927 |
| 0.5273 | 0.5 | 2110 | 0.4200 |
| 0.6665 | 0.5 | 2120 | 0.4480 |
| 1.4543 | 0.5 | 2130 | 0.4194 |
| 1.0902 | 0.5 | 2140 | 0.3985 |
| 0.5227 | 0.51 | 2150 | 0.3910 |
| 0.9229 | 0.51 | 2160 | 0.3935 |
| 1.2764 | 0.51 | 2170 | 0.3958 |
| 0.6033 | 0.51 | 2180 | 0.4082 |
| 0.4971 | 0.52 | 2190 | 0.4010 |
| 0.949 | 0.52 | 2200 | 0.4018 |
| 0.6899 | 0.52 | 2210 | 0.3925 |
| 0.6232 | 0.52 | 2220 | 0.3964 |
| 0.6495 | 0.52 | 2230 | 0.3807 |
| 0.5465 | 0.53 | 2240 | 0.3845 |
| 0.6768 | 0.53 | 2250 | 0.3868 |
| 0.6013 | 0.53 | 2260 | 0.3930 |
| 0.429 | 0.53 | 2270 | 0.3514 |
| 0.9703 | 0.54 | 2280 | 0.3413 |
| 1.0727 | 0.54 | 2290 | 0.3510 |
| 0.9429 | 0.54 | 2300 | 0.3689 |
| 0.4055 | 0.54 | 2310 | 0.3876 |
| 0.8104 | 0.55 | 2320 | 0.3604 |
| 0.7378 | 0.55 | 2330 | 0.3471 |
| 0.8352 | 0.55 | 2340 | 0.3516 |
| 1.1039 | 0.55 | 2350 | 0.3435 |
| 0.5216 | 0.56 | 2360 | 0.3055 |
| 1.0551 | 0.56 | 2370 | 0.3188 |
| 0.8982 | 0.56 | 2380 | 0.3241 |
| 0.6169 | 0.56 | 2390 | 0.3414 |
| 0.4689 | 0.56 | 2400 | 0.3356 |
| 0.6489 | 0.57 | 2410 | 0.3142 |
| 0.8051 | 0.57 | 2420 | 0.3106 |
| 0.7659 | 0.57 | 2430 | 0.2985 |
| 0.6141 | 0.57 | 2440 | 0.2968 |
| 0.8785 | 0.58 | 2450 | 0.3063 |
| 0.7279 | 0.58 | 2460 | 0.3035 |
| 0.5264 | 0.58 | 2470 | 0.3159 |
| 0.4273 | 0.58 | 2480 | 0.3115 |
| 0.6012 | 0.59 | 2490 | 0.3088 |
| 0.5439 | 0.59 | 2500 | 0.3158 |
| 0.5593 | 0.59 | 2510 | 0.2958 |
| 0.6242 | 0.59 | 2520 | 0.2569 |
| 0.3424 | 0.6 | 2530 | 0.2506 |
| 0.5503 | 0.6 | 2540 | 0.2600 |
| 0.6439 | 0.6 | 2550 | 0.2525 |
| 0.78 | 0.6 | 2560 | 0.2522 |
| 1.0034 | 0.6 | 2570 | 0.2654 |
| 0.6776 | 0.61 | 2580 | 0.2616 |
| 0.4825 | 0.61 | 2590 | 0.2695 |
| 0.3983 | 0.61 | 2600 | 0.2540 |
| 0.6563 | 0.61 | 2610 | 0.2448 |
| 0.9949 | 0.62 | 2620 | 0.2342 |
| 0.5665 | 0.62 | 2630 | 0.2504 |
| 0.8271 | 0.62 | 2640 | 0.2875 |
| 0.5201 | 0.62 | 2650 | 0.2782 |
| 0.6666 | 0.63 | 2660 | 0.2533 |
| 0.7844 | 0.63 | 2670 | 0.2530 |
| 0.7381 | 0.63 | 2680 | 0.2460 |
| 0.6387 | 0.63 | 2690 | 0.2302 |
| 0.3566 | 0.64 | 2700 | 0.2291 |
| 0.6419 | 0.64 | 2710 | 0.2463 |
| 0.6215 | 0.64 | 2720 | 0.2651 |
| 0.4889 | 0.64 | 2730 | 0.2716 |
| 0.3527 | 0.64 | 2740 | 0.2603 |
| 0.3562 | 0.65 | 2750 | 0.2527 |
| 0.7771 | 0.65 | 2760 | 0.2407 |
| 0.5634 | 0.65 | 2770 | 0.2244 |
| 0.3462 | 0.65 | 2780 | 0.2503 |
| 0.8867 | 0.66 | 2790 | 0.2806 |
| 0.7455 | 0.66 | 2800 | 0.3184 |
| 0.5771 | 0.66 | 2810 | 0.3167 |
| 0.4235 | 0.66 | 2820 | 0.2873 |
| 0.7555 | 0.67 | 2830 | 0.2547 |
| 0.6507 | 0.67 | 2840 | 0.2516 |
| 0.4571 | 0.67 | 2850 | 0.2488 |
| 0.4541 | 0.67 | 2860 | 0.2301 |
| 0.4262 | 0.68 | 2870 | 0.2216 |
| 0.3153 | 0.68 | 2880 | 0.2197 |
| 0.4931 | 0.68 | 2890 | 0.2374 |
| 0.4231 | 0.68 | 2900 | 0.2574 |
| 0.7015 | 0.68 | 2910 | 0.2652 |
| 0.3229 | 0.69 | 2920 | 0.2518 |
| 0.7307 | 0.69 | 2930 | 0.2342 |
| 0.3726 | 0.69 | 2940 | 0.2259 |
| 0.3265 | 0.69 | 2950 | 0.2205 |
| 0.4566 | 0.7 | 2960 | 0.2385 |
| 0.2977 | 0.7 | 2970 | 0.2686 |
| 0.8794 | 0.7 | 2980 | 0.2951 |
| 0.3673 | 0.7 | 2990 | 0.2772 |
| 0.2593 | 0.71 | 3000 | 0.2600 |
| 0.5541 | 0.71 | 3010 | 0.2483 |
| 0.7229 | 0.71 | 3020 | 0.2410 |
| 0.7328 | 0.71 | 3030 | 0.2352 |
| 0.5662 | 0.72 | 3040 | 0.2267 |
| 0.3607 | 0.72 | 3050 | 0.2296 |
| 0.7631 | 0.72 | 3060 | 0.2361 |
| 0.2357 | 0.72 | 3070 | 0.2352 |
| 0.3763 | 0.72 | 3080 | 0.2337 |
| 0.4553 | 0.73 | 3090 | 0.2065 |
| 0.5716 | 0.73 | 3100 | 0.1958 |
| 0.5131 | 0.73 | 3110 | 0.1937 |
| 0.5152 | 0.73 | 3120 | 0.2139 |
| 0.4131 | 0.74 | 3130 | 0.2437 |
| 0.8475 | 0.74 | 3140 | 0.2335 |
| 0.6262 | 0.74 | 3150 | 0.2227 |
| 0.684 | 0.74 | 3160 | 0.2159 |
| 0.3416 | 0.75 | 3170 | 0.2194 |
| 0.3636 | 0.75 | 3180 | 0.2030 |
| 0.3231 | 0.75 | 3190 | 0.2113 |
| 0.8976 | 0.75 | 3200 | 0.2275 |
| 0.4114 | 0.76 | 3210 | 0.2305 |
| 0.386 | 0.76 | 3220 | 0.2114 |
| 0.2951 | 0.76 | 3230 | 0.2099 |
| 0.8021 | 0.76 | 3240 | 0.2239 |
| 0.2723 | 0.76 | 3250 | 0.2289 |
| 0.9072 | 0.77 | 3260 | 0.2236 |
| 0.3085 | 0.77 | 3270 | 0.2220 |
| 0.2964 | 0.77 | 3280 | 0.2205 |
| 0.5862 | 0.77 | 3290 | 0.2210 |
| 0.315 | 0.78 | 3300 | 0.2246 |
| 0.3576 | 0.78 | 3310 | 0.2198 |
| 0.3658 | 0.78 | 3320 | 0.2154 |
| 0.3488 | 0.78 | 3330 | 0.2179 |
| 0.2596 | 0.79 | 3340 | 0.2193 |
| 0.5901 | 0.79 | 3350 | 0.2338 |
| 0.6697 | 0.79 | 3360 | 0.2290 |
| 0.4801 | 0.79 | 3370 | 0.2106 |
| 0.8202 | 0.8 | 3380 | 0.2141 |
| 0.3524 | 0.8 | 3390 | 0.2413 |
| 0.7825 | 0.8 | 3400 | 0.2359 |
| 0.4712 | 0.8 | 3410 | 0.2309 |
| 0.3536 | 0.8 | 3420 | 0.2314 |
| 0.9339 | 0.81 | 3430 | 0.2093 |
| 0.2812 | 0.81 | 3440 | 0.2114 |
| 0.5668 | 0.81 | 3450 | 0.2139 |
| 0.4397 | 0.81 | 3460 | 0.2127 |
| 0.4346 | 0.82 | 3470 | 0.2022 |
| 0.4335 | 0.82 | 3480 | 0.1997 |
| 0.6056 | 0.82 | 3490 | 0.1987 |
| 0.6434 | 0.82 | 3500 | 0.1922 |
| 0.4806 | 0.83 | 3510 | 0.1872 |
| 0.9088 | 0.83 | 3520 | 0.2073 |
| 0.1835 | 0.83 | 3530 | 0.2112 |
| 0.3963 | 0.83 | 3540 | 0.1991 |
| 0.2977 | 0.84 | 3550 | 0.1846 |
| 0.3423 | 0.84 | 3560 | 0.1864 |
| 0.7232 | 0.84 | 3570 | 0.1879 |
| 0.3509 | 0.84 | 3580 | 0.1856 |
| 0.3437 | 0.84 | 3590 | 0.1872 |
| 0.1969 | 0.85 | 3600 | 0.1884 |
| 0.611 | 0.85 | 3610 | 0.1966 |
| 0.3163 | 0.85 | 3620 | 0.1912 |
| 0.6549 | 0.85 | 3630 | 0.1897 |
| 0.6047 | 0.86 | 3640 | 0.1781 |
| 0.3034 | 0.86 | 3650 | 0.1745 |
| 0.2898 | 0.86 | 3660 | 0.1759 |
| 0.3211 | 0.86 | 3670 | 0.1794 |
| 0.4586 | 0.87 | 3680 | 0.1893 |
| 0.5102 | 0.87 | 3690 | 0.1904 |
| 0.4509 | 0.87 | 3700 | 0.1833 |
| 0.2828 | 0.87 | 3710 | 0.1831 |
| 0.3065 | 0.88 | 3720 | 0.1918 |
| 0.4697 | 0.88 | 3730 | 0.1989 |
| 0.4728 | 0.88 | 3740 | 0.2110 |
| 0.259 | 0.88 | 3750 | 0.2080 |
| 0.2843 | 0.88 | 3760 | 0.2013 |
| 0.3572 | 0.89 | 3770 | 0.1998 |
| 0.2861 | 0.89 | 3780 | 0.1997 |
| 0.3513 | 0.89 | 3790 | 0.2028 |
| 0.3063 | 0.89 | 3800 | 0.1974 |
| 0.4274 | 0.9 | 3810 | 0.1900 |
| 0.3639 | 0.9 | 3820 | 0.1868 |
| 0.2065 | 0.9 | 3830 | 0.1895 |
| 0.359 | 0.9 | 3840 | 0.1983 |
| 0.1902 | 0.91 | 3850 | 0.2105 |
| 0.1821 | 0.91 | 3860 | 0.2362 |
| 0.3117 | 0.91 | 3870 | 0.2455 |
| 0.7311 | 0.91 | 3880 | 0.2330 |
| 0.5677 | 0.92 | 3890 | 0.2363 |
| 0.2652 | 0.92 | 3900 | 0.2441 |
| 0.3911 | 0.92 | 3910 | 0.2360 |
| 0.3992 | 0.92 | 3920 | 0.2423 |
| 0.3325 | 0.92 | 3930 | 0.2387 |
| 0.321 | 0.93 | 3940 | 0.2428 |
| 0.7887 | 0.93 | 3950 | 0.2566 |
| 0.2373 | 0.93 | 3960 | 0.2374 |
| 0.6399 | 0.93 | 3970 | 0.2232 |
| 0.3938 | 0.94 | 3980 | 0.2115 |
| 0.3884 | 0.94 | 3990 | 0.2143 |
| 0.5061 | 0.94 | 4000 | 0.2246 |
| 0.3537 | 0.94 | 4010 | 0.2144 |
| 0.4257 | 0.95 | 4020 | 0.2082 |
| 0.4944 | 0.95 | 4030 | 0.2024 |
| 0.3119 | 0.95 | 4040 | 0.1987 |
| 0.5462 | 0.95 | 4050 | 0.1977 |
| 0.2478 | 0.96 | 4060 | 0.1980 |
| 0.253 | 0.96 | 4070 | 0.1979 |
| 0.3265 | 0.96 | 4080 | 0.2146 |
| 0.2407 | 0.96 | 4090 | 0.2303 |
| 0.2392 | 0.96 | 4100 | 0.2308 |
| 0.2047 | 0.97 | 4110 | 0.2521 |
| 0.1751 | 0.97 | 4120 | 0.2499 |
| 0.4894 | 0.97 | 4130 | 0.2351 |
| 0.3661 | 0.97 | 4140 | 0.2407 |
| 0.1099 | 0.98 | 4150 | 0.2375 |
| 0.6644 | 0.98 | 4160 | 0.2145 |
| 0.1317 | 0.98 | 4170 | 0.2289 |
| 0.6674 | 0.98 | 4180 | 0.2295 |
| 0.314 | 0.99 | 4190 | 0.2115 |
| 0.2731 | 0.99 | 4200 | 0.1864 |
| 0.2326 | 0.99 | 4210 | 0.1799 |
| 0.5015 | 0.99 | 4220 | 0.1760 |
| 0.2522 | 1.0 | 4230 | 0.1910 |
| 0.2634 | 1.0 | 4240 | 0.1969 |
| 0.3029 | 1.0 | 4250 | 0.1918 |
| 0.2226 | 1.0 | 4260 | 0.2049 |
| 0.6243 | 1.0 | 4270 | 0.2303 |
| 0.4633 | 1.01 | 4280 | 0.2305 |
| 0.4918 | 1.01 | 4290 | 0.2070 |
| 0.3314 | 1.01 | 4300 | 0.1970 |
| 0.2396 | 1.01 | 4310 | 0.2013 |
| 0.2283 | 1.02 | 4320 | 0.1931 |
| 0.4913 | 1.02 | 4330 | 0.2155 |
| 0.2601 | 1.02 | 4340 | 0.2543 |
| 0.2108 | 1.02 | 4350 | 0.2603 |
| 0.2433 | 1.03 | 4360 | 0.2395 |
| 0.1047 | 1.03 | 4370 | 0.2256 |
| 0.7346 | 1.03 | 4380 | 0.2442 |
| 0.3427 | 1.03 | 4390 | 0.2472 |
| 0.1887 | 1.04 | 4400 | 0.2260 |
| 0.4357 | 1.04 | 4410 | 0.1903 |
| 0.2175 | 1.04 | 4420 | 0.1907 |
| 0.5015 | 1.04 | 4430 | 0.2136 |
| 0.4701 | 1.04 | 4440 | 0.2123 |
| 0.3904 | 1.05 | 4450 | 0.2184 |
| 0.2129 | 1.05 | 4460 | 0.2216 |
| 0.3764 | 1.05 | 4470 | 0.2209 |
| 0.2599 | 1.05 | 4480 | 0.2037 |
| 0.5468 | 1.06 | 4490 | 0.1937 |
| 0.1625 | 1.06 | 4500 | 0.1960 |
| 0.184 | 1.06 | 4510 | 0.2081 |
| 0.334 | 1.06 | 4520 | 0.2118 |
| 0.0759 | 1.07 | 4530 | 0.2144 |
| 0.7927 | 1.07 | 4540 | 0.2005 |
| 0.1276 | 1.07 | 4550 | 0.1930 |
| 0.4991 | 1.07 | 4560 | 0.1887 |
| 0.238 | 1.08 | 4570 | 0.1914 |
| 0.2346 | 1.08 | 4580 | 0.2133 |
| 0.3646 | 1.08 | 4590 | 0.2318 |
| 0.1138 | 1.08 | 4600 | 0.2506 |
| 0.3171 | 1.08 | 4610 | 0.2494 |
| 0.596 | 1.09 | 4620 | 0.2275 |
| 0.2667 | 1.09 | 4630 | 0.2061 |
| 0.4161 | 1.09 | 4640 | 0.2252 |
| 0.3474 | 1.09 | 4650 | 0.2248 |
| 0.2747 | 1.1 | 4660 | 0.2312 |
| 0.5098 | 1.1 | 4670 | 0.1986 |
| 0.4898 | 1.1 | 4680 | 0.1781 |
| 0.2471 | 1.1 | 4690 | 0.1744 |
| 0.1659 | 1.11 | 4700 | 0.1823 |
| 0.4195 | 1.11 | 4710 | 0.1926 |
| 0.2737 | 1.11 | 4720 | 0.2300 |
| 0.2188 | 1.11 | 4730 | 0.2747 |
| 0.474 | 1.12 | 4740 | 0.2673 |
| 0.2487 | 1.12 | 4750 | 0.2377 |
| 0.2762 | 1.12 | 4760 | 0.2294 |
| 0.2084 | 1.12 | 4770 | 0.2277 |
| 0.4751 | 1.12 | 4780 | 0.2547 |
| 0.5042 | 1.13 | 4790 | 0.2726 |
| 0.1024 | 1.13 | 4800 | 0.2545 |
| 0.2493 | 1.13 | 4810 | 0.2255 |
| 0.4035 | 1.13 | 4820 | 0.2148 |
| 0.6532 | 1.14 | 4830 | 0.1993 |
| 0.19 | 1.14 | 4840 | 0.1978 |
| 0.0986 | 1.14 | 4850 | 0.1928 |
| 0.245 | 1.14 | 4860 | 0.2142 |
| 0.664 | 1.15 | 4870 | 0.2237 |
| 0.3992 | 1.15 | 4880 | 0.2025 |
| 0.108 | 1.15 | 4890 | 0.1816 |
| 0.885 | 1.15 | 4900 | 0.1812 |
| 0.5707 | 1.16 | 4910 | 0.1916 |
| 0.6256 | 1.16 | 4920 | 0.2030 |
| 0.3692 | 1.16 | 4930 | 0.2286 |
| 0.1598 | 1.16 | 4940 | 0.2112 |
| 0.7809 | 1.16 | 4950 | 0.2034 |
| 0.5057 | 1.17 | 4960 | 0.1747 |
| 0.2616 | 1.17 | 4970 | 0.1613 |
| 0.5148 | 1.17 | 4980 | 0.1660 |
| 0.4847 | 1.17 | 4990 | 0.1618 |
| 0.4978 | 1.18 | 5000 | 0.1567 |
| 0.3466 | 1.18 | 5010 | 0.1530 |
| 0.2511 | 1.18 | 5020 | 0.1576 |
| 0.1818 | 1.18 | 5030 | 0.1718 |
| 0.8491 | 1.19 | 5040 | 0.1930 |
| 0.1051 | 1.19 | 5050 | 0.1948 |
| 0.3309 | 1.19 | 5060 | 0.1999 |
| 0.4323 | 1.19 | 5070 | 0.1992 |
| 0.3315 | 1.2 | 5080 | 0.1951 |
| 0.131 | 1.2 | 5090 | 0.1822 |
| 0.0892 | 1.2 | 5100 | 0.1642 |
| 0.0156 | 1.2 | 5110 | 0.1557 |
| 0.4684 | 1.2 | 5120 | 0.1672 |
| 0.461 | 1.21 | 5130 | 0.1672 |
| 0.1355 | 1.21 | 5140 | 0.1675 |
| 0.1447 | 1.21 | 5150 | 0.1785 |
| 0.3596 | 1.21 | 5160 | 0.1792 |
| 0.0398 | 1.22 | 5170 | 0.1788 |
| 0.0961 | 1.22 | 5180 | 0.1772 |
| 0.2546 | 1.22 | 5190 | 0.1644 |
| 0.2119 | 1.22 | 5200 | 0.1606 |
| 0.567 | 1.23 | 5210 | 0.1623 |
| 0.1466 | 1.23 | 5220 | 0.1711 |
| 0.6517 | 1.23 | 5230 | 0.1779 |
| 0.0671 | 1.23 | 5240 | 0.1735 |
| 0.123 | 1.24 | 5250 | 0.1761 |
| 0.411 | 1.24 | 5260 | 0.1814 |
| 0.136 | 1.24 | 5270 | 0.1791 |
| 0.1735 | 1.24 | 5280 | 0.1867 |
| 0.2877 | 1.24 | 5290 | 0.1842 |
| 0.2175 | 1.25 | 5300 | 0.1899 |
| 0.2625 | 1.25 | 5310 | 0.1847 |
| 0.0795 | 1.25 | 5320 | 0.1848 |
| 0.1323 | 1.25 | 5330 | 0.1935 |
| 0.3752 | 1.26 | 5340 | 0.1998 |
| 0.1254 | 1.26 | 5350 | 0.1821 |
| 0.1431 | 1.26 | 5360 | 0.1744 |
| 0.5895 | 1.26 | 5370 | 0.1714 |
| 0.1235 | 1.27 | 5380 | 0.1854 |
| 0.2146 | 1.27 | 5390 | 0.2049 |
| 0.1444 | 1.27 | 5400 | 0.2226 |
| 0.4016 | 1.27 | 5410 | 0.2454 |
| 0.6784 | 1.28 | 5420 | 0.2622 |
| 0.4311 | 1.28 | 5430 | 0.2405 |
| 0.3039 | 1.28 | 5440 | 0.1928 |
| 0.2956 | 1.28 | 5450 | 0.1790 |
| 0.2004 | 1.28 | 5460 | 0.1785 |
| 0.2443 | 1.29 | 5470 | 0.1768 |
| 0.0428 | 1.29 | 5480 | 0.1852 |
| 0.2472 | 1.29 | 5490 | 0.1982 |
| 0.5345 | 1.29 | 5500 | 0.2071 |
| 0.136 | 1.3 | 5510 | 0.1858 |
| 0.1519 | 1.3 | 5520 | 0.1748 |
| 0.2248 | 1.3 | 5530 | 0.1681 |
| 0.166 | 1.3 | 5540 | 0.1698 |
| 0.4252 | 1.31 | 5550 | 0.1726 |
| 0.298 | 1.31 | 5560 | 0.1785 |
| 0.0708 | 1.31 | 5570 | 0.1797 |
| 0.282 | 1.31 | 5580 | 0.1705 |
| 1.0092 | 1.32 | 5590 | 0.1656 |
| 0.2018 | 1.32 | 5600 | 0.1700 |
| 0.0796 | 1.32 | 5610 | 0.1756 |
| 0.3687 | 1.32 | 5620 | 0.1678 |
| 0.1937 | 1.32 | 5630 | 0.1710 |
| 0.5449 | 1.33 | 5640 | 0.1887 |
| 0.0299 | 1.33 | 5650 | 0.1914 |
| 0.2386 | 1.33 | 5660 | 0.1833 |
| 0.1897 | 1.33 | 5670 | 0.1763 |
| 0.3064 | 1.34 | 5680 | 0.1631 |
| 0.2216 | 1.34 | 5690 | 0.1683 |
| 0.1249 | 1.34 | 5700 | 0.1777 |
| 0.0087 | 1.34 | 5710 | 0.1903 |
| 0.5886 | 1.35 | 5720 | 0.2098 |
| 0.0827 | 1.35 | 5730 | 0.2167 |
| 0.2194 | 1.35 | 5740 | 0.2041 |
| 0.4651 | 1.35 | 5750 | 0.1988 |
| 0.9891 | 1.36 | 5760 | 0.2044 |
| 0.2852 | 1.36 | 5770 | 0.2025 |
| 0.4693 | 1.36 | 5780 | 0.1761 |
| 0.1653 | 1.36 | 5790 | 0.1613 |
| 0.1625 | 1.36 | 5800 | 0.1745 |
| 0.0841 | 1.37 | 5810 | 0.1774 |
| 0.0317 | 1.37 | 5820 | 0.1748 |
| 0.1221 | 1.37 | 5830 | 0.1794 |
| 0.0837 | 1.37 | 5840 | 0.1900 |
| 0.1755 | 1.38 | 5850 | 0.1839 |
| 0.1719 | 1.38 | 5860 | 0.1731 |
| 0.2215 | 1.38 | 5870 | 0.1689 |
| 0.6336 | 1.38 | 5880 | 0.1699 |
| 0.4576 | 1.39 | 5890 | 0.1782 |
| 0.2106 | 1.39 | 5900 | 0.1888 |
| 0.4506 | 1.39 | 5910 | 0.1814 |
| 0.473 | 1.39 | 5920 | 0.1921 |
| 0.4673 | 1.4 | 5930 | 0.1953 |
| 0.1775 | 1.4 | 5940 | 0.2061 |
| 0.0852 | 1.4 | 5950 | 0.2124 |
| 0.3053 | 1.4 | 5960 | 0.2168 |
| 0.1072 | 1.4 | 5970 | 0.2273 |
| 0.1184 | 1.41 | 5980 | 0.2337 |
| 0.3458 | 1.41 | 5990 | 0.2123 |
| 0.1604 | 1.41 | 6000 | 0.1948 |
| 0.2199 | 1.41 | 6010 | 0.1702 |
| 0.2073 | 1.42 | 6020 | 0.1564 |
| 0.1866 | 1.42 | 6030 | 0.1602 |
| 0.4008 | 1.42 | 6040 | 0.1699 |
| 0.2635 | 1.42 | 6050 | 0.1848 |
| 0.0945 | 1.43 | 6060 | 0.2150 |
| 0.1214 | 1.43 | 6070 | 0.2391 |
| 0.3696 | 1.43 | 6080 | 0.2344 |
| 0.3044 | 1.43 | 6090 | 0.2090 |
| 0.3169 | 1.44 | 6100 | 0.1828 |
| 0.1032 | 1.44 | 6110 | 0.1772 |
| 0.1463 | 1.44 | 6120 | 0.1858 |
| 0.1899 | 1.44 | 6130 | 0.2030 |
| 0.5784 | 1.44 | 6140 | 0.2006 |
| 0.1119 | 1.45 | 6150 | 0.1916 |
| 0.0689 | 1.45 | 6160 | 0.1827 |
| 0.7187 | 1.45 | 6170 | 0.1806 |
| 0.1309 | 1.45 | 6180 | 0.1827 |
| 0.1597 | 1.46 | 6190 | 0.1755 |
| 0.5006 | 1.46 | 6200 | 0.1712 |
| 0.4701 | 1.46 | 6210 | 0.1864 |
| 0.2247 | 1.46 | 6220 | 0.2031 |
| 0.3197 | 1.47 | 6230 | 0.2372 |
| 0.2209 | 1.47 | 6240 | 0.2396 |
| 1.1995 | 1.47 | 6250 | 0.2138 |
| 0.9593 | 1.47 | 6260 | 0.1889 |
| 0.1071 | 1.48 | 6270 | 0.1810 |
| 0.0673 | 1.48 | 6280 | 0.1790 |
| 0.2732 | 1.48 | 6290 | 0.1794 |
| 0.0297 | 1.48 | 6300 | 0.1974 |
| 0.3706 | 1.48 | 6310 | 0.2122 |
| 0.4177 | 1.49 | 6320 | 0.2091 |
| 0.337 | 1.49 | 6330 | 0.1711 |
| 0.0592 | 1.49 | 6340 | 0.1548 |
| 0.2698 | 1.49 | 6350 | 0.1486 |
| 0.1118 | 1.5 | 6360 | 0.1542 |
| 0.1072 | 1.5 | 6370 | 0.1664 |
| 0.0892 | 1.5 | 6380 | 0.1644 |
| 0.4692 | 1.5 | 6390 | 0.1729 |
| 0.1136 | 1.51 | 6400 | 0.1746 |
| 0.1738 | 1.51 | 6410 | 0.1647 |
| 0.0302 | 1.51 | 6420 | 0.1569 |
| 0.1535 | 1.51 | 6430 | 0.1544 |
| 0.0871 | 1.52 | 6440 | 0.1571 |
| 0.6717 | 1.52 | 6450 | 0.1673 |
| 0.3538 | 1.52 | 6460 | 0.1904 |
| 0.1934 | 1.52 | 6470 | 0.1932 |
| 0.0568 | 1.52 | 6480 | 0.1837 |
| 0.2197 | 1.53 | 6490 | 0.1655 |
| 0.0899 | 1.53 | 6500 | 0.1566 |
| 0.3554 | 1.53 | 6510 | 0.1622 |
| 0.055 | 1.53 | 6520 | 0.1628 |
| 0.1583 | 1.54 | 6530 | 0.1748 |
| 0.1499 | 1.54 | 6540 | 0.1753 |
| 0.0158 | 1.54 | 6550 | 0.1726 |
| 0.5779 | 1.54 | 6560 | 0.1723 |
| 0.6185 | 1.55 | 6570 | 0.1645 |
| 0.1141 | 1.55 | 6580 | 0.1689 |
| 0.4059 | 1.55 | 6590 | 0.1778 |
| 0.2643 | 1.55 | 6600 | 0.1864 |
| 0.2214 | 1.56 | 6610 | 0.1815 |
| 0.0758 | 1.56 | 6620 | 0.1697 |
| 0.0379 | 1.56 | 6630 | 0.1619 |
| 0.0788 | 1.56 | 6640 | 0.1608 |
| 0.0818 | 1.56 | 6650 | 0.1685 |
| 0.0191 | 1.57 | 6660 | 0.1773 |
| 0.1959 | 1.57 | 6670 | 0.1720 |
| 0.1683 | 1.57 | 6680 | 0.1659 |
| 0.4874 | 1.57 | 6690 | 0.1704 |
| 0.4201 | 1.58 | 6700 | 0.1730 |
| 0.147 | 1.58 | 6710 | 0.1725 |
| 0.1274 | 1.58 | 6720 | 0.1636 |
| 1.248 | 1.58 | 6730 | 0.1678 |
| 0.0943 | 1.59 | 6740 | 0.1805 |
| 0.4071 | 1.59 | 6750 | 0.1851 |
| 0.0548 | 1.59 | 6760 | 0.1849 |
| 0.8717 | 1.59 | 6770 | 0.1789 |
| 0.0549 | 1.6 | 6780 | 0.1695 |
| 0.5486 | 1.6 | 6790 | 0.1708 |
| 0.0128 | 1.6 | 6800 | 0.1770 |
| 0.4382 | 1.6 | 6810 | 0.1738 |
| 0.0319 | 1.6 | 6820 | 0.1631 |
| 0.1554 | 1.61 | 6830 | 0.1604 |
| 0.2509 | 1.61 | 6840 | 0.1631 |
| 0.0055 | 1.61 | 6850 | 0.1686 |
| 0.5865 | 1.61 | 6860 | 0.1786 |
| 0.5643 | 1.62 | 6870 | 0.1779 |
| 0.0066 | 1.62 | 6880 | 0.1765 |
| 0.2172 | 1.62 | 6890 | 0.1600 |
| 0.18 | 1.62 | 6900 | 0.1497 |
| 0.4333 | 1.63 | 6910 | 0.1613 |
| 0.1564 | 1.63 | 6920 | 0.1875 |
| 0.1368 | 1.63 | 6930 | 0.2082 |
| 0.3151 | 1.63 | 6940 | 0.2140 |
| 0.0735 | 1.64 | 6950 | 0.2008 |
| 0.4799 | 1.64 | 6960 | 0.1830 |
| 0.0717 | 1.64 | 6970 | 0.1661 |
| 0.224 | 1.64 | 6980 | 0.1554 |
| 0.1441 | 1.64 | 6990 | 0.1492 |
| 0.5679 | 1.65 | 7000 | 0.1466 |
| 0.1268 | 1.65 | 7010 | 0.1561 |
| 0.1617 | 1.65 | 7020 | 0.1644 |
| 0.0751 | 1.65 | 7030 | 0.1549 |
| 0.0191 | 1.66 | 7040 | 0.1531 |
| 0.0068 | 1.66 | 7050 | 0.1635 |
| 0.6952 | 1.66 | 7060 | 0.1826 |
| 0.3756 | 1.66 | 7070 | 0.2015 |
| 0.1053 | 1.67 | 7080 | 0.1897 |
| 0.1043 | 1.67 | 7090 | 0.1660 |
| 0.0683 | 1.67 | 7100 | 0.1547 |
| 0.0926 | 1.67 | 7110 | 0.1500 |
| 0.794 | 1.68 | 7120 | 0.1532 |
| 0.1025 | 1.68 | 7130 | 0.1569 |
| 0.0148 | 1.68 | 7140 | 0.1755 |
| 0.2857 | 1.68 | 7150 | 0.1742 |
| 0.0457 | 1.68 | 7160 | 0.1794 |
| 0.419 | 1.69 | 7170 | 0.1922 |
| 0.1454 | 1.69 | 7180 | 0.1850 |
| 0.0851 | 1.69 | 7190 | 0.1811 |
| 0.1369 | 1.69 | 7200 | 0.1857 |
| 0.2758 | 1.7 | 7210 | 0.1901 |
| 0.1367 | 1.7 | 7220 | 0.1811 |
| 0.0281 | 1.7 | 7230 | 0.1716 |
| 0.0849 | 1.7 | 7240 | 0.1771 |
| 0.2501 | 1.71 | 7250 | 0.1857 |
| 0.1177 | 1.71 | 7260 | 0.1890 |
| 0.4418 | 1.71 | 7270 | 0.1810 |
| 0.3707 | 1.71 | 7280 | 0.1865 |
| 0.0036 | 1.72 | 7290 | 0.1925 |
| 0.104 | 1.72 | 7300 | 0.1853 |
| 0.3512 | 1.72 | 7310 | 0.1854 |
| 0.0357 | 1.72 | 7320 | 0.1738 |
| 0.0689 | 1.72 | 7330 | 0.1663 |
| 0.0974 | 1.73 | 7340 | 0.1579 |
| 0.2649 | 1.73 | 7350 | 0.1604 |
| 0.1477 | 1.73 | 7360 | 0.1575 |
| 0.1052 | 1.73 | 7370 | 0.1578 |
| 0.2056 | 1.74 | 7380 | 0.1748 |
| 0.4071 | 1.74 | 7390 | 0.1752 |
| 0.0626 | 1.74 | 7400 | 0.1589 |
| 0.274 | 1.74 | 7410 | 0.1543 |
| 0.8482 | 1.75 | 7420 | 0.1543 |
| 0.329 | 1.75 | 7430 | 0.1644 |
| 0.4106 | 1.75 | 7440 | 0.1763 |
| 0.0922 | 1.75 | 7450 | 0.1850 |
| 0.6622 | 1.76 | 7460 | 0.1802 |
| 0.1798 | 1.76 | 7470 | 0.1619 |
| 0.2405 | 1.76 | 7480 | 0.1579 |
| 0.3783 | 1.76 | 7490 | 0.1581 |
| 0.7209 | 1.76 | 7500 | 0.1520 |
| 0.2051 | 1.77 | 7510 | 0.1584 |
| 0.0596 | 1.77 | 7520 | 0.1670 |
| 0.0141 | 1.77 | 7530 | 0.1758 |
| 0.4289 | 1.77 | 7540 | 0.1789 |
| 0.3653 | 1.78 | 7550 | 0.1752 |
| 0.139 | 1.78 | 7560 | 0.1624 |
| 0.2607 | 1.78 | 7570 | 0.1636 |
| 0.593 | 1.78 | 7580 | 0.1553 |
| 0.6612 | 1.79 | 7590 | 0.1518 |
| 0.0378 | 1.79 | 7600 | 0.1462 |
| 0.208 | 1.79 | 7610 | 0.1421 |
| 0.2581 | 1.79 | 7620 | 0.1377 |
| 0.5168 | 1.8 | 7630 | 0.1434 |
| 0.0119 | 1.8 | 7640 | 0.1462 |
| 0.0079 | 1.8 | 7650 | 0.1506 |
| 0.087 | 1.8 | 7660 | 0.1561 |
| 0.1204 | 1.8 | 7670 | 0.1619 |
| 0.1078 | 1.81 | 7680 | 0.1634 |
| 0.0036 | 1.81 | 7690 | 0.1593 |
| 0.0157 | 1.81 | 7700 | 0.1634 |
| 0.0298 | 1.81 | 7710 | 0.1605 |
| 0.0267 | 1.82 | 7720 | 0.1519 |
| 0.0359 | 1.82 | 7730 | 0.1496 |
| 0.059 | 1.82 | 7740 | 0.1567 |
| 0.0578 | 1.82 | 7750 | 0.1638 |
| 0.0016 | 1.83 | 7760 | 0.1654 |
| 0.4018 | 1.83 | 7770 | 0.1496 |
| 0.0102 | 1.83 | 7780 | 0.1489 |
| 0.1148 | 1.83 | 7790 | 0.1522 |
| 0.0584 | 1.84 | 7800 | 0.1505 |
| 0.3587 | 1.84 | 7810 | 0.1539 |
| 0.3523 | 1.84 | 7820 | 0.1604 |
| 0.6866 | 1.84 | 7830 | 0.1743 |
| 0.1511 | 1.84 | 7840 | 0.1899 |
| 0.1533 | 1.85 | 7850 | 0.1874 |
| 0.4831 | 1.85 | 7860 | 0.1917 |
| 0.9589 | 1.85 | 7870 | 0.1818 |
| 0.2117 | 1.85 | 7880 | 0.1631 |
| 0.012 | 1.86 | 7890 | 0.1541 |
| 0.1356 | 1.86 | 7900 | 0.1499 |
| 0.0645 | 1.86 | 7910 | 0.1578 |
| 0.0372 | 1.86 | 7920 | 0.1504 |
| 0.1199 | 1.87 | 7930 | 0.1562 |
| 0.0636 | 1.87 | 7940 | 0.1595 |
| 0.0147 | 1.87 | 7950 | 0.1661 |
| 0.0036 | 1.87 | 7960 | 0.1710 |
| 0.0987 | 1.88 | 7970 | 0.1665 |
| 0.2832 | 1.88 | 7980 | 0.1566 |
| 0.3404 | 1.88 | 7990 | 0.1475 |
| 0.05 | 1.88 | 8000 | 0.1523 |
| 0.0166 | 1.88 | 8010 | 0.1588 |
| 0.3845 | 1.89 | 8020 | 0.1583 |
| 0.5199 | 1.89 | 8030 | 0.1398 |
| 0.4537 | 1.89 | 8040 | 0.1328 |
| 0.8766 | 1.89 | 8050 | 0.1369 |
| 0.0008 | 1.9 | 8060 | 0.1466 |
| 0.8728 | 1.9 | 8070 | 0.1527 |
| 0.0505 | 1.9 | 8080 | 0.1549 |
| 0.0897 | 1.9 | 8090 | 0.1468 |
| 0.0948 | 1.91 | 8100 | 0.1425 |
| 0.0777 | 1.91 | 8110 | 0.1481 |
| 0.1637 | 1.91 | 8120 | 0.1567 |
| 0.1959 | 1.91 | 8130 | 0.1579 |
| 0.3097 | 1.92 | 8140 | 0.1532 |
| 0.1499 | 1.92 | 8150 | 0.1437 |
| 0.9859 | 1.92 | 8160 | 0.1347 |
| 0.0276 | 1.92 | 8170 | 0.1345 |
| 0.2713 | 1.92 | 8180 | 0.1346 |
| 0.0029 | 1.93 | 8190 | 0.1390 |
| 0.2852 | 1.93 | 8200 | 0.1428 |
| 0.1095 | 1.93 | 8210 | 0.1470 |
| 0.105 | 1.93 | 8220 | 0.1504 |
| 0.4423 | 1.94 | 8230 | 0.1527 |
| 0.1453 | 1.94 | 8240 | 0.1495 |
| 0.3866 | 1.94 | 8250 | 0.1464 |
| 0.9174 | 1.94 | 8260 | 0.1474 |
| 0.0715 | 1.95 | 8270 | 0.1555 |
| 0.118 | 1.95 | 8280 | 0.1598 |
| 0.1343 | 1.95 | 8290 | 0.1701 |
| 0.0534 | 1.95 | 8300 | 0.1616 |
| 0.0922 | 1.96 | 8310 | 0.1493 |
| 0.137 | 1.96 | 8320 | 0.1437 |
| 0.6153 | 1.96 | 8330 | 0.1429 |
| 0.3538 | 1.96 | 8340 | 0.1416 |
| 0.5799 | 1.96 | 8350 | 0.1475 |
| 0.0163 | 1.97 | 8360 | 0.1510 |
| 0.0078 | 1.97 | 8370 | 0.1537 |
| 0.3576 | 1.97 | 8380 | 0.1616 |
| 0.6845 | 1.97 | 8390 | 0.1562 |
| 0.202 | 1.98 | 8400 | 0.1587 |
| 0.0088 | 1.98 | 8410 | 0.1569 |
| 0.3641 | 1.98 | 8420 | 0.1460 |
| 0.0073 | 1.98 | 8430 | 0.1361 |
| 0.0811 | 1.99 | 8440 | 0.1381 |
| 0.0231 | 1.99 | 8450 | 0.1373 |
| 0.3656 | 1.99 | 8460 | 0.1395 |
| 0.3725 | 1.99 | 8470 | 0.1436 |
| 0.0711 | 2.0 | 8480 | 0.1417 |
| 0.0142 | 2.0 | 8490 | 0.1305 |
| 0.3093 | 2.0 | 8500 | 0.1228 |
| 0.0199 | 2.0 | 8510 | 0.1247 |
| 0.004 | 2.0 | 8520 | 0.1330 |
| 0.325 | 2.01 | 8530 | 0.1362 |
| 0.1648 | 2.01 | 8540 | 0.1374 |
| 0.501 | 2.01 | 8550 | 0.1469 |
| 0.3436 | 2.01 | 8560 | 0.1582 |
| 0.3076 | 2.02 | 8570 | 0.1586 |
| 0.2154 | 2.02 | 8580 | 0.1499 |
| 0.3914 | 2.02 | 8590 | 0.1454 |
| 0.2452 | 2.02 | 8600 | 0.1395 |
| 0.0219 | 2.03 | 8610 | 0.1375 |
| 0.0036 | 2.03 | 8620 | 0.1388 |
| 0.2508 | 2.03 | 8630 | 0.1369 |
| 0.2267 | 2.03 | 8640 | 0.1307 |
| 0.007 | 2.04 | 8650 | 0.1286 |
| 0.1006 | 2.04 | 8660 | 0.1332 |
| 0.104 | 2.04 | 8670 | 0.1331 |
| 0.4938 | 2.04 | 8680 | 0.1297 |
| 0.379 | 2.04 | 8690 | 0.1263 |
| 0.0825 | 2.05 | 8700 | 0.1231 |
| 0.4663 | 2.05 | 8710 | 0.1209 |
| 0.0669 | 2.05 | 8720 | 0.1213 |
| 0.0327 | 2.05 | 8730 | 0.1222 |
| 0.3799 | 2.06 | 8740 | 0.1248 |
| 0.7058 | 2.06 | 8750 | 0.1327 |
| 0.2095 | 2.06 | 8760 | 0.1368 |
| 0.8127 | 2.06 | 8770 | 0.1348 |
| 0.0168 | 2.07 | 8780 | 0.1325 |
| 0.1456 | 2.07 | 8790 | 0.1271 |
| 0.0194 | 2.07 | 8800 | 0.1259 |
| 0.3847 | 2.07 | 8810 | 0.1320 |
| 0.1016 | 2.08 | 8820 | 0.1254 |
| 0.3418 | 2.08 | 8830 | 0.1244 |
| 0.3378 | 2.08 | 8840 | 0.1291 |
| 0.5554 | 2.08 | 8850 | 0.1277 |
| 0.0026 | 2.08 | 8860 | 0.1209 |
| 0.0062 | 2.09 | 8870 | 0.1183 |
| 0.343 | 2.09 | 8880 | 0.1140 |
| 0.0031 | 2.09 | 8890 | 0.1194 |
| 0.6685 | 2.09 | 8900 | 0.1243 |
| 0.1048 | 2.1 | 8910 | 0.1258 |
| 0.0054 | 2.1 | 8920 | 0.1245 |
| 0.1464 | 2.1 | 8930 | 0.1234 |
| 0.3645 | 2.1 | 8940 | 0.1270 |
| 0.1195 | 2.11 | 8950 | 0.1275 |
| 0.1394 | 2.11 | 8960 | 0.1243 |
| 0.3173 | 2.11 | 8970 | 0.1207 |
| 0.288 | 2.11 | 8980 | 0.1179 |
| 0.6389 | 2.12 | 8990 | 0.1196 |
| 0.1785 | 2.12 | 9000 | 0.1189 |
| 0.1437 | 2.12 | 9010 | 0.1232 |
| 0.3789 | 2.12 | 9020 | 0.1284 |
| 0.0772 | 2.12 | 9030 | 0.1331 |
| 0.075 | 2.13 | 9040 | 0.1404 |
| 0.4147 | 2.13 | 9050 | 0.1434 |
| 0.2397 | 2.13 | 9060 | 0.1486 |
| 0.1797 | 2.13 | 9070 | 0.1362 |
| 0.2987 | 2.14 | 9080 | 0.1263 |
| 0.003 | 2.14 | 9090 | 0.1233 |
| 0.0301 | 2.14 | 9100 | 0.1243 |
| 0.1034 | 2.14 | 9110 | 0.1295 |
| 1.1049 | 2.15 | 9120 | 0.1234 |
| 0.016 | 2.15 | 9130 | 0.1228 |
| 0.2769 | 2.15 | 9140 | 0.1258 |
| 0.003 | 2.15 | 9150 | 0.1269 |
| 0.0043 | 2.16 | 9160 | 0.1238 |
| 0.0287 | 2.16 | 9170 | 0.1230 |
| 0.1919 | 2.16 | 9180 | 0.1214 |
| 0.0439 | 2.16 | 9190 | 0.1219 |
| 0.1591 | 2.16 | 9200 | 0.1250 |
| 0.1009 | 2.17 | 9210 | 0.1314 |
| 0.4279 | 2.17 | 9220 | 0.1295 |
| 0.1096 | 2.17 | 9230 | 0.1327 |
| 0.5508 | 2.17 | 9240 | 0.1382 |
| 0.0007 | 2.18 | 9250 | 0.1433 |
| 0.3901 | 2.18 | 9260 | 0.1477 |
| 0.3904 | 2.18 | 9270 | 0.1453 |
| 0.6607 | 2.18 | 9280 | 0.1459 |
| 0.043 | 2.19 | 9290 | 0.1337 |
| 0.0857 | 2.19 | 9300 | 0.1284 |
| 0.0929 | 2.19 | 9310 | 0.1387 |
| 0.4403 | 2.19 | 9320 | 0.1420 |
| 0.3784 | 2.2 | 9330 | 0.1368 |
| 0.1277 | 2.2 | 9340 | 0.1322 |
| 0.0014 | 2.2 | 9350 | 0.1302 |
| 0.1071 | 2.2 | 9360 | 0.1267 |
| 0.3073 | 2.2 | 9370 | 0.1256 |
| 0.2623 | 2.21 | 9380 | 0.1188 |
| 0.0024 | 2.21 | 9390 | 0.1162 |
| 0.0022 | 2.21 | 9400 | 0.1173 |
| 0.0022 | 2.21 | 9410 | 0.1148 |
| 0.2141 | 2.22 | 9420 | 0.1173 |
| 0.0541 | 2.22 | 9430 | 0.1172 |
| 0.1819 | 2.22 | 9440 | 0.1148 |
| 0.1091 | 2.22 | 9450 | 0.1159 |
| 0.0939 | 2.23 | 9460 | 0.1166 |
| 0.0571 | 2.23 | 9470 | 0.1253 |
| 0.0249 | 2.23 | 9480 | 0.1310 |
| 0.0012 | 2.23 | 9490 | 0.1329 |
| 0.141 | 2.24 | 9500 | 0.1303 |
| 0.0219 | 2.24 | 9510 | 0.1321 |
| 0.0032 | 2.24 | 9520 | 0.1241 |
| 0.0376 | 2.24 | 9530 | 0.1228 |
| 0.3005 | 2.24 | 9540 | 0.1226 |
| 0.5186 | 2.25 | 9550 | 0.1186 |
| 0.7346 | 2.25 | 9560 | 0.1210 |
| 0.6254 | 2.25 | 9570 | 0.1266 |
| 0.0005 | 2.25 | 9580 | 0.1290 |
| 0.0058 | 2.26 | 9590 | 0.1288 |
| 0.086 | 2.26 | 9600 | 0.1265 |
| 0.1606 | 2.26 | 9610 | 0.1280 |
| 0.3294 | 2.26 | 9620 | 0.1299 |
| 0.0144 | 2.27 | 9630 | 0.1352 |
| 0.0511 | 2.27 | 9640 | 0.1461 |
| 0.048 | 2.27 | 9650 | 0.1502 |
| 0.1315 | 2.27 | 9660 | 0.1444 |
| 0.5199 | 2.28 | 9670 | 0.1357 |
| 0.001 | 2.28 | 9680 | 0.1309 |
| 0.1123 | 2.28 | 9690 | 0.1309 |
| 0.0246 | 2.28 | 9700 | 0.1279 |
| 0.5187 | 2.28 | 9710 | 0.1251 |
| 0.149 | 2.29 | 9720 | 0.1256 |
| 0.3862 | 2.29 | 9730 | 0.1237 |
| 0.081 | 2.29 | 9740 | 0.1217 |
| 0.2072 | 2.29 | 9750 | 0.1179 |
| 0.5972 | 2.3 | 9760 | 0.1158 |
| 0.4282 | 2.3 | 9770 | 0.1180 |
| 0.5381 | 2.3 | 9780 | 0.1175 |
| 0.3266 | 2.3 | 9790 | 0.1174 |
| 0.022 | 2.31 | 9800 | 0.1205 |
| 0.0618 | 2.31 | 9810 | 0.1289 |
| 0.0426 | 2.31 | 9820 | 0.1323 |
| 0.0454 | 2.31 | 9830 | 0.1254 |
| 0.0082 | 2.32 | 9840 | 0.1257 |
| 0.0436 | 2.32 | 9850 | 0.1226 |
| 0.0917 | 2.32 | 9860 | 0.1227 |
| 0.1662 | 2.32 | 9870 | 0.1227 |
| 0.4208 | 2.32 | 9880 | 0.1242 |
| 0.3316 | 2.33 | 9890 | 0.1250 |
| 0.0585 | 2.33 | 9900 | 0.1286 |
| 0.0012 | 2.33 | 9910 | 0.1287 |
| 0.0819 | 2.33 | 9920 | 0.1285 |
| 0.1023 | 2.34 | 9930 | 0.1277 |
| 0.2498 | 2.34 | 9940 | 0.1187 |
| 0.0227 | 2.34 | 9950 | 0.1118 |
| 0.3881 | 2.34 | 9960 | 0.1093 |
| 0.0005 | 2.35 | 9970 | 0.1107 |
| 0.2209 | 2.35 | 9980 | 0.1136 |
| 0.3133 | 2.35 | 9990 | 0.1168 |
| 0.0019 | 2.35 | 10000 | 0.1172 |
| 0.005 | 2.36 | 10010 | 0.1184 |
| 0.0007 | 2.36 | 10020 | 0.1206 |
| 0.3865 | 2.36 | 10030 | 0.1275 |
| 0.0257 | 2.36 | 10040 | 0.1290 |
| 0.1657 | 2.36 | 10050 | 0.1240 |
| 0.0005 | 2.37 | 10060 | 0.1213 |
| 0.2322 | 2.37 | 10070 | 0.1197 |
| 0.4951 | 2.37 | 10080 | 0.1179 |
| 0.022 | 2.37 | 10090 | 0.1192 |
| 0.0021 | 2.38 | 10100 | 0.1186 |
| 0.0141 | 2.38 | 10110 | 0.1180 |
| 0.2456 | 2.38 | 10120 | 0.1230 |
| 0.0858 | 2.38 | 10130 | 0.1271 |
| 0.2066 | 2.39 | 10140 | 0.1286 |
| 0.0486 | 2.39 | 10150 | 0.1280 |
| 0.1796 | 2.39 | 10160 | 0.1256 |
| 0.3334 | 2.39 | 10170 | 0.1219 |
| 0.3476 | 2.4 | 10180 | 0.1219 |
| 0.0013 | 2.4 | 10190 | 0.1203 |
| 0.3598 | 2.4 | 10200 | 0.1181 |
| 0.0203 | 2.4 | 10210 | 0.1160 |
| 0.0261 | 2.4 | 10220 | 0.1163 |
| 0.4687 | 2.41 | 10230 | 0.1169 |
| 0.0018 | 2.41 | 10240 | 0.1180 |
| 0.0267 | 2.41 | 10250 | 0.1190 |
| 0.0809 | 2.41 | 10260 | 0.1157 |
| 0.3853 | 2.42 | 10270 | 0.1096 |
| 0.3098 | 2.42 | 10280 | 0.1097 |
| 0.3664 | 2.42 | 10290 | 0.1112 |
| 0.4342 | 2.42 | 10300 | 0.1100 |
| 0.0544 | 2.43 | 10310 | 0.1140 |
| 0.0292 | 2.43 | 10320 | 0.1194 |
| 0.0662 | 2.43 | 10330 | 0.1151 |
| 0.0006 | 2.43 | 10340 | 0.1130 |
| 0.0049 | 2.44 | 10350 | 0.1127 |
| 0.0868 | 2.44 | 10360 | 0.1158 |
| 0.0021 | 2.44 | 10370 | 0.1158 |
| 0.6543 | 2.44 | 10380 | 0.1173 |
| 0.324 | 2.44 | 10390 | 0.1175 |
| 0.4314 | 2.45 | 10400 | 0.1175 |
| 0.6206 | 2.45 | 10410 | 0.1166 |
| 0.0314 | 2.45 | 10420 | 0.1167 |
| 0.1325 | 2.45 | 10430 | 0.1163 |
| 0.3137 | 2.46 | 10440 | 0.1167 |
| 0.5047 | 2.46 | 10450 | 0.1207 |
| 0.0739 | 2.46 | 10460 | 0.1241 |
| 0.2103 | 2.46 | 10470 | 0.1249 |
| 0.1068 | 2.47 | 10480 | 0.1264 |
| 0.0011 | 2.47 | 10490 | 0.1312 |
| 0.0089 | 2.47 | 10500 | 0.1357 |
| 0.4653 | 2.47 | 10510 | 0.1398 |
| 0.2758 | 2.48 | 10520 | 0.1397 |
| 0.0005 | 2.48 | 10530 | 0.1368 |
| 0.0115 | 2.48 | 10540 | 0.1334 |
| 0.1624 | 2.48 | 10550 | 0.1235 |
| 0.0143 | 2.48 | 10560 | 0.1158 |
| 0.1054 | 2.49 | 10570 | 0.1100 |
| 0.6404 | 2.49 | 10580 | 0.1096 |
| 0.0023 | 2.49 | 10590 | 0.1112 |
| 0.0054 | 2.49 | 10600 | 0.1121 |
| 0.106 | 2.5 | 10610 | 0.1074 |
| 0.0006 | 2.5 | 10620 | 0.1076 |
| 0.3312 | 2.5 | 10630 | 0.1091 |
| 0.1955 | 2.5 | 10640 | 0.1162 |
| 0.0806 | 2.51 | 10650 | 0.1230 |
| 0.0052 | 2.51 | 10660 | 0.1227 |
| 0.0288 | 2.51 | 10670 | 0.1154 |
| 0.0214 | 2.51 | 10680 | 0.1104 |
| 0.0034 | 2.52 | 10690 | 0.1073 |
| 0.0599 | 2.52 | 10700 | 0.1064 |
| 0.3673 | 2.52 | 10710 | 0.1069 |
| 0.0157 | 2.52 | 10720 | 0.1105 |
| 0.2701 | 2.52 | 10730 | 0.1154 |
| 0.0131 | 2.53 | 10740 | 0.1139 |
| 0.2673 | 2.53 | 10750 | 0.1159 |
| 0.0019 | 2.53 | 10760 | 0.1147 |
| 0.634 | 2.53 | 10770 | 0.1138 |
| 0.0584 | 2.54 | 10780 | 0.1166 |
| 0.0214 | 2.54 | 10790 | 0.1212 |
| 0.0009 | 2.54 | 10800 | 0.1224 |
| 0.4264 | 2.54 | 10810 | 0.1184 |
| 0.081 | 2.55 | 10820 | 0.1167 |
| 0.0153 | 2.55 | 10830 | 0.1161 |
| 0.419 | 2.55 | 10840 | 0.1106 |
| 0.0006 | 2.55 | 10850 | 0.1064 |
| 0.3354 | 2.56 | 10860 | 0.1071 |
| 0.6373 | 2.56 | 10870 | 0.1058 |
| 0.1011 | 2.56 | 10880 | 0.1064 |
| 0.0037 | 2.56 | 10890 | 0.1070 |
| 0.1149 | 2.56 | 10900 | 0.1067 |
| 0.0237 | 2.57 | 10910 | 0.1061 |
| 0.0306 | 2.57 | 10920 | 0.1058 |
| 0.3209 | 2.57 | 10930 | 0.1059 |
| 0.2004 | 2.57 | 10940 | 0.1054 |
| 0.0819 | 2.58 | 10950 | 0.1059 |
| 0.001 | 2.58 | 10960 | 0.1103 |
| 0.0162 | 2.58 | 10970 | 0.1147 |
| 0.0007 | 2.58 | 10980 | 0.1164 |
| 0.0391 | 2.59 | 10990 | 0.1133 |
| 0.0113 | 2.59 | 11000 | 0.1119 |
| 0.3553 | 2.59 | 11010 | 0.1079 |
| 0.004 | 2.59 | 11020 | 0.1069 |
| 0.0079 | 2.6 | 11030 | 0.1074 |
| 0.0071 | 2.6 | 11040 | 0.1069 |
| 0.2595 | 2.6 | 11050 | 0.1072 |
| 0.7005 | 2.6 | 11060 | 0.1091 |
| 0.2458 | 2.6 | 11070 | 0.1119 |
| 0.0411 | 2.61 | 11080 | 0.1168 |
| 0.0154 | 2.61 | 11090 | 0.1130 |
| 0.0022 | 2.61 | 11100 | 0.1092 |
| 0.6174 | 2.61 | 11110 | 0.1043 |
| 0.3189 | 2.62 | 11120 | 0.1015 |
| 0.3563 | 2.62 | 11130 | 0.1020 |
| 0.3818 | 2.62 | 11140 | 0.1037 |
| 0.3164 | 2.62 | 11150 | 0.1084 |
| 0.058 | 2.63 | 11160 | 0.1137 |
| 0.5556 | 2.63 | 11170 | 0.1168 |
| 0.3385 | 2.63 | 11180 | 0.1157 |
| 0.0741 | 2.63 | 11190 | 0.1160 |
| 0.074 | 2.64 | 11200 | 0.1176 |
| 0.0006 | 2.64 | 11210 | 0.1178 |
| 0.63 | 2.64 | 11220 | 0.1114 |
| 0.2031 | 2.64 | 11230 | 0.1090 |
| 0.1105 | 2.64 | 11240 | 0.1044 |
| 0.3006 | 2.65 | 11250 | 0.1004 |
| 0.0016 | 2.65 | 11260 | 0.0971 |
| 0.0225 | 2.65 | 11270 | 0.0967 |
| 0.7929 | 2.65 | 11280 | 0.0973 |
| 0.001 | 2.66 | 11290 | 0.0998 |
| 0.0054 | 2.66 | 11300 | 0.1024 |
| 0.4051 | 2.66 | 11310 | 0.1047 |
| 0.0003 | 2.66 | 11320 | 0.1073 |
| 0.0005 | 2.67 | 11330 | 0.1081 |
| 0.0038 | 2.67 | 11340 | 0.1067 |
| 0.3265 | 2.67 | 11350 | 0.1064 |
| 0.0423 | 2.67 | 11360 | 0.1058 |
| 0.0127 | 2.68 | 11370 | 0.1047 |
| 0.0171 | 2.68 | 11380 | 0.1039 |
| 0.3343 | 2.68 | 11390 | 0.1000 |
| 0.3079 | 2.68 | 11400 | 0.0959 |
| 0.0012 | 2.68 | 11410 | 0.0970 |
| 0.6415 | 2.69 | 11420 | 0.0970 |
| 0.0894 | 2.69 | 11430 | 0.0959 |
| 0.0041 | 2.69 | 11440 | 0.0967 |
| 0.331 | 2.69 | 11450 | 0.0939 |
| 0.0015 | 2.7 | 11460 | 0.0939 |
| 0.4499 | 2.7 | 11470 | 0.0953 |
| 1.0666 | 2.7 | 11480 | 0.0964 |
| 0.2882 | 2.7 | 11490 | 0.0981 |
| 0.0904 | 2.71 | 11500 | 0.1013 |
| 0.3304 | 2.71 | 11510 | 0.1033 |
| 0.0006 | 2.71 | 11520 | 0.1019 |
| 0.0685 | 2.71 | 11530 | 0.1037 |
| 0.0016 | 2.72 | 11540 | 0.1050 |
| 0.2952 | 2.72 | 11550 | 0.1063 |
| 0.2906 | 2.72 | 11560 | 0.1057 |
| 0.0339 | 2.72 | 11570 | 0.1020 |
| 0.0668 | 2.72 | 11580 | 0.1000 |
| 0.0289 | 2.73 | 11590 | 0.1002 |
| 0.0026 | 2.73 | 11600 | 0.1020 |
| 0.0007 | 2.73 | 11610 | 0.1027 |
| 0.2901 | 2.73 | 11620 | 0.1048 |
| 0.0003 | 2.74 | 11630 | 0.1070 |
| 0.289 | 2.74 | 11640 | 0.1085 |
| 0.0007 | 2.74 | 11650 | 0.1077 |
| 0.0071 | 2.74 | 11660 | 0.1079 |
| 0.3559 | 2.75 | 11670 | 0.1065 |
| 0.2206 | 2.75 | 11680 | 0.1071 |
| 0.2716 | 2.75 | 11690 | 0.1089 |
| 0.1768 | 2.75 | 11700 | 0.1069 |
| 0.0802 | 2.76 | 11710 | 0.1042 |
| 0.2158 | 2.76 | 11720 | 0.1046 |
| 0.0086 | 2.76 | 11730 | 0.1065 |
| 0.3224 | 2.76 | 11740 | 0.1078 |
| 0.4421 | 2.76 | 11750 | 0.1063 |
| 0.1481 | 2.77 | 11760 | 0.1051 |
| 0.6156 | 2.77 | 11770 | 0.1031 |
| 0.0731 | 2.77 | 11780 | 0.1017 |
| 0.0016 | 2.77 | 11790 | 0.1065 |
| 0.002 | 2.78 | 11800 | 0.1135 |
| 0.3179 | 2.78 | 11810 | 0.1135 |
| 0.1241 | 2.78 | 11820 | 0.1130 |
| 0.0004 | 2.78 | 11830 | 0.1134 |
| 0.053 | 2.79 | 11840 | 0.1160 |
| 0.0236 | 2.79 | 11850 | 0.1213 |
| 0.0016 | 2.79 | 11860 | 0.1170 |
| 0.305 | 2.79 | 11870 | 0.1141 |
| 0.1295 | 2.8 | 11880 | 0.1146 |
| 0.018 | 2.8 | 11890 | 0.1145 |
| 0.2933 | 2.8 | 11900 | 0.1121 |
| 0.0158 | 2.8 | 11910 | 0.1079 |
| 0.0764 | 2.8 | 11920 | 0.1061 |
| 0.0108 | 2.81 | 11930 | 0.1063 |
| 0.0003 | 2.81 | 11940 | 0.1073 |
| 0.1585 | 2.81 | 11950 | 0.1060 |
| 0.2861 | 2.81 | 11960 | 0.1036 |
| 0.0651 | 2.82 | 11970 | 0.0996 |
| 0.4748 | 2.82 | 11980 | 0.0974 |
| 0.2047 | 2.82 | 11990 | 0.0946 |
| 0.0016 | 2.82 | 12000 | 0.0951 |
| 0.0015 | 2.83 | 12010 | 0.0968 |
| 0.0363 | 2.83 | 12020 | 0.0965 |
| 0.1253 | 2.83 | 12030 | 0.0959 |
| 0.1584 | 2.83 | 12040 | 0.0963 |
| 0.2961 | 2.84 | 12050 | 0.0964 |
| 0.01 | 2.84 | 12060 | 0.0969 |
| 0.0036 | 2.84 | 12070 | 0.0975 |
| 0.2705 | 2.84 | 12080 | 0.0966 |
| 0.0176 | 2.84 | 12090 | 0.0959 |
| 0.1436 | 2.85 | 12100 | 0.0952 |
| 0.1119 | 2.85 | 12110 | 0.0966 |
| 0.071 | 2.85 | 12120 | 0.0969 |
| 0.0005 | 2.85 | 12130 | 0.0953 |
| 0.0002 | 2.86 | 12140 | 0.0949 |
| 0.4459 | 2.86 | 12150 | 0.0940 |
| 0.0015 | 2.86 | 12160 | 0.0943 |
| 0.4285 | 2.86 | 12170 | 0.0961 |
| 0.3299 | 2.87 | 12180 | 0.0967 |
| 0.3955 | 2.87 | 12190 | 0.0957 |
| 0.0331 | 2.87 | 12200 | 0.0961 |
| 0.0852 | 2.87 | 12210 | 0.0955 |
| 0.4534 | 2.88 | 12220 | 0.0946 |
| 0.0367 | 2.88 | 12230 | 0.0969 |
| 0.0063 | 2.88 | 12240 | 0.0998 |
| 0.0064 | 2.88 | 12250 | 0.1029 |
| 0.402 | 2.88 | 12260 | 0.1009 |
| 0.1158 | 2.89 | 12270 | 0.0995 |
| 0.0003 | 2.89 | 12280 | 0.0998 |
| 0.3028 | 2.89 | 12290 | 0.0994 |
| 0.1137 | 2.89 | 12300 | 0.1011 |
| 0.0694 | 2.9 | 12310 | 0.1054 |
| 0.4868 | 2.9 | 12320 | 0.1086 |
| 0.143 | 2.9 | 12330 | 0.1117 |
| 0.0234 | 2.9 | 12340 | 0.1151 |
| 0.197 | 2.91 | 12350 | 0.1133 |
| 0.6447 | 2.91 | 12360 | 0.1100 |
| 0.1801 | 2.91 | 12370 | 0.1084 |
| 0.0426 | 2.91 | 12380 | 0.1067 |
| 0.0003 | 2.92 | 12390 | 0.1038 |
| 0.0846 | 2.92 | 12400 | 0.1016 |
| 0.3846 | 2.92 | 12410 | 0.0962 |
| 0.0009 | 2.92 | 12420 | 0.0955 |
| 0.001 | 2.92 | 12430 | 0.0971 |
| 0.7066 | 2.93 | 12440 | 0.0979 |
| 0.5897 | 2.93 | 12450 | 0.0968 |
| 0.0006 | 2.93 | 12460 | 0.1014 |
| 0.0805 | 2.93 | 12470 | 0.1021 |
| 0.0043 | 2.94 | 12480 | 0.1007 |
| 0.1133 | 2.94 | 12490 | 0.0970 |
| 0.0209 | 2.94 | 12500 | 0.0969 |
| 0.087 | 2.94 | 12510 | 0.0992 |
| 0.3138 | 2.95 | 12520 | 0.1038 |
| 0.3103 | 2.95 | 12530 | 0.1025 |
| 0.0294 | 2.95 | 12540 | 0.0991 |
| 0.1009 | 2.95 | 12550 | 0.0959 |
| 0.6163 | 2.96 | 12560 | 0.0915 |
| 0.0363 | 2.96 | 12570 | 0.0906 |
| 0.0005 | 2.96 | 12580 | 0.0910 |
| 0.0221 | 2.96 | 12590 | 0.0919 |
| 0.0054 | 2.96 | 12600 | 0.0946 |
| 0.2647 | 2.97 | 12610 | 0.0957 |
| 0.1708 | 2.97 | 12620 | 0.0989 |
| 0.0012 | 2.97 | 12630 | 0.1033 |
| 0.0351 | 2.97 | 12640 | 0.1015 |
| 0.0003 | 2.98 | 12650 | 0.1005 |
| 0.0172 | 2.98 | 12660 | 0.0986 |
| 0.3075 | 2.98 | 12670 | 0.0988 |
| 0.0029 | 2.98 | 12680 | 0.0999 |
| 0.0214 | 2.99 | 12690 | 0.1008 |
| 0.16 | 2.99 | 12700 | 0.1036 |
| 0.0224 | 2.99 | 12710 | 0.1049 |
| 0.2915 | 2.99 | 12720 | 0.1019 |
| 0.012 | 3.0 | 12730 | 0.0964 |
| 0.0346 | 3.0 | 12740 | 0.0947 |
| 0.0005 | 3.0 | 12750 | 0.0944 |
| 0.2925 | 3.0 | 12760 | 0.0949 |
| 0.1026 | 3.0 | 12770 | 0.0946 |
| 0.0003 | 3.01 | 12780 | 0.0952 |
| 0.0017 | 3.01 | 12790 | 0.0971 |
| 0.0003 | 3.01 | 12800 | 0.0985 |
| 0.2315 | 3.01 | 12810 | 0.0961 |
| 0.1807 | 3.02 | 12820 | 0.0957 |
| 0.1075 | 3.02 | 12830 | 0.0971 |
| 0.0003 | 3.02 | 12840 | 0.0991 |
| 0.0305 | 3.02 | 12850 | 0.0997 |
| 0.0265 | 3.03 | 12860 | 0.1016 |
| 0.0459 | 3.03 | 12870 | 0.1009 |
| 0.0007 | 3.03 | 12880 | 0.1004 |
| 0.0014 | 3.03 | 12890 | 0.1004 |
| 0.0392 | 3.04 | 12900 | 0.1006 |
| 0.1092 | 3.04 | 12910 | 0.1025 |
| 0.2561 | 3.04 | 12920 | 0.1075 |
| 0.0015 | 3.04 | 12930 | 0.1162 |
| 0.0008 | 3.04 | 12940 | 0.1176 |
| 0.3145 | 3.05 | 12950 | 0.1153 |
| 0.3311 | 3.05 | 12960 | 0.1093 |
| 0.0018 | 3.05 | 12970 | 0.1039 |
| 0.2986 | 3.05 | 12980 | 0.1010 |
| 0.1393 | 3.06 | 12990 | 0.0961 |
| 0.2974 | 3.06 | 13000 | 0.0925 |
| 0.0228 | 3.06 | 13010 | 0.0920 |
| 0.2654 | 3.06 | 13020 | 0.0919 |
| 0.0159 | 3.07 | 13030 | 0.0927 |
| 0.0007 | 3.07 | 13040 | 0.0963 |
| 0.0038 | 3.07 | 13050 | 0.0992 |
| 0.0048 | 3.07 | 13060 | 0.0994 |
| 0.0129 | 3.08 | 13070 | 0.0991 |
| 0.183 | 3.08 | 13080 | 0.0996 |
| 0.0003 | 3.08 | 13090 | 0.1001 |
| 0.1187 | 3.08 | 13100 | 0.0988 |
| 0.161 | 3.08 | 13110 | 0.0992 |
| 0.0055 | 3.09 | 13120 | 0.1012 |
| 0.0018 | 3.09 | 13130 | 0.1027 |
| 0.267 | 3.09 | 13140 | 0.1026 |
| 0.0003 | 3.09 | 13150 | 0.1022 |
| 0.0384 | 3.1 | 13160 | 0.0996 |
| 0.0005 | 3.1 | 13170 | 0.0969 |
| 0.102 | 3.1 | 13180 | 0.0952 |
| 0.4256 | 3.1 | 13190 | 0.0952 |
| 0.0003 | 3.11 | 13200 | 0.0956 |
| 0.3139 | 3.11 | 13210 | 0.0971 |
| 0.0102 | 3.11 | 13220 | 0.0993 |
| 0.6281 | 3.11 | 13230 | 0.0997 |
| 0.0004 | 3.12 | 13240 | 0.1004 |
| 0.0003 | 3.12 | 13250 | 0.1007 |
| 0.012 | 3.12 | 13260 | 0.0988 |
| 0.3418 | 3.12 | 13270 | 0.0960 |
| 0.0003 | 3.12 | 13280 | 0.0943 |
| 0.6213 | 3.13 | 13290 | 0.0970 |
| 0.007 | 3.13 | 13300 | 0.0992 |
| 0.3648 | 3.13 | 13310 | 0.0996 |
| 0.0283 | 3.13 | 13320 | 0.0982 |
| 0.4129 | 3.14 | 13330 | 0.0970 |
| 0.3083 | 3.14 | 13340 | 0.0955 |
| 0.0007 | 3.14 | 13350 | 0.0947 |
| 0.8131 | 3.14 | 13360 | 0.0939 |
| 0.0003 | 3.15 | 13370 | 0.0919 |
| 0.0401 | 3.15 | 13380 | 0.0919 |
| 0.0062 | 3.15 | 13390 | 0.0940 |
| 0.0021 | 3.15 | 13400 | 0.0947 |
| 0.0003 | 3.16 | 13410 | 0.0962 |
| 0.2925 | 3.16 | 13420 | 0.0932 |
| 0.3347 | 3.16 | 13430 | 0.0899 |
| 0.2672 | 3.16 | 13440 | 0.0873 |
| 0.3489 | 3.16 | 13450 | 0.0843 |
| 0.2996 | 3.17 | 13460 | 0.0799 |
| 0.0253 | 3.17 | 13470 | 0.0808 |
| 0.0012 | 3.17 | 13480 | 0.0822 |
| 0.0132 | 3.17 | 13490 | 0.0842 |
| 0.0134 | 3.18 | 13500 | 0.0853 |
| 0.0043 | 3.18 | 13510 | 0.0862 |
| 0.2966 | 3.18 | 13520 | 0.0870 |
| 0.3338 | 3.18 | 13530 | 0.0882 |
| 0.1493 | 3.19 | 13540 | 0.0900 |
| 0.4678 | 3.19 | 13550 | 0.0891 |
| 0.0265 | 3.19 | 13560 | 0.0884 |
| 0.0396 | 3.19 | 13570 | 0.0877 |
| 0.0026 | 3.2 | 13580 | 0.0899 |
| 0.0003 | 3.2 | 13590 | 0.0917 |
| 0.2074 | 3.2 | 13600 | 0.0914 |
| 0.0018 | 3.2 | 13610 | 0.0918 |
| 0.008 | 3.2 | 13620 | 0.0942 |
| 0.0004 | 3.21 | 13630 | 0.0966 |
| 0.0218 | 3.21 | 13640 | 0.1003 |
| 0.0003 | 3.21 | 13650 | 0.1013 |
| 0.0014 | 3.21 | 13660 | 0.1009 |
| 0.0158 | 3.22 | 13670 | 0.1020 |
| 0.302 | 3.22 | 13680 | 0.0990 |
| 0.3981 | 3.22 | 13690 | 0.0954 |
| 0.0032 | 3.22 | 13700 | 0.0915 |
| 0.0004 | 3.23 | 13710 | 0.0904 |
| 0.3167 | 3.23 | 13720 | 0.0895 |
| 0.0003 | 3.23 | 13730 | 0.0885 |
| 0.3852 | 3.23 | 13740 | 0.0864 |
| 0.0006 | 3.24 | 13750 | 0.0858 |
| 0.1922 | 3.24 | 13760 | 0.0858 |
| 0.0747 | 3.24 | 13770 | 0.0870 |
| 0.0146 | 3.24 | 13780 | 0.0896 |
| 0.079 | 3.24 | 13790 | 0.0906 |
| 0.0042 | 3.25 | 13800 | 0.0903 |
| 0.0002 | 3.25 | 13810 | 0.0896 |
| 0.0009 | 3.25 | 13820 | 0.0897 |
| 0.0844 | 3.25 | 13830 | 0.0899 |
| 0.0012 | 3.26 | 13840 | 0.0898 |
| 0.0006 | 3.26 | 13850 | 0.0902 |
| 0.3199 | 3.26 | 13860 | 0.0902 |
| 0.0079 | 3.26 | 13870 | 0.0910 |
| 0.093 | 3.27 | 13880 | 0.0935 |
| 0.2652 | 3.27 | 13890 | 0.0944 |
| 0.0003 | 3.27 | 13900 | 0.0953 |
| 0.0141 | 3.27 | 13910 | 0.0948 |
| 0.001 | 3.28 | 13920 | 0.0935 |
| 0.0005 | 3.28 | 13930 | 0.0921 |
| 0.3881 | 3.28 | 13940 | 0.0932 |
| 0.0013 | 3.28 | 13950 | 0.0935 |
| 0.0054 | 3.28 | 13960 | 0.0936 |
| 0.2995 | 3.29 | 13970 | 0.0926 |
| 0.0013 | 3.29 | 13980 | 0.0927 |
| 0.0056 | 3.29 | 13990 | 0.0912 |
| 0.1778 | 3.29 | 14000 | 0.0914 |
| 0.5867 | 3.3 | 14010 | 0.0874 |
| 0.3062 | 3.3 | 14020 | 0.0852 |
| 0.3423 | 3.3 | 14030 | 0.0836 |
| 0.1672 | 3.3 | 14040 | 0.0869 |
| 0.1182 | 3.31 | 14050 | 0.0876 |
| 0.0008 | 3.31 | 14060 | 0.0907 |
| 0.6932 | 3.31 | 14070 | 0.0922 |
| 0.0604 | 3.31 | 14080 | 0.0912 |
| 0.0026 | 3.32 | 14090 | 0.0899 |
| 0.1247 | 3.32 | 14100 | 0.0887 |
| 0.0053 | 3.32 | 14110 | 0.0872 |
| 0.4837 | 3.32 | 14120 | 0.0863 |
| 0.0009 | 3.32 | 14130 | 0.0849 |
| 0.3786 | 3.33 | 14140 | 0.0838 |
| 0.0202 | 3.33 | 14150 | 0.0836 |
| 0.2748 | 3.33 | 14160 | 0.0835 |
| 0.0151 | 3.33 | 14170 | 0.0829 |
| 0.1156 | 3.34 | 14180 | 0.0837 |
| 0.1487 | 3.34 | 14190 | 0.0854 |
| 0.0393 | 3.34 | 14200 | 0.0854 |
| 0.013 | 3.34 | 14210 | 0.0852 |
| 0.2669 | 3.35 | 14220 | 0.0850 |
| 0.1747 | 3.35 | 14230 | 0.0854 |
| 0.1316 | 3.35 | 14240 | 0.0860 |
| 0.0799 | 3.35 | 14250 | 0.0858 |
| 0.0003 | 3.36 | 14260 | 0.0853 |
| 0.1815 | 3.36 | 14270 | 0.0856 |
| 0.5691 | 3.36 | 14280 | 0.0848 |
| 0.3647 | 3.36 | 14290 | 0.0850 |
| 0.0007 | 3.36 | 14300 | 0.0868 |
| 0.1298 | 3.37 | 14310 | 0.0871 |
| 0.0004 | 3.37 | 14320 | 0.0868 |
| 0.2891 | 3.37 | 14330 | 0.0860 |
| 0.2785 | 3.37 | 14340 | 0.0844 |
| 0.0004 | 3.38 | 14350 | 0.0836 |
| 0.377 | 3.38 | 14360 | 0.0826 |
| 0.2148 | 3.38 | 14370 | 0.0821 |
| 0.0009 | 3.38 | 14380 | 0.0827 |
| 0.0171 | 3.39 | 14390 | 0.0836 |
| 0.0005 | 3.39 | 14400 | 0.0839 |
| 0.3408 | 3.39 | 14410 | 0.0833 |
| 0.7255 | 3.39 | 14420 | 0.0821 |
| 0.0008 | 3.4 | 14430 | 0.0813 |
| 0.2816 | 3.4 | 14440 | 0.0818 |
| 0.336 | 3.4 | 14450 | 0.0811 |
| 0.0007 | 3.4 | 14460 | 0.0821 |
| 0.0005 | 3.4 | 14470 | 0.0829 |
| 0.2838 | 3.41 | 14480 | 0.0829 |
| 0.0601 | 3.41 | 14490 | 0.0829 |
| 0.2786 | 3.41 | 14500 | 0.0830 |
| 0.0006 | 3.41 | 14510 | 0.0849 |
| 0.2785 | 3.42 | 14520 | 0.0854 |
| 0.4389 | 3.42 | 14530 | 0.0837 |
| 0.3658 | 3.42 | 14540 | 0.0823 |
| 0.333 | 3.42 | 14550 | 0.0816 |
| 0.0021 | 3.43 | 14560 | 0.0821 |
| 0.0009 | 3.43 | 14570 | 0.0824 |
| 0.307 | 3.43 | 14580 | 0.0812 |
| 0.5699 | 3.43 | 14590 | 0.0791 |
| 0.0006 | 3.44 | 14600 | 0.0794 |
| 0.2658 | 3.44 | 14610 | 0.0801 |
| 0.144 | 3.44 | 14620 | 0.0795 |
| 0.2578 | 3.44 | 14630 | 0.0816 |
| 0.0021 | 3.44 | 14640 | 0.0836 |
| 0.3426 | 3.45 | 14650 | 0.0833 |
| 0.2029 | 3.45 | 14660 | 0.0842 |
| 0.0004 | 3.45 | 14670 | 0.0847 |
| 0.0231 | 3.45 | 14680 | 0.0838 |
| 0.001 | 3.46 | 14690 | 0.0848 |
| 0.0006 | 3.46 | 14700 | 0.0863 |
| 0.5858 | 3.46 | 14710 | 0.0871 |
| 0.3987 | 3.46 | 14720 | 0.0844 |
| 0.5542 | 3.47 | 14730 | 0.0802 |
| 0.0017 | 3.47 | 14740 | 0.0785 |
| 0.5167 | 3.47 | 14750 | 0.0780 |
| 0.0806 | 3.47 | 14760 | 0.0783 |
| 0.0087 | 3.48 | 14770 | 0.0785 |
| 0.0195 | 3.48 | 14780 | 0.0770 |
| 0.0895 | 3.48 | 14790 | 0.0769 |
| 0.0133 | 3.48 | 14800 | 0.0778 |
| 0.0004 | 3.48 | 14810 | 0.0792 |
| 0.0381 | 3.49 | 14820 | 0.0796 |
| 0.8936 | 3.49 | 14830 | 0.0778 |
| 0.4447 | 3.49 | 14840 | 0.0774 |
| 0.0006 | 3.49 | 14850 | 0.0771 |
| 0.0092 | 3.5 | 14860 | 0.0772 |
| 0.0004 | 3.5 | 14870 | 0.0771 |
| 0.0162 | 3.5 | 14880 | 0.0776 |
| 0.0004 | 3.5 | 14890 | 0.0786 |
| 0.4623 | 3.51 | 14900 | 0.0793 |
| 0.3204 | 3.51 | 14910 | 0.0786 |
| 0.3386 | 3.51 | 14920 | 0.0775 |
| 0.032 | 3.51 | 14930 | 0.0758 |
| 0.0223 | 3.52 | 14940 | 0.0762 |
| 0.4976 | 3.52 | 14950 | 0.0751 |
| 0.0004 | 3.52 | 14960 | 0.0752 |
| 0.8159 | 3.52 | 14970 | 0.0731 |
| 0.0736 | 3.52 | 14980 | 0.0730 |
| 0.0009 | 3.53 | 14990 | 0.0739 |
| 0.0046 | 3.53 | 15000 | 0.0745 |
| 0.3379 | 3.53 | 15010 | 0.0743 |
| 0.2567 | 3.53 | 15020 | 0.0759 |
| 0.2976 | 3.54 | 15030 | 0.0763 |
| 0.2896 | 3.54 | 15040 | 0.0760 |
| 0.0122 | 3.54 | 15050 | 0.0759 |
| 0.2924 | 3.54 | 15060 | 0.0754 |
| 0.086 | 3.55 | 15070 | 0.0757 |
| 0.2857 | 3.55 | 15080 | 0.0755 |
| 0.0025 | 3.55 | 15090 | 0.0760 |
| 0.0007 | 3.55 | 15100 | 0.0766 |
| 0.0024 | 3.56 | 15110 | 0.0773 |
| 0.2411 | 3.56 | 15120 | 0.0777 |
| 0.0157 | 3.56 | 15130 | 0.0777 |
| 0.2931 | 3.56 | 15140 | 0.0778 |
| 0.0033 | 3.56 | 15150 | 0.0783 |
| 0.0005 | 3.57 | 15160 | 0.0792 |
| 0.1324 | 3.57 | 15170 | 0.0798 |
| 0.0005 | 3.57 | 15180 | 0.0806 |
| 0.3594 | 3.57 | 15190 | 0.0804 |
| 0.0157 | 3.58 | 15200 | 0.0796 |
| 0.2647 | 3.58 | 15210 | 0.0786 |
| 0.2307 | 3.58 | 15220 | 0.0787 |
| 0.025 | 3.58 | 15230 | 0.0802 |
| 0.2231 | 3.59 | 15240 | 0.0818 |
| 0.0004 | 3.59 | 15250 | 0.0823 |
| 0.0003 | 3.59 | 15260 | 0.0829 |
| 0.0027 | 3.59 | 15270 | 0.0828 |
| 0.0004 | 3.6 | 15280 | 0.0826 |
| 0.2923 | 3.6 | 15290 | 0.0809 |
| 0.2984 | 3.6 | 15300 | 0.0790 |
| 0.0029 | 3.6 | 15310 | 0.0780 |
| 0.1036 | 3.6 | 15320 | 0.0780 |
| 0.0005 | 3.61 | 15330 | 0.0786 |
| 0.3424 | 3.61 | 15340 | 0.0788 |
| 0.0002 | 3.61 | 15350 | 0.0794 |
| 0.0091 | 3.61 | 15360 | 0.0802 |
| 0.1083 | 3.62 | 15370 | 0.0805 |
| 0.0003 | 3.62 | 15380 | 0.0809 |
| 0.0005 | 3.62 | 15390 | 0.0804 |
| 0.2791 | 3.62 | 15400 | 0.0795 |
| 0.0003 | 3.63 | 15410 | 0.0791 |
| 0.0004 | 3.63 | 15420 | 0.0799 |
| 0.5202 | 3.63 | 15430 | 0.0807 |
| 0.2363 | 3.63 | 15440 | 0.0802 |
| 0.2902 | 3.64 | 15450 | 0.0798 |
| 0.0025 | 3.64 | 15460 | 0.0796 |
| 0.2602 | 3.64 | 15470 | 0.0802 |
| 0.0364 | 3.64 | 15480 | 0.0813 |
| 0.0005 | 3.64 | 15490 | 0.0809 |
| 0.0003 | 3.65 | 15500 | 0.0813 |
| 0.0006 | 3.65 | 15510 | 0.0814 |
| 0.2759 | 3.65 | 15520 | 0.0801 |
| 0.0104 | 3.65 | 15530 | 0.0802 |
| 0.001 | 3.66 | 15540 | 0.0808 |
| 0.0004 | 3.66 | 15550 | 0.0823 |
| 0.2792 | 3.66 | 15560 | 0.0816 |
| 0.0017 | 3.66 | 15570 | 0.0816 |
| 0.0004 | 3.67 | 15580 | 0.0810 |
| 0.0044 | 3.67 | 15590 | 0.0807 |
| 0.0008 | 3.67 | 15600 | 0.0822 |
| 0.2675 | 3.67 | 15610 | 0.0833 |
| 0.0118 | 3.68 | 15620 | 0.0840 |
| 0.1065 | 3.68 | 15630 | 0.0859 |
| 0.007 | 3.68 | 15640 | 0.0870 |
| 0.0007 | 3.68 | 15650 | 0.0846 |
| 0.2866 | 3.68 | 15660 | 0.0839 |
| 0.001 | 3.69 | 15670 | 0.0846 |
| 0.0004 | 3.69 | 15680 | 0.0854 |
| 0.0002 | 3.69 | 15690 | 0.0862 |
| 0.3541 | 3.69 | 15700 | 0.0866 |
| 0.0011 | 3.7 | 15710 | 0.0868 |
| 0.048 | 3.7 | 15720 | 0.0859 |
| 0.277 | 3.7 | 15730 | 0.0846 |
| 0.0002 | 3.7 | 15740 | 0.0842 |
| 0.0004 | 3.71 | 15750 | 0.0853 |
| 0.0189 | 3.71 | 15760 | 0.0853 |
| 0.0779 | 3.71 | 15770 | 0.0849 |
| 0.3086 | 3.71 | 15780 | 0.0833 |
| 0.0004 | 3.72 | 15790 | 0.0835 |
| 0.0004 | 3.72 | 15800 | 0.0842 |
| 0.0011 | 3.72 | 15810 | 0.0853 |
| 0.0014 | 3.72 | 15820 | 0.0851 |
| 0.0062 | 3.72 | 15830 | 0.0852 |
| 0.2123 | 3.73 | 15840 | 0.0857 |
| 0.0037 | 3.73 | 15850 | 0.0856 |
| 0.0003 | 3.73 | 15860 | 0.0859 |
| 0.0465 | 3.73 | 15870 | 0.0867 |
| 0.3638 | 3.74 | 15880 | 0.0863 |
| 0.0107 | 3.74 | 15890 | 0.0857 |
| 0.0025 | 3.74 | 15900 | 0.0860 |
| 0.0905 | 3.74 | 15910 | 0.0851 |
| 0.142 | 3.75 | 15920 | 0.0845 |
| 0.0002 | 3.75 | 15930 | 0.0843 |
| 0.3104 | 3.75 | 15940 | 0.0851 |
| 0.0571 | 3.75 | 15950 | 0.0864 |
| 0.0171 | 3.76 | 15960 | 0.0853 |
| 0.0002 | 3.76 | 15970 | 0.0851 |
| 0.1669 | 3.76 | 15980 | 0.0863 |
| 0.1901 | 3.76 | 15990 | 0.0874 |
| 0.0312 | 3.76 | 16000 | 0.0870 |
| 0.0002 | 3.77 | 16010 | 0.0873 |
| 0.0271 | 3.77 | 16020 | 0.0858 |
| 0.0002 | 3.77 | 16030 | 0.0861 |
| 0.0094 | 3.77 | 16040 | 0.0879 |
| 0.2099 | 3.78 | 16050 | 0.0890 |
| 0.0002 | 3.78 | 16060 | 0.0894 |
| 0.6885 | 3.78 | 16070 | 0.0862 |
| 0.0002 | 3.78 | 16080 | 0.0848 |
| 0.3497 | 3.79 | 16090 | 0.0842 |
| 0.0871 | 3.79 | 16100 | 0.0851 |
| 0.0002 | 3.79 | 16110 | 0.0862 |
| 0.3508 | 3.79 | 16120 | 0.0870 |
| 0.0135 | 3.8 | 16130 | 0.0848 |
| 0.0033 | 3.8 | 16140 | 0.0826 |
| 0.0003 | 3.8 | 16150 | 0.0823 |
| 0.2433 | 3.8 | 16160 | 0.0827 |
| 0.3401 | 3.8 | 16170 | 0.0818 |
| 0.0234 | 3.81 | 16180 | 0.0798 |
| 0.3376 | 3.81 | 16190 | 0.0788 |
| 0.1368 | 3.81 | 16200 | 0.0795 |
| 0.0002 | 3.81 | 16210 | 0.0804 |
| 0.3597 | 3.82 | 16220 | 0.0801 |
| 0.0402 | 3.82 | 16230 | 0.0775 |
| 0.2917 | 3.82 | 16240 | 0.0756 |
| 0.0521 | 3.82 | 16250 | 0.0759 |
| 0.0003 | 3.83 | 16260 | 0.0763 |
| 0.0029 | 3.83 | 16270 | 0.0781 |
| 0.3 | 3.83 | 16280 | 0.0793 |
| 0.0003 | 3.83 | 16290 | 0.0784 |
| 0.0626 | 3.84 | 16300 | 0.0782 |
| 0.0033 | 3.84 | 16310 | 0.0776 |
| 0.0002 | 3.84 | 16320 | 0.0772 |
| 0.0327 | 3.84 | 16330 | 0.0771 |
| 0.0003 | 3.84 | 16340 | 0.0772 |
| 0.0463 | 3.85 | 16350 | 0.0776 |
| 0.0027 | 3.85 | 16360 | 0.0774 |
| 0.0002 | 3.85 | 16370 | 0.0776 |
| 0.0002 | 3.85 | 16380 | 0.0778 |
| 0.0026 | 3.86 | 16390 | 0.0781 |
| 0.0023 | 3.86 | 16400 | 0.0782 |
| 0.298 | 3.86 | 16410 | 0.0762 |
| 0.147 | 3.86 | 16420 | 0.0755 |
| 0.3189 | 3.87 | 16430 | 0.0746 |
| 0.005 | 3.87 | 16440 | 0.0755 |
| 0.2814 | 3.87 | 16450 | 0.0760 |
| 0.1032 | 3.87 | 16460 | 0.0755 |
| 0.1406 | 3.88 | 16470 | 0.0749 |
| 0.2432 | 3.88 | 16480 | 0.0753 |
| 0.0004 | 3.88 | 16490 | 0.0763 |
| 0.2769 | 3.88 | 16500 | 0.0758 |
| 0.1861 | 3.88 | 16510 | 0.0766 |
| 0.0246 | 3.89 | 16520 | 0.0769 |
| 0.3596 | 3.89 | 16530 | 0.0757 |
| 0.5246 | 3.89 | 16540 | 0.0735 |
| 0.3344 | 3.89 | 16550 | 0.0732 |
| 0.1046 | 3.9 | 16560 | 0.0722 |
| 0.5492 | 3.9 | 16570 | 0.0704 |
| 0.0004 | 3.9 | 16580 | 0.0691 |
| 0.0009 | 3.9 | 16590 | 0.0705 |
| 0.3495 | 3.91 | 16600 | 0.0708 |
| 0.2679 | 3.91 | 16610 | 0.0683 |
| 0.0003 | 3.91 | 16620 | 0.0685 |
| 0.0243 | 3.91 | 16630 | 0.0698 |
| 0.2918 | 3.92 | 16640 | 0.0697 |
| 0.0006 | 3.92 | 16650 | 0.0707 |
| 0.13 | 3.92 | 16660 | 0.0724 |
| 0.0007 | 3.92 | 16670 | 0.0736 |
| 0.105 | 3.92 | 16680 | 0.0755 |
| 0.2643 | 3.93 | 16690 | 0.0764 |
| 0.0003 | 3.93 | 16700 | 0.0769 |
| 0.2083 | 3.93 | 16710 | 0.0765 |
| 0.1837 | 3.93 | 16720 | 0.0758 |
| 0.0012 | 3.94 | 16730 | 0.0752 |
| 0.3447 | 3.94 | 16740 | 0.0750 |
| 0.0007 | 3.94 | 16750 | 0.0773 |
| 0.3358 | 3.94 | 16760 | 0.0774 |
| 0.0003 | 3.95 | 16770 | 0.0777 |
| 0.3103 | 3.95 | 16780 | 0.0759 |
| 0.0002 | 3.95 | 16790 | 0.0749 |
| 0.0002 | 3.95 | 16800 | 0.0749 |
| 0.2631 | 3.96 | 16810 | 0.0737 |
| 0.3875 | 3.96 | 16820 | 0.0730 |
| 0.3485 | 3.96 | 16830 | 0.0727 |
| 0.0009 | 3.96 | 16840 | 0.0735 |
| 0.2721 | 3.96 | 16850 | 0.0755 |
| 0.6234 | 3.97 | 16860 | 0.0750 |
| 0.0037 | 3.97 | 16870 | 0.0746 |
| 0.0549 | 3.97 | 16880 | 0.0754 |
| 0.0005 | 3.97 | 16890 | 0.0776 |
| 0.0007 | 3.98 | 16900 | 0.0773 |
| 0.4736 | 3.98 | 16910 | 0.0748 |
| 0.0414 | 3.98 | 16920 | 0.0740 |
| 0.2808 | 3.98 | 16930 | 0.0747 |
| 0.2404 | 3.99 | 16940 | 0.0747 |
| 0.0003 | 3.99 | 16950 | 0.0747 |
| 0.0004 | 3.99 | 16960 | 0.0756 |
| 0.4638 | 3.99 | 16970 | 0.0755 |
| 0.0009 | 4.0 | 16980 | 0.0743 |
| 0.0031 | 4.0 | 16990 | 0.0729 |
| 0.0064 | 4.0 | 17000 | 0.0731 |
| 0.0017 | 4.0 | 17010 | 0.0743 |
| 0.0006 | 4.0 | 17020 | 0.0744 |
| 0.0002 | 4.01 | 17030 | 0.0745 |
| 0.1042 | 4.01 | 17040 | 0.0748 |
| 0.3912 | 4.01 | 17050 | 0.0746 |
| 0.0952 | 4.01 | 17060 | 0.0745 |
| 0.0019 | 4.02 | 17070 | 0.0752 |
| 0.0382 | 4.02 | 17080 | 0.0760 |
| 0.0002 | 4.02 | 17090 | 0.0767 |
| 0.2508 | 4.02 | 17100 | 0.0751 |
| 0.0002 | 4.03 | 17110 | 0.0744 |
| 0.001 | 4.03 | 17120 | 0.0741 |
| 0.2056 | 4.03 | 17130 | 0.0741 |
| 0.0002 | 4.03 | 17140 | 0.0743 |
| 0.3199 | 4.04 | 17150 | 0.0739 |
| 0.0005 | 4.04 | 17160 | 0.0738 |
| 0.1215 | 4.04 | 17170 | 0.0736 |
| 0.0003 | 4.04 | 17180 | 0.0738 |
| 0.3389 | 4.04 | 17190 | 0.0729 |
| 0.0006 | 4.05 | 17200 | 0.0723 |
| 0.0017 | 4.05 | 17210 | 0.0731 |
| 0.0004 | 4.05 | 17220 | 0.0743 |
| 0.0782 | 4.05 | 17230 | 0.0755 |
| 0.2879 | 4.06 | 17240 | 0.0758 |
| 0.2835 | 4.06 | 17250 | 0.0752 |
| 0.0138 | 4.06 | 17260 | 0.0749 |
| 0.0006 | 4.06 | 17270 | 0.0755 |
| 0.5847 | 4.07 | 17280 | 0.0752 |
| 0.0002 | 4.07 | 17290 | 0.0748 |
| 0.1155 | 4.07 | 17300 | 0.0749 |
| 0.0002 | 4.07 | 17310 | 0.0752 |
| 0.0004 | 4.08 | 17320 | 0.0761 |
| 0.6313 | 4.08 | 17330 | 0.0745 |
| 0.0005 | 4.08 | 17340 | 0.0735 |
| 0.0032 | 4.08 | 17350 | 0.0736 |
| 0.0004 | 4.08 | 17360 | 0.0743 |
| 0.0091 | 4.09 | 17370 | 0.0751 |
| 0.0712 | 4.09 | 17380 | 0.0759 |
| 0.0022 | 4.09 | 17390 | 0.0762 |
| 0.0004 | 4.09 | 17400 | 0.0769 |
| 0.0001 | 4.1 | 17410 | 0.0773 |
| 0.0001 | 4.1 | 17420 | 0.0776 |
| 0.0002 | 4.1 | 17430 | 0.0779 |
| 0.585 | 4.1 | 17440 | 0.0771 |
| 0.0016 | 4.11 | 17450 | 0.0759 |
| 0.0004 | 4.11 | 17460 | 0.0758 |
| 0.0079 | 4.11 | 17470 | 0.0759 |
| 0.1015 | 4.11 | 17480 | 0.0770 |
| 0.0672 | 4.12 | 17490 | 0.0776 |
| 0.0007 | 4.12 | 17500 | 0.0780 |
| 0.6565 | 4.12 | 17510 | 0.0768 |
| 0.6093 | 4.12 | 17520 | 0.0739 |
| 0.032 | 4.12 | 17530 | 0.0722 |
| 0.0005 | 4.13 | 17540 | 0.0718 |
| 0.0003 | 4.13 | 17550 | 0.0722 |
| 0.0007 | 4.13 | 17560 | 0.0730 |
| 0.0002 | 4.13 | 17570 | 0.0735 |
| 0.0971 | 4.14 | 17580 | 0.0734 |
| 0.1604 | 4.14 | 17590 | 0.0738 |
| 0.2961 | 4.14 | 17600 | 0.0731 |
| 0.0116 | 4.14 | 17610 | 0.0732 |
| 0.0003 | 4.15 | 17620 | 0.0738 |
| 0.0007 | 4.15 | 17630 | 0.0743 |
| 0.0007 | 4.15 | 17640 | 0.0746 |
| 0.1136 | 4.15 | 17650 | 0.0750 |
| 0.0004 | 4.16 | 17660 | 0.0754 |
| 0.0002 | 4.16 | 17670 | 0.0757 |
| 0.3341 | 4.16 | 17680 | 0.0755 |
| 0.001 | 4.16 | 17690 | 0.0749 |
| 0.1725 | 4.16 | 17700 | 0.0748 |
| 0.0037 | 4.17 | 17710 | 0.0748 |
| 0.3721 | 4.17 | 17720 | 0.0749 |
| 0.2821 | 4.17 | 17730 | 0.0752 |
| 0.0932 | 4.17 | 17740 | 0.0753 |
| 0.3053 | 4.18 | 17750 | 0.0762 |
| 0.0002 | 4.18 | 17760 | 0.0767 |
| 0.0002 | 4.18 | 17770 | 0.0770 |
| 0.0004 | 4.18 | 17780 | 0.0768 |
| 0.0915 | 4.19 | 17790 | 0.0762 |
| 0.0004 | 4.19 | 17800 | 0.0759 |
| 0.0002 | 4.19 | 17810 | 0.0759 |
| 0.0732 | 4.19 | 17820 | 0.0761 |
| 0.0002 | 4.2 | 17830 | 0.0763 |
| 0.0113 | 4.2 | 17840 | 0.0763 |
| 0.0261 | 4.2 | 17850 | 0.0763 |
| 0.2278 | 4.2 | 17860 | 0.0759 |
| 0.0411 | 4.2 | 17870 | 0.0751 |
| 0.0045 | 4.21 | 17880 | 0.0747 |
| 0.4132 | 4.21 | 17890 | 0.0739 |
| 0.0004 | 4.21 | 17900 | 0.0734 |
| 0.0003 | 4.21 | 17910 | 0.0734 |
| 0.283 | 4.22 | 17920 | 0.0733 |
| 0.1526 | 4.22 | 17930 | 0.0721 |
| 0.0359 | 4.22 | 17940 | 0.0718 |
| 0.425 | 4.22 | 17950 | 0.0713 |
| 0.0035 | 4.23 | 17960 | 0.0706 |
| 0.0006 | 4.23 | 17970 | 0.0715 |
| 0.0006 | 4.23 | 17980 | 0.0732 |
| 0.0002 | 4.23 | 17990 | 0.0743 |
| 0.054 | 4.24 | 18000 | 0.0743 |
| 0.0213 | 4.24 | 18010 | 0.0750 |
| 0.0027 | 4.24 | 18020 | 0.0749 |
| 0.5611 | 4.24 | 18030 | 0.0738 |
| 0.2393 | 4.24 | 18040 | 0.0724 |
| 0.0003 | 4.25 | 18050 | 0.0724 |
| 1.2984 | 4.25 | 18060 | 0.0709 |
| 0.0002 | 4.25 | 18070 | 0.0708 |
| 0.0271 | 4.25 | 18080 | 0.0710 |
| 0.1498 | 4.26 | 18090 | 0.0712 |
| 0.0009 | 4.26 | 18100 | 0.0704 |
| 0.0003 | 4.26 | 18110 | 0.0708 |
| 0.0308 | 4.26 | 18120 | 0.0716 |
| 0.659 | 4.27 | 18130 | 0.0694 |
| 0.2952 | 4.27 | 18140 | 0.0675 |
| 0.0038 | 4.27 | 18150 | 0.0665 |
| 0.1375 | 4.27 | 18160 | 0.0671 |
| 0.7079 | 4.28 | 18170 | 0.0644 |
| 0.0851 | 4.28 | 18180 | 0.0640 |
| 0.0013 | 4.28 | 18190 | 0.0656 |
| 0.2423 | 4.28 | 18200 | 0.0668 |
| 0.2575 | 4.28 | 18210 | 0.0675 |
| 0.5335 | 4.29 | 18220 | 0.0680 |
| 0.2825 | 4.29 | 18230 | 0.0686 |
| 0.0002 | 4.29 | 18240 | 0.0692 |
| 0.0004 | 4.29 | 18250 | 0.0704 |
| 0.0059 | 4.3 | 18260 | 0.0714 |
| 0.0478 | 4.3 | 18270 | 0.0723 |
| 0.0005 | 4.3 | 18280 | 0.0734 |
| 0.0002 | 4.3 | 18290 | 0.0742 |
| 0.0093 | 4.31 | 18300 | 0.0749 |
| 0.0009 | 4.31 | 18310 | 0.0746 |
| 0.0015 | 4.31 | 18320 | 0.0745 |
| 0.0651 | 4.31 | 18330 | 0.0748 |
| 0.1539 | 4.32 | 18340 | 0.0747 |
| 0.0002 | 4.32 | 18350 | 0.0750 |
| 0.3224 | 4.32 | 18360 | 0.0749 |
| 0.0002 | 4.32 | 18370 | 0.0748 |
| 0.0029 | 4.32 | 18380 | 0.0753 |
| 0.0007 | 4.33 | 18390 | 0.0762 |
| 0.0001 | 4.33 | 18400 | 0.0766 |
| 0.001 | 4.33 | 18410 | 0.0768 |
| 0.3448 | 4.33 | 18420 | 0.0747 |
| 0.0002 | 4.34 | 18430 | 0.0736 |
| 0.0002 | 4.34 | 18440 | 0.0735 |
| 0.0002 | 4.34 | 18450 | 0.0737 |
| 0.3644 | 4.34 | 18460 | 0.0718 |
| 0.2935 | 4.35 | 18470 | 0.0702 |
| 0.2925 | 4.35 | 18480 | 0.0686 |
| 0.0003 | 4.35 | 18490 | 0.0682 |
| 0.0389 | 4.35 | 18500 | 0.0680 |
| 0.0005 | 4.36 | 18510 | 0.0684 |
| 0.0004 | 4.36 | 18520 | 0.0702 |
| 0.3982 | 4.36 | 18530 | 0.0718 |
| 0.0004 | 4.36 | 18540 | 0.0736 |
| 0.0005 | 4.36 | 18550 | 0.0749 |
| 0.0002 | 4.37 | 18560 | 0.0759 |
| 0.0002 | 4.37 | 18570 | 0.0764 |
| 0.3715 | 4.37 | 18580 | 0.0751 |
| 0.0003 | 4.37 | 18590 | 0.0744 |
| 0.0002 | 4.38 | 18600 | 0.0745 |
| 0.4073 | 4.38 | 18610 | 0.0739 |
| 0.0003 | 4.38 | 18620 | 0.0733 |
| 0.0431 | 4.38 | 18630 | 0.0732 |
| 0.0985 | 4.39 | 18640 | 0.0733 |
| 0.0169 | 4.39 | 18650 | 0.0733 |
| 0.0002 | 4.39 | 18660 | 0.0736 |
| 0.0119 | 4.39 | 18670 | 0.0743 |
| 0.0497 | 4.4 | 18680 | 0.0748 |
| 0.1171 | 4.4 | 18690 | 0.0751 |
| 0.0021 | 4.4 | 18700 | 0.0760 |
| 0.3253 | 4.4 | 18710 | 0.0771 |
| 0.0895 | 4.4 | 18720 | 0.0780 |
| 0.0004 | 4.41 | 18730 | 0.0786 |
| 0.1197 | 4.41 | 18740 | 0.0785 |
| 0.0003 | 4.41 | 18750 | 0.0801 |
| 0.0005 | 4.41 | 18760 | 0.0803 |
| 0.0002 | 4.42 | 18770 | 0.0795 |
| 0.3121 | 4.42 | 18780 | 0.0785 |
| 0.0001 | 4.42 | 18790 | 0.0764 |
| 0.0022 | 4.42 | 18800 | 0.0763 |
| 0.0002 | 4.43 | 18810 | 0.0770 |
| 0.0002 | 4.43 | 18820 | 0.0781 |
| 0.0005 | 4.43 | 18830 | 0.0789 |
| 0.304 | 4.43 | 18840 | 0.0794 |
| 0.6571 | 4.44 | 18850 | 0.0768 |
| 0.257 | 4.44 | 18860 | 0.0738 |
| 0.1065 | 4.44 | 18870 | 0.0719 |
| 0.0003 | 4.44 | 18880 | 0.0719 |
| 0.0137 | 4.44 | 18890 | 0.0714 |
| 0.0002 | 4.45 | 18900 | 0.0711 |
| 0.0007 | 4.45 | 18910 | 0.0706 |
| 0.4023 | 4.45 | 18920 | 0.0691 |
| 0.0002 | 4.45 | 18930 | 0.0691 |
| 0.5165 | 4.46 | 18940 | 0.0686 |
| 0.0003 | 4.46 | 18950 | 0.0679 |
| 0.0485 | 4.46 | 18960 | 0.0684 |
| 0.2788 | 4.46 | 18970 | 0.0677 |
| 0.0003 | 4.47 | 18980 | 0.0677 |
| 0.0006 | 4.47 | 18990 | 0.0682 |
| 0.2749 | 4.47 | 19000 | 0.0689 |
| 0.5892 | 4.47 | 19010 | 0.0668 |
| 0.0005 | 4.48 | 19020 | 0.0662 |
| 0.0002 | 4.48 | 19030 | 0.0668 |
| 0.4823 | 4.48 | 19040 | 0.0664 |
| 0.0012 | 4.48 | 19050 | 0.0647 |
| 0.0482 | 4.48 | 19060 | 0.0650 |
| 0.1132 | 4.49 | 19070 | 0.0659 |
| 0.0003 | 4.49 | 19080 | 0.0672 |
| 0.2879 | 4.49 | 19090 | 0.0668 |
| 0.256 | 4.49 | 19100 | 0.0669 |
| 0.0216 | 4.5 | 19110 | 0.0670 |
| 0.3252 | 4.5 | 19120 | 0.0666 |
| 0.0002 | 4.5 | 19130 | 0.0665 |
| 0.0574 | 4.5 | 19140 | 0.0672 |
| 0.3554 | 4.51 | 19150 | 0.0669 |
| 0.0003 | 4.51 | 19160 | 0.0672 |
| 0.3015 | 4.51 | 19170 | 0.0675 |
| 0.0002 | 4.51 | 19180 | 0.0671 |
| 0.1565 | 4.52 | 19190 | 0.0670 |
| 0.0003 | 4.52 | 19200 | 0.0670 |
| 0.2904 | 4.52 | 19210 | 0.0661 |
| 0.0003 | 4.52 | 19220 | 0.0663 |
| 0.267 | 4.52 | 19230 | 0.0667 |
| 0.0003 | 4.53 | 19240 | 0.0663 |
| 0.5314 | 4.53 | 19250 | 0.0654 |
| 0.292 | 4.53 | 19260 | 0.0646 |
| 0.0004 | 4.53 | 19270 | 0.0653 |
| 0.0093 | 4.54 | 19280 | 0.0662 |
| 0.0003 | 4.54 | 19290 | 0.0670 |
| 0.2437 | 4.54 | 19300 | 0.0667 |
| 0.2913 | 4.54 | 19310 | 0.0665 |
| 0.2572 | 4.55 | 19320 | 0.0655 |
| 0.6526 | 4.55 | 19330 | 0.0655 |
| 0.2407 | 4.55 | 19340 | 0.0642 |
| 0.0009 | 4.55 | 19350 | 0.0645 |
| 0.0013 | 4.56 | 19360 | 0.0661 |
| 0.0004 | 4.56 | 19370 | 0.0670 |
| 0.2218 | 4.56 | 19380 | 0.0672 |
| 0.1251 | 4.56 | 19390 | 0.0672 |
| 0.1528 | 4.56 | 19400 | 0.0669 |
| 0.0003 | 4.57 | 19410 | 0.0669 |
| 0.0002 | 4.57 | 19420 | 0.0672 |
| 0.0004 | 4.57 | 19430 | 0.0679 |
| 0.0005 | 4.57 | 19440 | 0.0689 |
| 0.0033 | 4.58 | 19450 | 0.0697 |
| 0.0006 | 4.58 | 19460 | 0.0702 |
| 0.7668 | 4.58 | 19470 | 0.0688 |
| 0.0003 | 4.58 | 19480 | 0.0680 |
| 0.0003 | 4.59 | 19490 | 0.0681 |
| 0.2649 | 4.59 | 19500 | 0.0676 |
| 0.1132 | 4.59 | 19510 | 0.0678 |
| 0.0002 | 4.59 | 19520 | 0.0693 |
| 0.0985 | 4.6 | 19530 | 0.0705 |
| 0.0002 | 4.6 | 19540 | 0.0714 |
| 0.0003 | 4.6 | 19550 | 0.0721 |
| 0.1109 | 4.6 | 19560 | 0.0724 |
| 0.0008 | 4.6 | 19570 | 0.0730 |
| 0.0073 | 4.61 | 19580 | 0.0732 |
| 0.0001 | 4.61 | 19590 | 0.0734 |
| 0.0001 | 4.61 | 19600 | 0.0736 |
| 0.0049 | 4.61 | 19610 | 0.0741 |
| 0.0171 | 4.62 | 19620 | 0.0754 |
| 0.3038 | 4.62 | 19630 | 0.0749 |
| 0.237 | 4.62 | 19640 | 0.0743 |
| 0.0018 | 4.62 | 19650 | 0.0743 |
| 0.3878 | 4.63 | 19660 | 0.0737 |
| 0.277 | 4.63 | 19670 | 0.0720 |
| 0.0004 | 4.63 | 19680 | 0.0715 |
| 0.0003 | 4.63 | 19690 | 0.0719 |
| 0.0012 | 4.64 | 19700 | 0.0723 |
| 0.5104 | 4.64 | 19710 | 0.0719 |
| 0.3825 | 4.64 | 19720 | 0.0702 |
| 0.041 | 4.64 | 19730 | 0.0695 |
| 0.0003 | 4.64 | 19740 | 0.0697 |
| 0.0114 | 4.65 | 19750 | 0.0705 |
| 0.1003 | 4.65 | 19760 | 0.0710 |
| 0.0003 | 4.65 | 19770 | 0.0721 |
| 0.5726 | 4.65 | 19780 | 0.0716 |
| 0.0003 | 4.66 | 19790 | 0.0710 |
| 0.0045 | 4.66 | 19800 | 0.0709 |
| 0.0002 | 4.66 | 19810 | 0.0709 |
| 0.2612 | 4.66 | 19820 | 0.0711 |
| 0.1154 | 4.67 | 19830 | 0.0697 |
| 0.4952 | 4.67 | 19840 | 0.0694 |
| 0.0002 | 4.67 | 19850 | 0.0691 |
| 0.0004 | 4.67 | 19860 | 0.0693 |
| 0.0073 | 4.68 | 19870 | 0.0700 |
| 0.0006 | 4.68 | 19880 | 0.0704 |
| 0.0002 | 4.68 | 19890 | 0.0709 |
| 0.0096 | 4.68 | 19900 | 0.0721 |
| 0.5865 | 4.68 | 19910 | 0.0706 |
| 0.0003 | 4.69 | 19920 | 0.0698 |
| 0.0002 | 4.69 | 19930 | 0.0698 |
| 0.5109 | 4.69 | 19940 | 0.0696 |
| 0.0002 | 4.69 | 19950 | 0.0688 |
| 0.1106 | 4.7 | 19960 | 0.0689 |
| 0.0006 | 4.7 | 19970 | 0.0692 |
| 0.4659 | 4.7 | 19980 | 0.0684 |
| 0.1068 | 4.7 | 19990 | 0.0685 |
| 0.111 | 4.71 | 20000 | 0.0691 |
| 0.0004 | 4.71 | 20010 | 0.0695 |
| 0.0882 | 4.71 | 20020 | 0.0699 |
| 0.037 | 4.71 | 20030 | 0.0703 |
| 0.2227 | 4.72 | 20040 | 0.0705 |
| 0.1916 | 4.72 | 20050 | 0.0693 |
| 0.0004 | 4.72 | 20060 | 0.0695 |
| 0.0003 | 4.72 | 20070 | 0.0699 |
| 0.2509 | 4.72 | 20080 | 0.0692 |
| 0.0068 | 4.73 | 20090 | 0.0690 |
| 0.2614 | 4.73 | 20100 | 0.0688 |
| 0.0511 | 4.73 | 20110 | 0.0690 |
| 0.0002 | 4.73 | 20120 | 0.0691 |
| 0.254 | 4.74 | 20130 | 0.0685 |
| 0.5431 | 4.74 | 20140 | 0.0679 |
| 0.001 | 4.74 | 20150 | 0.0666 |
| 0.0003 | 4.74 | 20160 | 0.0667 |
| 0.0031 | 4.75 | 20170 | 0.0673 |
| 0.0094 | 4.75 | 20180 | 0.0683 |
| 0.0002 | 4.75 | 20190 | 0.0690 |
| 0.047 | 4.75 | 20200 | 0.0694 |
| 0.0006 | 4.76 | 20210 | 0.0699 |
| 0.0004 | 4.76 | 20220 | 0.0705 |
| 0.0003 | 4.76 | 20230 | 0.0710 |
| 0.0002 | 4.76 | 20240 | 0.0715 |
| 0.917 | 4.76 | 20250 | 0.0702 |
| 0.0008 | 4.77 | 20260 | 0.0687 |
| 0.001 | 4.77 | 20270 | 0.0685 |
| 0.0002 | 4.77 | 20280 | 0.0690 |
| 0.003 | 4.77 | 20290 | 0.0696 |
| 0.3469 | 4.78 | 20300 | 0.0698 |
| 0.3974 | 4.78 | 20310 | 0.0683 |
| 0.1657 | 4.78 | 20320 | 0.0669 |
| 0.7553 | 4.78 | 20330 | 0.0651 |
| 0.2077 | 4.79 | 20340 | 0.0639 |
| 0.0651 | 4.79 | 20350 | 0.0640 |
| 0.2523 | 4.79 | 20360 | 0.0643 |
| 0.0272 | 4.79 | 20370 | 0.0650 |
| 0.0005 | 4.8 | 20380 | 0.0659 |
| 0.0002 | 4.8 | 20390 | 0.0667 |
| 0.2791 | 4.8 | 20400 | 0.0664 |
| 0.0075 | 4.8 | 20410 | 0.0668 |
| 0.038 | 4.8 | 20420 | 0.0671 |
| 0.0099 | 4.81 | 20430 | 0.0682 |
| 0.0732 | 4.81 | 20440 | 0.0685 |
| 0.5496 | 4.81 | 20450 | 0.0676 |
| 0.0035 | 4.81 | 20460 | 0.0673 |
| 0.0003 | 4.82 | 20470 | 0.0674 |
| 0.0002 | 4.82 | 20480 | 0.0677 |
| 0.0002 | 4.82 | 20490 | 0.0681 |
| 0.0005 | 4.82 | 20500 | 0.0685 |
| 0.3393 | 4.83 | 20510 | 0.0679 |
| 0.0002 | 4.83 | 20520 | 0.0678 |
| 0.0002 | 4.83 | 20530 | 0.0680 |
| 0.0346 | 4.83 | 20540 | 0.0685 |
| 0.2585 | 4.84 | 20550 | 0.0689 |
| 0.0004 | 4.84 | 20560 | 0.0690 |
| 0.0003 | 4.84 | 20570 | 0.0695 |
| 0.0002 | 4.84 | 20580 | 0.0699 |
| 0.3445 | 4.84 | 20590 | 0.0695 |
| 0.221 | 4.85 | 20600 | 0.0696 |
| 0.0002 | 4.85 | 20610 | 0.0695 |
| 0.0002 | 4.85 | 20620 | 0.0696 |
| 0.0002 | 4.85 | 20630 | 0.0699 |
| 0.0002 | 4.86 | 20640 | 0.0703 |
| 0.0002 | 4.86 | 20650 | 0.0707 |
| 0.0002 | 4.86 | 20660 | 0.0711 |
| 0.2612 | 4.86 | 20670 | 0.0711 |
| 0.019 | 4.87 | 20680 | 0.0713 |
| 0.0163 | 4.87 | 20690 | 0.0720 |
| 0.0007 | 4.87 | 20700 | 0.0720 |
| 0.2625 | 4.87 | 20710 | 0.0706 |
| 0.0142 | 4.88 | 20720 | 0.0703 |
| 0.4807 | 4.88 | 20730 | 0.0687 |
| 0.0018 | 4.88 | 20740 | 0.0674 |
| 0.0002 | 4.88 | 20750 | 0.0672 |
| 0.0002 | 4.88 | 20760 | 0.0674 |
| 0.4013 | 4.89 | 20770 | 0.0652 |
| 0.6143 | 4.89 | 20780 | 0.0638 |
| 0.228 | 4.89 | 20790 | 0.0628 |
| 0.0531 | 4.89 | 20800 | 0.0634 |
| 0.2475 | 4.9 | 20810 | 0.0638 |
| 0.2775 | 4.9 | 20820 | 0.0630 |
| 0.258 | 4.9 | 20830 | 0.0617 |
| 0.0004 | 4.9 | 20840 | 0.0618 |
| 0.0007 | 4.91 | 20850 | 0.0629 |
| 0.0008 | 4.91 | 20860 | 0.0640 |
| 0.1329 | 4.91 | 20870 | 0.0648 |
| 0.3393 | 4.91 | 20880 | 0.0639 |
| 0.2696 | 4.92 | 20890 | 0.0623 |
| 0.0004 | 4.92 | 20900 | 0.0622 |
| 0.2839 | 4.92 | 20910 | 0.0622 |
| 0.2764 | 4.92 | 20920 | 0.0624 |
| 0.1733 | 4.92 | 20930 | 0.0619 |
| 0.0003 | 4.93 | 20940 | 0.0617 |
| 0.2396 | 4.93 | 20950 | 0.0613 |
| 0.2706 | 4.93 | 20960 | 0.0613 |
| 0.0184 | 4.93 | 20970 | 0.0617 |
| 0.0004 | 4.94 | 20980 | 0.0627 |
| 0.1602 | 4.94 | 20990 | 0.0635 |
| 0.0597 | 4.94 | 21000 | 0.0638 |
| 0.3885 | 4.94 | 21010 | 0.0634 |
| 0.0007 | 4.95 | 21020 | 0.0632 |
| 0.3543 | 4.95 | 21030 | 0.0634 |
| 0.0505 | 4.95 | 21040 | 0.0630 |
| 0.001 | 4.95 | 21050 | 0.0631 |
| 0.2874 | 4.96 | 21060 | 0.0623 |
| 0.316 | 4.96 | 21070 | 0.0618 |
| 0.0006 | 4.96 | 21080 | 0.0619 |
| 0.0007 | 4.96 | 21090 | 0.0627 |
| 0.423 | 4.96 | 21100 | 0.0629 |
| 0.1324 | 4.97 | 21110 | 0.0632 |
| 0.001 | 4.97 | 21120 | 0.0641 |
| 0.0004 | 4.97 | 21130 | 0.0652 |
| 0.0043 | 4.97 | 21140 | 0.0660 |
| 0.0612 | 4.98 | 21150 | 0.0664 |
| 0.1615 | 4.98 | 21160 | 0.0661 |
| 0.002 | 4.98 | 21170 | 0.0665 |
| 0.051 | 4.98 | 21180 | 0.0666 |
| 0.0404 | 4.99 | 21190 | 0.0663 |
| 0.0003 | 4.99 | 21200 | 0.0665 |
| 0.2272 | 4.99 | 21210 | 0.0668 |
| 0.0002 | 4.99 | 21220 | 0.0668 |
| 0.0002 | 5.0 | 21230 | 0.0670 |
| 0.0002 | 5.0 | 21240 | 0.0673 |
| 0.0002 | 5.0 | 21250 | 0.0675 |
| 0.0002 | 5.0 | 21260 | 0.0678 |
| 0.0002 | 5.0 | 21270 | 0.0680 |
| 0.4266 | 5.01 | 21280 | 0.0681 |
| 0.0148 | 5.01 | 21290 | 0.0684 |
| 0.3038 | 5.01 | 21300 | 0.0679 |
| 0.2869 | 5.01 | 21310 | 0.0670 |
| 0.0002 | 5.02 | 21320 | 0.0665 |
| 0.0002 | 5.02 | 21330 | 0.0664 |
| 0.2531 | 5.02 | 21340 | 0.0660 |
| 0.0002 | 5.02 | 21350 | 0.0661 |
| 0.4148 | 5.03 | 21360 | 0.0658 |
| 0.0002 | 5.03 | 21370 | 0.0649 |
| 0.0933 | 5.03 | 21380 | 0.0646 |
| 0.3689 | 5.03 | 21390 | 0.0637 |
| 0.1195 | 5.04 | 21400 | 0.0627 |
| 0.205 | 5.04 | 21410 | 0.0626 |
| 0.0002 | 5.04 | 21420 | 0.0626 |
| 0.0083 | 5.04 | 21430 | 0.0629 |
| 0.0002 | 5.04 | 21440 | 0.0634 |
| 0.0421 | 5.05 | 21450 | 0.0637 |
| 0.0002 | 5.05 | 21460 | 0.0639 |
| 0.0002 | 5.05 | 21470 | 0.0642 |
| 0.0002 | 5.05 | 21480 | 0.0645 |
| 0.0588 | 5.06 | 21490 | 0.0644 |
| 0.0003 | 5.06 | 21500 | 0.0646 |
| 0.0002 | 5.06 | 21510 | 0.0654 |
| 0.0005 | 5.06 | 21520 | 0.0653 |
| 0.0001 | 5.07 | 21530 | 0.0653 |
| 0.3309 | 5.07 | 21540 | 0.0648 |
| 0.0467 | 5.07 | 21550 | 0.0649 |
| 0.0001 | 5.07 | 21560 | 0.0652 |
| 0.0002 | 5.08 | 21570 | 0.0656 |
| 0.0003 | 5.08 | 21580 | 0.0662 |
| 0.007 | 5.08 | 21590 | 0.0667 |
| 0.2656 | 5.08 | 21600 | 0.0667 |
| 0.0001 | 5.08 | 21610 | 0.0667 |
| 0.0002 | 5.09 | 21620 | 0.0669 |
| 0.3021 | 5.09 | 21630 | 0.0672 |
| 0.0817 | 5.09 | 21640 | 0.0678 |
| 0.1222 | 5.09 | 21650 | 0.0677 |
| 0.3446 | 5.1 | 21660 | 0.0675 |
| 0.0382 | 5.1 | 21670 | 0.0670 |
| 0.262 | 5.1 | 21680 | 0.0659 |
| 0.0065 | 5.1 | 21690 | 0.0655 |
| 0.0001 | 5.11 | 21700 | 0.0655 |
| 0.0002 | 5.11 | 21710 | 0.0657 |
| 0.0002 | 5.11 | 21720 | 0.0660 |
| 0.0004 | 5.11 | 21730 | 0.0662 |
| 0.0001 | 5.12 | 21740 | 0.0665 |
| 0.2505 | 5.12 | 21750 | 0.0661 |
| 0.0636 | 5.12 | 21760 | 0.0655 |
| 0.1111 | 5.12 | 21770 | 0.0652 |
| 0.347 | 5.12 | 21780 | 0.0652 |
| 0.0003 | 5.13 | 21790 | 0.0653 |
| 0.0001 | 5.13 | 21800 | 0.0656 |
| 0.2673 | 5.13 | 21810 | 0.0652 |
| 0.1675 | 5.13 | 21820 | 0.0647 |
| 0.0584 | 5.14 | 21830 | 0.0646 |
| 0.0011 | 5.14 | 21840 | 0.0649 |
| 0.0003 | 5.14 | 21850 | 0.0653 |
| 0.0005 | 5.14 | 21860 | 0.0657 |
| 0.0002 | 5.15 | 21870 | 0.0663 |
| 0.0001 | 5.15 | 21880 | 0.0667 |
| 0.0006 | 5.15 | 21890 | 0.0670 |
| 0.0001 | 5.15 | 21900 | 0.0673 |
| 0.0001 | 5.16 | 21910 | 0.0676 |
| 0.0001 | 5.16 | 21920 | 0.0679 |
| 0.319 | 5.16 | 21930 | 0.0676 |
| 0.0031 | 5.16 | 21940 | 0.0674 |
| 0.0485 | 5.16 | 21950 | 0.0678 |
| 0.0945 | 5.17 | 21960 | 0.0684 |
| 0.2948 | 5.17 | 21970 | 0.0674 |
| 0.2156 | 5.17 | 21980 | 0.0663 |
| 0.3084 | 5.17 | 21990 | 0.0654 |
| 0.0001 | 5.18 | 22000 | 0.0650 |
| 0.2667 | 5.18 | 22010 | 0.0650 |
| 0.0004 | 5.18 | 22020 | 0.0645 |
| 0.0002 | 5.18 | 22030 | 0.0649 |
| 0.1518 | 5.19 | 22040 | 0.0651 |
| 0.5584 | 5.19 | 22050 | 0.0633 |
| 0.0003 | 5.19 | 22060 | 0.0631 |
| 0.5666 | 5.19 | 22070 | 0.0631 |
| 0.282 | 5.2 | 22080 | 0.0621 |
| 0.3556 | 5.2 | 22090 | 0.0615 |
| 0.0716 | 5.2 | 22100 | 0.0618 |
| 0.2657 | 5.2 | 22110 | 0.0621 |
| 0.265 | 5.2 | 22120 | 0.0618 |
| 0.001 | 5.21 | 22130 | 0.0620 |
| 0.0003 | 5.21 | 22140 | 0.0625 |
| 0.0005 | 5.21 | 22150 | 0.0631 |
| 0.0002 | 5.21 | 22160 | 0.0638 |
| 0.0009 | 5.22 | 22170 | 0.0642 |
| 0.0002 | 5.22 | 22180 | 0.0646 |
| 0.0002 | 5.22 | 22190 | 0.0650 |
| 0.0001 | 5.22 | 22200 | 0.0654 |
| 0.226 | 5.23 | 22210 | 0.0639 |
| 0.2145 | 5.23 | 22220 | 0.0633 |
| 0.2762 | 5.23 | 22230 | 0.0619 |
| 0.2981 | 5.23 | 22240 | 0.0612 |
| 0.213 | 5.24 | 22250 | 0.0604 |
| 0.0083 | 5.24 | 22260 | 0.0604 |
| 0.0003 | 5.24 | 22270 | 0.0606 |
| 0.0004 | 5.24 | 22280 | 0.0616 |
| 0.0002 | 5.24 | 22290 | 0.0624 |
| 0.2627 | 5.25 | 22300 | 0.0610 |
| 0.3007 | 5.25 | 22310 | 0.0606 |
| 0.0533 | 5.25 | 22320 | 0.0603 |
| 0.0504 | 5.25 | 22330 | 0.0609 |
| 0.2552 | 5.26 | 22340 | 0.0608 |
| 0.2526 | 5.26 | 22350 | 0.0605 |
| 0.0007 | 5.26 | 22360 | 0.0609 |
| 0.0002 | 5.26 | 22370 | 0.0615 |
| 0.6772 | 5.27 | 22380 | 0.0610 |
| 0.1122 | 5.27 | 22390 | 0.0601 |
| 0.2745 | 5.27 | 22400 | 0.0607 |
| 0.2085 | 5.27 | 22410 | 0.0600 |
| 0.0003 | 5.28 | 22420 | 0.0604 |
| 0.2557 | 5.28 | 22430 | 0.0600 |
| 0.0005 | 5.28 | 22440 | 0.0601 |
| 0.2332 | 5.28 | 22450 | 0.0603 |
| 0.5356 | 5.28 | 22460 | 0.0589 |
| 0.1234 | 5.29 | 22470 | 0.0578 |
| 0.0003 | 5.29 | 22480 | 0.0578 |
| 0.0008 | 5.29 | 22490 | 0.0591 |
| 0.0003 | 5.29 | 22500 | 0.0605 |
| 0.0198 | 5.3 | 22510 | 0.0612 |
| 0.1317 | 5.3 | 22520 | 0.0610 |
| 0.0009 | 5.3 | 22530 | 0.0608 |
| 0.0033 | 5.3 | 22540 | 0.0615 |
| 0.0002 | 5.31 | 22550 | 0.0622 |
| 0.2657 | 5.31 | 22560 | 0.0625 |
| 0.0004 | 5.31 | 22570 | 0.0630 |
| 0.2678 | 5.31 | 22580 | 0.0624 |
| 0.0002 | 5.32 | 22590 | 0.0617 |
| 0.0002 | 5.32 | 22600 | 0.0618 |
| 0.0086 | 5.32 | 22610 | 0.0630 |
| 0.3295 | 5.32 | 22620 | 0.0626 |
| 0.0002 | 5.32 | 22630 | 0.0624 |
| 0.0002 | 5.33 | 22640 | 0.0627 |
| 0.0083 | 5.33 | 22650 | 0.0631 |
| 0.2076 | 5.33 | 22660 | 0.0622 |
| 0.1436 | 5.33 | 22670 | 0.0621 |
| 0.0002 | 5.34 | 22680 | 0.0622 |
| 0.0008 | 5.34 | 22690 | 0.0626 |
| 0.0002 | 5.34 | 22700 | 0.0628 |
| 0.0657 | 5.34 | 22710 | 0.0633 |
| 0.0001 | 5.35 | 22720 | 0.0638 |
| 0.0073 | 5.35 | 22730 | 0.0643 |
| 0.2857 | 5.35 | 22740 | 0.0647 |
| 0.2453 | 5.35 | 22750 | 0.0639 |
| 0.0005 | 5.36 | 22760 | 0.0645 |
| 0.0001 | 5.36 | 22770 | 0.0648 |
| 0.0002 | 5.36 | 22780 | 0.0652 |
| 0.0448 | 5.36 | 22790 | 0.0655 |
| 0.4136 | 5.36 | 22800 | 0.0643 |
| 0.0002 | 5.37 | 22810 | 0.0641 |
| 0.2271 | 5.37 | 22820 | 0.0635 |
| 0.0855 | 5.37 | 22830 | 0.0626 |
| 0.1631 | 5.37 | 22840 | 0.0617 |
| 0.2331 | 5.38 | 22850 | 0.0617 |
| 0.262 | 5.38 | 22860 | 0.0601 |
| 0.0002 | 5.38 | 22870 | 0.0598 |
| 0.0086 | 5.38 | 22880 | 0.0606 |
| 0.0247 | 5.39 | 22890 | 0.0616 |
| 0.0003 | 5.39 | 22900 | 0.0625 |
| 0.323 | 5.39 | 22910 | 0.0624 |
| 0.0004 | 5.39 | 22920 | 0.0632 |
| 0.0167 | 5.4 | 22930 | 0.0637 |
| 0.0002 | 5.4 | 22940 | 0.0641 |
| 0.0001 | 5.4 | 22950 | 0.0645 |
| 0.3905 | 5.4 | 22960 | 0.0636 |
| 0.0002 | 5.4 | 22970 | 0.0629 |
| 0.3104 | 5.41 | 22980 | 0.0626 |
| 0.0002 | 5.41 | 22990 | 0.0636 |
| 0.0002 | 5.41 | 23000 | 0.0643 |
| 0.2495 | 5.41 | 23010 | 0.0648 |
| 0.0002 | 5.42 | 23020 | 0.0660 |
| 0.1325 | 5.42 | 23030 | 0.0657 |
| 0.4504 | 5.42 | 23040 | 0.0656 |
| 0.0002 | 5.42 | 23050 | 0.0647 |
| 0.268 | 5.43 | 23060 | 0.0640 |
| 0.354 | 5.43 | 23070 | 0.0621 |
| 0.0002 | 5.43 | 23080 | 0.0615 |
| 0.0002 | 5.43 | 23090 | 0.0616 |
| 0.0009 | 5.44 | 23100 | 0.0621 |
| 0.3933 | 5.44 | 23110 | 0.0626 |
| 0.1017 | 5.44 | 23120 | 0.0621 |
| 0.2107 | 5.44 | 23130 | 0.0622 |
| 0.0002 | 5.44 | 23140 | 0.0622 |
| 0.1798 | 5.45 | 23150 | 0.0616 |
| 0.2826 | 5.45 | 23160 | 0.0605 |
| 0.0003 | 5.45 | 23170 | 0.0603 |
| 0.1468 | 5.45 | 23180 | 0.0609 |
| 0.0015 | 5.46 | 23190 | 0.0614 |
| 0.0207 | 5.46 | 23200 | 0.0625 |
| 0.0002 | 5.46 | 23210 | 0.0633 |
| 0.0002 | 5.46 | 23220 | 0.0638 |
| 0.0002 | 5.47 | 23230 | 0.0643 |
| 0.0001 | 5.47 | 23240 | 0.0647 |
| 0.0001 | 5.47 | 23250 | 0.0650 |
| 0.0003 | 5.47 | 23260 | 0.0654 |
| 0.3301 | 5.48 | 23270 | 0.0661 |
| 0.0008 | 5.48 | 23280 | 0.0661 |
| 0.0003 | 5.48 | 23290 | 0.0666 |
| 0.0001 | 5.48 | 23300 | 0.0669 |
| 0.0001 | 5.48 | 23310 | 0.0672 |
| 0.0011 | 5.49 | 23320 | 0.0682 |
| 0.0001 | 5.49 | 23330 | 0.0690 |
| 0.0046 | 5.49 | 23340 | 0.0695 |
| 0.0004 | 5.49 | 23350 | 0.0698 |
| 0.0003 | 5.5 | 23360 | 0.0702 |
| 0.2797 | 5.5 | 23370 | 0.0692 |
| 0.3092 | 5.5 | 23380 | 0.0685 |
| 0.1698 | 5.5 | 23390 | 0.0682 |
| 0.4403 | 5.51 | 23400 | 0.0669 |
| 0.0714 | 5.51 | 23410 | 0.0651 |
| 0.0184 | 5.51 | 23420 | 0.0642 |
| 0.3414 | 5.51 | 23430 | 0.0642 |
| 0.0399 | 5.52 | 23440 | 0.0638 |
| 0.0002 | 5.52 | 23450 | 0.0644 |
| 0.5444 | 5.52 | 23460 | 0.0633 |
| 0.0002 | 5.52 | 23470 | 0.0630 |
| 0.0026 | 5.52 | 23480 | 0.0632 |
| 0.0009 | 5.53 | 23490 | 0.0641 |
| 0.0002 | 5.53 | 23500 | 0.0647 |
| 0.2522 | 5.53 | 23510 | 0.0643 |
| 0.0002 | 5.53 | 23520 | 0.0642 |
| 0.2738 | 5.54 | 23530 | 0.0637 |
| 0.0001 | 5.54 | 23540 | 0.0637 |
| 0.2607 | 5.54 | 23550 | 0.0634 |
| 0.0003 | 5.54 | 23560 | 0.0636 |
| 0.0283 | 5.55 | 23570 | 0.0638 |
| 0.2983 | 5.55 | 23580 | 0.0632 |
| 0.0002 | 5.55 | 23590 | 0.0629 |
| 0.0008 | 5.55 | 23600 | 0.0632 |
| 0.0002 | 5.56 | 23610 | 0.0638 |
| 0.0146 | 5.56 | 23620 | 0.0642 |
| 0.2374 | 5.56 | 23630 | 0.0640 |
| 0.5717 | 5.56 | 23640 | 0.0616 |
| 0.2747 | 5.56 | 23650 | 0.0599 |
| 0.0003 | 5.57 | 23660 | 0.0589 |
| 0.3285 | 5.57 | 23670 | 0.0588 |
| 0.0003 | 5.57 | 23680 | 0.0583 |
| 0.2673 | 5.57 | 23690 | 0.0584 |
| 0.0009 | 5.58 | 23700 | 0.0599 |
| 0.281 | 5.58 | 23710 | 0.0607 |
| 0.001 | 5.58 | 23720 | 0.0606 |
| 0.0003 | 5.58 | 23730 | 0.0612 |
| 0.0003 | 5.59 | 23740 | 0.0624 |
| 0.1509 | 5.59 | 23750 | 0.0623 |
| 0.2663 | 5.59 | 23760 | 0.0614 |
| 0.0002 | 5.59 | 23770 | 0.0613 |
| 0.0061 | 5.6 | 23780 | 0.0618 |
| 0.0002 | 5.6 | 23790 | 0.0624 |
| 0.0002 | 5.6 | 23800 | 0.0628 |
| 0.0003 | 5.6 | 23810 | 0.0632 |
| 0.0002 | 5.6 | 23820 | 0.0639 |
| 0.0002 | 5.61 | 23830 | 0.0645 |
| 0.302 | 5.61 | 23840 | 0.0637 |
| 0.0002 | 5.61 | 23850 | 0.0634 |
| 0.0002 | 5.61 | 23860 | 0.0637 |
| 0.0556 | 5.62 | 23870 | 0.0641 |
| 0.2719 | 5.62 | 23880 | 0.0650 |
| 0.2352 | 5.62 | 23890 | 0.0637 |
| 0.2769 | 5.62 | 23900 | 0.0614 |
| 0.0003 | 5.63 | 23910 | 0.0610 |
| 0.0025 | 5.63 | 23920 | 0.0614 |
| 0.0003 | 5.63 | 23930 | 0.0621 |
| 0.1148 | 5.63 | 23940 | 0.0623 |
| 0.0018 | 5.64 | 23950 | 0.0619 |
| 0.0001 | 5.64 | 23960 | 0.0620 |
| 0.2308 | 5.64 | 23970 | 0.0617 |
| 0.0001 | 5.64 | 23980 | 0.0614 |
| 0.0251 | 5.64 | 23990 | 0.0616 |
| 0.082 | 5.65 | 24000 | 0.0619 |
| 0.0001 | 5.65 | 24010 | 0.0621 |
| 0.0002 | 5.65 | 24020 | 0.0624 |
| 0.0927 | 5.65 | 24030 | 0.0628 |
| 0.0627 | 5.66 | 24040 | 0.0630 |
| 0.0001 | 5.66 | 24050 | 0.0634 |
| 0.0003 | 5.66 | 24060 | 0.0637 |
| 0.21 | 5.66 | 24070 | 0.0639 |
| 0.0001 | 5.67 | 24080 | 0.0640 |
| 0.5313 | 5.67 | 24090 | 0.0617 |
| 0.0002 | 5.67 | 24100 | 0.0611 |
| 0.2681 | 5.67 | 24110 | 0.0604 |
| 0.2902 | 5.68 | 24120 | 0.0593 |
| 0.001 | 5.68 | 24130 | 0.0600 |
| 0.2532 | 5.68 | 24140 | 0.0595 |
| 0.0004 | 5.68 | 24150 | 0.0597 |
| 0.0002 | 5.68 | 24160 | 0.0601 |
| 0.0765 | 5.69 | 24170 | 0.0609 |
| 0.0426 | 5.69 | 24180 | 0.0621 |
| 0.0001 | 5.69 | 24190 | 0.0621 |
| 0.0002 | 5.69 | 24200 | 0.0626 |
| 0.0237 | 5.7 | 24210 | 0.0637 |
| 0.0001 | 5.7 | 24220 | 0.0643 |
| 0.0001 | 5.7 | 24230 | 0.0648 |
| 0.0004 | 5.7 | 24240 | 0.0654 |
| 0.0001 | 5.71 | 24250 | 0.0658 |
| 0.0002 | 5.71 | 24260 | 0.0662 |
| 0.0001 | 5.71 | 24270 | 0.0668 |
| 0.0001 | 5.71 | 24280 | 0.0675 |
| 0.0001 | 5.72 | 24290 | 0.0679 |
| 0.0001 | 5.72 | 24300 | 0.0683 |
| 0.1787 | 5.72 | 24310 | 0.0675 |
| 0.0002 | 5.72 | 24320 | 0.0673 |
| 0.0002 | 5.72 | 24330 | 0.0672 |
| 0.0002 | 5.73 | 24340 | 0.0672 |
| 0.0005 | 5.73 | 24350 | 0.0682 |
| 0.0001 | 5.73 | 24360 | 0.0687 |
| 0.3098 | 5.73 | 24370 | 0.0681 |
| 0.1171 | 5.74 | 24380 | 0.0669 |
| 0.053 | 5.74 | 24390 | 0.0663 |
| 0.0003 | 5.74 | 24400 | 0.0659 |
| 0.3425 | 5.74 | 24410 | 0.0656 |
| 0.0079 | 5.75 | 24420 | 0.0655 |
| 0.0001 | 5.75 | 24430 | 0.0658 |
| 0.0002 | 5.75 | 24440 | 0.0663 |
| 0.0007 | 5.75 | 24450 | 0.0668 |
| 0.0026 | 5.76 | 24460 | 0.0660 |
| 0.0902 | 5.76 | 24470 | 0.0656 |
| 0.218 | 5.76 | 24480 | 0.0653 |
| 0.0001 | 5.76 | 24490 | 0.0647 |
| 0.0001 | 5.76 | 24500 | 0.0648 |
| 0.0001 | 5.77 | 24510 | 0.0650 |
| 0.0001 | 5.77 | 24520 | 0.0652 |
| 0.0536 | 5.77 | 24530 | 0.0652 |
| 0.3214 | 5.77 | 24540 | 0.0635 |
| 0.2005 | 5.78 | 24550 | 0.0626 |
| 0.0002 | 5.78 | 24560 | 0.0621 |
| 0.2983 | 5.78 | 24570 | 0.0610 |
| 0.1035 | 5.78 | 24580 | 0.0608 |
| 0.0001 | 5.79 | 24590 | 0.0610 |
| 0.0001 | 5.79 | 24600 | 0.0613 |
| 0.0548 | 5.79 | 24610 | 0.0619 |
| 0.1026 | 5.79 | 24620 | 0.0631 |
| 0.0034 | 5.8 | 24630 | 0.0646 |
| 0.0001 | 5.8 | 24640 | 0.0656 |
| 0.2627 | 5.8 | 24650 | 0.0654 |
| 0.326 | 5.8 | 24660 | 0.0634 |
| 0.0006 | 5.8 | 24670 | 0.0627 |
| 0.2504 | 5.81 | 24680 | 0.0612 |
| 0.0021 | 5.81 | 24690 | 0.0603 |
| 0.2227 | 5.81 | 24700 | 0.0585 |
| 0.0002 | 5.81 | 24710 | 0.0583 |
| 0.2374 | 5.82 | 24720 | 0.0580 |
| 0.002 | 5.82 | 24730 | 0.0578 |
| 0.2897 | 5.82 | 24740 | 0.0573 |
| 0.0004 | 5.82 | 24750 | 0.0565 |
| 0.3698 | 5.83 | 24760 | 0.0551 |
| 0.2787 | 5.83 | 24770 | 0.0541 |
| 0.0003 | 5.83 | 24780 | 0.0543 |
| 0.0006 | 5.83 | 24790 | 0.0550 |
| 0.4419 | 5.84 | 24800 | 0.0543 |
| 0.0001 | 5.84 | 24810 | 0.0538 |
| 0.0724 | 5.84 | 24820 | 0.0541 |
| 0.4743 | 5.84 | 24830 | 0.0527 |
| 0.0399 | 5.84 | 24840 | 0.0519 |
| 0.2269 | 5.85 | 24850 | 0.0522 |
| 0.0797 | 5.85 | 24860 | 0.0526 |
| 0.0584 | 5.85 | 24870 | 0.0540 |
| 0.0033 | 5.85 | 24880 | 0.0559 |
| 0.1589 | 5.86 | 24890 | 0.0563 |
| 0.0125 | 5.86 | 24900 | 0.0567 |
| 0.0005 | 5.86 | 24910 | 0.0579 |
| 0.0001 | 5.86 | 24920 | 0.0589 |
| 0.0713 | 5.87 | 24930 | 0.0595 |
| 0.3017 | 5.87 | 24940 | 0.0593 |
| 0.2625 | 5.87 | 24950 | 0.0589 |
| 0.2259 | 5.87 | 24960 | 0.0572 |
| 0.0004 | 5.88 | 24970 | 0.0567 |
| 0.3819 | 5.88 | 24980 | 0.0563 |
| 0.0002 | 5.88 | 24990 | 0.0563 |
| 0.0002 | 5.88 | 25000 | 0.0566 |
| 0.6253 | 5.88 | 25010 | 0.0564 |
| 0.3072 | 5.89 | 25020 | 0.0540 |
| 0.0005 | 5.89 | 25030 | 0.0536 |
| 0.2971 | 5.89 | 25040 | 0.0533 |
| 0.1072 | 5.89 | 25050 | 0.0532 |
| 0.1462 | 5.9 | 25060 | 0.0528 |
| 0.0406 | 5.9 | 25070 | 0.0537 |
| 0.2047 | 5.9 | 25080 | 0.0536 |
| 0.3569 | 5.9 | 25090 | 0.0532 |
| 0.0895 | 5.91 | 25100 | 0.0526 |
| 0.0006 | 5.91 | 25110 | 0.0529 |
| 0.0005 | 5.91 | 25120 | 0.0539 |
| 0.2767 | 5.91 | 25130 | 0.0541 |
| 0.2691 | 5.92 | 25140 | 0.0538 |
| 0.0007 | 5.92 | 25150 | 0.0542 |
| 0.0004 | 5.92 | 25160 | 0.0552 |
| 0.0065 | 5.92 | 25170 | 0.0569 |
| 0.526 | 5.92 | 25180 | 0.0574 |
| 0.1924 | 5.93 | 25190 | 0.0554 |
| 0.0002 | 5.93 | 25200 | 0.0550 |
| 0.0004 | 5.93 | 25210 | 0.0554 |
| 0.0991 | 5.93 | 25220 | 0.0548 |
| 0.0003 | 5.94 | 25230 | 0.0546 |
| 0.3305 | 5.94 | 25240 | 0.0541 |
| 0.2446 | 5.94 | 25250 | 0.0531 |
| 0.207 | 5.94 | 25260 | 0.0522 |
| 0.4335 | 5.95 | 25270 | 0.0521 |
| 0.0003 | 5.95 | 25280 | 0.0523 |
| 0.0002 | 5.95 | 25290 | 0.0526 |
| 0.0003 | 5.95 | 25300 | 0.0535 |
| 0.0002 | 5.96 | 25310 | 0.0542 |
| 0.263 | 5.96 | 25320 | 0.0538 |
| 0.0003 | 5.96 | 25330 | 0.0538 |
| 0.0002 | 5.96 | 25340 | 0.0541 |
| 0.2646 | 5.96 | 25350 | 0.0540 |
| 0.1972 | 5.97 | 25360 | 0.0535 |
| 0.0003 | 5.97 | 25370 | 0.0535 |
| 0.3428 | 5.97 | 25380 | 0.0534 |
| 0.0777 | 5.97 | 25390 | 0.0548 |
| 0.014 | 5.98 | 25400 | 0.0554 |
| 0.0006 | 5.98 | 25410 | 0.0562 |
| 0.0001 | 5.98 | 25420 | 0.0569 |
| 0.0879 | 5.98 | 25430 | 0.0572 |
| 0.0004 | 5.99 | 25440 | 0.0567 |
| 0.0002 | 5.99 | 25450 | 0.0575 |
| 0.0874 | 5.99 | 25460 | 0.0580 |
| 0.3303 | 5.99 | 25470 | 0.0581 |
| 0.2652 | 6.0 | 25480 | 0.0579 |
| 0.0001 | 6.0 | 25490 | 0.0575 |
| 0.0002 | 6.0 | 25500 | 0.0575 |
| 0.0325 | 6.0 | 25510 | 0.0576 |
| 0.0001 | 6.0 | 25520 | 0.0577 |
| 0.2292 | 6.01 | 25530 | 0.0575 |
| 0.3731 | 6.01 | 25540 | 0.0565 |
| 0.0002 | 6.01 | 25550 | 0.0565 |
| 0.0002 | 6.01 | 25560 | 0.0569 |
| 0.2883 | 6.02 | 25570 | 0.0565 |
| 0.0002 | 6.02 | 25580 | 0.0568 |
| 0.0002 | 6.02 | 25590 | 0.0573 |
| 0.0032 | 6.02 | 25600 | 0.0580 |
| 0.0086 | 6.03 | 25610 | 0.0587 |
| 0.0002 | 6.03 | 25620 | 0.0592 |
| 0.0002 | 6.03 | 25630 | 0.0597 |
| 0.1354 | 6.03 | 25640 | 0.0600 |
| 0.2878 | 6.04 | 25650 | 0.0597 |
| 0.0779 | 6.04 | 25660 | 0.0598 |
| 0.0001 | 6.04 | 25670 | 0.0599 |
| 0.0001 | 6.04 | 25680 | 0.0601 |
| 0.0052 | 6.04 | 25690 | 0.0606 |
| 0.0182 | 6.05 | 25700 | 0.0602 |
| 0.2648 | 6.05 | 25710 | 0.0581 |
| 0.0002 | 6.05 | 25720 | 0.0577 |
| 0.0001 | 6.05 | 25730 | 0.0577 |
| 0.0002 | 6.06 | 25740 | 0.0582 |
| 0.2793 | 6.06 | 25750 | 0.0574 |
| 0.0005 | 6.06 | 25760 | 0.0565 |
| 0.1035 | 6.06 | 25770 | 0.0564 |
| 0.0504 | 6.07 | 25780 | 0.0564 |
| 0.0001 | 6.07 | 25790 | 0.0567 |
| 0.0001 | 6.07 | 25800 | 0.0570 |
| 0.2919 | 6.07 | 25810 | 0.0562 |
| 0.0002 | 6.08 | 25820 | 0.0559 |
| 0.0002 | 6.08 | 25830 | 0.0562 |
| 0.1047 | 6.08 | 25840 | 0.0565 |
| 0.0001 | 6.08 | 25850 | 0.0568 |
| 0.0273 | 6.08 | 25860 | 0.0570 |
| 0.0002 | 6.09 | 25870 | 0.0571 |
| 0.0006 | 6.09 | 25880 | 0.0582 |
| 0.0002 | 6.09 | 25890 | 0.0601 |
| 0.0002 | 6.09 | 25900 | 0.0618 |
| 0.0001 | 6.1 | 25910 | 0.0626 |
| 0.0001 | 6.1 | 25920 | 0.0629 |
| 0.0013 | 6.1 | 25930 | 0.0636 |
| 0.1981 | 6.1 | 25940 | 0.0625 |
| 0.3022 | 6.11 | 25950 | 0.0612 |
| 0.0002 | 6.11 | 25960 | 0.0609 |
| 0.2947 | 6.11 | 25970 | 0.0594 |
| 0.0656 | 6.11 | 25980 | 0.0581 |
| 0.0003 | 6.12 | 25990 | 0.0574 |
| 0.0802 | 6.12 | 26000 | 0.0581 |
| 0.0001 | 6.12 | 26010 | 0.0588 |
| 0.6026 | 6.12 | 26020 | 0.0587 |
| 0.0002 | 6.12 | 26030 | 0.0582 |
| 0.0076 | 6.13 | 26040 | 0.0581 |
| 0.4531 | 6.13 | 26050 | 0.0578 |
| 0.263 | 6.13 | 26060 | 0.0569 |
| 0.5859 | 6.13 | 26070 | 0.0552 |
| 0.0006 | 6.14 | 26080 | 0.0552 |
| 0.2937 | 6.14 | 26090 | 0.0550 |
| 0.2496 | 6.14 | 26100 | 0.0547 |
| 0.0003 | 6.14 | 26110 | 0.0546 |
| 0.0004 | 6.15 | 26120 | 0.0550 |
| 0.2261 | 6.15 | 26130 | 0.0552 |
| 0.0003 | 6.15 | 26140 | 0.0555 |
| 0.0002 | 6.15 | 26150 | 0.0561 |
| 0.2307 | 6.16 | 26160 | 0.0565 |
| 0.6161 | 6.16 | 26170 | 0.0547 |
| 0.0003 | 6.16 | 26180 | 0.0536 |
| 0.0012 | 6.16 | 26190 | 0.0536 |
| 0.2418 | 6.16 | 26200 | 0.0546 |
| 0.0008 | 6.17 | 26210 | 0.0551 |
| 0.5149 | 6.17 | 26220 | 0.0546 |
| 0.1216 | 6.17 | 26230 | 0.0535 |
| 0.0003 | 6.17 | 26240 | 0.0537 |
| 0.0002 | 6.18 | 26250 | 0.0541 |
| 0.1804 | 6.18 | 26260 | 0.0531 |
| 0.0118 | 6.18 | 26270 | 0.0533 |
| 0.0002 | 6.18 | 26280 | 0.0537 |
| 0.0002 | 6.19 | 26290 | 0.0540 |
| 0.231 | 6.19 | 26300 | 0.0534 |
| 0.0003 | 6.19 | 26310 | 0.0534 |
| 0.0007 | 6.19 | 26320 | 0.0541 |
| 0.0002 | 6.2 | 26330 | 0.0546 |
| 0.371 | 6.2 | 26340 | 0.0543 |
| 0.0001 | 6.2 | 26350 | 0.0544 |
| 0.0042 | 6.2 | 26360 | 0.0548 |
| 0.001 | 6.2 | 26370 | 0.0561 |
| 0.0456 | 6.21 | 26380 | 0.0567 |
| 0.1436 | 6.21 | 26390 | 0.0569 |
| 0.0001 | 6.21 | 26400 | 0.0570 |
| 0.0002 | 6.21 | 26410 | 0.0572 |
| 0.0001 | 6.22 | 26420 | 0.0575 |
| 0.0005 | 6.22 | 26430 | 0.0577 |
| 0.092 | 6.22 | 26440 | 0.0584 |
| 0.0001 | 6.22 | 26450 | 0.0587 |
| 0.0001 | 6.23 | 26460 | 0.0589 |
| 0.0001 | 6.23 | 26470 | 0.0593 |
| 0.2973 | 6.23 | 26480 | 0.0583 |
| 0.5598 | 6.23 | 26490 | 0.0572 |
| 0.2772 | 6.24 | 26500 | 0.0554 |
| 0.5205 | 6.24 | 26510 | 0.0541 |
| 0.5284 | 6.24 | 26520 | 0.0517 |
| 0.0004 | 6.24 | 26530 | 0.0515 |
| 0.0008 | 6.24 | 26540 | 0.0529 |
| 0.2449 | 6.25 | 26550 | 0.0537 |
| 0.0003 | 6.25 | 26560 | 0.0544 |
| 0.0004 | 6.25 | 26570 | 0.0552 |
| 0.0003 | 6.25 | 26580 | 0.0559 |
| 0.0002 | 6.26 | 26590 | 0.0565 |
| 0.0002 | 6.26 | 26600 | 0.0570 |
| 0.0002 | 6.26 | 26610 | 0.0577 |
| 0.1426 | 6.26 | 26620 | 0.0575 |
| 0.2664 | 6.27 | 26630 | 0.0572 |
| 0.0001 | 6.27 | 26640 | 0.0569 |
| 0.0001 | 6.27 | 26650 | 0.0570 |
| 0.0813 | 6.27 | 26660 | 0.0572 |
| 0.0001 | 6.28 | 26670 | 0.0572 |
| 0.0852 | 6.28 | 26680 | 0.0569 |
| 0.5738 | 6.28 | 26690 | 0.0558 |
| 0.008 | 6.28 | 26700 | 0.0543 |
| 0.4489 | 6.28 | 26710 | 0.0528 |
| 0.225 | 6.29 | 26720 | 0.0510 |
| 0.0004 | 6.29 | 26730 | 0.0505 |
| 0.0002 | 6.29 | 26740 | 0.0509 |
| 0.2489 | 6.29 | 26750 | 0.0510 |
| 0.2375 | 6.3 | 26760 | 0.0506 |
| 0.0005 | 6.3 | 26770 | 0.0514 |
| 0.0315 | 6.3 | 26780 | 0.0525 |
| 0.3102 | 6.3 | 26790 | 0.0529 |
| 0.0003 | 6.31 | 26800 | 0.0533 |
| 0.0009 | 6.31 | 26810 | 0.0539 |
| 0.5299 | 6.31 | 26820 | 0.0535 |
| 0.0004 | 6.31 | 26830 | 0.0532 |
| 0.0002 | 6.32 | 26840 | 0.0536 |
| 0.3144 | 6.32 | 26850 | 0.0538 |
| 0.1267 | 6.32 | 26860 | 0.0534 |
| 0.002 | 6.32 | 26870 | 0.0532 |
| 0.0001 | 6.32 | 26880 | 0.0530 |
| 0.0133 | 6.33 | 26890 | 0.0546 |
| 0.0601 | 6.33 | 26900 | 0.0557 |
| 0.1818 | 6.33 | 26910 | 0.0561 |
| 0.0003 | 6.33 | 26920 | 0.0566 |
| 0.0149 | 6.34 | 26930 | 0.0571 |
| 0.0004 | 6.34 | 26940 | 0.0581 |
| 0.0002 | 6.34 | 26950 | 0.0588 |
| 0.5245 | 6.34 | 26960 | 0.0576 |
| 0.0001 | 6.35 | 26970 | 0.0572 |
| 0.0001 | 6.35 | 26980 | 0.0572 |
| 0.0001 | 6.35 | 26990 | 0.0574 |
| 0.2969 | 6.35 | 27000 | 0.0569 |
| 0.0001 | 6.36 | 27010 | 0.0565 |
| 0.2489 | 6.36 | 27020 | 0.0564 |
| 0.0002 | 6.36 | 27030 | 0.0563 |
| 0.1583 | 6.36 | 27040 | 0.0568 |
| 0.1698 | 6.36 | 27050 | 0.0574 |
| 0.103 | 6.37 | 27060 | 0.0579 |
| 0.0001 | 6.37 | 27070 | 0.0580 |
| 0.0001 | 6.37 | 27080 | 0.0582 |
| 0.0414 | 6.37 | 27090 | 0.0585 |
| 0.0127 | 6.38 | 27100 | 0.0591 |
| 0.3489 | 6.38 | 27110 | 0.0593 |
| 0.0158 | 6.38 | 27120 | 0.0595 |
| 0.0007 | 6.38 | 27130 | 0.0594 |
| 0.3808 | 6.39 | 27140 | 0.0588 |
| 0.295 | 6.39 | 27150 | 0.0566 |
| 0.1748 | 6.39 | 27160 | 0.0547 |
| 0.0003 | 6.39 | 27170 | 0.0542 |
| 0.0003 | 6.4 | 27180 | 0.0545 |
| 0.0002 | 6.4 | 27190 | 0.0552 |
| 0.0001 | 6.4 | 27200 | 0.0557 |
| 0.0469 | 6.4 | 27210 | 0.0562 |
| 0.0002 | 6.4 | 27220 | 0.0567 |
| 0.0001 | 6.41 | 27230 | 0.0571 |
| 0.2516 | 6.41 | 27240 | 0.0563 |
| 0.0002 | 6.41 | 27250 | 0.0563 |
| 0.0002 | 6.41 | 27260 | 0.0566 |
| 0.4267 | 6.42 | 27270 | 0.0563 |
| 0.0002 | 6.42 | 27280 | 0.0554 |
| 0.0001 | 6.42 | 27290 | 0.0553 |
| 0.0002 | 6.42 | 27300 | 0.0555 |
| 0.0016 | 6.43 | 27310 | 0.0559 |
| 0.5449 | 6.43 | 27320 | 0.0550 |
| 0.2438 | 6.43 | 27330 | 0.0541 |
| 0.0818 | 6.43 | 27340 | 0.0536 |
| 0.0002 | 6.44 | 27350 | 0.0538 |
| 0.0002 | 6.44 | 27360 | 0.0542 |
| 0.0005 | 6.44 | 27370 | 0.0544 |
| 0.0003 | 6.44 | 27380 | 0.0546 |
| 0.1133 | 6.44 | 27390 | 0.0552 |
| 0.3364 | 6.45 | 27400 | 0.0559 |
| 0.2442 | 6.45 | 27410 | 0.0567 |
| 0.359 | 6.45 | 27420 | 0.0552 |
| 0.1108 | 6.45 | 27430 | 0.0552 |
| 0.2541 | 6.46 | 27440 | 0.0548 |
| 0.0036 | 6.46 | 27450 | 0.0546 |
| 0.014 | 6.46 | 27460 | 0.0539 |
| 0.1655 | 6.46 | 27470 | 0.0538 |
| 0.0002 | 6.47 | 27480 | 0.0538 |
| 0.0001 | 6.47 | 27490 | 0.0539 |
| 0.0002 | 6.47 | 27500 | 0.0545 |
| 0.2151 | 6.47 | 27510 | 0.0543 |
| 0.0001 | 6.48 | 27520 | 0.0533 |
| 0.3645 | 6.48 | 27530 | 0.0522 |
| 0.1165 | 6.48 | 27540 | 0.0526 |
| 0.1532 | 6.48 | 27550 | 0.0529 |
| 0.0001 | 6.48 | 27560 | 0.0531 |
| 0.0005 | 6.49 | 27570 | 0.0538 |
| 0.0975 | 6.49 | 27580 | 0.0548 |
| 0.0001 | 6.49 | 27590 | 0.0554 |
| 0.5578 | 6.49 | 27600 | 0.0547 |
| 0.0001 | 6.5 | 27610 | 0.0537 |
| 0.2634 | 6.5 | 27620 | 0.0534 |
| 0.209 | 6.5 | 27630 | 0.0520 |
| 0.3746 | 6.5 | 27640 | 0.0518 |
| 0.0002 | 6.51 | 27650 | 0.0524 |
| 0.028 | 6.51 | 27660 | 0.0533 |
| 0.3168 | 6.51 | 27670 | 0.0532 |
| 0.0002 | 6.51 | 27680 | 0.0535 |
| 0.2905 | 6.52 | 27690 | 0.0532 |
| 0.0827 | 6.52 | 27700 | 0.0529 |
| 0.4484 | 6.52 | 27710 | 0.0530 |
| 0.2204 | 6.52 | 27720 | 0.0521 |
| 0.0002 | 6.52 | 27730 | 0.0520 |
| 0.2748 | 6.53 | 27740 | 0.0527 |
| 0.0002 | 6.53 | 27750 | 0.0551 |
| 0.2363 | 6.53 | 27760 | 0.0546 |
| 0.0001 | 6.53 | 27770 | 0.0545 |
| 0.0002 | 6.54 | 27780 | 0.0549 |
| 0.0028 | 6.54 | 27790 | 0.0555 |
| 0.5124 | 6.54 | 27800 | 0.0528 |
| 0.2686 | 6.54 | 27810 | 0.0496 |
| 0.0005 | 6.55 | 27820 | 0.0487 |
| 0.0004 | 6.55 | 27830 | 0.0495 |
| 0.3161 | 6.55 | 27840 | 0.0499 |
| 0.0004 | 6.55 | 27850 | 0.0497 |
| 0.0176 | 6.56 | 27860 | 0.0501 |
| 0.0002 | 6.56 | 27870 | 0.0504 |
| 0.0051 | 6.56 | 27880 | 0.0509 |
| 0.0011 | 6.56 | 27890 | 0.0514 |
| 0.0001 | 6.56 | 27900 | 0.0519 |
| 0.0002 | 6.57 | 27910 | 0.0523 |
| 0.0002 | 6.57 | 27920 | 0.0527 |
| 0.5372 | 6.57 | 27930 | 0.0516 |
| 0.0003 | 6.57 | 27940 | 0.0507 |
| 0.0002 | 6.58 | 27950 | 0.0506 |
| 0.0003 | 6.58 | 27960 | 0.0513 |
| 0.2959 | 6.58 | 27970 | 0.0503 |
| 0.0164 | 6.58 | 27980 | 0.0503 |
| 0.0002 | 6.59 | 27990 | 0.0505 |
| 0.0002 | 6.59 | 28000 | 0.0510 |
| 0.0003 | 6.59 | 28010 | 0.0516 |
| 0.0001 | 6.59 | 28020 | 0.0526 |
| 0.2698 | 6.6 | 28030 | 0.0524 |
| 0.0002 | 6.6 | 28040 | 0.0523 |
| 0.1328 | 6.6 | 28050 | 0.0515 |
| 0.0002 | 6.6 | 28060 | 0.0511 |
| 0.0001 | 6.6 | 28070 | 0.0512 |
| 0.0876 | 6.61 | 28080 | 0.0512 |
| 0.0001 | 6.61 | 28090 | 0.0513 |
| 0.1229 | 6.61 | 28100 | 0.0513 |
| 0.0425 | 6.61 | 28110 | 0.0524 |
| 0.2585 | 6.62 | 28120 | 0.0527 |
| 0.0001 | 6.62 | 28130 | 0.0518 |
| 0.2753 | 6.62 | 28140 | 0.0513 |
| 0.2384 | 6.62 | 28150 | 0.0508 |
| 0.2402 | 6.63 | 28160 | 0.0498 |
| 0.0005 | 6.63 | 28170 | 0.0491 |
| 0.0001 | 6.63 | 28180 | 0.0489 |
| 0.0026 | 6.63 | 28190 | 0.0490 |
| 0.0003 | 6.64 | 28200 | 0.0497 |
| 0.0002 | 6.64 | 28210 | 0.0503 |
| 0.0002 | 6.64 | 28220 | 0.0508 |
| 0.0001 | 6.64 | 28230 | 0.0512 |
| 0.0952 | 6.64 | 28240 | 0.0514 |
| 0.0002 | 6.65 | 28250 | 0.0519 |
| 0.004 | 6.65 | 28260 | 0.0526 |
| 0.0003 | 6.65 | 28270 | 0.0531 |
| 0.0001 | 6.65 | 28280 | 0.0536 |
| 0.0002 | 6.66 | 28290 | 0.0540 |
| 0.0761 | 6.66 | 28300 | 0.0555 |
| 0.0002 | 6.66 | 28310 | 0.0563 |
| 0.0035 | 6.66 | 28320 | 0.0565 |
| 0.0001 | 6.67 | 28330 | 0.0569 |
| 0.0002 | 6.67 | 28340 | 0.0572 |
| 0.0001 | 6.67 | 28350 | 0.0575 |
| 0.0001 | 6.67 | 28360 | 0.0578 |
| 0.0003 | 6.68 | 28370 | 0.0581 |
| 0.0002 | 6.68 | 28380 | 0.0581 |
| 0.4609 | 6.68 | 28390 | 0.0560 |
| 0.2856 | 6.68 | 28400 | 0.0532 |
| 0.0001 | 6.68 | 28410 | 0.0520 |
| 0.0001 | 6.69 | 28420 | 0.0518 |
| 0.0933 | 6.69 | 28430 | 0.0521 |
| 0.0055 | 6.69 | 28440 | 0.0538 |
| 0.0003 | 6.69 | 28450 | 0.0547 |
| 0.6048 | 6.7 | 28460 | 0.0524 |
| 0.0746 | 6.7 | 28470 | 0.0506 |
| 0.2634 | 6.7 | 28480 | 0.0500 |
| 0.2551 | 6.7 | 28490 | 0.0495 |
| 0.0002 | 6.71 | 28500 | 0.0486 |
| 0.3419 | 6.71 | 28510 | 0.0485 |
| 0.0003 | 6.71 | 28520 | 0.0487 |
| 0.2398 | 6.71 | 28530 | 0.0486 |
| 0.2256 | 6.72 | 28540 | 0.0479 |
| 0.0004 | 6.72 | 28550 | 0.0480 |
| 0.0003 | 6.72 | 28560 | 0.0484 |
| 0.0002 | 6.72 | 28570 | 0.0489 |
| 0.0003 | 6.72 | 28580 | 0.0494 |
| 0.0001 | 6.73 | 28590 | 0.0499 |
| 0.2475 | 6.73 | 28600 | 0.0497 |
| 0.0002 | 6.73 | 28610 | 0.0491 |
| 0.0004 | 6.73 | 28620 | 0.0501 |
| 0.0002 | 6.74 | 28630 | 0.0510 |
| 0.1054 | 6.74 | 28640 | 0.0517 |
| 0.3321 | 6.74 | 28650 | 0.0512 |
| 0.0002 | 6.74 | 28660 | 0.0512 |
| 0.0002 | 6.75 | 28670 | 0.0515 |
| 0.0134 | 6.75 | 28680 | 0.0518 |
| 0.0002 | 6.75 | 28690 | 0.0522 |
| 0.0573 | 6.75 | 28700 | 0.0528 |
| 0.0002 | 6.76 | 28710 | 0.0529 |
| 0.2568 | 6.76 | 28720 | 0.0522 |
| 0.0002 | 6.76 | 28730 | 0.0521 |
| 0.0521 | 6.76 | 28740 | 0.0534 |
| 0.3443 | 6.76 | 28750 | 0.0533 |
| 0.0048 | 6.77 | 28760 | 0.0541 |
| 0.0426 | 6.77 | 28770 | 0.0551 |
| 0.0001 | 6.77 | 28780 | 0.0555 |
| 0.0001 | 6.77 | 28790 | 0.0558 |
| 0.214 | 6.78 | 28800 | 0.0554 |
| 0.3315 | 6.78 | 28810 | 0.0543 |
| 0.0006 | 6.78 | 28820 | 0.0541 |
| 0.0001 | 6.78 | 28830 | 0.0543 |
| 0.2181 | 6.79 | 28840 | 0.0527 |
| 0.1237 | 6.79 | 28850 | 0.0519 |
| 0.0001 | 6.79 | 28860 | 0.0514 |
| 0.0001 | 6.79 | 28870 | 0.0514 |
| 0.0908 | 6.8 | 28880 | 0.0512 |
| 0.2669 | 6.8 | 28890 | 0.0500 |
| 0.0566 | 6.8 | 28900 | 0.0500 |
| 0.2615 | 6.8 | 28910 | 0.0496 |
| 0.0485 | 6.8 | 28920 | 0.0494 |
| 0.0002 | 6.81 | 28930 | 0.0496 |
| 0.024 | 6.81 | 28940 | 0.0506 |
| 0.2629 | 6.81 | 28950 | 0.0516 |
| 0.0001 | 6.81 | 28960 | 0.0512 |
| 0.0002 | 6.82 | 28970 | 0.0513 |
| 0.0002 | 6.82 | 28980 | 0.0516 |
| 0.0002 | 6.82 | 28990 | 0.0519 |
| 0.3004 | 6.82 | 29000 | 0.0508 |
| 0.0002 | 6.83 | 29010 | 0.0508 |
| 0.0002 | 6.83 | 29020 | 0.0512 |
| 0.2702 | 6.83 | 29030 | 0.0505 |
| 0.0644 | 6.83 | 29040 | 0.0499 |
| 0.0001 | 6.84 | 29050 | 0.0496 |
| 0.0002 | 6.84 | 29060 | 0.0499 |
| 0.0001 | 6.84 | 29070 | 0.0505 |
| 0.0001 | 6.84 | 29080 | 0.0509 |
| 0.0001 | 6.84 | 29090 | 0.0512 |
| 0.0003 | 6.85 | 29100 | 0.0515 |
| 0.0001 | 6.85 | 29110 | 0.0518 |
| 0.0553 | 6.85 | 29120 | 0.0513 |
| 0.0001 | 6.85 | 29130 | 0.0510 |
| 0.2735 | 6.86 | 29140 | 0.0505 |
| 0.1126 | 6.86 | 29150 | 0.0498 |
| 0.0001 | 6.86 | 29160 | 0.0498 |
| 0.0004 | 6.86 | 29170 | 0.0496 |
| 0.0001 | 6.87 | 29180 | 0.0497 |
| 0.0696 | 6.87 | 29190 | 0.0496 |
| 0.0001 | 6.87 | 29200 | 0.0497 |
| 0.0001 | 6.87 | 29210 | 0.0499 |
| 0.0001 | 6.88 | 29220 | 0.0502 |
| 0.0002 | 6.88 | 29230 | 0.0510 |
| 0.0393 | 6.88 | 29240 | 0.0512 |
| 0.0004 | 6.88 | 29250 | 0.0519 |
| 0.0001 | 6.88 | 29260 | 0.0519 |
| 0.1645 | 6.89 | 29270 | 0.0513 |
| 0.0004 | 6.89 | 29280 | 0.0515 |
| 0.0001 | 6.89 | 29290 | 0.0517 |
| 0.0002 | 6.89 | 29300 | 0.0521 |
| 0.2275 | 6.9 | 29310 | 0.0520 |
| 0.0001 | 6.9 | 29320 | 0.0513 |
| 0.0612 | 6.9 | 29330 | 0.0511 |
| 0.2963 | 6.9 | 29340 | 0.0505 |
| 0.0373 | 6.91 | 29350 | 0.0504 |
| 0.3092 | 6.91 | 29360 | 0.0503 |
| 0.2425 | 6.91 | 29370 | 0.0496 |
| 0.0002 | 6.91 | 29380 | 0.0494 |
| 0.0001 | 6.92 | 29390 | 0.0496 |
| 0.0003 | 6.92 | 29400 | 0.0503 |
| 0.0003 | 6.92 | 29410 | 0.0515 |
| 0.2243 | 6.92 | 29420 | 0.0522 |
| 0.3104 | 6.92 | 29430 | 0.0518 |
| 0.0002 | 6.93 | 29440 | 0.0517 |
| 0.0002 | 6.93 | 29450 | 0.0520 |
| 0.0002 | 6.93 | 29460 | 0.0525 |
| 0.2674 | 6.93 | 29470 | 0.0527 |
| 0.0002 | 6.94 | 29480 | 0.0521 |
| 0.0001 | 6.94 | 29490 | 0.0521 |
| 0.0288 | 6.94 | 29500 | 0.0525 |
| 0.2743 | 6.94 | 29510 | 0.0527 |
| 0.2418 | 6.95 | 29520 | 0.0518 |
| 0.3013 | 6.95 | 29530 | 0.0508 |
| 0.0002 | 6.95 | 29540 | 0.0501 |
| 0.0007 | 6.95 | 29550 | 0.0508 |
| 0.0001 | 6.96 | 29560 | 0.0522 |
| 0.0533 | 6.96 | 29570 | 0.0527 |
| 0.3493 | 6.96 | 29580 | 0.0519 |
| 0.002 | 6.96 | 29590 | 0.0506 |
| 0.0001 | 6.96 | 29600 | 0.0503 |
| 0.0001 | 6.97 | 29610 | 0.0504 |
| 0.2681 | 6.97 | 29620 | 0.0496 |
| 0.0002 | 6.97 | 29630 | 0.0497 |
| 0.0002 | 6.97 | 29640 | 0.0500 |
| 0.0003 | 6.98 | 29650 | 0.0505 |
| 0.0001 | 6.98 | 29660 | 0.0510 |
| 0.2751 | 6.98 | 29670 | 0.0502 |
| 0.2986 | 6.98 | 29680 | 0.0496 |
| 0.0001 | 6.99 | 29690 | 0.0490 |
| 0.0001 | 6.99 | 29700 | 0.0490 |
| 0.2549 | 6.99 | 29710 | 0.0480 |
| 0.231 | 6.99 | 29720 | 0.0462 |
| 0.0002 | 7.0 | 29730 | 0.0459 |
| 0.2292 | 7.0 | 29740 | 0.0454 |
| 0.3028 | 7.0 | 29750 | 0.0456 |
| 0.0003 | 7.0 | 29760 | 0.0454 |
| 0.0002 | 7.0 | 29770 | 0.0455 |
| 0.0003 | 7.01 | 29780 | 0.0460 |
| 0.2819 | 7.01 | 29790 | 0.0454 |
| 0.0005 | 7.01 | 29800 | 0.0455 |
| 0.0004 | 7.01 | 29810 | 0.0459 |
| 0.0002 | 7.02 | 29820 | 0.0465 |
| 0.0001 | 7.02 | 29830 | 0.0470 |
| 0.0007 | 7.02 | 29840 | 0.0474 |
| 0.1971 | 7.02 | 29850 | 0.0477 |
| 0.2648 | 7.03 | 29860 | 0.0471 |
| 0.2423 | 7.03 | 29870 | 0.0460 |
| 0.0002 | 7.03 | 29880 | 0.0452 |
| 0.2633 | 7.03 | 29890 | 0.0457 |
| 0.2683 | 7.04 | 29900 | 0.0454 |
| 0.2202 | 7.04 | 29910 | 0.0447 |
| 0.0253 | 7.04 | 29920 | 0.0449 |
| 0.018 | 7.04 | 29930 | 0.0453 |
| 0.2205 | 7.04 | 29940 | 0.0456 |
| 0.0002 | 7.05 | 29950 | 0.0451 |
| 0.0002 | 7.05 | 29960 | 0.0451 |
| 0.0001 | 7.05 | 29970 | 0.0454 |
| 0.0001 | 7.05 | 29980 | 0.0456 |
| 0.0007 | 7.06 | 29990 | 0.0464 |
| 0.323 | 7.06 | 30000 | 0.0471 |
| 0.052 | 7.06 | 30010 | 0.0466 |
| 0.0002 | 7.06 | 30020 | 0.0467 |
| 0.193 | 7.07 | 30030 | 0.0462 |
| 0.0001 | 7.07 | 30040 | 0.0460 |
| 0.0002 | 7.07 | 30050 | 0.0461 |
| 0.0001 | 7.07 | 30060 | 0.0462 |
| 0.2232 | 7.08 | 30070 | 0.0456 |
| 0.0001 | 7.08 | 30080 | 0.0452 |
| 0.0002 | 7.08 | 30090 | 0.0454 |
| 0.2974 | 7.08 | 30100 | 0.0452 |
| 0.2136 | 7.08 | 30110 | 0.0456 |
| 0.0002 | 7.09 | 30120 | 0.0455 |
| 0.1061 | 7.09 | 30130 | 0.0460 |
| 0.0002 | 7.09 | 30140 | 0.0469 |
| 0.0009 | 7.09 | 30150 | 0.0479 |
| 0.061 | 7.1 | 30160 | 0.0495 |
| 0.3126 | 7.1 | 30170 | 0.0492 |
| 0.1001 | 7.1 | 30180 | 0.0485 |
| 0.0003 | 7.1 | 30190 | 0.0487 |
| 0.0002 | 7.11 | 30200 | 0.0490 |
| 0.0002 | 7.11 | 30210 | 0.0494 |
| 0.2986 | 7.11 | 30220 | 0.0492 |
| 0.2385 | 7.11 | 30230 | 0.0479 |
| 0.298 | 7.12 | 30240 | 0.0474 |
| 0.0488 | 7.12 | 30250 | 0.0472 |
| 0.0203 | 7.12 | 30260 | 0.0476 |
| 0.3876 | 7.12 | 30270 | 0.0485 |
| 0.2579 | 7.12 | 30280 | 0.0492 |
| 0.2431 | 7.13 | 30290 | 0.0488 |
| 0.0003 | 7.13 | 30300 | 0.0487 |
| 0.0004 | 7.13 | 30310 | 0.0491 |
| 0.248 | 7.13 | 30320 | 0.0482 |
| 0.0003 | 7.14 | 30330 | 0.0484 |
| 0.0084 | 7.14 | 30340 | 0.0490 |
| 0.0002 | 7.14 | 30350 | 0.0499 |
| 0.3566 | 7.14 | 30360 | 0.0497 |
| 0.4205 | 7.15 | 30370 | 0.0486 |
| 0.2262 | 7.15 | 30380 | 0.0470 |
| 0.0003 | 7.15 | 30390 | 0.0465 |
| 0.0003 | 7.15 | 30400 | 0.0469 |
| 0.0029 | 7.16 | 30410 | 0.0476 |
| 0.0002 | 7.16 | 30420 | 0.0480 |
| 0.0002 | 7.16 | 30430 | 0.0485 |
| 0.0013 | 7.16 | 30440 | 0.0493 |
| 0.585 | 7.16 | 30450 | 0.0500 |
| 0.2115 | 7.17 | 30460 | 0.0496 |
| 0.0002 | 7.17 | 30470 | 0.0493 |
| 0.0016 | 7.17 | 30480 | 0.0496 |
| 0.0002 | 7.17 | 30490 | 0.0502 |
| 0.0002 | 7.18 | 30500 | 0.0506 |
| 0.0002 | 7.18 | 30510 | 0.0511 |
| 0.0002 | 7.18 | 30520 | 0.0515 |
| 0.2773 | 7.18 | 30530 | 0.0514 |
| 0.0001 | 7.19 | 30540 | 0.0511 |
| 0.008 | 7.19 | 30550 | 0.0512 |
| 0.0001 | 7.19 | 30560 | 0.0516 |
| 0.6677 | 7.19 | 30570 | 0.0500 |
| 0.2472 | 7.2 | 30580 | 0.0481 |
| 0.0002 | 7.2 | 30590 | 0.0477 |
| 0.0003 | 7.2 | 30600 | 0.0480 |
| 0.0002 | 7.2 | 30610 | 0.0484 |
| 0.0004 | 7.2 | 30620 | 0.0490 |
| 0.3218 | 7.21 | 30630 | 0.0493 |
| 0.0002 | 7.21 | 30640 | 0.0496 |
| 0.0002 | 7.21 | 30650 | 0.0500 |
| 0.0002 | 7.21 | 30660 | 0.0505 |
| 0.0002 | 7.22 | 30670 | 0.0509 |
| 0.0002 | 7.22 | 30680 | 0.0514 |
| 0.0007 | 7.22 | 30690 | 0.0519 |
| 0.0901 | 7.22 | 30700 | 0.0514 |
| 0.0723 | 7.23 | 30710 | 0.0514 |
| 0.0001 | 7.23 | 30720 | 0.0516 |
| 0.1819 | 7.23 | 30730 | 0.0509 |
| 0.0002 | 7.23 | 30740 | 0.0509 |
| 0.0001 | 7.24 | 30750 | 0.0513 |
| 0.0001 | 7.24 | 30760 | 0.0517 |
| 0.0001 | 7.24 | 30770 | 0.0520 |
| 0.2541 | 7.24 | 30780 | 0.0516 |
| 0.0001 | 7.24 | 30790 | 0.0515 |
| 0.0001 | 7.25 | 30800 | 0.0516 |
| 0.0001 | 7.25 | 30810 | 0.0518 |
| 0.0001 | 7.25 | 30820 | 0.0520 |
| 0.0001 | 7.25 | 30830 | 0.0523 |
| 0.4595 | 7.26 | 30840 | 0.0518 |
| 0.0001 | 7.26 | 30850 | 0.0514 |
| 0.2998 | 7.26 | 30860 | 0.0495 |
| 0.0001 | 7.26 | 30870 | 0.0489 |
| 0.2635 | 7.27 | 30880 | 0.0481 |
| 0.3001 | 7.27 | 30890 | 0.0470 |
| 0.0007 | 7.27 | 30900 | 0.0476 |
| 0.0002 | 7.27 | 30910 | 0.0486 |
| 0.0001 | 7.28 | 30920 | 0.0491 |
| 0.0002 | 7.28 | 30930 | 0.0498 |
| 0.0001 | 7.28 | 30940 | 0.0503 |
| 0.3306 | 7.28 | 30950 | 0.0498 |
| 0.0002 | 7.28 | 30960 | 0.0495 |
| 0.2171 | 7.29 | 30970 | 0.0496 |
| 0.0001 | 7.29 | 30980 | 0.0489 |
| 0.0062 | 7.29 | 30990 | 0.0489 |
| 0.0001 | 7.29 | 31000 | 0.0493 |
| 0.0001 | 7.3 | 31010 | 0.0495 |
| 0.0463 | 7.3 | 31020 | 0.0497 |
| 0.2668 | 7.3 | 31030 | 0.0492 |
| 0.0001 | 7.3 | 31040 | 0.0490 |
| 0.0002 | 7.31 | 31050 | 0.0492 |
| 0.0001 | 7.31 | 31060 | 0.0495 |
| 0.0031 | 7.31 | 31070 | 0.0498 |
| 0.0002 | 7.31 | 31080 | 0.0503 |
| 0.1161 | 7.32 | 31090 | 0.0509 |
| 0.0001 | 7.32 | 31100 | 0.0513 |
| 0.2328 | 7.32 | 31110 | 0.0514 |
| 0.0001 | 7.32 | 31120 | 0.0512 |
| 0.0002 | 7.32 | 31130 | 0.0514 |
| 0.2661 | 7.33 | 31140 | 0.0508 |
| 0.018 | 7.33 | 31150 | 0.0504 |
| 0.0001 | 7.33 | 31160 | 0.0504 |
| 0.5262 | 7.33 | 31170 | 0.0487 |
| 0.336 | 7.34 | 31180 | 0.0468 |
| 0.1469 | 7.34 | 31190 | 0.0461 |
| 0.2787 | 7.34 | 31200 | 0.0453 |
| 0.0002 | 7.34 | 31210 | 0.0451 |
| 0.067 | 7.35 | 31220 | 0.0450 |
| 0.0017 | 7.35 | 31230 | 0.0455 |
| 0.0002 | 7.35 | 31240 | 0.0460 |
| 0.0003 | 7.35 | 31250 | 0.0464 |
| 0.0008 | 7.36 | 31260 | 0.0472 |
| 0.0013 | 7.36 | 31270 | 0.0479 |
| 0.0001 | 7.36 | 31280 | 0.0483 |
| 0.2726 | 7.36 | 31290 | 0.0481 |
| 0.1627 | 7.36 | 31300 | 0.0475 |
| 0.0001 | 7.37 | 31310 | 0.0475 |
| 0.0002 | 7.37 | 31320 | 0.0477 |
| 0.0001 | 7.37 | 31330 | 0.0479 |
| 0.0002 | 7.37 | 31340 | 0.0483 |
| 0.203 | 7.38 | 31350 | 0.0486 |
| 0.0001 | 7.38 | 31360 | 0.0485 |
| 0.5793 | 7.38 | 31370 | 0.0474 |
| 0.264 | 7.38 | 31380 | 0.0467 |
| 0.0003 | 7.39 | 31390 | 0.0461 |
| 0.0001 | 7.39 | 31400 | 0.0461 |
| 0.0028 | 7.39 | 31410 | 0.0464 |
| 0.0001 | 7.39 | 31420 | 0.0468 |
| 0.0009 | 7.4 | 31430 | 0.0472 |
| 0.4456 | 7.4 | 31440 | 0.0466 |
| 0.0001 | 7.4 | 31450 | 0.0463 |
| 0.0003 | 7.4 | 31460 | 0.0464 |
| 0.0013 | 7.4 | 31470 | 0.0468 |
| 0.0002 | 7.41 | 31480 | 0.0471 |
| 0.2432 | 7.41 | 31490 | 0.0472 |
| 0.0002 | 7.41 | 31500 | 0.0465 |
| 0.0002 | 7.41 | 31510 | 0.0464 |
| 0.0658 | 7.42 | 31520 | 0.0468 |
| 0.0003 | 7.42 | 31530 | 0.0469 |
| 0.5884 | 7.42 | 31540 | 0.0462 |
| 0.0008 | 7.42 | 31550 | 0.0452 |
| 0.0002 | 7.43 | 31560 | 0.0451 |
| 0.0002 | 7.43 | 31570 | 0.0453 |
| 0.0001 | 7.43 | 31580 | 0.0455 |
| 0.0002 | 7.43 | 31590 | 0.0460 |
| 0.0001 | 7.44 | 31600 | 0.0463 |
| 0.2575 | 7.44 | 31610 | 0.0468 |
| 0.3009 | 7.44 | 31620 | 0.0463 |
| 0.0002 | 7.44 | 31630 | 0.0460 |
| 0.2894 | 7.44 | 31640 | 0.0454 |
| 0.0002 | 7.45 | 31650 | 0.0453 |
| 0.0001 | 7.45 | 31660 | 0.0455 |
| 0.0001 | 7.45 | 31670 | 0.0457 |
| 0.0566 | 7.45 | 31680 | 0.0458 |
| 0.0001 | 7.46 | 31690 | 0.0454 |
| 0.0938 | 7.46 | 31700 | 0.0455 |
| 0.2831 | 7.46 | 31710 | 0.0451 |
| 0.0003 | 7.46 | 31720 | 0.0451 |
| 0.0001 | 7.47 | 31730 | 0.0454 |
| 0.0745 | 7.47 | 31740 | 0.0456 |
| 0.0002 | 7.47 | 31750 | 0.0458 |
| 0.1638 | 7.47 | 31760 | 0.0453 |
| 0.0037 | 7.48 | 31770 | 0.0451 |
| 0.0002 | 7.48 | 31780 | 0.0460 |
| 0.5366 | 7.48 | 31790 | 0.0455 |
| 0.2073 | 7.48 | 31800 | 0.0440 |
| 0.3786 | 7.48 | 31810 | 0.0428 |
| 0.0002 | 7.49 | 31820 | 0.0421 |
| 0.0542 | 7.49 | 31830 | 0.0417 |
| 0.0003 | 7.49 | 31840 | 0.0419 |
| 0.0003 | 7.49 | 31850 | 0.0422 |
| 0.1393 | 7.5 | 31860 | 0.0424 |
| 0.0002 | 7.5 | 31870 | 0.0424 |
| 0.0002 | 7.5 | 31880 | 0.0428 |
| 0.0002 | 7.5 | 31890 | 0.0431 |
| 0.0001 | 7.51 | 31900 | 0.0436 |
| 0.0001 | 7.51 | 31910 | 0.0440 |
| 0.0006 | 7.51 | 31920 | 0.0444 |
| 0.0002 | 7.51 | 31930 | 0.0449 |
| 0.6469 | 7.52 | 31940 | 0.0442 |
| 0.0001 | 7.52 | 31950 | 0.0436 |
| 0.0022 | 7.52 | 31960 | 0.0437 |
| 0.0001 | 7.52 | 31970 | 0.0440 |
| 0.2347 | 7.52 | 31980 | 0.0438 |
| 0.0001 | 7.53 | 31990 | 0.0434 |
| 0.5261 | 7.53 | 32000 | 0.0427 |
| 0.2682 | 7.53 | 32010 | 0.0413 |
| 0.0002 | 7.53 | 32020 | 0.0410 |
| 0.001 | 7.54 | 32030 | 0.0417 |
| 0.2112 | 7.54 | 32040 | 0.0418 |
| 0.0148 | 7.54 | 32050 | 0.0423 |
| 0.0002 | 7.54 | 32060 | 0.0427 |
| 0.0003 | 7.55 | 32070 | 0.0437 |
| 0.0003 | 7.55 | 32080 | 0.0450 |
| 0.2589 | 7.55 | 32090 | 0.0455 |
| 0.1033 | 7.55 | 32100 | 0.0445 |
| 0.0002 | 7.56 | 32110 | 0.0442 |
| 0.3051 | 7.56 | 32120 | 0.0439 |
| 0.0003 | 7.56 | 32130 | 0.0449 |
| 0.0454 | 7.56 | 32140 | 0.0448 |
| 0.0001 | 7.56 | 32150 | 0.0446 |
| 0.0001 | 7.57 | 32160 | 0.0447 |
| 0.0001 | 7.57 | 32170 | 0.0448 |
| 0.2985 | 7.57 | 32180 | 0.0451 |
| 0.0001 | 7.57 | 32190 | 0.0452 |
| 0.0001 | 7.58 | 32200 | 0.0454 |
| 0.0913 | 7.58 | 32210 | 0.0441 |
| 0.1468 | 7.58 | 32220 | 0.0433 |
| 0.0005 | 7.58 | 32230 | 0.0431 |
| 0.0869 | 7.59 | 32240 | 0.0430 |
| 0.182 | 7.59 | 32250 | 0.0423 |
| 0.0001 | 7.59 | 32260 | 0.0423 |
| 0.0001 | 7.59 | 32270 | 0.0426 |
| 0.0001 | 7.6 | 32280 | 0.0428 |
| 0.5049 | 7.6 | 32290 | 0.0434 |
| 0.0001 | 7.6 | 32300 | 0.0432 |
| 0.3267 | 7.6 | 32310 | 0.0432 |
| 0.2928 | 7.6 | 32320 | 0.0426 |
| 0.27 | 7.61 | 32330 | 0.0422 |
| 0.0836 | 7.61 | 32340 | 0.0420 |
| 0.3084 | 7.61 | 32350 | 0.0405 |
| 0.1679 | 7.61 | 32360 | 0.0399 |
| 0.2437 | 7.62 | 32370 | 0.0399 |
| 0.0003 | 7.62 | 32380 | 0.0400 |
| 0.0002 | 7.62 | 32390 | 0.0406 |
| 0.2761 | 7.62 | 32400 | 0.0407 |
| 0.0003 | 7.63 | 32410 | 0.0412 |
| 0.0002 | 7.63 | 32420 | 0.0421 |
| 0.0002 | 7.63 | 32430 | 0.0427 |
| 0.0001 | 7.63 | 32440 | 0.0431 |
| 0.0001 | 7.64 | 32450 | 0.0433 |
| 0.2903 | 7.64 | 32460 | 0.0435 |
| 0.0001 | 7.64 | 32470 | 0.0436 |
| 0.0001 | 7.64 | 32480 | 0.0438 |
| 0.0001 | 7.64 | 32490 | 0.0440 |
| 0.0015 | 7.65 | 32500 | 0.0444 |
| 0.0922 | 7.65 | 32510 | 0.0446 |
| 0.0001 | 7.65 | 32520 | 0.0443 |
| 0.0001 | 7.65 | 32530 | 0.0443 |
| 0.0001 | 7.66 | 32540 | 0.0444 |
| 0.0002 | 7.66 | 32550 | 0.0448 |
| 0.0001 | 7.66 | 32560 | 0.0452 |
| 0.0002 | 7.66 | 32570 | 0.0453 |
| 0.0001 | 7.67 | 32580 | 0.0451 |
| 0.0001 | 7.67 | 32590 | 0.0452 |
| 0.2912 | 7.67 | 32600 | 0.0454 |
| 0.0001 | 7.67 | 32610 | 0.0456 |
| 0.0001 | 7.68 | 32620 | 0.0458 |
| 0.0001 | 7.68 | 32630 | 0.0461 |
| 0.2536 | 7.68 | 32640 | 0.0457 |
| 0.2862 | 7.68 | 32650 | 0.0451 |
| 0.231 | 7.68 | 32660 | 0.0434 |
| 0.3068 | 7.69 | 32670 | 0.0427 |
| 0.1166 | 7.69 | 32680 | 0.0421 |
| 0.0002 | 7.69 | 32690 | 0.0419 |
| 0.0003 | 7.69 | 32700 | 0.0419 |
| 0.0001 | 7.7 | 32710 | 0.0423 |
| 0.08 | 7.7 | 32720 | 0.0423 |
| 0.0927 | 7.7 | 32730 | 0.0419 |
| 0.0001 | 7.7 | 32740 | 0.0420 |
| 0.3026 | 7.71 | 32750 | 0.0423 |
| 0.0607 | 7.71 | 32760 | 0.0415 |
| 0.0706 | 7.71 | 32770 | 0.0416 |
| 0.2612 | 7.71 | 32780 | 0.0420 |
| 0.0002 | 7.72 | 32790 | 0.0423 |
| 0.0112 | 7.72 | 32800 | 0.0427 |
| 0.0001 | 7.72 | 32810 | 0.0430 |
| 0.2266 | 7.72 | 32820 | 0.0426 |
| 0.0778 | 7.72 | 32830 | 0.0424 |
| 0.0002 | 7.73 | 32840 | 0.0426 |
| 0.2184 | 7.73 | 32850 | 0.0428 |
| 0.0426 | 7.73 | 32860 | 0.0423 |
| 0.1817 | 7.73 | 32870 | 0.0422 |
| 0.0001 | 7.74 | 32880 | 0.0423 |
| 0.0326 | 7.74 | 32890 | 0.0423 |
| 0.0001 | 7.74 | 32900 | 0.0425 |
| 0.1868 | 7.74 | 32910 | 0.0429 |
| 0.055 | 7.75 | 32920 | 0.0432 |
| 0.3645 | 7.75 | 32930 | 0.0428 |
| 0.0001 | 7.75 | 32940 | 0.0428 |
| 0.0361 | 7.75 | 32950 | 0.0437 |
| 0.0001 | 7.76 | 32960 | 0.0444 |
| 0.0001 | 7.76 | 32970 | 0.0449 |
| 0.0601 | 7.76 | 32980 | 0.0451 |
| 0.0032 | 7.76 | 32990 | 0.0453 |
| 0.2311 | 7.76 | 33000 | 0.0446 |
| 0.0001 | 7.77 | 33010 | 0.0443 |
| 0.2988 | 7.77 | 33020 | 0.0430 |
| 0.0001 | 7.77 | 33030 | 0.0425 |
| 0.0002 | 7.77 | 33040 | 0.0426 |
| 0.0001 | 7.78 | 33050 | 0.0428 |
| 0.0001 | 7.78 | 33060 | 0.0431 |
| 0.0007 | 7.78 | 33070 | 0.0435 |
| 0.0096 | 7.78 | 33080 | 0.0453 |
| 0.0153 | 7.79 | 33090 | 0.0471 |
| 0.0001 | 7.79 | 33100 | 0.0479 |
| 0.0001 | 7.79 | 33110 | 0.0483 |
| 0.0001 | 7.79 | 33120 | 0.0486 |
| 0.0001 | 7.8 | 33130 | 0.0488 |
| 0.3521 | 7.8 | 33140 | 0.0474 |
| 0.0001 | 7.8 | 33150 | 0.0468 |
| 0.3021 | 7.8 | 33160 | 0.0456 |
| 0.0001 | 7.8 | 33170 | 0.0452 |
| 0.0004 | 7.81 | 33180 | 0.0455 |
| 0.0001 | 7.81 | 33190 | 0.0460 |
| 0.0002 | 7.81 | 33200 | 0.0463 |
| 0.0681 | 7.81 | 33210 | 0.0465 |
| 0.0001 | 7.82 | 33220 | 0.0467 |
| 0.2901 | 7.82 | 33230 | 0.0463 |
| 0.0001 | 7.82 | 33240 | 0.0460 |
| 0.0001 | 7.82 | 33250 | 0.0460 |
| 0.0001 | 7.83 | 33260 | 0.0461 |
| 0.0024 | 7.83 | 33270 | 0.0464 |
| 0.2698 | 7.83 | 33280 | 0.0465 |
| 0.1013 | 7.83 | 33290 | 0.0453 |
| 0.0947 | 7.84 | 33300 | 0.0443 |
| 0.0002 | 7.84 | 33310 | 0.0438 |
| 0.0001 | 7.84 | 33320 | 0.0437 |
| 0.0001 | 7.84 | 33330 | 0.0439 |
| 0.2219 | 7.84 | 33340 | 0.0432 |
| 0.0001 | 7.85 | 33350 | 0.0431 |
| 0.0001 | 7.85 | 33360 | 0.0431 |
| 0.1106 | 7.85 | 33370 | 0.0427 |
| 0.0001 | 7.85 | 33380 | 0.0426 |
| 0.0927 | 7.86 | 33390 | 0.0428 |
| 0.0001 | 7.86 | 33400 | 0.0434 |
| 0.0001 | 7.86 | 33410 | 0.0441 |
| 0.2002 | 7.86 | 33420 | 0.0440 |
| 0.2883 | 7.87 | 33430 | 0.0432 |
| 0.0001 | 7.87 | 33440 | 0.0425 |
| 0.4161 | 7.87 | 33450 | 0.0419 |
| 0.0002 | 7.87 | 33460 | 0.0419 |
| 0.0034 | 7.88 | 33470 | 0.0419 |
| 0.0261 | 7.88 | 33480 | 0.0428 |
| 0.0336 | 7.88 | 33490 | 0.0434 |
| 0.0001 | 7.88 | 33500 | 0.0439 |
| 0.2654 | 7.88 | 33510 | 0.0438 |
| 0.1588 | 7.89 | 33520 | 0.0437 |
| 0.1833 | 7.89 | 33530 | 0.0434 |
| 0.0009 | 7.89 | 33540 | 0.0447 |
| 0.0093 | 7.89 | 33550 | 0.0464 |
| 0.0001 | 7.9 | 33560 | 0.0468 |
| 0.0451 | 7.9 | 33570 | 0.0466 |
| 0.3119 | 7.9 | 33580 | 0.0474 |
| 0.0001 | 7.9 | 33590 | 0.0476 |
| 0.0001 | 7.91 | 33600 | 0.0478 |
| 0.4698 | 7.91 | 33610 | 0.0474 |
| 0.0001 | 7.91 | 33620 | 0.0465 |
| 0.0004 | 7.91 | 33630 | 0.0463 |
| 0.0408 | 7.92 | 33640 | 0.0463 |
| 0.4168 | 7.92 | 33650 | 0.0459 |
| 0.0111 | 7.92 | 33660 | 0.0458 |
| 0.1738 | 7.92 | 33670 | 0.0451 |
| 0.0001 | 7.92 | 33680 | 0.0449 |
| 0.0002 | 7.93 | 33690 | 0.0452 |
| 0.6619 | 7.93 | 33700 | 0.0436 |
| 0.0002 | 7.93 | 33710 | 0.0432 |
| 0.021 | 7.93 | 33720 | 0.0439 |
| 0.0233 | 7.94 | 33730 | 0.0449 |
| 0.0002 | 7.94 | 33740 | 0.0458 |
| 0.0002 | 7.94 | 33750 | 0.0464 |
| 0.0302 | 7.94 | 33760 | 0.0464 |
| 0.0808 | 7.95 | 33770 | 0.0455 |
| 0.0108 | 7.95 | 33780 | 0.0447 |
| 0.2602 | 7.95 | 33790 | 0.0442 |
| 0.2344 | 7.95 | 33800 | 0.0426 |
| 0.0009 | 7.96 | 33810 | 0.0424 |
| 0.0002 | 7.96 | 33820 | 0.0425 |
| 0.2807 | 7.96 | 33830 | 0.0419 |
| 0.0003 | 7.96 | 33840 | 0.0416 |
| 0.2693 | 7.96 | 33850 | 0.0417 |
| 0.0002 | 7.97 | 33860 | 0.0412 |
| 0.0572 | 7.97 | 33870 | 0.0411 |
| 0.1377 | 7.97 | 33880 | 0.0416 |
| 0.0003 | 7.97 | 33890 | 0.0418 |
| 0.0002 | 7.98 | 33900 | 0.0421 |
| 0.2992 | 7.98 | 33910 | 0.0419 |
| 0.0002 | 7.98 | 33920 | 0.0420 |
| 0.0005 | 7.98 | 33930 | 0.0421 |
| 0.0001 | 7.99 | 33940 | 0.0423 |
| 0.3439 | 7.99 | 33950 | 0.0423 |
| 0.0001 | 7.99 | 33960 | 0.0424 |
| 0.0001 | 7.99 | 33970 | 0.0426 |
| 0.0001 | 8.0 | 33980 | 0.0428 |
| 0.0002 | 8.0 | 33990 | 0.0430 |
| 0.2585 | 8.0 | 34000 | 0.0429 |
| 0.0001 | 8.0 | 34010 | 0.0419 |
| 0.2006 | 8.0 | 34020 | 0.0408 |
| 0.0001 | 8.01 | 34030 | 0.0404 |
| 0.2378 | 8.01 | 34040 | 0.0398 |
| 0.0006 | 8.01 | 34050 | 0.0399 |
| 0.0001 | 8.01 | 34060 | 0.0404 |
| 0.0001 | 8.02 | 34070 | 0.0408 |
| 0.269 | 8.02 | 34080 | 0.0403 |
| 0.0001 | 8.02 | 34090 | 0.0402 |
| 0.0001 | 8.02 | 34100 | 0.0402 |
| 0.0642 | 8.03 | 34110 | 0.0399 |
| 0.0001 | 8.03 | 34120 | 0.0399 |
| 0.0001 | 8.03 | 34130 | 0.0400 |
| 0.0002 | 8.03 | 34140 | 0.0403 |
| 0.0001 | 8.04 | 34150 | 0.0405 |
| 0.0008 | 8.04 | 34160 | 0.0407 |
| 0.0001 | 8.04 | 34170 | 0.0413 |
| 0.0001 | 8.04 | 34180 | 0.0418 |
| 0.2442 | 8.04 | 34190 | 0.0436 |
| 0.2383 | 8.05 | 34200 | 0.0443 |
| 0.2858 | 8.05 | 34210 | 0.0441 |
| 0.005 | 8.05 | 34220 | 0.0437 |
| 0.2621 | 8.05 | 34230 | 0.0438 |
| 0.0001 | 8.06 | 34240 | 0.0437 |
| 0.2453 | 8.06 | 34250 | 0.0425 |
| 0.2803 | 8.06 | 34260 | 0.0419 |
| 0.0002 | 8.06 | 34270 | 0.0417 |
| 0.1479 | 8.07 | 34280 | 0.0416 |
| 0.0002 | 8.07 | 34290 | 0.0414 |
| 0.2963 | 8.07 | 34300 | 0.0410 |
| 0.3575 | 8.07 | 34310 | 0.0399 |
| 0.0002 | 8.08 | 34320 | 0.0398 |
| 0.0019 | 8.08 | 34330 | 0.0403 |
| 0.3872 | 8.08 | 34340 | 0.0399 |
| 0.3183 | 8.08 | 34350 | 0.0391 |
| 0.0089 | 8.08 | 34360 | 0.0385 |
| 0.0002 | 8.09 | 34370 | 0.0386 |
| 0.0002 | 8.09 | 34380 | 0.0389 |
| 0.0948 | 8.09 | 34390 | 0.0391 |
| 0.0002 | 8.09 | 34400 | 0.0394 |
| 0.0425 | 8.1 | 34410 | 0.0398 |
| 0.2926 | 8.1 | 34420 | 0.0396 |
| 0.2485 | 8.1 | 34430 | 0.0386 |
| 0.3238 | 8.1 | 34440 | 0.0379 |
| 0.0001 | 8.11 | 34450 | 0.0375 |
| 0.0003 | 8.11 | 34460 | 0.0380 |
| 0.0001 | 8.11 | 34470 | 0.0386 |
| 0.2885 | 8.11 | 34480 | 0.0389 |
| 0.1646 | 8.12 | 34490 | 0.0387 |
| 0.2171 | 8.12 | 34500 | 0.0379 |
| 0.0756 | 8.12 | 34510 | 0.0376 |
| 0.0002 | 8.12 | 34520 | 0.0376 |
| 0.2224 | 8.12 | 34530 | 0.0372 |
| 0.0002 | 8.13 | 34540 | 0.0373 |
| 0.0005 | 8.13 | 34550 | 0.0379 |
| 0.1942 | 8.13 | 34560 | 0.0383 |
| 0.0611 | 8.13 | 34570 | 0.0395 |
| 0.0003 | 8.14 | 34580 | 0.0406 |
| 0.2569 | 8.14 | 34590 | 0.0406 |
| 0.2877 | 8.14 | 34600 | 0.0398 |
| 0.0008 | 8.14 | 34610 | 0.0397 |
| 0.0004 | 8.15 | 34620 | 0.0400 |
| 0.3037 | 8.15 | 34630 | 0.0403 |
| 0.0002 | 8.15 | 34640 | 0.0405 |
| 0.1908 | 8.15 | 34650 | 0.0401 |
| 0.0002 | 8.16 | 34660 | 0.0399 |
| 0.0001 | 8.16 | 34670 | 0.0401 |
| 0.0001 | 8.16 | 34680 | 0.0403 |
| 0.0002 | 8.16 | 34690 | 0.0406 |
| 0.0001 | 8.16 | 34700 | 0.0410 |
| 0.001 | 8.17 | 34710 | 0.0414 |
| 0.0002 | 8.17 | 34720 | 0.0425 |
| 0.0004 | 8.17 | 34730 | 0.0443 |
| 0.3249 | 8.17 | 34740 | 0.0448 |
| 0.3086 | 8.18 | 34750 | 0.0438 |
| 0.0001 | 8.18 | 34760 | 0.0432 |
| 0.2629 | 8.18 | 34770 | 0.0422 |
| 0.0006 | 8.18 | 34780 | 0.0416 |
| 0.2704 | 8.19 | 34790 | 0.0404 |
| 0.0806 | 8.19 | 34800 | 0.0401 |
| 0.0002 | 8.19 | 34810 | 0.0402 |
| 0.0002 | 8.19 | 34820 | 0.0405 |
| 0.0003 | 8.2 | 34830 | 0.0416 |
| 0.0002 | 8.2 | 34840 | 0.0426 |
| 0.0001 | 8.2 | 34850 | 0.0431 |
| 0.3502 | 8.2 | 34860 | 0.0409 |
| 0.0015 | 8.2 | 34870 | 0.0399 |
| 0.0001 | 8.21 | 34880 | 0.0401 |
| 0.0929 | 8.21 | 34890 | 0.0398 |
| 0.0004 | 8.21 | 34900 | 0.0396 |
| 0.0001 | 8.21 | 34910 | 0.0397 |
| 0.0715 | 8.22 | 34920 | 0.0400 |
| 0.0006 | 8.22 | 34930 | 0.0403 |
| 0.0639 | 8.22 | 34940 | 0.0408 |
| 0.0001 | 8.22 | 34950 | 0.0412 |
| 0.0623 | 8.23 | 34960 | 0.0416 |
| 0.2742 | 8.23 | 34970 | 0.0421 |
| 0.0893 | 8.23 | 34980 | 0.0427 |
| 0.0016 | 8.23 | 34990 | 0.0432 |
| 0.336 | 8.24 | 35000 | 0.0433 |
| 0.2583 | 8.24 | 35010 | 0.0424 |
| 0.0912 | 8.24 | 35020 | 0.0425 |
| 0.2893 | 8.24 | 35030 | 0.0431 |
| 0.3202 | 8.24 | 35040 | 0.0405 |
| 0.0533 | 8.25 | 35050 | 0.0396 |
| 0.0001 | 8.25 | 35060 | 0.0395 |
| 0.0001 | 8.25 | 35070 | 0.0395 |
| 0.0017 | 8.25 | 35080 | 0.0400 |
| 0.0013 | 8.26 | 35090 | 0.0407 |
| 0.2515 | 8.26 | 35100 | 0.0409 |
| 0.0924 | 8.26 | 35110 | 0.0409 |
| 0.1972 | 8.26 | 35120 | 0.0406 |
| 0.2751 | 8.27 | 35130 | 0.0393 |
| 0.0001 | 8.27 | 35140 | 0.0387 |
| 0.0293 | 8.27 | 35150 | 0.0387 |
| 0.007 | 8.27 | 35160 | 0.0392 |
| 0.0002 | 8.28 | 35170 | 0.0396 |
| 0.0002 | 8.28 | 35180 | 0.0401 |
| 0.0015 | 8.28 | 35190 | 0.0408 |
| 0.0001 | 8.28 | 35200 | 0.0412 |
| 0.0343 | 8.28 | 35210 | 0.0416 |
| 0.0001 | 8.29 | 35220 | 0.0419 |
| 0.0001 | 8.29 | 35230 | 0.0422 |
| 0.0002 | 8.29 | 35240 | 0.0422 |
| 0.2514 | 8.29 | 35250 | 0.0418 |
| 0.1138 | 8.3 | 35260 | 0.0411 |
| 0.0001 | 8.3 | 35270 | 0.0404 |
| 0.0001 | 8.3 | 35280 | 0.0404 |
| 0.4437 | 8.3 | 35290 | 0.0383 |
| 0.0002 | 8.31 | 35300 | 0.0375 |
| 0.1598 | 8.31 | 35310 | 0.0369 |
| 0.0001 | 8.31 | 35320 | 0.0366 |
| 0.0833 | 8.31 | 35330 | 0.0366 |
| 0.0001 | 8.32 | 35340 | 0.0360 |
| 0.0089 | 8.32 | 35350 | 0.0360 |
| 0.0086 | 8.32 | 35360 | 0.0367 |
| 0.0001 | 8.32 | 35370 | 0.0371 |
| 0.0001 | 8.32 | 35380 | 0.0375 |
| 0.0001 | 8.33 | 35390 | 0.0378 |
| 0.02 | 8.33 | 35400 | 0.0377 |
| 0.079 | 8.33 | 35410 | 0.0382 |
| 0.0007 | 8.33 | 35420 | 0.0390 |
| 0.0001 | 8.34 | 35430 | 0.0398 |
| 0.7161 | 8.34 | 35440 | 0.0392 |
| 0.0001 | 8.34 | 35450 | 0.0384 |
| 0.0005 | 8.34 | 35460 | 0.0390 |
| 0.0006 | 8.35 | 35470 | 0.0399 |
| 0.0001 | 8.35 | 35480 | 0.0404 |
| 0.0001 | 8.35 | 35490 | 0.0407 |
| 0.2055 | 8.35 | 35500 | 0.0401 |
| 0.018 | 8.36 | 35510 | 0.0408 |
| 0.0001 | 8.36 | 35520 | 0.0419 |
| 0.0001 | 8.36 | 35530 | 0.0423 |
| 0.0001 | 8.36 | 35540 | 0.0426 |
| 0.0002 | 8.36 | 35550 | 0.0427 |
| 0.1918 | 8.37 | 35560 | 0.0418 |
| 0.3615 | 8.37 | 35570 | 0.0399 |
| 0.0904 | 8.37 | 35580 | 0.0387 |
| 0.0001 | 8.37 | 35590 | 0.0384 |
| 0.3267 | 8.38 | 35600 | 0.0382 |
| 0.3739 | 8.38 | 35610 | 0.0374 |
| 0.0007 | 8.38 | 35620 | 0.0366 |
| 0.0003 | 8.38 | 35630 | 0.0365 |
| 0.0002 | 8.39 | 35640 | 0.0367 |
| 0.0004 | 8.39 | 35650 | 0.0373 |
| 0.0001 | 8.39 | 35660 | 0.0378 |
| 0.0574 | 8.39 | 35670 | 0.0394 |
| 0.0001 | 8.4 | 35680 | 0.0402 |
| 0.0001 | 8.4 | 35690 | 0.0406 |
| 0.1351 | 8.4 | 35700 | 0.0405 |
| 0.252 | 8.4 | 35710 | 0.0398 |
| 0.1947 | 8.4 | 35720 | 0.0379 |
| 0.2186 | 8.41 | 35730 | 0.0366 |
| 0.0001 | 8.41 | 35740 | 0.0368 |
| 0.0001 | 8.41 | 35750 | 0.0371 |
| 0.0001 | 8.41 | 35760 | 0.0375 |
| 0.0163 | 8.42 | 35770 | 0.0384 |
| 0.1622 | 8.42 | 35780 | 0.0394 |
| 0.0002 | 8.42 | 35790 | 0.0400 |
| 0.3319 | 8.42 | 35800 | 0.0403 |
| 0.0001 | 8.43 | 35810 | 0.0400 |
| 0.1684 | 8.43 | 35820 | 0.0397 |
| 0.0003 | 8.43 | 35830 | 0.0405 |
| 0.5612 | 8.43 | 35840 | 0.0406 |
| 0.0002 | 8.44 | 35850 | 0.0404 |
| 0.0004 | 8.44 | 35860 | 0.0404 |
| 0.2143 | 8.44 | 35870 | 0.0392 |
| 0.0001 | 8.44 | 35880 | 0.0386 |
| 0.0002 | 8.44 | 35890 | 0.0387 |
| 0.0012 | 8.45 | 35900 | 0.0407 |
| 0.0547 | 8.45 | 35910 | 0.0418 |
| 0.1575 | 8.45 | 35920 | 0.0412 |
| 0.0001 | 8.45 | 35930 | 0.0410 |
| 0.0032 | 8.46 | 35940 | 0.0426 |
| 0.0001 | 8.46 | 35950 | 0.0436 |
| 0.0008 | 8.46 | 35960 | 0.0457 |
| 0.2955 | 8.46 | 35970 | 0.0462 |
| 0.0001 | 8.47 | 35980 | 0.0466 |
| 0.0385 | 8.47 | 35990 | 0.0472 |
| 0.5301 | 8.47 | 36000 | 0.0459 |
| 0.1991 | 8.47 | 36010 | 0.0444 |
| 0.1959 | 8.48 | 36020 | 0.0438 |
| 0.0002 | 8.48 | 36030 | 0.0435 |
| 0.0004 | 8.48 | 36040 | 0.0443 |
| 0.2344 | 8.48 | 36050 | 0.0441 |
| 0.3417 | 8.48 | 36060 | 0.0425 |
| 0.0004 | 8.49 | 36070 | 0.0416 |
| 0.0001 | 8.49 | 36080 | 0.0419 |
| 0.0001 | 8.49 | 36090 | 0.0421 |
| 0.002 | 8.49 | 36100 | 0.0425 |
| 0.0001 | 8.5 | 36110 | 0.0430 |
| 0.0001 | 8.5 | 36120 | 0.0433 |
| 0.0001 | 8.5 | 36130 | 0.0435 |
| 0.0001 | 8.5 | 36140 | 0.0437 |
| 0.0176 | 8.51 | 36150 | 0.0441 |
| 0.0001 | 8.51 | 36160 | 0.0444 |
| 0.0001 | 8.51 | 36170 | 0.0446 |
| 0.2364 | 8.51 | 36180 | 0.0445 |
| 0.0001 | 8.52 | 36190 | 0.0438 |
| 0.0002 | 8.52 | 36200 | 0.0439 |
| 0.0001 | 8.52 | 36210 | 0.0442 |
| 0.0411 | 8.52 | 36220 | 0.0444 |
| 0.206 | 8.52 | 36230 | 0.0438 |
| 0.3168 | 8.53 | 36240 | 0.0436 |
| 0.0001 | 8.53 | 36250 | 0.0433 |
| 0.0001 | 8.53 | 36260 | 0.0434 |
| 0.0001 | 8.53 | 36270 | 0.0435 |
| 0.0001 | 8.54 | 36280 | 0.0437 |
| 0.2149 | 8.54 | 36290 | 0.0432 |
| 0.0001 | 8.54 | 36300 | 0.0421 |
| 0.5274 | 8.54 | 36310 | 0.0402 |
| 0.0004 | 8.55 | 36320 | 0.0389 |
| 0.0001 | 8.55 | 36330 | 0.0390 |
| 0.0001 | 8.55 | 36340 | 0.0393 |
| 0.0001 | 8.55 | 36350 | 0.0396 |
| 0.0157 | 8.56 | 36360 | 0.0399 |
| 0.0001 | 8.56 | 36370 | 0.0398 |
| 0.0001 | 8.56 | 36380 | 0.0399 |
| 0.0002 | 8.56 | 36390 | 0.0401 |
| 0.2385 | 8.56 | 36400 | 0.0396 |
| 0.2908 | 8.57 | 36410 | 0.0386 |
| 0.0001 | 8.57 | 36420 | 0.0384 |
| 0.0001 | 8.57 | 36430 | 0.0384 |
| 0.5473 | 8.57 | 36440 | 0.0376 |
| 0.0012 | 8.58 | 36450 | 0.0381 |
| 0.0003 | 8.58 | 36460 | 0.0389 |
| 0.0201 | 8.58 | 36470 | 0.0397 |
| 0.0107 | 8.58 | 36480 | 0.0413 |
| 0.0001 | 8.59 | 36490 | 0.0422 |
| 0.0001 | 8.59 | 36500 | 0.0428 |
| 0.0001 | 8.59 | 36510 | 0.0432 |
| 0.3476 | 8.59 | 36520 | 0.0427 |
| 0.4204 | 8.6 | 36530 | 0.0404 |
| 0.0003 | 8.6 | 36540 | 0.0391 |
| 0.0001 | 8.6 | 36550 | 0.0383 |
| 0.0001 | 8.6 | 36560 | 0.0383 |
| 0.0001 | 8.6 | 36570 | 0.0384 |
| 0.0072 | 8.61 | 36580 | 0.0387 |
| 0.6604 | 8.61 | 36590 | 0.0377 |
| 0.037 | 8.61 | 36600 | 0.0372 |
| 0.0433 | 8.61 | 36610 | 0.0376 |
| 0.0014 | 8.62 | 36620 | 0.0379 |
| 0.0054 | 8.62 | 36630 | 0.0383 |
| 0.0001 | 8.62 | 36640 | 0.0383 |
| 0.1613 | 8.62 | 36650 | 0.0379 |
| 0.0001 | 8.63 | 36660 | 0.0377 |
| 0.2557 | 8.63 | 36670 | 0.0372 |
| 0.2808 | 8.63 | 36680 | 0.0367 |
| 0.4026 | 8.63 | 36690 | 0.0357 |
| 0.0042 | 8.64 | 36700 | 0.0354 |
| 0.2645 | 8.64 | 36710 | 0.0353 |
| 0.0005 | 8.64 | 36720 | 0.0354 |
| 0.0088 | 8.64 | 36730 | 0.0364 |
| 0.0034 | 8.64 | 36740 | 0.0370 |
| 0.0003 | 8.65 | 36750 | 0.0377 |
| 0.2634 | 8.65 | 36760 | 0.0383 |
| 0.0001 | 8.65 | 36770 | 0.0382 |
| 0.0001 | 8.65 | 36780 | 0.0383 |
| 0.0001 | 8.66 | 36790 | 0.0384 |
| 0.2511 | 8.66 | 36800 | 0.0388 |
| 0.0015 | 8.66 | 36810 | 0.0387 |
| 0.0001 | 8.66 | 36820 | 0.0393 |
| 0.0001 | 8.67 | 36830 | 0.0396 |
| 0.0001 | 8.67 | 36840 | 0.0398 |
| 0.3592 | 8.67 | 36850 | 0.0388 |
| 0.0001 | 8.67 | 36860 | 0.0382 |
| 0.0001 | 8.68 | 36870 | 0.0382 |
| 0.0004 | 8.68 | 36880 | 0.0389 |
| 0.2756 | 8.68 | 36890 | 0.0388 |
| 0.0001 | 8.68 | 36900 | 0.0386 |
| 0.0005 | 8.68 | 36910 | 0.0396 |
| 0.0001 | 8.69 | 36920 | 0.0402 |
| 0.0001 | 8.69 | 36930 | 0.0405 |
| 0.0001 | 8.69 | 36940 | 0.0407 |
| 0.0001 | 8.69 | 36950 | 0.0409 |
| 0.0002 | 8.7 | 36960 | 0.0413 |
| 0.0001 | 8.7 | 36970 | 0.0417 |
| 0.0406 | 8.7 | 36980 | 0.0419 |
| 0.0002 | 8.7 | 36990 | 0.0429 |
| 0.107 | 8.71 | 37000 | 0.0439 |
| 0.0001 | 8.71 | 37010 | 0.0438 |
| 0.0001 | 8.71 | 37020 | 0.0439 |
| 0.2828 | 8.71 | 37030 | 0.0433 |
| 0.0052 | 8.72 | 37040 | 0.0427 |
| 0.0002 | 8.72 | 37050 | 0.0426 |
| 0.037 | 8.72 | 37060 | 0.0431 |
| 0.0076 | 8.72 | 37070 | 0.0442 |
| 0.2992 | 8.72 | 37080 | 0.0447 |
| 0.0001 | 8.73 | 37090 | 0.0448 |
| 0.1432 | 8.73 | 37100 | 0.0445 |
| 0.2799 | 8.73 | 37110 | 0.0452 |
| 0.2694 | 8.73 | 37120 | 0.0451 |
| 0.6125 | 8.74 | 37130 | 0.0436 |
| 0.3732 | 8.74 | 37140 | 0.0424 |
| 0.5751 | 8.74 | 37150 | 0.0390 |
| 0.5177 | 8.74 | 37160 | 0.0366 |
| 0.0003 | 8.75 | 37170 | 0.0353 |
| 0.0007 | 8.75 | 37180 | 0.0352 |
| 0.0012 | 8.75 | 37190 | 0.0360 |
| 0.0014 | 8.75 | 37200 | 0.0380 |
| 0.2907 | 8.76 | 37210 | 0.0393 |
| 0.0002 | 8.76 | 37220 | 0.0400 |
| 0.2382 | 8.76 | 37230 | 0.0401 |
| 0.0002 | 8.76 | 37240 | 0.0394 |
| 0.0002 | 8.76 | 37250 | 0.0395 |
| 0.0424 | 8.77 | 37260 | 0.0391 |
| 0.0001 | 8.77 | 37270 | 0.0391 |
| 0.0001 | 8.77 | 37280 | 0.0392 |
| 0.0001 | 8.77 | 37290 | 0.0394 |
| 0.0002 | 8.78 | 37300 | 0.0394 |
| 0.2049 | 8.78 | 37310 | 0.0386 |
| 0.1225 | 8.78 | 37320 | 0.0377 |
| 0.3037 | 8.78 | 37330 | 0.0371 |
| 0.2534 | 8.79 | 37340 | 0.0370 |
| 0.0925 | 8.79 | 37350 | 0.0366 |
| 0.0005 | 8.79 | 37360 | 0.0362 |
| 0.0001 | 8.79 | 37370 | 0.0362 |
| 0.0003 | 8.8 | 37380 | 0.0366 |
| 0.0001 | 8.8 | 37390 | 0.0371 |
| 0.2672 | 8.8 | 37400 | 0.0366 |
| 0.0002 | 8.8 | 37410 | 0.0367 |
| 0.0005 | 8.8 | 37420 | 0.0372 |
| 0.1205 | 8.81 | 37430 | 0.0367 |
| 0.0002 | 8.81 | 37440 | 0.0367 |
| 0.0002 | 8.81 | 37450 | 0.0371 |
| 0.0003 | 8.81 | 37460 | 0.0375 |
| 0.033 | 8.82 | 37470 | 0.0378 |
| 0.0001 | 8.82 | 37480 | 0.0388 |
| 0.0001 | 8.82 | 37490 | 0.0393 |
| 0.0001 | 8.82 | 37500 | 0.0397 |
| 0.0001 | 8.83 | 37510 | 0.0400 |
| 0.6351 | 8.83 | 37520 | 0.0396 |
| 0.2934 | 8.83 | 37530 | 0.0380 |
| 0.0002 | 8.83 | 37540 | 0.0375 |
| 0.0001 | 8.84 | 37550 | 0.0375 |
| 0.0001 | 8.84 | 37560 | 0.0377 |
| 0.4863 | 8.84 | 37570 | 0.0372 |
| 0.0002 | 8.84 | 37580 | 0.0362 |
| 0.0002 | 8.84 | 37590 | 0.0360 |
| 0.0002 | 8.85 | 37600 | 0.0361 |
| 0.2383 | 8.85 | 37610 | 0.0357 |
| 0.0953 | 8.85 | 37620 | 0.0351 |
| 0.0897 | 8.85 | 37630 | 0.0351 |
| 0.2677 | 8.86 | 37640 | 0.0351 |
| 0.0002 | 8.86 | 37650 | 0.0352 |
| 0.0338 | 8.86 | 37660 | 0.0352 |
| 0.0009 | 8.86 | 37670 | 0.0349 |
| 0.0586 | 8.87 | 37680 | 0.0350 |
| 0.0115 | 8.87 | 37690 | 0.0349 |
| 0.0001 | 8.87 | 37700 | 0.0350 |
| 0.0013 | 8.87 | 37710 | 0.0351 |
| 0.0002 | 8.88 | 37720 | 0.0353 |
| 0.0003 | 8.88 | 37730 | 0.0355 |
| 0.184 | 8.88 | 37740 | 0.0355 |
| 0.0001 | 8.88 | 37750 | 0.0350 |
| 0.0004 | 8.88 | 37760 | 0.0353 |
| 0.0001 | 8.89 | 37770 | 0.0357 |
| 0.0001 | 8.89 | 37780 | 0.0360 |
| 0.0001 | 8.89 | 37790 | 0.0362 |
| 0.0 | 8.89 | 37800 | 0.0363 |
| 0.0001 | 8.9 | 37810 | 0.0365 |
| 0.2459 | 8.9 | 37820 | 0.0363 |
| 0.6861 | 8.9 | 37830 | 0.0348 |
| 0.0639 | 8.9 | 37840 | 0.0341 |
| 0.2465 | 8.91 | 37850 | 0.0333 |
| 0.0603 | 8.91 | 37860 | 0.0330 |
| 0.0002 | 8.91 | 37870 | 0.0331 |
| 0.0004 | 8.91 | 37880 | 0.0333 |
| 0.0004 | 8.92 | 37890 | 0.0335 |
| 0.0002 | 8.92 | 37900 | 0.0340 |
| 0.0001 | 8.92 | 37910 | 0.0343 |
| 0.0194 | 8.92 | 37920 | 0.0343 |
| 0.0001 | 8.92 | 37930 | 0.0344 |
| 0.0001 | 8.93 | 37940 | 0.0345 |
| 0.2706 | 8.93 | 37950 | 0.0341 |
| 0.0001 | 8.93 | 37960 | 0.0340 |
| 0.0001 | 8.93 | 37970 | 0.0341 |
| 0.2992 | 8.94 | 37980 | 0.0338 |
| 0.0002 | 8.94 | 37990 | 0.0338 |
| 0.0603 | 8.94 | 38000 | 0.0340 |
| 0.0493 | 8.94 | 38010 | 0.0345 |
| 0.0001 | 8.95 | 38020 | 0.0349 |
| 0.0001 | 8.95 | 38030 | 0.0351 |
| 0.0001 | 8.95 | 38040 | 0.0353 |
| 0.261 | 8.95 | 38050 | 0.0351 |
| 0.0001 | 8.96 | 38060 | 0.0351 |
| 0.176 | 8.96 | 38070 | 0.0342 |
| 0.0009 | 8.96 | 38080 | 0.0341 |
| 0.0001 | 8.96 | 38090 | 0.0342 |
| 0.0001 | 8.96 | 38100 | 0.0343 |
| 0.0001 | 8.97 | 38110 | 0.0345 |
| 0.0002 | 8.97 | 38120 | 0.0348 |
| 0.0002 | 8.97 | 38130 | 0.0354 |
| 0.0001 | 8.97 | 38140 | 0.0358 |
| 0.2291 | 8.98 | 38150 | 0.0354 |
| 0.0001 | 8.98 | 38160 | 0.0349 |
| 0.0001 | 8.98 | 38170 | 0.0348 |
| 0.006 | 8.98 | 38180 | 0.0351 |
| 0.0011 | 8.99 | 38190 | 0.0359 |
| 0.0001 | 8.99 | 38200 | 0.0365 |
| 0.0002 | 8.99 | 38210 | 0.0366 |
| 0.3022 | 8.99 | 38220 | 0.0364 |
| 0.0001 | 9.0 | 38230 | 0.0365 |
| 0.0001 | 9.0 | 38240 | 0.0367 |
| 0.0001 | 9.0 | 38250 | 0.0369 |
| 0.0001 | 9.0 | 38260 | 0.0370 |
| 0.0001 | 9.0 | 38270 | 0.0372 |
| 0.0001 | 9.01 | 38280 | 0.0374 |
| 0.339 | 9.01 | 38290 | 0.0377 |
| 0.0013 | 9.01 | 38300 | 0.0385 |
| 0.0005 | 9.01 | 38310 | 0.0394 |
| 0.0012 | 9.02 | 38320 | 0.0397 |
| 0.0001 | 9.02 | 38330 | 0.0401 |
| 0.3049 | 9.02 | 38340 | 0.0410 |
| 0.227 | 9.02 | 38350 | 0.0411 |
| 0.008 | 9.03 | 38360 | 0.0407 |
| 0.2536 | 9.03 | 38370 | 0.0401 |
| 0.0049 | 9.03 | 38380 | 0.0395 |
| 0.1872 | 9.03 | 38390 | 0.0387 |
| 0.2345 | 9.04 | 38400 | 0.0364 |
| 0.0114 | 9.04 | 38410 | 0.0355 |
| 0.2417 | 9.04 | 38420 | 0.0352 |
| 0.0039 | 9.04 | 38430 | 0.0351 |
| 0.2881 | 9.04 | 38440 | 0.0351 |
| 0.0002 | 9.05 | 38450 | 0.0353 |
| 0.0001 | 9.05 | 38460 | 0.0358 |
| 0.0002 | 9.05 | 38470 | 0.0363 |
| 0.2365 | 9.05 | 38480 | 0.0365 |
| 0.0001 | 9.06 | 38490 | 0.0365 |
| 0.0001 | 9.06 | 38500 | 0.0366 |
| 0.2932 | 9.06 | 38510 | 0.0361 |
| 0.0001 | 9.06 | 38520 | 0.0359 |
| 0.0002 | 9.07 | 38530 | 0.0360 |
| 0.2921 | 9.07 | 38540 | 0.0352 |
| 0.0002 | 9.07 | 38550 | 0.0350 |
| 0.1025 | 9.07 | 38560 | 0.0347 |
| 0.2787 | 9.08 | 38570 | 0.0342 |
| 0.0019 | 9.08 | 38580 | 0.0344 |
| 0.2518 | 9.08 | 38590 | 0.0349 |
| 0.0002 | 9.08 | 38600 | 0.0354 |
| 0.0008 | 9.08 | 38610 | 0.0360 |
| 0.2349 | 9.09 | 38620 | 0.0361 |
| 0.3121 | 9.09 | 38630 | 0.0373 |
| 0.0002 | 9.09 | 38640 | 0.0377 |
| 0.0002 | 9.09 | 38650 | 0.0382 |
| 0.0002 | 9.1 | 38660 | 0.0386 |
| 0.1559 | 9.1 | 38670 | 0.0378 |
| 0.0001 | 9.1 | 38680 | 0.0375 |
| 0.0002 | 9.1 | 38690 | 0.0374 |
| 0.0872 | 9.11 | 38700 | 0.0375 |
| 0.0002 | 9.11 | 38710 | 0.0379 |
| 0.0001 | 9.11 | 38720 | 0.0382 |
| 0.0001 | 9.11 | 38730 | 0.0384 |
| 0.0001 | 9.12 | 38740 | 0.0388 |
| 0.0005 | 9.12 | 38750 | 0.0400 |
| 0.0007 | 9.12 | 38760 | 0.0405 |
| 0.0001 | 9.12 | 38770 | 0.0407 |
| 0.0755 | 9.12 | 38780 | 0.0418 |
| 0.0001 | 9.13 | 38790 | 0.0425 |
| 0.0001 | 9.13 | 38800 | 0.0428 |
| 0.4704 | 9.13 | 38810 | 0.0423 |
| 0.0001 | 9.13 | 38820 | 0.0402 |
| 0.287 | 9.14 | 38830 | 0.0394 |
| 0.0001 | 9.14 | 38840 | 0.0390 |
| 0.01 | 9.14 | 38850 | 0.0394 |
| 0.0595 | 9.14 | 38860 | 0.0387 |
| 0.2624 | 9.15 | 38870 | 0.0379 |
| 0.0207 | 9.15 | 38880 | 0.0384 |
| 0.0001 | 9.15 | 38890 | 0.0386 |
| 0.0002 | 9.15 | 38900 | 0.0388 |
| 0.3221 | 9.16 | 38910 | 0.0385 |
| 0.0881 | 9.16 | 38920 | 0.0380 |
| 0.0001 | 9.16 | 38930 | 0.0372 |
| 0.0001 | 9.16 | 38940 | 0.0371 |
| 0.5093 | 9.16 | 38950 | 0.0358 |
| 0.0002 | 9.17 | 38960 | 0.0351 |
| 0.0002 | 9.17 | 38970 | 0.0351 |
| 0.0453 | 9.17 | 38980 | 0.0351 |
| 0.1831 | 9.17 | 38990 | 0.0350 |
| 0.0001 | 9.18 | 39000 | 0.0352 |
| 0.0001 | 9.18 | 39010 | 0.0354 |
| 0.0001 | 9.18 | 39020 | 0.0355 |
| 0.2629 | 9.18 | 39030 | 0.0358 |
| 0.0001 | 9.19 | 39040 | 0.0360 |
| 0.0003 | 9.19 | 39050 | 0.0368 |
| 0.0001 | 9.19 | 39060 | 0.0372 |
| 0.0012 | 9.19 | 39070 | 0.0374 |
| 0.0546 | 9.2 | 39080 | 0.0376 |
| 0.0001 | 9.2 | 39090 | 0.0378 |
| 0.0001 | 9.2 | 39100 | 0.0380 |
| 0.5577 | 9.2 | 39110 | 0.0362 |
| 0.0001 | 9.2 | 39120 | 0.0354 |
| 0.1331 | 9.21 | 39130 | 0.0350 |
| 0.0001 | 9.21 | 39140 | 0.0342 |
| 0.0001 | 9.21 | 39150 | 0.0341 |
| 0.0002 | 9.21 | 39160 | 0.0343 |
| 0.2047 | 9.22 | 39170 | 0.0341 |
| 0.0001 | 9.22 | 39180 | 0.0338 |
| 0.0076 | 9.22 | 39190 | 0.0340 |
| 0.0355 | 9.22 | 39200 | 0.0341 |
| 0.0255 | 9.23 | 39210 | 0.0344 |
| 0.0667 | 9.23 | 39220 | 0.0341 |
| 0.0004 | 9.23 | 39230 | 0.0344 |
| 0.0001 | 9.23 | 39240 | 0.0347 |
| 0.0001 | 9.24 | 39250 | 0.0350 |
| 0.0194 | 9.24 | 39260 | 0.0353 |
| 0.0001 | 9.24 | 39270 | 0.0356 |
| 0.3416 | 9.24 | 39280 | 0.0368 |
| 0.0001 | 9.24 | 39290 | 0.0372 |
| 0.0001 | 9.25 | 39300 | 0.0375 |
| 0.0001 | 9.25 | 39310 | 0.0377 |
| 0.0785 | 9.25 | 39320 | 0.0374 |
| 0.058 | 9.25 | 39330 | 0.0366 |
| 0.0001 | 9.26 | 39340 | 0.0368 |
| 0.0004 | 9.26 | 39350 | 0.0370 |
| 0.0001 | 9.26 | 39360 | 0.0374 |
| 0.3088 | 9.26 | 39370 | 0.0373 |
| 0.0001 | 9.27 | 39380 | 0.0380 |
| 0.0882 | 9.27 | 39390 | 0.0382 |
| 0.0001 | 9.27 | 39400 | 0.0384 |
| 0.0001 | 9.27 | 39410 | 0.0386 |
| 0.2494 | 9.28 | 39420 | 0.0376 |
| 0.2841 | 9.28 | 39430 | 0.0363 |
| 0.0476 | 9.28 | 39440 | 0.0360 |
| 0.2629 | 9.28 | 39450 | 0.0359 |
| 0.0001 | 9.28 | 39460 | 0.0356 |
| 0.0507 | 9.29 | 39470 | 0.0357 |
| 0.1201 | 9.29 | 39480 | 0.0344 |
| 0.0001 | 9.29 | 39490 | 0.0341 |
| 0.0001 | 9.29 | 39500 | 0.0340 |
| 0.0002 | 9.3 | 39510 | 0.0343 |
| 0.0008 | 9.3 | 39520 | 0.0343 |
| 0.0001 | 9.3 | 39530 | 0.0344 |
| 0.0001 | 9.3 | 39540 | 0.0345 |
| 0.0001 | 9.31 | 39550 | 0.0346 |
| 0.0001 | 9.31 | 39560 | 0.0347 |
| 0.2396 | 9.31 | 39570 | 0.0341 |
| 0.0721 | 9.31 | 39580 | 0.0340 |
| 0.2521 | 9.32 | 39590 | 0.0333 |
| 0.0001 | 9.32 | 39600 | 0.0331 |
| 0.001 | 9.32 | 39610 | 0.0331 |
| 0.2652 | 9.32 | 39620 | 0.0330 |
| 0.0001 | 9.32 | 39630 | 0.0330 |
| 0.0001 | 9.33 | 39640 | 0.0331 |
| 0.0002 | 9.33 | 39650 | 0.0333 |
| 0.0001 | 9.33 | 39660 | 0.0335 |
| 0.0001 | 9.33 | 39670 | 0.0337 |
| 0.0001 | 9.34 | 39680 | 0.0338 |
| 0.2362 | 9.34 | 39690 | 0.0338 |
| 0.3455 | 9.34 | 39700 | 0.0330 |
| 0.0034 | 9.34 | 39710 | 0.0329 |
| 0.0001 | 9.35 | 39720 | 0.0329 |
| 0.0026 | 9.35 | 39730 | 0.0331 |
| 0.3595 | 9.35 | 39740 | 0.0329 |
| 0.1785 | 9.35 | 39750 | 0.0325 |
| 0.0001 | 9.36 | 39760 | 0.0321 |
| 0.0001 | 9.36 | 39770 | 0.0321 |
| 0.0001 | 9.36 | 39780 | 0.0323 |
| 0.0001 | 9.36 | 39790 | 0.0324 |
| 0.0001 | 9.36 | 39800 | 0.0325 |
| 0.0325 | 9.37 | 39810 | 0.0326 |
| 0.0472 | 9.37 | 39820 | 0.0329 |
| 0.0004 | 9.37 | 39830 | 0.0335 |
| 0.4212 | 9.37 | 39840 | 0.0334 |
| 0.0001 | 9.38 | 39850 | 0.0332 |
| 0.0001 | 9.38 | 39860 | 0.0332 |
| 0.0004 | 9.38 | 39870 | 0.0341 |
| 0.0001 | 9.38 | 39880 | 0.0347 |
| 0.0001 | 9.39 | 39890 | 0.0350 |
| 0.3144 | 9.39 | 39900 | 0.0346 |
| 0.1176 | 9.39 | 39910 | 0.0345 |
| 0.0003 | 9.39 | 39920 | 0.0348 |
| 0.0018 | 9.4 | 39930 | 0.0354 |
| 0.3078 | 9.4 | 39940 | 0.0367 |
| 0.0001 | 9.4 | 39950 | 0.0379 |
| 0.0019 | 9.4 | 39960 | 0.0372 |
| 0.4076 | 9.4 | 39970 | 0.0360 |
| 0.0002 | 9.41 | 39980 | 0.0351 |
| 0.3351 | 9.41 | 39990 | 0.0345 |
| 0.3399 | 9.41 | 40000 | 0.0336 |
| 0.0001 | 9.41 | 40010 | 0.0332 |
| 0.0002 | 9.42 | 40020 | 0.0332 |
| 0.2663 | 9.42 | 40030 | 0.0329 |
| 0.1171 | 9.42 | 40040 | 0.0330 |
| 0.0003 | 9.42 | 40050 | 0.0336 |
| 0.0001 | 9.43 | 40060 | 0.0339 |
| 0.0001 | 9.43 | 40070 | 0.0341 |
| 0.0599 | 9.43 | 40080 | 0.0343 |
| 0.1872 | 9.43 | 40090 | 0.0337 |
| 0.1017 | 9.44 | 40100 | 0.0333 |
| 0.1385 | 9.44 | 40110 | 0.0326 |
| 0.1868 | 9.44 | 40120 | 0.0319 |
| 0.0001 | 9.44 | 40130 | 0.0313 |
| 0.0001 | 9.44 | 40140 | 0.0312 |
| 0.2506 | 9.45 | 40150 | 0.0309 |
| 0.1983 | 9.45 | 40160 | 0.0304 |
| 0.0001 | 9.45 | 40170 | 0.0302 |
| 0.0002 | 9.45 | 40180 | 0.0303 |
| 0.0765 | 9.46 | 40190 | 0.0305 |
| 0.0916 | 9.46 | 40200 | 0.0306 |
| 0.0134 | 9.46 | 40210 | 0.0312 |
| 0.0001 | 9.46 | 40220 | 0.0316 |
| 0.0423 | 9.47 | 40230 | 0.0320 |
| 0.0001 | 9.47 | 40240 | 0.0320 |
| 0.0042 | 9.47 | 40250 | 0.0323 |
| 0.0001 | 9.47 | 40260 | 0.0330 |
| 0.1582 | 9.48 | 40270 | 0.0330 |
| 0.0002 | 9.48 | 40280 | 0.0329 |
| 0.0001 | 9.48 | 40290 | 0.0330 |
| 0.0458 | 9.48 | 40300 | 0.0332 |
| 0.065 | 9.48 | 40310 | 0.0337 |
| 0.046 | 9.49 | 40320 | 0.0338 |
| 0.0001 | 9.49 | 40330 | 0.0338 |
| 0.0001 | 9.49 | 40340 | 0.0341 |
| 0.1942 | 9.49 | 40350 | 0.0338 |
| 0.0001 | 9.5 | 40360 | 0.0335 |
| 0.0006 | 9.5 | 40370 | 0.0338 |
| 0.1139 | 9.5 | 40380 | 0.0339 |
| 0.0001 | 9.5 | 40390 | 0.0341 |
| 0.0509 | 9.51 | 40400 | 0.0350 |
| 0.0001 | 9.51 | 40410 | 0.0354 |
| 0.0422 | 9.51 | 40420 | 0.0356 |
| 0.0353 | 9.51 | 40430 | 0.0357 |
| 0.0251 | 9.52 | 40440 | 0.0358 |
| 0.0001 | 9.52 | 40450 | 0.0352 |
| 0.0001 | 9.52 | 40460 | 0.0351 |
| 0.0004 | 9.52 | 40470 | 0.0350 |
| 0.1855 | 9.52 | 40480 | 0.0348 |
| 0.0001 | 9.53 | 40490 | 0.0346 |
| 0.5632 | 9.53 | 40500 | 0.0343 |
| 0.0136 | 9.53 | 40510 | 0.0345 |
| 0.0001 | 9.53 | 40520 | 0.0351 |
| 0.254 | 9.54 | 40530 | 0.0352 |
| 0.0001 | 9.54 | 40540 | 0.0351 |
| 0.0001 | 9.54 | 40550 | 0.0352 |
| 0.0001 | 9.54 | 40560 | 0.0353 |
| 0.113 | 9.55 | 40570 | 0.0355 |
| 0.0006 | 9.55 | 40580 | 0.0355 |
| 0.0001 | 9.55 | 40590 | 0.0355 |
| 0.2632 | 9.55 | 40600 | 0.0345 |
| 0.0001 | 9.56 | 40610 | 0.0340 |
| 0.0001 | 9.56 | 40620 | 0.0339 |
| 0.0 | 9.56 | 40630 | 0.0339 |
| 0.0004 | 9.56 | 40640 | 0.0340 |
| 0.3002 | 9.56 | 40650 | 0.0338 |
| 0.0001 | 9.57 | 40660 | 0.0340 |
| 0.0001 | 9.57 | 40670 | 0.0341 |
| 0.0001 | 9.57 | 40680 | 0.0343 |
| 0.0001 | 9.57 | 40690 | 0.0344 |
| 0.0653 | 9.58 | 40700 | 0.0353 |
| 0.0022 | 9.58 | 40710 | 0.0369 |
| 0.0009 | 9.58 | 40720 | 0.0386 |
| 0.0001 | 9.58 | 40730 | 0.0394 |
| 0.0001 | 9.59 | 40740 | 0.0398 |
| 0.0001 | 9.59 | 40750 | 0.0400 |
| 0.0001 | 9.59 | 40760 | 0.0402 |
| 0.0001 | 9.59 | 40770 | 0.0403 |
| 0.0003 | 9.6 | 40780 | 0.0404 |
| 0.0701 | 9.6 | 40790 | 0.0406 |
| 0.0 | 9.6 | 40800 | 0.0407 |
| 0.0001 | 9.6 | 40810 | 0.0408 |
| 0.0001 | 9.6 | 40820 | 0.0409 |
| 0.0007 | 9.61 | 40830 | 0.0406 |
| 0.0053 | 9.61 | 40840 | 0.0402 |
| 0.3175 | 9.61 | 40850 | 0.0377 |
| 0.0006 | 9.61 | 40860 | 0.0353 |
| 0.1591 | 9.62 | 40870 | 0.0330 |
| 0.0031 | 9.62 | 40880 | 0.0325 |
| 0.0001 | 9.62 | 40890 | 0.0330 |
| 0.037 | 9.62 | 40900 | 0.0333 |
| 0.0028 | 9.63 | 40910 | 0.0339 |
| 0.28 | 9.63 | 40920 | 0.0340 |
| 0.6263 | 9.63 | 40930 | 0.0330 |
| 0.0921 | 9.63 | 40940 | 0.0314 |
| 0.3101 | 9.64 | 40950 | 0.0300 |
| 0.0001 | 9.64 | 40960 | 0.0295 |
| 0.0338 | 9.64 | 40970 | 0.0291 |
| 0.275 | 9.64 | 40980 | 0.0275 |
| 0.2661 | 9.64 | 40990 | 0.0270 |
| 0.0002 | 9.65 | 41000 | 0.0270 |
| 0.0002 | 9.65 | 41010 | 0.0272 |
| 0.0001 | 9.65 | 41020 | 0.0276 |
| 0.0001 | 9.65 | 41030 | 0.0279 |
| 0.0001 | 9.66 | 41040 | 0.0280 |
| 0.0528 | 9.66 | 41050 | 0.0284 |
| 0.1913 | 9.66 | 41060 | 0.0290 |
| 0.0003 | 9.66 | 41070 | 0.0287 |
| 0.2429 | 9.67 | 41080 | 0.0284 |
| 0.0001 | 9.67 | 41090 | 0.0283 |
| 0.0001 | 9.67 | 41100 | 0.0284 |
| 0.0001 | 9.67 | 41110 | 0.0287 |
| 0.0569 | 9.68 | 41120 | 0.0294 |
| 0.0001 | 9.68 | 41130 | 0.0300 |
| 0.0001 | 9.68 | 41140 | 0.0303 |
| 0.2796 | 9.68 | 41150 | 0.0302 |
| 0.0001 | 9.68 | 41160 | 0.0295 |
| 0.0002 | 9.69 | 41170 | 0.0294 |
| 0.0001 | 9.69 | 41180 | 0.0295 |
| 0.0001 | 9.69 | 41190 | 0.0296 |
| 0.2329 | 9.69 | 41200 | 0.0296 |
| 0.5507 | 9.7 | 41210 | 0.0287 |
| 0.0937 | 9.7 | 41220 | 0.0286 |
| 0.1259 | 9.7 | 41230 | 0.0281 |
| 0.0002 | 9.7 | 41240 | 0.0280 |
| 0.0003 | 9.71 | 41250 | 0.0283 |
| 0.0001 | 9.71 | 41260 | 0.0290 |
| 0.0129 | 9.71 | 41270 | 0.0294 |
| 0.0423 | 9.71 | 41280 | 0.0291 |
| 0.0001 | 9.72 | 41290 | 0.0290 |
| 0.1595 | 9.72 | 41300 | 0.0290 |
| 0.0 | 9.72 | 41310 | 0.0291 |
| 0.0001 | 9.72 | 41320 | 0.0292 |
| 0.0 | 9.72 | 41330 | 0.0293 |
| 0.0001 | 9.73 | 41340 | 0.0294 |
| 0.0677 | 9.73 | 41350 | 0.0290 |
| 0.0001 | 9.73 | 41360 | 0.0291 |
| 0.0001 | 9.73 | 41370 | 0.0294 |
| 0.0002 | 9.74 | 41380 | 0.0302 |
| 0.0307 | 9.74 | 41390 | 0.0309 |
| 0.1511 | 9.74 | 41400 | 0.0311 |
| 0.349 | 9.74 | 41410 | 0.0302 |
| 0.0 | 9.75 | 41420 | 0.0300 |
| 0.0001 | 9.75 | 41430 | 0.0300 |
| 0.1408 | 9.75 | 41440 | 0.0301 |
| 0.0001 | 9.75 | 41450 | 0.0302 |
| 0.0003 | 9.76 | 41460 | 0.0305 |
| 0.0001 | 9.76 | 41470 | 0.0308 |
| 0.0001 | 9.76 | 41480 | 0.0311 |
| 0.0321 | 9.76 | 41490 | 0.0305 |
| 0.0255 | 9.76 | 41500 | 0.0306 |
| 0.0 | 9.77 | 41510 | 0.0309 |
| 0.001 | 9.77 | 41520 | 0.0315 |
| 0.0758 | 9.77 | 41530 | 0.0319 |
| 0.0001 | 9.77 | 41540 | 0.0321 |
| 0.449 | 9.78 | 41550 | 0.0316 |
| 0.1025 | 9.78 | 41560 | 0.0306 |
| 0.0001 | 9.78 | 41570 | 0.0303 |
| 0.0 | 9.78 | 41580 | 0.0303 |
| 0.0 | 9.79 | 41590 | 0.0304 |
| 0.3076 | 9.79 | 41600 | 0.0301 |
| 0.0001 | 9.79 | 41610 | 0.0294 |
| 0.2855 | 9.79 | 41620 | 0.0290 |
| 0.0132 | 9.8 | 41630 | 0.0292 |
| 0.0 | 9.8 | 41640 | 0.0294 |
| 0.0001 | 9.8 | 41650 | 0.0295 |
| 0.0073 | 9.8 | 41660 | 0.0299 |
| 0.0055 | 9.8 | 41670 | 0.0305 |
| 0.0468 | 9.81 | 41680 | 0.0313 |
| 0.1658 | 9.81 | 41690 | 0.0306 |
| 0.3809 | 9.81 | 41700 | 0.0306 |
| 0.0001 | 9.81 | 41710 | 0.0307 |
| 0.0001 | 9.82 | 41720 | 0.0308 |
| 0.0108 | 9.82 | 41730 | 0.0308 |
| 0.0001 | 9.82 | 41740 | 0.0310 |
| 0.0004 | 9.82 | 41750 | 0.0319 |
| 0.0001 | 9.83 | 41760 | 0.0326 |
| 0.0002 | 9.83 | 41770 | 0.0329 |
| 0.0033 | 9.83 | 41780 | 0.0339 |
| 0.0001 | 9.83 | 41790 | 0.0345 |
| 0.2046 | 9.84 | 41800 | 0.0338 |
| 0.2422 | 9.84 | 41810 | 0.0325 |
| 0.0008 | 9.84 | 41820 | 0.0325 |
| 0.0001 | 9.84 | 41830 | 0.0328 |
| 0.0001 | 9.84 | 41840 | 0.0329 |
| 0.0001 | 9.85 | 41850 | 0.0330 |
| 0.0001 | 9.85 | 41860 | 0.0331 |
| 0.2537 | 9.85 | 41870 | 0.0331 |
| 0.0608 | 9.85 | 41880 | 0.0328 |
| 0.0415 | 9.86 | 41890 | 0.0321 |
| 0.6392 | 9.86 | 41900 | 0.0311 |
| 0.4655 | 9.86 | 41910 | 0.0298 |
| 0.0004 | 9.86 | 41920 | 0.0292 |
| 0.216 | 9.87 | 41930 | 0.0289 |
| 0.0001 | 9.87 | 41940 | 0.0288 |
| 0.0162 | 9.87 | 41950 | 0.0292 |
| 0.0001 | 9.87 | 41960 | 0.0295 |
| 0.0002 | 9.88 | 41970 | 0.0301 |
| 0.0001 | 9.88 | 41980 | 0.0305 |
| 0.0002 | 9.88 | 41990 | 0.0308 |
| 0.2346 | 9.88 | 42000 | 0.0308 |
| 0.0213 | 9.88 | 42010 | 0.0304 |
| 0.2212 | 9.89 | 42020 | 0.0298 |
| 0.0001 | 9.89 | 42030 | 0.0296 |
| 0.0696 | 9.89 | 42040 | 0.0298 |
| 0.0057 | 9.89 | 42050 | 0.0306 |
| 0.0001 | 9.9 | 42060 | 0.0311 |
| 0.0001 | 9.9 | 42070 | 0.0315 |
| 0.048 | 9.9 | 42080 | 0.0318 |
| 0.0001 | 9.9 | 42090 | 0.0320 |
| 0.428 | 9.91 | 42100 | 0.0319 |
| 0.2101 | 9.91 | 42110 | 0.0314 |
| 0.0008 | 9.91 | 42120 | 0.0317 |
| 0.0001 | 9.91 | 42130 | 0.0319 |
| 0.6253 | 9.92 | 42140 | 0.0299 |
| 0.0002 | 9.92 | 42150 | 0.0290 |
| 0.2709 | 9.92 | 42160 | 0.0288 |
| 0.0063 | 9.92 | 42170 | 0.0293 |
| 0.0041 | 9.92 | 42180 | 0.0302 |
| 0.0001 | 9.93 | 42190 | 0.0306 |
| 0.1341 | 9.93 | 42200 | 0.0310 |
| 0.001 | 9.93 | 42210 | 0.0317 |
| 0.0001 | 9.93 | 42220 | 0.0320 |
| 0.2934 | 9.94 | 42230 | 0.0320 |
| 0.0001 | 9.94 | 42240 | 0.0318 |
| 0.0512 | 9.94 | 42250 | 0.0315 |
| 0.0009 | 9.94 | 42260 | 0.0311 |
| 0.0841 | 9.95 | 42270 | 0.0316 |
| 0.0001 | 9.95 | 42280 | 0.0321 |
| 0.0125 | 9.95 | 42290 | 0.0324 |
| 0.0027 | 9.95 | 42300 | 0.0325 |
| 0.0152 | 9.96 | 42310 | 0.0327 |
| 0.0001 | 9.96 | 42320 | 0.0329 |
| 0.0001 | 9.96 | 42330 | 0.0331 |
| 0.0002 | 9.96 | 42340 | 0.0334 |
| 0.0001 | 9.96 | 42350 | 0.0336 |
| 0.0001 | 9.97 | 42360 | 0.0338 |
| 0.1304 | 9.97 | 42370 | 0.0337 |
| 0.0001 | 9.97 | 42380 | 0.0335 |
| 0.0001 | 9.97 | 42390 | 0.0336 |
| 0.0001 | 9.98 | 42400 | 0.0338 |
| 0.1377 | 9.98 | 42410 | 0.0337 |
| 0.0001 | 9.98 | 42420 | 0.0331 |
| 0.307 | 9.98 | 42430 | 0.0322 |
| 0.0043 | 9.99 | 42440 | 0.0321 |
| 0.0001 | 9.99 | 42450 | 0.0323 |
| 0.0005 | 9.99 | 42460 | 0.0330 |
| 0.4266 | 9.99 | 42470 | 0.0327 |
| 0.0001 | 10.0 | 42480 | 0.0323 |
| 0.2868 | 10.0 | 42490 | 0.0322 |
| 0.0001 | 10.0 | 42500 | 0.0321 |
| 0.1562 | 10.0 | 42510 | 0.0327 |
| 0.0001 | 10.0 | 42520 | 0.0332 |
| 0.0001 | 10.01 | 42530 | 0.0335 |
| 0.061 | 10.01 | 42540 | 0.0328 |
| 0.1778 | 10.01 | 42550 | 0.0325 |
| 0.048 | 10.01 | 42560 | 0.0323 |
| 0.0001 | 10.02 | 42570 | 0.0322 |
| 0.0001 | 10.02 | 42580 | 0.0322 |
| 0.0844 | 10.02 | 42590 | 0.0315 |
| 0.0001 | 10.02 | 42600 | 0.0312 |
| 0.257 | 10.03 | 42610 | 0.0309 |
| 0.0001 | 10.03 | 42620 | 0.0305 |
| 0.2979 | 10.03 | 42630 | 0.0308 |
| 0.0001 | 10.03 | 42640 | 0.0312 |
| 0.743 | 10.04 | 42650 | 0.0312 |
| 0.0001 | 10.04 | 42660 | 0.0313 |
| 0.2613 | 10.04 | 42670 | 0.0305 |
| 0.0012 | 10.04 | 42680 | 0.0302 |
| 0.0003 | 10.04 | 42690 | 0.0307 |
| 0.1573 | 10.05 | 42700 | 0.0309 |
| 0.0003 | 10.05 | 42710 | 0.0310 |
| 0.0001 | 10.05 | 42720 | 0.0312 |
| 0.0001 | 10.05 | 42730 | 0.0312 |
| 0.0001 | 10.06 | 42740 | 0.0313 |
| 0.016 | 10.06 | 42750 | 0.0311 |
| 0.0001 | 10.06 | 42760 | 0.0306 |
| 0.0017 | 10.06 | 42770 | 0.0308 |
| 0.0001 | 10.07 | 42780 | 0.0311 |
| 0.2512 | 10.07 | 42790 | 0.0307 |
| 0.3392 | 10.07 | 42800 | 0.0297 |
| 0.026 | 10.07 | 42810 | 0.0301 |
| 0.0222 | 10.08 | 42820 | 0.0304 |
| 0.0001 | 10.08 | 42830 | 0.0305 |
| 0.0003 | 10.08 | 42840 | 0.0307 |
| 0.0609 | 10.08 | 42850 | 0.0316 |
| 0.0001 | 10.08 | 42860 | 0.0323 |
| 0.0002 | 10.09 | 42870 | 0.0331 |
| 0.2779 | 10.09 | 42880 | 0.0330 |
| 0.0001 | 10.09 | 42890 | 0.0330 |
| 0.5647 | 10.09 | 42900 | 0.0326 |
| 0.0002 | 10.1 | 42910 | 0.0314 |
| 0.0029 | 10.1 | 42920 | 0.0306 |
| 0.2937 | 10.1 | 42930 | 0.0302 |
| 0.0475 | 10.1 | 42940 | 0.0300 |
| 0.0001 | 10.11 | 42950 | 0.0299 |
| 0.0001 | 10.11 | 42960 | 0.0300 |
| 0.0001 | 10.11 | 42970 | 0.0301 |
| 0.0001 | 10.11 | 42980 | 0.0302 |
| 0.0001 | 10.12 | 42990 | 0.0303 |
| 0.0001 | 10.12 | 43000 | 0.0304 |
| 0.0002 | 10.12 | 43010 | 0.0306 |
| 0.0002 | 10.12 | 43020 | 0.0308 |
| 0.0001 | 10.12 | 43030 | 0.0309 |
| 0.6149 | 10.13 | 43040 | 0.0303 |
| 0.0001 | 10.13 | 43050 | 0.0298 |
| 0.0001 | 10.13 | 43060 | 0.0298 |
| 0.0001 | 10.13 | 43070 | 0.0299 |
| 0.0001 | 10.14 | 43080 | 0.0299 |
| 0.0001 | 10.14 | 43090 | 0.0300 |
| 0.0373 | 10.14 | 43100 | 0.0299 |
| 0.2956 | 10.14 | 43110 | 0.0296 |
| 0.0001 | 10.15 | 43120 | 0.0293 |
| 0.4041 | 10.15 | 43130 | 0.0284 |
| 0.0001 | 10.15 | 43140 | 0.0276 |
| 0.0001 | 10.15 | 43150 | 0.0275 |
| 0.0001 | 10.16 | 43160 | 0.0275 |
| 0.0002 | 10.16 | 43170 | 0.0276 |
| 0.0002 | 10.16 | 43180 | 0.0279 |
| 0.122 | 10.16 | 43190 | 0.0277 |
| 0.1965 | 10.16 | 43200 | 0.0278 |
| 0.0001 | 10.17 | 43210 | 0.0281 |
| 0.0001 | 10.17 | 43220 | 0.0282 |
| 0.0798 | 10.17 | 43230 | 0.0281 |
| 0.1073 | 10.17 | 43240 | 0.0269 |
| 0.0002 | 10.18 | 43250 | 0.0266 |
| 0.1609 | 10.18 | 43260 | 0.0261 |
| 0.0128 | 10.18 | 43270 | 0.0258 |
| 0.0462 | 10.18 | 43280 | 0.0267 |
| 0.0005 | 10.19 | 43290 | 0.0271 |
| 0.0001 | 10.19 | 43300 | 0.0273 |
| 0.0001 | 10.19 | 43310 | 0.0274 |
| 0.0001 | 10.19 | 43320 | 0.0274 |
| 0.2825 | 10.2 | 43330 | 0.0275 |
| 0.2899 | 10.2 | 43340 | 0.0271 |
| 0.0002 | 10.2 | 43350 | 0.0266 |
| 0.0022 | 10.2 | 43360 | 0.0273 |
| 0.0001 | 10.2 | 43370 | 0.0277 |
| 0.0001 | 10.21 | 43380 | 0.0280 |
| 0.2239 | 10.21 | 43390 | 0.0283 |
| 0.3064 | 10.21 | 43400 | 0.0283 |
| 0.0899 | 10.21 | 43410 | 0.0284 |
| 0.0009 | 10.22 | 43420 | 0.0277 |
| 0.0002 | 10.22 | 43430 | 0.0277 |
| 0.0001 | 10.22 | 43440 | 0.0278 |
| 0.2814 | 10.22 | 43450 | 0.0280 |
| 0.3057 | 10.23 | 43460 | 0.0280 |
| 0.0424 | 10.23 | 43470 | 0.0281 |
| 0.2742 | 10.23 | 43480 | 0.0278 |
| 0.0001 | 10.23 | 43490 | 0.0276 |
| 0.0275 | 10.24 | 43500 | 0.0275 |
| 0.0001 | 10.24 | 43510 | 0.0273 |
| 0.0012 | 10.24 | 43520 | 0.0272 |
| 0.0002 | 10.24 | 43530 | 0.0274 |
| 0.0001 | 10.24 | 43540 | 0.0275 |
| 0.0001 | 10.25 | 43550 | 0.0277 |
| 0.2632 | 10.25 | 43560 | 0.0276 |
| 0.0008 | 10.25 | 43570 | 0.0276 |
| 0.1944 | 10.25 | 43580 | 0.0267 |
| 0.0492 | 10.26 | 43590 | 0.0263 |
| 0.0002 | 10.26 | 43600 | 0.0256 |
| 0.0204 | 10.26 | 43610 | 0.0252 |
| 0.0001 | 10.26 | 43620 | 0.0252 |
| 0.5119 | 10.27 | 43630 | 0.0249 |
| 0.0002 | 10.27 | 43640 | 0.0249 |
| 0.2654 | 10.27 | 43650 | 0.0251 |
| 0.2059 | 10.27 | 43660 | 0.0253 |
| 0.4238 | 10.28 | 43670 | 0.0251 |
| 0.0002 | 10.28 | 43680 | 0.0252 |
| 0.0003 | 10.28 | 43690 | 0.0255 |
| 0.0004 | 10.28 | 43700 | 0.0252 |
| 0.0009 | 10.28 | 43710 | 0.0248 |
| 0.0006 | 10.29 | 43720 | 0.0259 |
| 0.2917 | 10.29 | 43730 | 0.0267 |
| 0.0003 | 10.29 | 43740 | 0.0273 |
| 0.0001 | 10.29 | 43750 | 0.0280 |
| 0.0003 | 10.3 | 43760 | 0.0285 |
| 0.1222 | 10.3 | 43770 | 0.0285 |
| 0.0001 | 10.3 | 43780 | 0.0285 |
| 0.0001 | 10.3 | 43790 | 0.0285 |
| 0.263 | 10.31 | 43800 | 0.0284 |
| 0.0001 | 10.31 | 43810 | 0.0284 |
| 0.0325 | 10.31 | 43820 | 0.0283 |
| 0.0001 | 10.31 | 43830 | 0.0282 |
| 0.0079 | 10.32 | 43840 | 0.0285 |
| 0.0001 | 10.32 | 43850 | 0.0289 |
| 0.1872 | 10.32 | 43860 | 0.0286 |
| 0.0001 | 10.32 | 43870 | 0.0281 |
| 0.0001 | 10.32 | 43880 | 0.0279 |
| 0.0002 | 10.33 | 43890 | 0.0280 |
| 0.0002 | 10.33 | 43900 | 0.0282 |
| 0.0 | 10.33 | 43910 | 0.0284 |
| 0.0001 | 10.33 | 43920 | 0.0285 |
| 0.1517 | 10.34 | 43930 | 0.0284 |
| 0.2524 | 10.34 | 43940 | 0.0276 |
| 0.1481 | 10.34 | 43950 | 0.0270 |
| 0.0422 | 10.34 | 43960 | 0.0267 |
| 0.3478 | 10.35 | 43970 | 0.0266 |
| 0.0002 | 10.35 | 43980 | 0.0267 |
| 0.0119 | 10.35 | 43990 | 0.0272 |
| 0.0001 | 10.35 | 44000 | 0.0277 |
| 0.2848 | 10.36 | 44010 | 0.0279 |
| 0.0522 | 10.36 | 44020 | 0.0274 |
| 0.0001 | 10.36 | 44030 | 0.0271 |
| 0.0001 | 10.36 | 44040 | 0.0271 |
| 0.0001 | 10.36 | 44050 | 0.0271 |
| 0.0002 | 10.37 | 44060 | 0.0272 |
| 0.0001 | 10.37 | 44070 | 0.0273 |
| 0.0001 | 10.37 | 44080 | 0.0274 |
| 0.0001 | 10.37 | 44090 | 0.0275 |
| 0.0 | 10.38 | 44100 | 0.0276 |
| 0.0001 | 10.38 | 44110 | 0.0276 |
| 0.3003 | 10.38 | 44120 | 0.0274 |
| 0.0001 | 10.38 | 44130 | 0.0269 |
| 0.1643 | 10.39 | 44140 | 0.0267 |
| 0.0001 | 10.39 | 44150 | 0.0262 |
| 0.0002 | 10.39 | 44160 | 0.0262 |
| 0.0001 | 10.39 | 44170 | 0.0263 |
| 0.2066 | 10.4 | 44180 | 0.0257 |
| 0.0001 | 10.4 | 44190 | 0.0255 |
| 0.0428 | 10.4 | 44200 | 0.0265 |
| 0.2288 | 10.4 | 44210 | 0.0265 |
| 0.2742 | 10.4 | 44220 | 0.0264 |
| 0.0003 | 10.41 | 44230 | 0.0263 |
| 0.1912 | 10.41 | 44240 | 0.0259 |
| 0.0077 | 10.41 | 44250 | 0.0255 |
| 0.0001 | 10.41 | 44260 | 0.0253 |
| 0.2424 | 10.42 | 44270 | 0.0252 |
| 0.0013 | 10.42 | 44280 | 0.0251 |
| 0.2494 | 10.42 | 44290 | 0.0256 |
| 0.0015 | 10.42 | 44300 | 0.0267 |
| 0.0627 | 10.43 | 44310 | 0.0272 |
| 0.0001 | 10.43 | 44320 | 0.0269 |
| 0.0001 | 10.43 | 44330 | 0.0269 |
| 0.0001 | 10.43 | 44340 | 0.0269 |
| 0.0001 | 10.44 | 44350 | 0.0270 |
| 0.0001 | 10.44 | 44360 | 0.0271 |
| 0.0006 | 10.44 | 44370 | 0.0273 |
| 0.0001 | 10.44 | 44380 | 0.0274 |
| 0.0002 | 10.44 | 44390 | 0.0276 |
| 0.0468 | 10.45 | 44400 | 0.0277 |
| 0.0001 | 10.45 | 44410 | 0.0278 |
| 0.0001 | 10.45 | 44420 | 0.0279 |
| 0.0176 | 10.45 | 44430 | 0.0271 |
| 0.0452 | 10.46 | 44440 | 0.0264 |
| 0.008 | 10.46 | 44450 | 0.0261 |
| 0.0001 | 10.46 | 44460 | 0.0257 |
| 0.0002 | 10.46 | 44470 | 0.0258 |
| 0.0002 | 10.47 | 44480 | 0.0264 |
| 0.0086 | 10.47 | 44490 | 0.0269 |
| 0.0 | 10.47 | 44500 | 0.0278 |
| 0.1012 | 10.47 | 44510 | 0.0280 |
| 0.0882 | 10.48 | 44520 | 0.0272 |
| 0.0 | 10.48 | 44530 | 0.0267 |
| 0.0001 | 10.48 | 44540 | 0.0265 |
| 0.0001 | 10.48 | 44550 | 0.0265 |
| 0.0001 | 10.48 | 44560 | 0.0266 |
| 0.0237 | 10.49 | 44570 | 0.0268 |
| 0.2909 | 10.49 | 44580 | 0.0271 |
| 0.0001 | 10.49 | 44590 | 0.0271 |
| 0.0001 | 10.49 | 44600 | 0.0272 |
| 0.411 | 10.5 | 44610 | 0.0270 |
| 0.3086 | 10.5 | 44620 | 0.0269 |
| 0.2137 | 10.5 | 44630 | 0.0264 |
| 0.2862 | 10.5 | 44640 | 0.0260 |
| 0.0001 | 10.51 | 44650 | 0.0258 |
| 0.0004 | 10.51 | 44660 | 0.0258 |
| 0.0001 | 10.51 | 44670 | 0.0260 |
| 0.0003 | 10.51 | 44680 | 0.0265 |
| 0.2343 | 10.52 | 44690 | 0.0262 |
| 0.5918 | 10.52 | 44700 | 0.0261 |
| 0.0001 | 10.52 | 44710 | 0.0256 |
| 0.0457 | 10.52 | 44720 | 0.0260 |
| 0.2379 | 10.52 | 44730 | 0.0255 |
| 0.0069 | 10.53 | 44740 | 0.0253 |
| 0.0962 | 10.53 | 44750 | 0.0249 |
| 0.4592 | 10.53 | 44760 | 0.0238 |
| 0.1385 | 10.53 | 44770 | 0.0233 |
| 0.0003 | 10.54 | 44780 | 0.0237 |
| 0.0001 | 10.54 | 44790 | 0.0240 |
| 0.1492 | 10.54 | 44800 | 0.0242 |
| 0.3191 | 10.54 | 44810 | 0.0236 |
| 0.0084 | 10.55 | 44820 | 0.0234 |
| 0.0001 | 10.55 | 44830 | 0.0236 |
| 0.0006 | 10.55 | 44840 | 0.0241 |
| 0.0086 | 10.55 | 44850 | 0.0244 |
| 0.0549 | 10.56 | 44860 | 0.0245 |
| 0.0 | 10.56 | 44870 | 0.0246 |
| 0.0001 | 10.56 | 44880 | 0.0247 |
| 0.115 | 10.56 | 44890 | 0.0246 |
| 0.2119 | 10.56 | 44900 | 0.0245 |
| 0.2119 | 10.57 | 44910 | 0.0238 |
| 0.0393 | 10.57 | 44920 | 0.0235 |
| 0.0001 | 10.57 | 44930 | 0.0234 |
| 0.1687 | 10.57 | 44940 | 0.0237 |
| 0.3221 | 10.58 | 44950 | 0.0240 |
| 0.2338 | 10.58 | 44960 | 0.0234 |
| 0.2145 | 10.58 | 44970 | 0.0233 |
| 0.0001 | 10.58 | 44980 | 0.0233 |
| 0.0002 | 10.59 | 44990 | 0.0236 |
| 0.0001 | 10.59 | 45000 | 0.0238 |
| 0.0002 | 10.59 | 45010 | 0.0242 |
| 0.0169 | 10.59 | 45020 | 0.0247 |
| 0.0001 | 10.6 | 45030 | 0.0250 |
| 0.0001 | 10.6 | 45040 | 0.0252 |
| 0.001 | 10.6 | 45050 | 0.0254 |
| 0.0032 | 10.6 | 45060 | 0.0259 |
| 0.0001 | 10.6 | 45070 | 0.0262 |
| 0.2298 | 10.61 | 45080 | 0.0257 |
| 0.0001 | 10.61 | 45090 | 0.0255 |
| 0.0496 | 10.61 | 45100 | 0.0254 |
| 0.0316 | 10.61 | 45110 | 0.0255 |
| 0.0001 | 10.62 | 45120 | 0.0255 |
| 0.026 | 10.62 | 45130 | 0.0255 |
| 0.0087 | 10.62 | 45140 | 0.0255 |
| 0.0001 | 10.62 | 45150 | 0.0262 |
| 0.0002 | 10.63 | 45160 | 0.0266 |
| 0.0412 | 10.63 | 45170 | 0.0267 |
| 0.0001 | 10.63 | 45180 | 0.0266 |
| 0.5676 | 10.63 | 45190 | 0.0260 |
| 0.0001 | 10.64 | 45200 | 0.0256 |
| 0.0001 | 10.64 | 45210 | 0.0256 |
| 0.0754 | 10.64 | 45220 | 0.0263 |
| 0.0005 | 10.64 | 45230 | 0.0266 |
| 0.3258 | 10.64 | 45240 | 0.0267 |
| 0.0001 | 10.65 | 45250 | 0.0264 |
| 0.0001 | 10.65 | 45260 | 0.0264 |
| 0.0002 | 10.65 | 45270 | 0.0268 |
| 0.2691 | 10.65 | 45280 | 0.0265 |
| 0.0001 | 10.66 | 45290 | 0.0266 |
| 0.1915 | 10.66 | 45300 | 0.0263 |
| 0.091 | 10.66 | 45310 | 0.0253 |
| 0.0001 | 10.66 | 45320 | 0.0251 |
| 0.0001 | 10.67 | 45330 | 0.0251 |
| 0.0018 | 10.67 | 45340 | 0.0256 |
| 0.3273 | 10.67 | 45350 | 0.0260 |
| 0.0002 | 10.67 | 45360 | 0.0264 |
| 0.0002 | 10.68 | 45370 | 0.0273 |
| 0.0001 | 10.68 | 45380 | 0.0277 |
| 0.0002 | 10.68 | 45390 | 0.0278 |
| 0.0001 | 10.68 | 45400 | 0.0279 |
| 0.0001 | 10.68 | 45410 | 0.0280 |
| 0.0001 | 10.69 | 45420 | 0.0282 |
| 0.0001 | 10.69 | 45430 | 0.0284 |
| 0.2059 | 10.69 | 45440 | 0.0278 |
| 0.0001 | 10.69 | 45450 | 0.0275 |
| 0.0001 | 10.7 | 45460 | 0.0275 |
| 0.0002 | 10.7 | 45470 | 0.0276 |
| 0.0001 | 10.7 | 45480 | 0.0279 |
| 0.0001 | 10.7 | 45490 | 0.0280 |
| 0.0001 | 10.71 | 45500 | 0.0281 |
| 0.3211 | 10.71 | 45510 | 0.0273 |
| 0.0001 | 10.71 | 45520 | 0.0272 |
| 0.2828 | 10.71 | 45530 | 0.0270 |
| 0.1628 | 10.72 | 45540 | 0.0263 |
| 0.0 | 10.72 | 45550 | 0.0258 |
| 0.0001 | 10.72 | 45560 | 0.0256 |
| 0.2963 | 10.72 | 45570 | 0.0258 |
| 0.0001 | 10.72 | 45580 | 0.0261 |
| 0.0001 | 10.73 | 45590 | 0.0262 |
| 0.0 | 10.73 | 45600 | 0.0263 |
| 0.0001 | 10.73 | 45610 | 0.0264 |
| 0.0 | 10.73 | 45620 | 0.0265 |
| 0.0001 | 10.74 | 45630 | 0.0265 |
| 0.0001 | 10.74 | 45640 | 0.0266 |
| 0.0001 | 10.74 | 45650 | 0.0267 |
| 0.0002 | 10.74 | 45660 | 0.0264 |
| 0.0 | 10.75 | 45670 | 0.0261 |
| 0.1228 | 10.75 | 45680 | 0.0258 |
| 0.0001 | 10.75 | 45690 | 0.0258 |
| 0.0001 | 10.75 | 45700 | 0.0259 |
| 0.0001 | 10.76 | 45710 | 0.0259 |
| 0.0593 | 10.76 | 45720 | 0.0260 |
| 0.0007 | 10.76 | 45730 | 0.0264 |
| 0.0009 | 10.76 | 45740 | 0.0270 |
| 0.0001 | 10.76 | 45750 | 0.0274 |
| 0.0001 | 10.77 | 45760 | 0.0276 |
| 0.0001 | 10.77 | 45770 | 0.0277 |
| 0.0001 | 10.77 | 45780 | 0.0279 |
| 0.0001 | 10.77 | 45790 | 0.0280 |
| 0.1992 | 10.78 | 45800 | 0.0271 |
| 0.2351 | 10.78 | 45810 | 0.0266 |
| 0.0178 | 10.78 | 45820 | 0.0264 |
| 0.0001 | 10.78 | 45830 | 0.0265 |
| 0.0023 | 10.79 | 45840 | 0.0269 |
| 0.0 | 10.79 | 45850 | 0.0274 |
| 0.0001 | 10.79 | 45860 | 0.0277 |
| 0.0133 | 10.79 | 45870 | 0.0281 |
| 0.1704 | 10.8 | 45880 | 0.0275 |
| 0.0008 | 10.8 | 45890 | 0.0274 |
| 0.113 | 10.8 | 45900 | 0.0269 |
| 0.0001 | 10.8 | 45910 | 0.0259 |
| 0.0 | 10.8 | 45920 | 0.0256 |
| 0.2266 | 10.81 | 45930 | 0.0254 |
| 0.0 | 10.81 | 45940 | 0.0248 |
| 0.0 | 10.81 | 45950 | 0.0246 |
| 0.3016 | 10.81 | 45960 | 0.0245 |
| 0.0039 | 10.82 | 45970 | 0.0253 |
| 0.0001 | 10.82 | 45980 | 0.0261 |
| 0.0132 | 10.82 | 45990 | 0.0265 |
| 0.2314 | 10.82 | 46000 | 0.0271 |
| 0.297 | 10.83 | 46010 | 0.0276 |
| 0.2782 | 10.83 | 46020 | 0.0267 |
| 0.0001 | 10.83 | 46030 | 0.0263 |
| 0.0001 | 10.83 | 46040 | 0.0263 |
| 0.0001 | 10.84 | 46050 | 0.0263 |
| 0.0001 | 10.84 | 46060 | 0.0266 |
| 0.0001 | 10.84 | 46070 | 0.0268 |
| 0.0 | 10.84 | 46080 | 0.0269 |
| 0.2114 | 10.84 | 46090 | 0.0264 |
| 0.0147 | 10.85 | 46100 | 0.0258 |
| 0.0001 | 10.85 | 46110 | 0.0255 |
| 0.0709 | 10.85 | 46120 | 0.0251 |
| 0.0001 | 10.85 | 46130 | 0.0246 |
| 0.0001 | 10.86 | 46140 | 0.0245 |
| 0.0001 | 10.86 | 46150 | 0.0246 |
| 0.005 | 10.86 | 46160 | 0.0244 |
| 0.316 | 10.86 | 46170 | 0.0240 |
| 0.0002 | 10.87 | 46180 | 0.0240 |
| 0.2528 | 10.87 | 46190 | 0.0240 |
| 0.0003 | 10.87 | 46200 | 0.0242 |
| 0.0001 | 10.87 | 46210 | 0.0245 |
| 0.1961 | 10.88 | 46220 | 0.0242 |
| 0.0001 | 10.88 | 46230 | 0.0245 |
| 0.0001 | 10.88 | 46240 | 0.0247 |
| 0.0001 | 10.88 | 46250 | 0.0249 |
| 0.0003 | 10.88 | 46260 | 0.0250 |
| 0.0002 | 10.89 | 46270 | 0.0257 |
| 0.242 | 10.89 | 46280 | 0.0260 |
| 0.0688 | 10.89 | 46290 | 0.0252 |
| 0.0001 | 10.89 | 46300 | 0.0250 |
| 0.0 | 10.9 | 46310 | 0.0250 |
| 0.0001 | 10.9 | 46320 | 0.0254 |
| 0.0001 | 10.9 | 46330 | 0.0256 |
| 0.0001 | 10.9 | 46340 | 0.0257 |
| 0.1457 | 10.91 | 46350 | 0.0256 |
| 0.0433 | 10.91 | 46360 | 0.0257 |
| 0.1696 | 10.91 | 46370 | 0.0251 |
| 0.1508 | 10.91 | 46380 | 0.0249 |
| 0.5887 | 10.92 | 46390 | 0.0241 |
| 0.0005 | 10.92 | 46400 | 0.0238 |
| 0.0001 | 10.92 | 46410 | 0.0242 |
| 0.2542 | 10.92 | 46420 | 0.0243 |
| 0.0001 | 10.92 | 46430 | 0.0240 |
| 0.0198 | 10.93 | 46440 | 0.0241 |
| 0.3026 | 10.93 | 46450 | 0.0235 |
| 0.0059 | 10.93 | 46460 | 0.0241 |
| 0.0633 | 10.93 | 46470 | 0.0242 |
| 0.0127 | 10.94 | 46480 | 0.0241 |
| 0.0001 | 10.94 | 46490 | 0.0241 |
| 0.625 | 10.94 | 46500 | 0.0233 |
| 0.0001 | 10.94 | 46510 | 0.0229 |
| 0.0006 | 10.95 | 46520 | 0.0230 |
| 0.013 | 10.95 | 46530 | 0.0233 |
| 0.0001 | 10.95 | 46540 | 0.0236 |
| 0.0001 | 10.95 | 46550 | 0.0238 |
| 0.0001 | 10.96 | 46560 | 0.0239 |
| 0.0001 | 10.96 | 46570 | 0.0241 |
| 0.0001 | 10.96 | 46580 | 0.0243 |
| 0.0276 | 10.96 | 46590 | 0.0250 |
| 0.0 | 10.96 | 46600 | 0.0253 |
| 0.0002 | 10.97 | 46610 | 0.0253 |
| 0.2179 | 10.97 | 46620 | 0.0249 |
| 0.023 | 10.97 | 46630 | 0.0252 |
| 0.0002 | 10.97 | 46640 | 0.0257 |
| 0.0783 | 10.98 | 46650 | 0.0261 |
| 0.1995 | 10.98 | 46660 | 0.0262 |
| 0.0001 | 10.98 | 46670 | 0.0258 |
| 0.0691 | 10.98 | 46680 | 0.0254 |
| 0.0 | 10.99 | 46690 | 0.0248 |
| 0.0001 | 10.99 | 46700 | 0.0247 |
| 0.0001 | 10.99 | 46710 | 0.0248 |
| 0.0002 | 10.99 | 46720 | 0.0249 |
| 0.0001 | 11.0 | 46730 | 0.0250 |
| 0.3213 | 11.0 | 46740 | 0.0245 |
| 0.0833 | 11.0 | 46750 | 0.0249 |
| 0.1467 | 11.0 | 46760 | 0.0250 |
| 0.0001 | 11.0 | 46770 | 0.0248 |
| 0.036 | 11.01 | 46780 | 0.0247 |
| 0.2637 | 11.01 | 46790 | 0.0245 |
| 0.0001 | 11.01 | 46800 | 0.0245 |
| 0.0001 | 11.01 | 46810 | 0.0245 |
| 0.0068 | 11.02 | 46820 | 0.0246 |
| 0.2843 | 11.02 | 46830 | 0.0244 |
| 0.0001 | 11.02 | 46840 | 0.0242 |
| 0.2883 | 11.02 | 46850 | 0.0238 |
| 0.0002 | 11.03 | 46860 | 0.0238 |
| 0.0001 | 11.03 | 46870 | 0.0240 |
| 0.0001 | 11.03 | 46880 | 0.0242 |
| 0.0001 | 11.03 | 46890 | 0.0243 |
| 0.0001 | 11.04 | 46900 | 0.0245 |
| 0.0001 | 11.04 | 46910 | 0.0246 |
| 0.0001 | 11.04 | 46920 | 0.0247 |
| 0.001 | 11.04 | 46930 | 0.0248 |
| 0.0005 | 11.04 | 46940 | 0.0254 |
| 0.0 | 11.05 | 46950 | 0.0262 |
| 0.0001 | 11.05 | 46960 | 0.0265 |
| 0.0001 | 11.05 | 46970 | 0.0266 |
| 0.0002 | 11.05 | 46980 | 0.0267 |
| 0.0441 | 11.06 | 46990 | 0.0265 |
| 0.1029 | 11.06 | 47000 | 0.0270 |
| 0.0 | 11.06 | 47010 | 0.0271 |
| 0.0398 | 11.06 | 47020 | 0.0266 |
| 0.0975 | 11.07 | 47030 | 0.0263 |
| 0.001 | 11.07 | 47040 | 0.0262 |
| 0.0 | 11.07 | 47050 | 0.0264 |
| 0.0001 | 11.07 | 47060 | 0.0265 |
| 0.0 | 11.08 | 47070 | 0.0266 |
| 0.0002 | 11.08 | 47080 | 0.0266 |
| 0.2026 | 11.08 | 47090 | 0.0264 |
| 0.0 | 11.08 | 47100 | 0.0264 |
| 0.3685 | 11.08 | 47110 | 0.0263 |
| 0.0391 | 11.09 | 47120 | 0.0256 |
| 0.032 | 11.09 | 47130 | 0.0254 |
| 0.3495 | 11.09 | 47140 | 0.0248 |
| 0.0001 | 11.09 | 47150 | 0.0246 |
| 0.0001 | 11.1 | 47160 | 0.0246 |
| 0.0001 | 11.1 | 47170 | 0.0246 |
| 0.0492 | 11.1 | 47180 | 0.0247 |
| 0.0001 | 11.1 | 47190 | 0.0246 |
| 0.0001 | 11.11 | 47200 | 0.0246 |
| 0.0001 | 11.11 | 47210 | 0.0246 |
| 0.0001 | 11.11 | 47220 | 0.0247 |
| 0.0842 | 11.11 | 47230 | 0.0248 |
| 0.2833 | 11.12 | 47240 | 0.0247 |
| 0.0 | 11.12 | 47250 | 0.0245 |
| 0.0004 | 11.12 | 47260 | 0.0245 |
| 0.0001 | 11.12 | 47270 | 0.0245 |
| 0.0001 | 11.12 | 47280 | 0.0246 |
| 0.0001 | 11.13 | 47290 | 0.0246 |
| 0.0001 | 11.13 | 47300 | 0.0248 |
| 0.2021 | 11.13 | 47310 | 0.0254 |
| 0.2638 | 11.13 | 47320 | 0.0254 |
| 0.0001 | 11.14 | 47330 | 0.0256 |
| 0.2582 | 11.14 | 47340 | 0.0253 |
| 0.0008 | 11.14 | 47350 | 0.0254 |
| 0.0321 | 11.14 | 47360 | 0.0261 |
| 0.3163 | 11.15 | 47370 | 0.0261 |
| 0.0001 | 11.15 | 47380 | 0.0260 |
| 0.0002 | 11.15 | 47390 | 0.0262 |
| 0.0588 | 11.15 | 47400 | 0.0265 |
| 0.0116 | 11.16 | 47410 | 0.0265 |
| 0.0005 | 11.16 | 47420 | 0.0264 |
| 0.0001 | 11.16 | 47430 | 0.0271 |
| 0.0671 | 11.16 | 47440 | 0.0280 |
| 0.0002 | 11.16 | 47450 | 0.0284 |
| 0.0002 | 11.17 | 47460 | 0.0283 |
| 0.0001 | 11.17 | 47470 | 0.0283 |
| 0.0001 | 11.17 | 47480 | 0.0283 |
| 0.0 | 11.17 | 47490 | 0.0283 |
| 0.0001 | 11.18 | 47500 | 0.0284 |
| 0.3382 | 11.18 | 47510 | 0.0284 |
| 0.0001 | 11.18 | 47520 | 0.0284 |
| 0.0001 | 11.18 | 47530 | 0.0285 |
| 0.0001 | 11.19 | 47540 | 0.0286 |
| 0.0026 | 11.19 | 47550 | 0.0292 |
| 0.3058 | 11.19 | 47560 | 0.0293 |
| 0.0005 | 11.19 | 47570 | 0.0289 |
| 0.1036 | 11.2 | 47580 | 0.0289 |
| 0.0001 | 11.2 | 47590 | 0.0289 |
| 0.0512 | 11.2 | 47600 | 0.0298 |
| 0.0001 | 11.2 | 47610 | 0.0302 |
| 0.0165 | 11.2 | 47620 | 0.0298 |
| 0.0001 | 11.21 | 47630 | 0.0293 |
| 0.0001 | 11.21 | 47640 | 0.0292 |
| 0.0005 | 11.21 | 47650 | 0.0290 |
| 0.0 | 11.21 | 47660 | 0.0289 |
| 0.0001 | 11.22 | 47670 | 0.0289 |
| 0.0173 | 11.22 | 47680 | 0.0288 |
| 0.0 | 11.22 | 47690 | 0.0286 |
| 0.0656 | 11.22 | 47700 | 0.0281 |
| 0.0 | 11.23 | 47710 | 0.0279 |
| 0.0009 | 11.23 | 47720 | 0.0276 |
| 0.2918 | 11.23 | 47730 | 0.0277 |
| 0.0001 | 11.23 | 47740 | 0.0278 |
| 0.0 | 11.24 | 47750 | 0.0279 |
| 0.2811 | 11.24 | 47760 | 0.0278 |
| 0.0001 | 11.24 | 47770 | 0.0282 |
| 0.0001 | 11.24 | 47780 | 0.0283 |
| 0.0519 | 11.24 | 47790 | 0.0284 |
| 0.001 | 11.25 | 47800 | 0.0285 |
| 0.0001 | 11.25 | 47810 | 0.0292 |
| 0.3012 | 11.25 | 47820 | 0.0293 |
| 0.0001 | 11.25 | 47830 | 0.0292 |
| 0.0001 | 11.26 | 47840 | 0.0292 |
| 0.5438 | 11.26 | 47850 | 0.0280 |
| 0.0001 | 11.26 | 47860 | 0.0276 |
| 0.0167 | 11.26 | 47870 | 0.0273 |
| 0.0001 | 11.27 | 47880 | 0.0268 |
| 0.0055 | 11.27 | 47890 | 0.0266 |
| 0.0001 | 11.27 | 47900 | 0.0264 |
| 0.2282 | 11.27 | 47910 | 0.0261 |
| 0.0003 | 11.28 | 47920 | 0.0265 |
| 0.0001 | 11.28 | 47930 | 0.0268 |
| 0.0001 | 11.28 | 47940 | 0.0270 |
| 0.0001 | 11.28 | 47950 | 0.0272 |
| 0.0206 | 11.28 | 47960 | 0.0277 |
| 0.3603 | 11.29 | 47970 | 0.0287 |
| 0.0327 | 11.29 | 47980 | 0.0292 |
| 0.0001 | 11.29 | 47990 | 0.0294 |
| 0.0001 | 11.29 | 48000 | 0.0295 |
| 0.0001 | 11.3 | 48010 | 0.0296 |
| 0.2808 | 11.3 | 48020 | 0.0289 |
| 0.0001 | 11.3 | 48030 | 0.0282 |
| 0.0001 | 11.3 | 48040 | 0.0280 |
| 0.1453 | 11.31 | 48050 | 0.0274 |
| 0.252 | 11.31 | 48060 | 0.0258 |
| 0.0001 | 11.31 | 48070 | 0.0253 |
| 0.0001 | 11.31 | 48080 | 0.0252 |
| 0.0001 | 11.32 | 48090 | 0.0252 |
| 0.2323 | 11.32 | 48100 | 0.0250 |
| 0.0001 | 11.32 | 48110 | 0.0250 |
| 0.6421 | 11.32 | 48120 | 0.0245 |
| 0.0001 | 11.32 | 48130 | 0.0239 |
| 0.0001 | 11.33 | 48140 | 0.0237 |
| 0.0337 | 11.33 | 48150 | 0.0239 |
| 0.0001 | 11.33 | 48160 | 0.0241 |
| 0.2702 | 11.33 | 48170 | 0.0247 |
| 0.0001 | 11.34 | 48180 | 0.0251 |
| 0.0001 | 11.34 | 48190 | 0.0253 |
| 0.2491 | 11.34 | 48200 | 0.0254 |
| 0.0001 | 11.34 | 48210 | 0.0250 |
| 0.0001 | 11.35 | 48220 | 0.0250 |
| 0.0001 | 11.35 | 48230 | 0.0251 |
| 0.0054 | 11.35 | 48240 | 0.0248 |
| 0.0001 | 11.35 | 48250 | 0.0247 |
| 0.0001 | 11.36 | 48260 | 0.0247 |
| 0.2661 | 11.36 | 48270 | 0.0241 |
| 0.2227 | 11.36 | 48280 | 0.0238 |
| 0.0001 | 11.36 | 48290 | 0.0234 |
| 0.0002 | 11.36 | 48300 | 0.0236 |
| 0.0002 | 11.37 | 48310 | 0.0239 |
| 0.0185 | 11.37 | 48320 | 0.0234 |
| 0.0001 | 11.37 | 48330 | 0.0230 |
| 0.0001 | 11.37 | 48340 | 0.0229 |
| 0.0001 | 11.38 | 48350 | 0.0230 |
| 0.0001 | 11.38 | 48360 | 0.0230 |
| 0.0001 | 11.38 | 48370 | 0.0231 |
| 0.2035 | 11.38 | 48380 | 0.0233 |
| 0.0002 | 11.39 | 48390 | 0.0232 |
| 0.0001 | 11.39 | 48400 | 0.0234 |
| 0.3452 | 11.39 | 48410 | 0.0235 |
| 0.0001 | 11.39 | 48420 | 0.0240 |
| 0.0001 | 11.4 | 48430 | 0.0243 |
| 0.0912 | 11.4 | 48440 | 0.0238 |
| 0.211 | 11.4 | 48450 | 0.0237 |
| 0.4252 | 11.4 | 48460 | 0.0234 |
| 0.2042 | 11.4 | 48470 | 0.0230 |
| 0.0179 | 11.41 | 48480 | 0.0228 |
| 0.0821 | 11.41 | 48490 | 0.0227 |
| 0.0001 | 11.41 | 48500 | 0.0233 |
| 0.0001 | 11.41 | 48510 | 0.0236 |
| 0.0818 | 11.42 | 48520 | 0.0240 |
| 0.0002 | 11.42 | 48530 | 0.0248 |
| 0.001 | 11.42 | 48540 | 0.0256 |
| 0.0001 | 11.42 | 48550 | 0.0265 |
| 0.0001 | 11.43 | 48560 | 0.0268 |
| 0.0001 | 11.43 | 48570 | 0.0270 |
| 0.271 | 11.43 | 48580 | 0.0268 |
| 0.0523 | 11.43 | 48590 | 0.0265 |
| 0.0001 | 11.44 | 48600 | 0.0267 |
| 0.4182 | 11.44 | 48610 | 0.0254 |
| 0.158 | 11.44 | 48620 | 0.0249 |
| 0.0356 | 11.44 | 48630 | 0.0240 |
| 0.0001 | 11.44 | 48640 | 0.0238 |
| 0.0001 | 11.45 | 48650 | 0.0237 |
| 0.0001 | 11.45 | 48660 | 0.0240 |
| 0.2174 | 11.45 | 48670 | 0.0241 |
| 0.0001 | 11.45 | 48680 | 0.0238 |
| 0.0001 | 11.46 | 48690 | 0.0237 |
| 0.0002 | 11.46 | 48700 | 0.0238 |
| 0.0023 | 11.46 | 48710 | 0.0243 |
| 0.0426 | 11.46 | 48720 | 0.0237 |
| 0.2115 | 11.47 | 48730 | 0.0232 |
| 0.0042 | 11.47 | 48740 | 0.0228 |
| 0.0001 | 11.47 | 48750 | 0.0228 |
| 0.1068 | 11.47 | 48760 | 0.0225 |
| 0.0001 | 11.48 | 48770 | 0.0222 |
| 0.0001 | 11.48 | 48780 | 0.0222 |
| 0.0053 | 11.48 | 48790 | 0.0224 |
| 0.0001 | 11.48 | 48800 | 0.0226 |
| 0.3552 | 11.48 | 48810 | 0.0232 |
| 0.0001 | 11.49 | 48820 | 0.0236 |
| 0.0002 | 11.49 | 48830 | 0.0238 |
| 0.0001 | 11.49 | 48840 | 0.0239 |
| 0.2776 | 11.49 | 48850 | 0.0238 |
| 0.0002 | 11.5 | 48860 | 0.0246 |
| 0.0712 | 11.5 | 48870 | 0.0245 |
| 0.0001 | 11.5 | 48880 | 0.0245 |
| 0.0134 | 11.5 | 48890 | 0.0242 |
| 0.0001 | 11.51 | 48900 | 0.0241 |
| 0.0003 | 11.51 | 48910 | 0.0248 |
| 0.0876 | 11.51 | 48920 | 0.0260 |
| 0.5687 | 11.51 | 48930 | 0.0263 |
| 0.0001 | 11.52 | 48940 | 0.0264 |
| 0.1322 | 11.52 | 48950 | 0.0259 |
| 0.0001 | 11.52 | 48960 | 0.0255 |
| 0.0001 | 11.52 | 48970 | 0.0255 |
| 0.0001 | 11.52 | 48980 | 0.0255 |
| 0.0149 | 11.53 | 48990 | 0.0260 |
| 0.2402 | 11.53 | 49000 | 0.0261 |
| 0.0001 | 11.53 | 49010 | 0.0259 |
| 0.112 | 11.53 | 49020 | 0.0268 |
| 0.3141 | 11.54 | 49030 | 0.0279 |
| 0.0003 | 11.54 | 49040 | 0.0284 |
| 0.0001 | 11.54 | 49050 | 0.0288 |
| 0.6594 | 11.54 | 49060 | 0.0286 |
| 0.0114 | 11.55 | 49070 | 0.0280 |
| 0.0001 | 11.55 | 49080 | 0.0278 |
| 0.0007 | 11.55 | 49090 | 0.0278 |
| 0.0001 | 11.55 | 49100 | 0.0279 |
| 0.1053 | 11.56 | 49110 | 0.0273 |
| 0.5597 | 11.56 | 49120 | 0.0261 |
| 0.0001 | 11.56 | 49130 | 0.0256 |
| 0.0001 | 11.56 | 49140 | 0.0254 |
| 0.1291 | 11.56 | 49150 | 0.0249 |
| 0.0001 | 11.57 | 49160 | 0.0245 |
| 0.4801 | 11.57 | 49170 | 0.0238 |
| 0.0008 | 11.57 | 49180 | 0.0237 |
| 0.0116 | 11.57 | 49190 | 0.0238 |
| 0.0001 | 11.58 | 49200 | 0.0243 |
| 0.1873 | 11.58 | 49210 | 0.0244 |
| 0.0001 | 11.58 | 49220 | 0.0243 |
| 0.0302 | 11.58 | 49230 | 0.0250 |
| 0.0001 | 11.59 | 49240 | 0.0254 |
| 0.0026 | 11.59 | 49250 | 0.0261 |
| 0.2548 | 11.59 | 49260 | 0.0265 |
| 0.0002 | 11.59 | 49270 | 0.0268 |
| 0.0002 | 11.6 | 49280 | 0.0269 |
| 0.0001 | 11.6 | 49290 | 0.0269 |
| 0.3291 | 11.6 | 49300 | 0.0262 |
| 0.0001 | 11.6 | 49310 | 0.0256 |
| 0.0034 | 11.6 | 49320 | 0.0260 |
| 0.2501 | 11.61 | 49330 | 0.0252 |
| 0.0001 | 11.61 | 49340 | 0.0246 |
| 0.0001 | 11.61 | 49350 | 0.0244 |
| 0.0001 | 11.61 | 49360 | 0.0243 |
| 0.0001 | 11.62 | 49370 | 0.0243 |
| 0.0004 | 11.62 | 49380 | 0.0246 |
| 0.0002 | 11.62 | 49390 | 0.0251 |
| 0.0001 | 11.62 | 49400 | 0.0254 |
| 0.0001 | 11.63 | 49410 | 0.0256 |
| 0.0001 | 11.63 | 49420 | 0.0258 |
| 0.5139 | 11.63 | 49430 | 0.0253 |
| 0.0003 | 11.63 | 49440 | 0.0254 |
| 0.0005 | 11.64 | 49450 | 0.0260 |
| 0.0001 | 11.64 | 49460 | 0.0263 |
| 0.0001 | 11.64 | 49470 | 0.0264 |
| 0.0001 | 11.64 | 49480 | 0.0265 |
| 0.0001 | 11.64 | 49490 | 0.0264 |
| 0.0651 | 11.65 | 49500 | 0.0270 |
| 0.0001 | 11.65 | 49510 | 0.0276 |
| 0.1773 | 11.65 | 49520 | 0.0271 |
| 0.0001 | 11.65 | 49530 | 0.0264 |
| 0.0001 | 11.66 | 49540 | 0.0262 |
| 0.0002 | 11.66 | 49550 | 0.0261 |
| 0.0001 | 11.66 | 49560 | 0.0261 |
| 0.0001 | 11.66 | 49570 | 0.0262 |
| 0.4277 | 11.67 | 49580 | 0.0254 |
| 0.0145 | 11.67 | 49590 | 0.0242 |
| 0.2859 | 11.67 | 49600 | 0.0235 |
| 0.0001 | 11.67 | 49610 | 0.0229 |
| 0.1118 | 11.68 | 49620 | 0.0230 |
| 0.0031 | 11.68 | 49630 | 0.0240 |
| 0.0001 | 11.68 | 49640 | 0.0246 |
| 0.0 | 11.68 | 49650 | 0.0249 |
| 0.2677 | 11.68 | 49660 | 0.0252 |
| 0.0005 | 11.69 | 49670 | 0.0249 |
| 0.5963 | 11.69 | 49680 | 0.0228 |
| 0.0422 | 11.69 | 49690 | 0.0212 |
| 0.2657 | 11.69 | 49700 | 0.0203 |
| 0.0004 | 11.7 | 49710 | 0.0207 |
| 0.0001 | 11.7 | 49720 | 0.0207 |
| 0.3541 | 11.7 | 49730 | 0.0205 |
| 0.0001 | 11.7 | 49740 | 0.0203 |
| 0.0001 | 11.71 | 49750 | 0.0202 |
| 0.2072 | 11.71 | 49760 | 0.0205 |
| 0.0002 | 11.71 | 49770 | 0.0208 |
| 0.0343 | 11.71 | 49780 | 0.0203 |
| 0.0002 | 11.72 | 49790 | 0.0201 |
| 0.4527 | 11.72 | 49800 | 0.0199 |
| 0.0001 | 11.72 | 49810 | 0.0198 |
| 0.0665 | 11.72 | 49820 | 0.0196 |
| 0.0172 | 11.72 | 49830 | 0.0193 |
| 0.0 | 11.73 | 49840 | 0.0193 |
| 0.0001 | 11.73 | 49850 | 0.0193 |
| 0.0002 | 11.73 | 49860 | 0.0194 |
| 0.0001 | 11.73 | 49870 | 0.0196 |
| 0.0001 | 11.74 | 49880 | 0.0198 |
| 0.0112 | 11.74 | 49890 | 0.0201 |
| 0.2585 | 11.74 | 49900 | 0.0199 |
| 0.3218 | 11.74 | 49910 | 0.0195 |
| 0.2643 | 11.75 | 49920 | 0.0192 |
| 0.0001 | 11.75 | 49930 | 0.0191 |
| 0.0002 | 11.75 | 49940 | 0.0191 |
| 0.0009 | 11.75 | 49950 | 0.0194 |
| 0.0002 | 11.76 | 49960 | 0.0199 |
| 0.1389 | 11.76 | 49970 | 0.0203 |
| 0.0001 | 11.76 | 49980 | 0.0205 |
| 0.0001 | 11.76 | 49990 | 0.0206 |
| 0.0095 | 11.76 | 50000 | 0.0209 |
| 0.0001 | 11.77 | 50010 | 0.0214 |
| 0.0001 | 11.77 | 50020 | 0.0217 |
| 0.4765 | 11.77 | 50030 | 0.0218 |
| 0.0001 | 11.77 | 50040 | 0.0216 |
| 0.2976 | 11.78 | 50050 | 0.0208 |
| 0.0006 | 11.78 | 50060 | 0.0208 |
| 0.0012 | 11.78 | 50070 | 0.0215 |
| 0.0761 | 11.78 | 50080 | 0.0222 |
| 0.0001 | 11.79 | 50090 | 0.0229 |
| 0.0001 | 11.79 | 50100 | 0.0232 |
| 0.0 | 11.79 | 50110 | 0.0235 |
| 0.0001 | 11.79 | 50120 | 0.0236 |
| 0.0001 | 11.8 | 50130 | 0.0237 |
| 0.1188 | 11.8 | 50140 | 0.0231 |
| 0.0 | 11.8 | 50150 | 0.0228 |
| 0.0001 | 11.8 | 50160 | 0.0230 |
| 0.3238 | 11.8 | 50170 | 0.0226 |
| 0.2185 | 11.81 | 50180 | 0.0218 |
| 0.0001 | 11.81 | 50190 | 0.0213 |
| 0.0003 | 11.81 | 50200 | 0.0215 |
| 0.0554 | 11.81 | 50210 | 0.0211 |
| 0.0002 | 11.82 | 50220 | 0.0208 |
| 0.0001 | 11.82 | 50230 | 0.0208 |
| 0.118 | 11.82 | 50240 | 0.0204 |
| 0.0001 | 11.82 | 50250 | 0.0203 |
| 0.2577 | 11.83 | 50260 | 0.0200 |
| 0.0002 | 11.83 | 50270 | 0.0198 |
| 0.0001 | 11.83 | 50280 | 0.0198 |
| 0.0869 | 11.83 | 50290 | 0.0195 |
| 0.215 | 11.84 | 50300 | 0.0194 |
| 0.0299 | 11.84 | 50310 | 0.0190 |
| 0.2625 | 11.84 | 50320 | 0.0186 |
| 0.0001 | 11.84 | 50330 | 0.0185 |
| 0.1004 | 11.84 | 50340 | 0.0185 |
| 0.0001 | 11.85 | 50350 | 0.0187 |
| 0.0058 | 11.85 | 50360 | 0.0187 |
| 0.2726 | 11.85 | 50370 | 0.0185 |
| 0.0001 | 11.85 | 50380 | 0.0186 |
| 0.0122 | 11.86 | 50390 | 0.0187 |
| 0.0001 | 11.86 | 50400 | 0.0187 |
| 0.2623 | 11.86 | 50410 | 0.0185 |
| 0.0001 | 11.86 | 50420 | 0.0185 |
| 0.0001 | 11.87 | 50430 | 0.0185 |
| 0.0001 | 11.87 | 50440 | 0.0186 |
| 0.0552 | 11.87 | 50450 | 0.0187 |
| 0.0001 | 11.87 | 50460 | 0.0187 |
| 0.0001 | 11.88 | 50470 | 0.0188 |
| 0.0001 | 11.88 | 50480 | 0.0189 |
| 0.0001 | 11.88 | 50490 | 0.0189 |
| 0.0199 | 11.88 | 50500 | 0.0190 |
| 0.0001 | 11.88 | 50510 | 0.0192 |
| 0.0001 | 11.89 | 50520 | 0.0193 |
| 0.0005 | 11.89 | 50530 | 0.0194 |
| 0.0001 | 11.89 | 50540 | 0.0197 |
| 0.0001 | 11.89 | 50550 | 0.0199 |
| 0.0001 | 11.9 | 50560 | 0.0200 |
| 0.0006 | 11.9 | 50570 | 0.0202 |
| 0.0007 | 11.9 | 50580 | 0.0204 |
| 0.0 | 11.9 | 50590 | 0.0205 |
| 0.0823 | 11.91 | 50600 | 0.0205 |
| 0.0005 | 11.91 | 50610 | 0.0211 |
| 0.0001 | 11.91 | 50620 | 0.0214 |
| 0.2023 | 11.91 | 50630 | 0.0210 |
| 0.0001 | 11.92 | 50640 | 0.0209 |
| 0.0001 | 11.92 | 50650 | 0.0209 |
| 0.0555 | 11.92 | 50660 | 0.0211 |
| 0.0 | 11.92 | 50670 | 0.0213 |
| 0.0001 | 11.92 | 50680 | 0.0215 |
| 0.0005 | 11.93 | 50690 | 0.0216 |
| 0.0077 | 11.93 | 50700 | 0.0221 |
| 0.0001 | 11.93 | 50710 | 0.0223 |
| 0.0001 | 11.93 | 50720 | 0.0225 |
| 0.2675 | 11.94 | 50730 | 0.0221 |
| 0.0001 | 11.94 | 50740 | 0.0220 |
| 0.0001 | 11.94 | 50750 | 0.0220 |
| 0.0001 | 11.94 | 50760 | 0.0221 |
| 0.2036 | 11.95 | 50770 | 0.0216 |
| 0.0001 | 11.95 | 50780 | 0.0213 |
| 0.0001 | 11.95 | 50790 | 0.0213 |
| 0.2601 | 11.95 | 50800 | 0.0213 |
| 0.0001 | 11.96 | 50810 | 0.0214 |
| 0.0001 | 11.96 | 50820 | 0.0215 |
| 0.0691 | 11.96 | 50830 | 0.0211 |
| 0.0254 | 11.96 | 50840 | 0.0209 |
| 0.0001 | 11.96 | 50850 | 0.0210 |
| 0.4226 | 11.97 | 50860 | 0.0206 |
| 0.0001 | 11.97 | 50870 | 0.0203 |
| 0.0001 | 11.97 | 50880 | 0.0203 |
| 0.1774 | 11.97 | 50890 | 0.0204 |
| 0.0001 | 11.98 | 50900 | 0.0205 |
| 0.0701 | 11.98 | 50910 | 0.0204 |
| 0.0022 | 11.98 | 50920 | 0.0204 |
| 0.0001 | 11.98 | 50930 | 0.0203 |
| 0.4662 | 11.99 | 50940 | 0.0201 |
| 0.0001 | 11.99 | 50950 | 0.0200 |
| 0.0001 | 11.99 | 50960 | 0.0201 |
| 0.0001 | 11.99 | 50970 | 0.0201 |
| 0.0001 | 12.0 | 50980 | 0.0202 |
| 0.0002 | 12.0 | 50990 | 0.0204 |
| 0.1614 | 12.0 | 51000 | 0.0205 |
| 0.2377 | 12.0 | 51010 | 0.0203 |
| 0.0001 | 12.0 | 51020 | 0.0203 |
| 0.0068 | 12.01 | 51030 | 0.0203 |
| 0.2275 | 12.01 | 51040 | 0.0204 |
| 0.0001 | 12.01 | 51050 | 0.0207 |
| 0.0843 | 12.01 | 51060 | 0.0211 |
| 0.0161 | 12.02 | 51070 | 0.0214 |
| 0.0001 | 12.02 | 51080 | 0.0210 |
| 0.0175 | 12.02 | 51090 | 0.0207 |
| 0.4784 | 12.02 | 51100 | 0.0205 |
| 0.293 | 12.03 | 51110 | 0.0206 |
| 0.0001 | 12.03 | 51120 | 0.0207 |
| 0.0002 | 12.03 | 51130 | 0.0209 |
| 0.0001 | 12.03 | 51140 | 0.0210 |
| 0.3097 | 12.04 | 51150 | 0.0210 |
| 0.0001 | 12.04 | 51160 | 0.0209 |
| 0.2333 | 12.04 | 51170 | 0.0206 |
| 0.0001 | 12.04 | 51180 | 0.0205 |
| 0.0001 | 12.04 | 51190 | 0.0206 |
| 0.228 | 12.05 | 51200 | 0.0203 |
| 0.0001 | 12.05 | 51210 | 0.0203 |
| 0.1808 | 12.05 | 51220 | 0.0205 |
| 0.0094 | 12.05 | 51230 | 0.0211 |
| 0.0002 | 12.06 | 51240 | 0.0211 |
| 0.3041 | 12.06 | 51250 | 0.0208 |
| 0.0073 | 12.06 | 51260 | 0.0210 |
| 0.0002 | 12.06 | 51270 | 0.0213 |
| 0.2418 | 12.07 | 51280 | 0.0214 |
| 0.3057 | 12.07 | 51290 | 0.0212 |
| 0.8319 | 12.07 | 51300 | 0.0207 |
| 0.0001 | 12.07 | 51310 | 0.0205 |
| 0.0185 | 12.08 | 51320 | 0.0202 |
| 0.2445 | 12.08 | 51330 | 0.0200 |
| 0.0003 | 12.08 | 51340 | 0.0204 |
| 0.0002 | 12.08 | 51350 | 0.0207 |
| 0.037 | 12.08 | 51360 | 0.0209 |
| 0.2184 | 12.09 | 51370 | 0.0205 |
| 0.0003 | 12.09 | 51380 | 0.0208 |
| 0.0001 | 12.09 | 51390 | 0.0209 |
| 0.0001 | 12.09 | 51400 | 0.0210 |
| 0.0926 | 12.1 | 51410 | 0.0205 |
| 0.0001 | 12.1 | 51420 | 0.0201 |
| 0.2378 | 12.1 | 51430 | 0.0199 |
| 0.103 | 12.1 | 51440 | 0.0197 |
| 0.0001 | 12.11 | 51450 | 0.0197 |
| 0.0006 | 12.11 | 51460 | 0.0197 |
| 0.0001 | 12.11 | 51470 | 0.0197 |
| 0.0001 | 12.11 | 51480 | 0.0198 |
| 0.0001 | 12.12 | 51490 | 0.0198 |
| 0.0001 | 12.12 | 51500 | 0.0200 |
| 0.0001 | 12.12 | 51510 | 0.0202 |
| 0.0014 | 12.12 | 51520 | 0.0205 |
| 0.0001 | 12.12 | 51530 | 0.0213 |
| 0.0409 | 12.13 | 51540 | 0.0215 |
| 0.2691 | 12.13 | 51550 | 0.0213 |
| 0.0004 | 12.13 | 51560 | 0.0213 |
| 0.0099 | 12.13 | 51570 | 0.0212 |
| 0.4848 | 12.14 | 51580 | 0.0211 |
| 0.0 | 12.14 | 51590 | 0.0209 |
| 0.0001 | 12.14 | 51600 | 0.0208 |
| 0.0007 | 12.14 | 51610 | 0.0209 |
| 0.0001 | 12.15 | 51620 | 0.0215 |
| 0.0001 | 12.15 | 51630 | 0.0217 |
| 0.0 | 12.15 | 51640 | 0.0218 |
| 0.0001 | 12.15 | 51650 | 0.0218 |
| 0.0001 | 12.16 | 51660 | 0.0219 |
| 0.0001 | 12.16 | 51670 | 0.0219 |
| 0.0664 | 12.16 | 51680 | 0.0220 |
| 0.2493 | 12.16 | 51690 | 0.0218 |
| 0.0001 | 12.16 | 51700 | 0.0219 |
| 0.1226 | 12.17 | 51710 | 0.0221 |
| 0.0001 | 12.17 | 51720 | 0.0224 |
| 0.0001 | 12.17 | 51730 | 0.0225 |
| 0.0001 | 12.17 | 51740 | 0.0226 |
| 0.5193 | 12.18 | 51750 | 0.0224 |
| 0.0001 | 12.18 | 51760 | 0.0228 |
| 0.0001 | 12.18 | 51770 | 0.0230 |
| 0.0915 | 12.18 | 51780 | 0.0225 |
| 0.0001 | 12.19 | 51790 | 0.0223 |
| 0.1932 | 12.19 | 51800 | 0.0219 |
| 0.0 | 12.19 | 51810 | 0.0216 |
| 0.0498 | 12.19 | 51820 | 0.0213 |
| 0.0001 | 12.2 | 51830 | 0.0211 |
| 0.0001 | 12.2 | 51840 | 0.0211 |
| 0.11 | 12.2 | 51850 | 0.0209 |
| 0.005 | 12.2 | 51860 | 0.0206 |
| 0.2763 | 12.2 | 51870 | 0.0202 |
| 0.0001 | 12.21 | 51880 | 0.0201 |
| 0.0013 | 12.21 | 51890 | 0.0200 |
| 0.0071 | 12.21 | 51900 | 0.0198 |
| 0.0017 | 12.21 | 51910 | 0.0198 |
| 0.0001 | 12.22 | 51920 | 0.0198 |
| 0.001 | 12.22 | 51930 | 0.0206 |
| 0.0001 | 12.22 | 51940 | 0.0210 |
| 0.2983 | 12.22 | 51950 | 0.0207 |
| 0.0001 | 12.23 | 51960 | 0.0205 |
| 0.0001 | 12.23 | 51970 | 0.0204 |
| 0.0001 | 12.23 | 51980 | 0.0204 |
| 0.0076 | 12.23 | 51990 | 0.0205 |
| 0.0001 | 12.24 | 52000 | 0.0205 |
| 0.162 | 12.24 | 52010 | 0.0208 |
| 0.0001 | 12.24 | 52020 | 0.0210 |
| 0.0002 | 12.24 | 52030 | 0.0214 |
| 0.0444 | 12.24 | 52040 | 0.0218 |
| 0.0001 | 12.25 | 52050 | 0.0220 |
| 0.0 | 12.25 | 52060 | 0.0221 |
| 0.0001 | 12.25 | 52070 | 0.0222 |
| 0.0263 | 12.25 | 52080 | 0.0229 |
| 0.0 | 12.26 | 52090 | 0.0231 |
| 0.0222 | 12.26 | 52100 | 0.0232 |
| 0.1039 | 12.26 | 52110 | 0.0230 |
| 0.0 | 12.26 | 52120 | 0.0226 |
| 0.3523 | 12.27 | 52130 | 0.0228 |
| 0.1961 | 12.27 | 52140 | 0.0229 |
| 0.0 | 12.27 | 52150 | 0.0229 |
| 0.2073 | 12.27 | 52160 | 0.0228 |
| 0.0001 | 12.28 | 52170 | 0.0226 |
| 0.0001 | 12.28 | 52180 | 0.0225 |
| 0.0002 | 12.28 | 52190 | 0.0223 |
| 0.0073 | 12.28 | 52200 | 0.0218 |
| 0.0009 | 12.28 | 52210 | 0.0216 |
| 0.337 | 12.29 | 52220 | 0.0214 |
| 0.0001 | 12.29 | 52230 | 0.0214 |
| 0.1605 | 12.29 | 52240 | 0.0209 |
| 0.0305 | 12.29 | 52250 | 0.0203 |
| 0.0001 | 12.3 | 52260 | 0.0201 |
| 0.0655 | 12.3 | 52270 | 0.0204 |
| 0.0001 | 12.3 | 52280 | 0.0206 |
| 0.0006 | 12.3 | 52290 | 0.0206 |
| 0.0001 | 12.31 | 52300 | 0.0206 |
| 0.0001 | 12.31 | 52310 | 0.0206 |
| 0.0001 | 12.31 | 52320 | 0.0207 |
| 0.0037 | 12.31 | 52330 | 0.0210 |
| 0.0 | 12.32 | 52340 | 0.0212 |
| 0.0001 | 12.32 | 52350 | 0.0213 |
| 0.0001 | 12.32 | 52360 | 0.0213 |
| 0.0001 | 12.32 | 52370 | 0.0214 |
| 0.3049 | 12.32 | 52380 | 0.0217 |
| 0.0001 | 12.33 | 52390 | 0.0217 |
| 0.0001 | 12.33 | 52400 | 0.0217 |
| 0.0002 | 12.33 | 52410 | 0.0218 |
| 0.0001 | 12.33 | 52420 | 0.0219 |
| 0.0001 | 12.34 | 52430 | 0.0219 |
| 0.0686 | 12.34 | 52440 | 0.0221 |
| 0.0001 | 12.34 | 52450 | 0.0222 |
| 0.0627 | 12.34 | 52460 | 0.0225 |
| 0.3267 | 12.35 | 52470 | 0.0224 |
| 0.0001 | 12.35 | 52480 | 0.0222 |
| 0.0001 | 12.35 | 52490 | 0.0222 |
| 0.0003 | 12.35 | 52500 | 0.0222 |
| 0.0001 | 12.36 | 52510 | 0.0224 |
| 0.2599 | 12.36 | 52520 | 0.0220 |
| 0.288 | 12.36 | 52530 | 0.0218 |
| 0.0876 | 12.36 | 52540 | 0.0218 |
| 0.4426 | 12.36 | 52550 | 0.0217 |
| 0.0001 | 12.37 | 52560 | 0.0215 |
| 0.0005 | 12.37 | 52570 | 0.0215 |
| 0.2547 | 12.37 | 52580 | 0.0214 |
| 0.0001 | 12.37 | 52590 | 0.0212 |
| 0.0001 | 12.38 | 52600 | 0.0212 |
| 0.0005 | 12.38 | 52610 | 0.0216 |
| 0.0145 | 12.38 | 52620 | 0.0212 |
| 0.0001 | 12.38 | 52630 | 0.0210 |
| 0.0001 | 12.39 | 52640 | 0.0211 |
| 0.0629 | 12.39 | 52650 | 0.0211 |
| 0.034 | 12.39 | 52660 | 0.0211 |
| 0.0001 | 12.39 | 52670 | 0.0211 |
| 0.0463 | 12.4 | 52680 | 0.0213 |
| 0.0367 | 12.4 | 52690 | 0.0216 |
| 0.223 | 12.4 | 52700 | 0.0212 |
| 0.0001 | 12.4 | 52710 | 0.0208 |
| 0.0029 | 12.4 | 52720 | 0.0209 |
| 0.2097 | 12.41 | 52730 | 0.0208 |
| 0.1916 | 12.41 | 52740 | 0.0205 |
| 0.0001 | 12.41 | 52750 | 0.0204 |
| 0.6506 | 12.41 | 52760 | 0.0202 |
| 0.003 | 12.42 | 52770 | 0.0202 |
| 0.0001 | 12.42 | 52780 | 0.0211 |
| 0.0001 | 12.42 | 52790 | 0.0215 |
| 0.0002 | 12.42 | 52800 | 0.0218 |
| 0.0001 | 12.43 | 52810 | 0.0220 |
| 0.0234 | 12.43 | 52820 | 0.0221 |
| 0.0001 | 12.43 | 52830 | 0.0222 |
| 0.0292 | 12.43 | 52840 | 0.0222 |
| 0.4242 | 12.44 | 52850 | 0.0218 |
| 0.2847 | 12.44 | 52860 | 0.0213 |
| 0.0604 | 12.44 | 52870 | 0.0211 |
| 0.0001 | 12.44 | 52880 | 0.0207 |
| 0.0001 | 12.44 | 52890 | 0.0207 |
| 0.0001 | 12.45 | 52900 | 0.0207 |
| 0.3691 | 12.45 | 52910 | 0.0207 |
| 0.0042 | 12.45 | 52920 | 0.0205 |
| 0.0519 | 12.45 | 52930 | 0.0203 |
| 0.0269 | 12.46 | 52940 | 0.0203 |
| 0.0002 | 12.46 | 52950 | 0.0202 |
| 0.0004 | 12.46 | 52960 | 0.0201 |
| 0.0378 | 12.46 | 52970 | 0.0201 |
| 0.0375 | 12.47 | 52980 | 0.0200 |
| 0.0 | 12.47 | 52990 | 0.0201 |
| 0.0001 | 12.47 | 53000 | 0.0201 |
| 0.0001 | 12.47 | 53010 | 0.0202 |
| 0.0852 | 12.48 | 53020 | 0.0205 |
| 0.0056 | 12.48 | 53030 | 0.0203 |
| 0.0305 | 12.48 | 53040 | 0.0201 |
| 0.0001 | 12.48 | 53050 | 0.0201 |
| 0.0157 | 12.48 | 53060 | 0.0201 |
| 0.3017 | 12.49 | 53070 | 0.0201 |
| 0.175 | 12.49 | 53080 | 0.0201 |
| 0.0003 | 12.49 | 53090 | 0.0199 |
| 0.017 | 12.49 | 53100 | 0.0197 |
| 0.0188 | 12.5 | 53110 | 0.0197 |
| 0.1098 | 12.5 | 53120 | 0.0200 |
| 0.5271 | 12.5 | 53130 | 0.0202 |
| 0.0005 | 12.5 | 53140 | 0.0199 |
| 0.0002 | 12.51 | 53150 | 0.0200 |
| 0.0001 | 12.51 | 53160 | 0.0201 |
| 0.0004 | 12.51 | 53170 | 0.0202 |
| 0.0001 | 12.51 | 53180 | 0.0203 |
| 0.3061 | 12.52 | 53190 | 0.0204 |
| 0.001 | 12.52 | 53200 | 0.0209 |
| 0.0212 | 12.52 | 53210 | 0.0212 |
| 0.0001 | 12.52 | 53220 | 0.0213 |
| 0.0001 | 12.52 | 53230 | 0.0215 |
| 0.0 | 12.53 | 53240 | 0.0216 |
| 0.003 | 12.53 | 53250 | 0.0218 |
| 0.0001 | 12.53 | 53260 | 0.0219 |
| 0.0 | 12.53 | 53270 | 0.0220 |
| 0.5542 | 12.54 | 53280 | 0.0217 |
| 0.0002 | 12.54 | 53290 | 0.0214 |
| 0.0001 | 12.54 | 53300 | 0.0215 |
| 0.0001 | 12.54 | 53310 | 0.0216 |
| 0.0001 | 12.55 | 53320 | 0.0217 |
| 0.0001 | 12.55 | 53330 | 0.0218 |
| 0.0 | 12.55 | 53340 | 0.0219 |
| 0.0001 | 12.55 | 53350 | 0.0219 |
| 0.1752 | 12.56 | 53360 | 0.0215 |
| 0.0484 | 12.56 | 53370 | 0.0212 |
| 0.0 | 12.56 | 53380 | 0.0211 |
| 0.0001 | 12.56 | 53390 | 0.0211 |
| 0.0001 | 12.56 | 53400 | 0.0211 |
| 0.0001 | 12.57 | 53410 | 0.0211 |
| 0.0043 | 12.57 | 53420 | 0.0212 |
| 0.0352 | 12.57 | 53430 | 0.0214 |
| 0.0239 | 12.57 | 53440 | 0.0215 |
| 0.0001 | 12.58 | 53450 | 0.0216 |
| 0.0021 | 12.58 | 53460 | 0.0216 |
| 0.0 | 12.58 | 53470 | 0.0217 |
| 0.0001 | 12.58 | 53480 | 0.0217 |
| 0.3272 | 12.59 | 53490 | 0.0216 |
| 0.0001 | 12.59 | 53500 | 0.0216 |
| 0.0354 | 12.59 | 53510 | 0.0216 |
| 0.0001 | 12.59 | 53520 | 0.0215 |
| 0.0098 | 12.6 | 53530 | 0.0214 |
| 0.0002 | 12.6 | 53540 | 0.0214 |
| 0.0 | 12.6 | 53550 | 0.0217 |
| 0.0833 | 12.6 | 53560 | 0.0214 |
| 0.0001 | 12.6 | 53570 | 0.0208 |
| 0.0001 | 12.61 | 53580 | 0.0207 |
| 0.0004 | 12.61 | 53590 | 0.0208 |
| 0.0001 | 12.61 | 53600 | 0.0210 |
| 0.0495 | 12.61 | 53610 | 0.0212 |
| 0.0001 | 12.62 | 53620 | 0.0213 |
| 0.0499 | 12.62 | 53630 | 0.0213 |
| 0.0039 | 12.62 | 53640 | 0.0218 |
| 0.0001 | 12.62 | 53650 | 0.0221 |
| 0.278 | 12.63 | 53660 | 0.0223 |
| 0.1032 | 12.63 | 53670 | 0.0221 |
| 0.283 | 12.63 | 53680 | 0.0216 |
| 0.0001 | 12.63 | 53690 | 0.0213 |
| 0.0 | 12.64 | 53700 | 0.0212 |
| 0.0001 | 12.64 | 53710 | 0.0212 |
| 0.0027 | 12.64 | 53720 | 0.0212 |
| 0.0208 | 12.64 | 53730 | 0.0212 |
| 0.3517 | 12.64 | 53740 | 0.0212 |
| 0.0 | 12.65 | 53750 | 0.0212 |
| 0.0001 | 12.65 | 53760 | 0.0213 |
| 0.0001 | 12.65 | 53770 | 0.0215 |
| 0.0001 | 12.65 | 53780 | 0.0217 |
| 0.0 | 12.66 | 53790 | 0.0218 |
| 0.0 | 12.66 | 53800 | 0.0218 |
| 0.0 | 12.66 | 53810 | 0.0219 |
| 0.3232 | 12.66 | 53820 | 0.0217 |
| 0.0001 | 12.67 | 53830 | 0.0216 |
| 0.0004 | 12.67 | 53840 | 0.0217 |
| 0.0212 | 12.67 | 53850 | 0.0217 |
| 0.0003 | 12.67 | 53860 | 0.0221 |
| 0.1667 | 12.68 | 53870 | 0.0222 |
| 0.0 | 12.68 | 53880 | 0.0226 |
| 0.0001 | 12.68 | 53890 | 0.0229 |
| 0.0001 | 12.68 | 53900 | 0.0231 |
| 0.0311 | 12.68 | 53910 | 0.0225 |
| 0.6095 | 12.69 | 53920 | 0.0222 |
| 0.0 | 12.69 | 53930 | 0.0220 |
| 0.0001 | 12.69 | 53940 | 0.0220 |
| 0.0001 | 12.69 | 53950 | 0.0220 |
| 0.1703 | 12.7 | 53960 | 0.0218 |
| 0.0 | 12.7 | 53970 | 0.0217 |
| 0.0442 | 12.7 | 53980 | 0.0217 |
| 0.0001 | 12.7 | 53990 | 0.0217 |
| 0.0001 | 12.71 | 54000 | 0.0217 |
| 0.3161 | 12.71 | 54010 | 0.0216 |
| 0.1005 | 12.71 | 54020 | 0.0211 |
| 0.0001 | 12.71 | 54030 | 0.0209 |
| 0.0001 | 12.72 | 54040 | 0.0210 |
| 0.0584 | 12.72 | 54050 | 0.0211 |
| 0.0052 | 12.72 | 54060 | 0.0215 |
| 0.0001 | 12.72 | 54070 | 0.0219 |
| 0.0 | 12.72 | 54080 | 0.0221 |
| 0.3226 | 12.73 | 54090 | 0.0221 |
| 0.0001 | 12.73 | 54100 | 0.0220 |
| 0.0001 | 12.73 | 54110 | 0.0220 |
| 0.0001 | 12.73 | 54120 | 0.0220 |
| 0.0002 | 12.74 | 54130 | 0.0220 |
| 0.004 | 12.74 | 54140 | 0.0222 |
| 0.0547 | 12.74 | 54150 | 0.0233 |
| 0.0 | 12.74 | 54160 | 0.0238 |
| 0.0001 | 12.75 | 54170 | 0.0240 |
| 0.1631 | 12.75 | 54180 | 0.0236 |
| 0.287 | 12.75 | 54190 | 0.0227 |
| 0.0001 | 12.75 | 54200 | 0.0223 |
| 0.0002 | 12.76 | 54210 | 0.0224 |
| 0.0001 | 12.76 | 54220 | 0.0225 |
| 0.0001 | 12.76 | 54230 | 0.0226 |
| 0.0143 | 12.76 | 54240 | 0.0227 |
| 0.2873 | 12.76 | 54250 | 0.0225 |
| 0.0001 | 12.77 | 54260 | 0.0224 |
| 0.0001 | 12.77 | 54270 | 0.0224 |
| 0.0001 | 12.77 | 54280 | 0.0225 |
| 0.0001 | 12.77 | 54290 | 0.0225 |
| 0.0001 | 12.78 | 54300 | 0.0226 |
| 0.0001 | 12.78 | 54310 | 0.0227 |
| 0.3462 | 12.78 | 54320 | 0.0220 |
| 0.0015 | 12.78 | 54330 | 0.0213 |
| 0.4338 | 12.79 | 54340 | 0.0207 |
| 0.0001 | 12.79 | 54350 | 0.0201 |
| 0.3371 | 12.79 | 54360 | 0.0196 |
| 0.0329 | 12.79 | 54370 | 0.0192 |
| 0.0001 | 12.8 | 54380 | 0.0191 |
| 0.1219 | 12.8 | 54390 | 0.0187 |
| 0.0001 | 12.8 | 54400 | 0.0185 |
| 0.3022 | 12.8 | 54410 | 0.0180 |
| 0.0319 | 12.8 | 54420 | 0.0181 |
| 0.0002 | 12.81 | 54430 | 0.0183 |
| 0.0191 | 12.81 | 54440 | 0.0182 |
| 0.0002 | 12.81 | 54450 | 0.0183 |
| 0.0002 | 12.81 | 54460 | 0.0185 |
| 0.2519 | 12.82 | 54470 | 0.0186 |
| 0.0031 | 12.82 | 54480 | 0.0187 |
| 0.0001 | 12.82 | 54490 | 0.0190 |
| 0.0001 | 12.82 | 54500 | 0.0191 |
| 0.0003 | 12.83 | 54510 | 0.0193 |
| 0.0454 | 12.83 | 54520 | 0.0198 |
| 0.3396 | 12.83 | 54530 | 0.0199 |
| 0.2808 | 12.83 | 54540 | 0.0193 |
| 0.0002 | 12.84 | 54550 | 0.0192 |
| 0.0001 | 12.84 | 54560 | 0.0192 |
| 0.0002 | 12.84 | 54570 | 0.0193 |
| 0.0002 | 12.84 | 54580 | 0.0196 |
| 0.3185 | 12.84 | 54590 | 0.0197 |
| 0.0001 | 12.85 | 54600 | 0.0196 |
| 0.1269 | 12.85 | 54610 | 0.0191 |
| 0.0955 | 12.85 | 54620 | 0.0191 |
| 0.0001 | 12.85 | 54630 | 0.0193 |
| 0.0002 | 12.86 | 54640 | 0.0194 |
| 0.0001 | 12.86 | 54650 | 0.0195 |
| 0.0036 | 12.86 | 54660 | 0.0196 |
| 0.0001 | 12.86 | 54670 | 0.0197 |
| 0.0001 | 12.87 | 54680 | 0.0197 |
| 0.0001 | 12.87 | 54690 | 0.0198 |
| 0.0001 | 12.87 | 54700 | 0.0199 |
| 0.0001 | 12.87 | 54710 | 0.0200 |
| 0.0002 | 12.88 | 54720 | 0.0201 |
| 0.0001 | 12.88 | 54730 | 0.0202 |
| 0.0001 | 12.88 | 54740 | 0.0202 |
| 0.1924 | 12.88 | 54750 | 0.0205 |
| 0.1546 | 12.88 | 54760 | 0.0207 |
| 0.0004 | 12.89 | 54770 | 0.0203 |
| 0.0001 | 12.89 | 54780 | 0.0201 |
| 0.0001 | 12.89 | 54790 | 0.0201 |
| 0.0001 | 12.89 | 54800 | 0.0201 |
| 0.0364 | 12.9 | 54810 | 0.0198 |
| 0.0071 | 12.9 | 54820 | 0.0195 |
| 0.0001 | 12.9 | 54830 | 0.0195 |
| 0.0 | 12.9 | 54840 | 0.0195 |
| 0.0001 | 12.91 | 54850 | 0.0195 |
| 0.0001 | 12.91 | 54860 | 0.0196 |
| 0.232 | 12.91 | 54870 | 0.0193 |
| 0.0001 | 12.91 | 54880 | 0.0192 |
| 0.0 | 12.92 | 54890 | 0.0191 |
| 0.0001 | 12.92 | 54900 | 0.0192 |
| 0.0073 | 12.92 | 54910 | 0.0194 |
| 0.0006 | 12.92 | 54920 | 0.0200 |
| 0.0001 | 12.92 | 54930 | 0.0205 |
| 0.0 | 12.93 | 54940 | 0.0207 |
| 0.1023 | 12.93 | 54950 | 0.0202 |
| 0.0001 | 12.93 | 54960 | 0.0195 |
| 0.0238 | 12.93 | 54970 | 0.0194 |
| 0.0001 | 12.94 | 54980 | 0.0195 |
| 0.2478 | 12.94 | 54990 | 0.0196 |
| 0.0001 | 12.94 | 55000 | 0.0196 |
| 0.0001 | 12.94 | 55010 | 0.0197 |
| 0.0001 | 12.95 | 55020 | 0.0197 |
| 0.2955 | 12.95 | 55030 | 0.0194 |
| 0.0001 | 12.95 | 55040 | 0.0192 |
| 0.0001 | 12.95 | 55050 | 0.0192 |
| 0.0001 | 12.96 | 55060 | 0.0192 |
| 0.0741 | 12.96 | 55070 | 0.0187 |
| 0.0003 | 12.96 | 55080 | 0.0186 |
| 0.0 | 12.96 | 55090 | 0.0185 |
| 0.0001 | 12.96 | 55100 | 0.0186 |
| 0.2587 | 12.97 | 55110 | 0.0185 |
| 0.0078 | 12.97 | 55120 | 0.0181 |
| 0.0001 | 12.97 | 55130 | 0.0181 |
| 0.0001 | 12.97 | 55140 | 0.0182 |
| 0.0001 | 12.98 | 55150 | 0.0182 |
| 0.034 | 12.98 | 55160 | 0.0186 |
| 0.0001 | 12.98 | 55170 | 0.0194 |
| 0.0001 | 12.98 | 55180 | 0.0198 |
| 0.0615 | 12.99 | 55190 | 0.0199 |
| 0.0001 | 12.99 | 55200 | 0.0200 |
| 0.1477 | 12.99 | 55210 | 0.0196 |
| 0.0 | 12.99 | 55220 | 0.0193 |
| 0.0001 | 13.0 | 55230 | 0.0193 |
| 0.0 | 13.0 | 55240 | 0.0193 |
| 0.0005 | 13.0 | 55250 | 0.0197 |
| 0.0001 | 13.0 | 55260 | 0.0202 |
| 0.0001 | 13.0 | 55270 | 0.0204 |
| 0.0 | 13.01 | 55280 | 0.0205 |
| 0.018 | 13.01 | 55290 | 0.0206 |
| 0.0002 | 13.01 | 55300 | 0.0208 |
| 0.0 | 13.01 | 55310 | 0.0208 |
| 0.2563 | 13.02 | 55320 | 0.0207 |
| 0.0 | 13.02 | 55330 | 0.0203 |
| 0.2478 | 13.02 | 55340 | 0.0199 |
| 0.0 | 13.02 | 55350 | 0.0196 |
| 0.0106 | 13.03 | 55360 | 0.0191 |
| 0.0001 | 13.03 | 55370 | 0.0189 |
| 0.348 | 13.03 | 55380 | 0.0187 |
| 0.0 | 13.03 | 55390 | 0.0184 |
| 0.0 | 13.04 | 55400 | 0.0184 |
| 0.0501 | 13.04 | 55410 | 0.0183 |
| 0.0142 | 13.04 | 55420 | 0.0184 |
| 0.0005 | 13.04 | 55430 | 0.0186 |
| 0.0001 | 13.04 | 55440 | 0.0194 |
| 0.0001 | 13.05 | 55450 | 0.0197 |
| 0.0221 | 13.05 | 55460 | 0.0196 |
| 0.3554 | 13.05 | 55470 | 0.0193 |
| 0.0001 | 13.05 | 55480 | 0.0191 |
| 0.0976 | 13.06 | 55490 | 0.0191 |
| 0.0 | 13.06 | 55500 | 0.0191 |
| 0.0163 | 13.06 | 55510 | 0.0192 |
| 0.0066 | 13.06 | 55520 | 0.0195 |
| 0.0 | 13.07 | 55530 | 0.0197 |
| 0.7395 | 13.07 | 55540 | 0.0188 |
| 0.0001 | 13.07 | 55550 | 0.0184 |
| 0.0002 | 13.07 | 55560 | 0.0185 |
| 0.0001 | 13.08 | 55570 | 0.0186 |
| 0.0001 | 13.08 | 55580 | 0.0188 |
| 0.0001 | 13.08 | 55590 | 0.0189 |
| 0.0001 | 13.08 | 55600 | 0.0189 |
| 0.0001 | 13.08 | 55610 | 0.0191 |
| 0.0004 | 13.09 | 55620 | 0.0192 |
| 0.171 | 13.09 | 55630 | 0.0187 |
| 0.219 | 13.09 | 55640 | 0.0185 |
| 0.2445 | 13.09 | 55650 | 0.0182 |
| 0.0 | 13.1 | 55660 | 0.0180 |
| 0.2642 | 13.1 | 55670 | 0.0177 |
| 0.0134 | 13.1 | 55680 | 0.0176 |
| 0.2538 | 13.1 | 55690 | 0.0174 |
| 0.0001 | 13.11 | 55700 | 0.0174 |
| 0.0251 | 13.11 | 55710 | 0.0174 |
| 0.0001 | 13.11 | 55720 | 0.0175 |
| 0.0219 | 13.11 | 55730 | 0.0176 |
| 0.0001 | 13.12 | 55740 | 0.0176 |
| 0.0001 | 13.12 | 55750 | 0.0177 |
| 0.0009 | 13.12 | 55760 | 0.0177 |
| 0.0001 | 13.12 | 55770 | 0.0178 |
| 0.0008 | 13.12 | 55780 | 0.0178 |
| 0.0001 | 13.13 | 55790 | 0.0179 |
| 0.0001 | 13.13 | 55800 | 0.0180 |
| 0.0528 | 13.13 | 55810 | 0.0181 |
| 0.0215 | 13.13 | 55820 | 0.0182 |
| 0.0001 | 13.14 | 55830 | 0.0188 |
| 0.0035 | 13.14 | 55840 | 0.0192 |
| 0.0342 | 13.14 | 55850 | 0.0192 |
| 0.0 | 13.14 | 55860 | 0.0192 |
| 0.0003 | 13.15 | 55870 | 0.0194 |
| 0.0001 | 13.15 | 55880 | 0.0201 |
| 0.0 | 13.15 | 55890 | 0.0204 |
| 0.0001 | 13.15 | 55900 | 0.0205 |
| 0.0001 | 13.16 | 55910 | 0.0207 |
| 0.0001 | 13.16 | 55920 | 0.0208 |
| 0.0001 | 13.16 | 55930 | 0.0214 |
| 0.313 | 13.16 | 55940 | 0.0217 |
| 0.0001 | 13.16 | 55950 | 0.0216 |
| 0.0001 | 13.17 | 55960 | 0.0216 |
| 0.0001 | 13.17 | 55970 | 0.0217 |
| 0.6848 | 13.17 | 55980 | 0.0211 |
| 0.0001 | 13.17 | 55990 | 0.0205 |
| 0.0035 | 13.18 | 56000 | 0.0204 |
| 0.0001 | 13.18 | 56010 | 0.0205 |
| 0.0 | 13.18 | 56020 | 0.0207 |
| 0.0001 | 13.18 | 56030 | 0.0208 |
| 0.0001 | 13.19 | 56040 | 0.0209 |
| 0.0001 | 13.19 | 56050 | 0.0210 |
| 0.0002 | 13.19 | 56060 | 0.0211 |
| 0.0346 | 13.19 | 56070 | 0.0212 |
| 0.0001 | 13.2 | 56080 | 0.0213 |
| 0.0847 | 13.2 | 56090 | 0.0217 |
| 0.0005 | 13.2 | 56100 | 0.0220 |
| 0.0034 | 13.2 | 56110 | 0.0221 |
| 0.0001 | 13.2 | 56120 | 0.0222 |
| 0.0835 | 13.21 | 56130 | 0.0222 |
| 0.0001 | 13.21 | 56140 | 0.0221 |
| 0.0001 | 13.21 | 56150 | 0.0222 |
| 0.0881 | 13.21 | 56160 | 0.0218 |
| 0.3443 | 13.22 | 56170 | 0.0209 |
| 0.0003 | 13.22 | 56180 | 0.0209 |
| 0.0001 | 13.22 | 56190 | 0.0211 |
| 0.0001 | 13.22 | 56200 | 0.0211 |
| 0.263 | 13.23 | 56210 | 0.0211 |
| 0.0678 | 13.23 | 56220 | 0.0210 |
| 0.1887 | 13.23 | 56230 | 0.0208 |
| 0.008 | 13.23 | 56240 | 0.0203 |
| 0.0001 | 13.24 | 56250 | 0.0205 |
| 0.0001 | 13.24 | 56260 | 0.0206 |
| 0.0001 | 13.24 | 56270 | 0.0207 |
| 0.0001 | 13.24 | 56280 | 0.0208 |
| 0.0844 | 13.24 | 56290 | 0.0202 |
| 0.0001 | 13.25 | 56300 | 0.0199 |
| 0.0543 | 13.25 | 56310 | 0.0186 |
| 0.0001 | 13.25 | 56320 | 0.0182 |
| 0.0001 | 13.25 | 56330 | 0.0181 |
| 0.104 | 13.26 | 56340 | 0.0180 |
| 0.2026 | 13.26 | 56350 | 0.0181 |
| 0.0001 | 13.26 | 56360 | 0.0181 |
| 0.0001 | 13.26 | 56370 | 0.0182 |
| 0.0001 | 13.27 | 56380 | 0.0183 |
| 0.0001 | 13.27 | 56390 | 0.0183 |
| 0.2685 | 13.27 | 56400 | 0.0183 |
| 0.0025 | 13.27 | 56410 | 0.0181 |
| 0.0006 | 13.28 | 56420 | 0.0183 |
| 0.0074 | 13.28 | 56430 | 0.0185 |
| 0.0001 | 13.28 | 56440 | 0.0185 |
| 0.2179 | 13.28 | 56450 | 0.0185 |
| 0.0047 | 13.28 | 56460 | 0.0186 |
| 0.0001 | 13.29 | 56470 | 0.0187 |
| 0.1671 | 13.29 | 56480 | 0.0185 |
| 0.0001 | 13.29 | 56490 | 0.0183 |
| 0.1904 | 13.29 | 56500 | 0.0179 |
| 0.0001 | 13.3 | 56510 | 0.0178 |
| 0.0001 | 13.3 | 56520 | 0.0178 |
| 0.2439 | 13.3 | 56530 | 0.0176 |
| 0.0002 | 13.3 | 56540 | 0.0178 |
| 0.0001 | 13.31 | 56550 | 0.0179 |
| 0.0001 | 13.31 | 56560 | 0.0180 |
| 0.0001 | 13.31 | 56570 | 0.0181 |
| 0.188 | 13.31 | 56580 | 0.0185 |
| 0.0001 | 13.32 | 56590 | 0.0186 |
| 0.0001 | 13.32 | 56600 | 0.0186 |
| 0.0995 | 13.32 | 56610 | 0.0186 |
| 0.0 | 13.32 | 56620 | 0.0185 |
| 0.539 | 13.32 | 56630 | 0.0181 |
| 0.0003 | 13.33 | 56640 | 0.0176 |
| 0.0001 | 13.33 | 56650 | 0.0175 |
| 0.0001 | 13.33 | 56660 | 0.0175 |
| 0.0001 | 13.33 | 56670 | 0.0175 |
| 0.2852 | 13.34 | 56680 | 0.0176 |
| 0.0 | 13.34 | 56690 | 0.0177 |
| 0.0001 | 13.34 | 56700 | 0.0177 |
| 0.0001 | 13.34 | 56710 | 0.0178 |
| 0.0001 | 13.35 | 56720 | 0.0179 |
| 0.0001 | 13.35 | 56730 | 0.0180 |
| 0.0001 | 13.35 | 56740 | 0.0181 |
| 0.0002 | 13.35 | 56750 | 0.0185 |
| 0.0001 | 13.36 | 56760 | 0.0186 |
| 0.0001 | 13.36 | 56770 | 0.0187 |
| 0.0386 | 13.36 | 56780 | 0.0189 |
| 0.0 | 13.36 | 56790 | 0.0191 |
| 0.0192 | 13.36 | 56800 | 0.0192 |
| 0.0004 | 13.37 | 56810 | 0.0192 |
| 0.0001 | 13.37 | 56820 | 0.0194 |
| 0.0933 | 13.37 | 56830 | 0.0194 |
| 0.0 | 13.37 | 56840 | 0.0192 |
| 0.3074 | 13.38 | 56850 | 0.0187 |
| 0.0 | 13.38 | 56860 | 0.0189 |
| 0.1442 | 13.38 | 56870 | 0.0189 |
| 0.2644 | 13.38 | 56880 | 0.0185 |
| 0.0234 | 13.39 | 56890 | 0.0185 |
| 0.0001 | 13.39 | 56900 | 0.0185 |
| 0.0007 | 13.39 | 56910 | 0.0186 |
| 0.0001 | 13.39 | 56920 | 0.0187 |
| 0.2445 | 13.4 | 56930 | 0.0186 |
| 0.0 | 13.4 | 56940 | 0.0181 |
| 0.0 | 13.4 | 56950 | 0.0180 |
| 0.0301 | 13.4 | 56960 | 0.0180 |
| 0.0 | 13.4 | 56970 | 0.0180 |
| 0.0 | 13.41 | 56980 | 0.0180 |
| 0.0006 | 13.41 | 56990 | 0.0181 |
| 0.0 | 13.41 | 57000 | 0.0184 |
| 0.0326 | 13.41 | 57010 | 0.0185 |
| 0.0001 | 13.42 | 57020 | 0.0187 |
| 0.0001 | 13.42 | 57030 | 0.0189 |
| 0.003 | 13.42 | 57040 | 0.0192 |
| 0.0001 | 13.42 | 57050 | 0.0195 |
| 0.008 | 13.43 | 57060 | 0.0198 |
| 0.0 | 13.43 | 57070 | 0.0201 |
| 0.0 | 13.43 | 57080 | 0.0203 |
| 0.1603 | 13.43 | 57090 | 0.0200 |
| 0.0541 | 13.44 | 57100 | 0.0195 |
| 0.0 | 13.44 | 57110 | 0.0189 |
| 0.0006 | 13.44 | 57120 | 0.0186 |
| 0.0012 | 13.44 | 57130 | 0.0184 |
| 0.0001 | 13.44 | 57140 | 0.0186 |
| 0.0 | 13.45 | 57150 | 0.0188 |
| 0.0 | 13.45 | 57160 | 0.0189 |
| 0.0 | 13.45 | 57170 | 0.0190 |
| 0.6138 | 13.45 | 57180 | 0.0187 |
| 0.053 | 13.46 | 57190 | 0.0184 |
| 0.0001 | 13.46 | 57200 | 0.0182 |
| 0.0001 | 13.46 | 57210 | 0.0183 |
| 0.0001 | 13.46 | 57220 | 0.0186 |
| 0.0001 | 13.47 | 57230 | 0.0188 |
| 0.0483 | 13.47 | 57240 | 0.0186 |
| 0.3357 | 13.47 | 57250 | 0.0185 |
| 0.0001 | 13.47 | 57260 | 0.0185 |
| 0.0001 | 13.48 | 57270 | 0.0185 |
| 0.0002 | 13.48 | 57280 | 0.0188 |
| 0.0022 | 13.48 | 57290 | 0.0189 |
| 0.0043 | 13.48 | 57300 | 0.0194 |
| 0.3159 | 13.48 | 57310 | 0.0196 |
| 0.0 | 13.49 | 57320 | 0.0197 |
| 0.1599 | 13.49 | 57330 | 0.0198 |
| 0.0001 | 13.49 | 57340 | 0.0199 |
| 0.0001 | 13.49 | 57350 | 0.0199 |
| 0.0572 | 13.5 | 57360 | 0.0196 |
| 0.0001 | 13.5 | 57370 | 0.0194 |
| 0.0001 | 13.5 | 57380 | 0.0194 |
| 0.2631 | 13.5 | 57390 | 0.0189 |
| 0.0 | 13.51 | 57400 | 0.0187 |
| 0.0333 | 13.51 | 57410 | 0.0187 |
| 0.0 | 13.51 | 57420 | 0.0187 |
| 0.0 | 13.51 | 57430 | 0.0187 |
| 0.0001 | 13.52 | 57440 | 0.0188 |
| 0.0086 | 13.52 | 57450 | 0.0188 |
| 0.0 | 13.52 | 57460 | 0.0188 |
| 0.0003 | 13.52 | 57470 | 0.0188 |
| 0.0015 | 13.52 | 57480 | 0.0189 |
| 0.0001 | 13.53 | 57490 | 0.0191 |
| 0.1274 | 13.53 | 57500 | 0.0189 |
| 0.0004 | 13.53 | 57510 | 0.0190 |
| 0.0006 | 13.53 | 57520 | 0.0191 |
| 0.1583 | 13.54 | 57530 | 0.0191 |
| 0.0 | 13.54 | 57540 | 0.0190 |
| 0.0 | 13.54 | 57550 | 0.0189 |
| 0.0 | 13.54 | 57560 | 0.0190 |
| 0.0742 | 13.55 | 57570 | 0.0190 |
| 0.0 | 13.55 | 57580 | 0.0190 |
| 0.0 | 13.55 | 57590 | 0.0190 |
| 0.0 | 13.55 | 57600 | 0.0191 |
| 0.4237 | 13.56 | 57610 | 0.0188 |
| 0.0003 | 13.56 | 57620 | 0.0187 |
| 0.0001 | 13.56 | 57630 | 0.0187 |
| 0.0137 | 13.56 | 57640 | 0.0187 |
| 0.0019 | 13.56 | 57650 | 0.0190 |
| 0.3419 | 13.57 | 57660 | 0.0192 |
| 0.0 | 13.57 | 57670 | 0.0192 |
| 0.276 | 13.57 | 57680 | 0.0192 |
| 0.0776 | 13.57 | 57690 | 0.0191 |
| 0.5801 | 13.58 | 57700 | 0.0191 |
| 0.0002 | 13.58 | 57710 | 0.0190 |
| 0.0001 | 13.58 | 57720 | 0.0189 |
| 0.2316 | 13.58 | 57730 | 0.0187 |
| 0.0001 | 13.59 | 57740 | 0.0182 |
| 0.4906 | 13.59 | 57750 | 0.0179 |
| 0.0001 | 13.59 | 57760 | 0.0176 |
| 0.1987 | 13.59 | 57770 | 0.0175 |
| 0.0001 | 13.6 | 57780 | 0.0175 |
| 0.3036 | 13.6 | 57790 | 0.0175 |
| 0.0042 | 13.6 | 57800 | 0.0174 |
| 0.2965 | 13.6 | 57810 | 0.0173 |
| 0.0373 | 13.6 | 57820 | 0.0175 |
| 0.039 | 13.61 | 57830 | 0.0178 |
| 0.1107 | 13.61 | 57840 | 0.0181 |
| 0.0766 | 13.61 | 57850 | 0.0190 |
| 0.0004 | 13.61 | 57860 | 0.0194 |
| 0.0001 | 13.62 | 57870 | 0.0194 |
| 0.0001 | 13.62 | 57880 | 0.0194 |
| 0.0001 | 13.62 | 57890 | 0.0195 |
| 0.0043 | 13.62 | 57900 | 0.0193 |
| 0.0001 | 13.63 | 57910 | 0.0190 |
| 0.0001 | 13.63 | 57920 | 0.0189 |
| 0.1016 | 13.63 | 57930 | 0.0185 |
| 0.0 | 13.63 | 57940 | 0.0185 |
| 0.0 | 13.64 | 57950 | 0.0185 |
| 0.004 | 13.64 | 57960 | 0.0186 |
| 0.1882 | 13.64 | 57970 | 0.0193 |
| 0.0001 | 13.64 | 57980 | 0.0196 |
| 0.0001 | 13.64 | 57990 | 0.0199 |
| 0.0408 | 13.65 | 58000 | 0.0200 |
| 0.0 | 13.65 | 58010 | 0.0198 |
| 0.3154 | 13.65 | 58020 | 0.0197 |
| 0.1132 | 13.65 | 58030 | 0.0194 |
| 0.0 | 13.66 | 58040 | 0.0194 |
| 0.0001 | 13.66 | 58050 | 0.0193 |
| 0.0 | 13.66 | 58060 | 0.0194 |
| 0.1057 | 13.66 | 58070 | 0.0191 |
| 0.0001 | 13.67 | 58080 | 0.0190 |
| 0.0 | 13.67 | 58090 | 0.0189 |
| 0.0001 | 13.67 | 58100 | 0.0190 |
| 0.0133 | 13.67 | 58110 | 0.0192 |
| 0.0001 | 13.68 | 58120 | 0.0195 |
| 0.0051 | 13.68 | 58130 | 0.0197 |
| 0.0 | 13.68 | 58140 | 0.0200 |
| 0.0 | 13.68 | 58150 | 0.0201 |
| 0.0 | 13.68 | 58160 | 0.0201 |
| 0.2845 | 13.69 | 58170 | 0.0202 |
| 0.0 | 13.69 | 58180 | 0.0203 |
| 0.0001 | 13.69 | 58190 | 0.0204 |
| 0.0038 | 13.69 | 58200 | 0.0205 |
| 0.0 | 13.7 | 58210 | 0.0206 |
| 0.0001 | 13.7 | 58220 | 0.0206 |
| 0.0 | 13.7 | 58230 | 0.0207 |
| 0.0 | 13.7 | 58240 | 0.0207 |
| 0.0 | 13.71 | 58250 | 0.0207 |
| 0.0 | 13.71 | 58260 | 0.0207 |
| 0.2875 | 13.71 | 58270 | 0.0204 |
| 0.0001 | 13.71 | 58280 | 0.0203 |
| 0.2687 | 13.72 | 58290 | 0.0198 |
| 0.0001 | 13.72 | 58300 | 0.0196 |
| 0.0001 | 13.72 | 58310 | 0.0196 |
| 0.0001 | 13.72 | 58320 | 0.0197 |
| 0.0001 | 13.72 | 58330 | 0.0197 |
| 0.0007 | 13.73 | 58340 | 0.0193 |
| 0.0 | 13.73 | 58350 | 0.0192 |
| 0.0001 | 13.73 | 58360 | 0.0192 |
| 0.0001 | 13.73 | 58370 | 0.0192 |
| 0.0054 | 13.74 | 58380 | 0.0191 |
| 0.0003 | 13.74 | 58390 | 0.0192 |
| 0.0007 | 13.74 | 58400 | 0.0194 |
| 0.0001 | 13.74 | 58410 | 0.0196 |
| 0.0 | 13.75 | 58420 | 0.0197 |
| 0.0 | 13.75 | 58430 | 0.0198 |
| 0.0453 | 13.75 | 58440 | 0.0196 |
| 0.0001 | 13.75 | 58450 | 0.0194 |
| 0.0 | 13.76 | 58460 | 0.0193 |
| 0.0 | 13.76 | 58470 | 0.0194 |
| 0.2988 | 13.76 | 58480 | 0.0192 |
| 0.0086 | 13.76 | 58490 | 0.0191 |
| 0.0 | 13.76 | 58500 | 0.0190 |
| 0.1714 | 13.77 | 58510 | 0.0188 |
| 0.0236 | 13.77 | 58520 | 0.0190 |
| 0.0987 | 13.77 | 58530 | 0.0190 |
| 0.0001 | 13.77 | 58540 | 0.0190 |
| 0.0219 | 13.78 | 58550 | 0.0191 |
| 0.0001 | 13.78 | 58560 | 0.0192 |
| 0.5148 | 13.78 | 58570 | 0.0190 |
| 0.0001 | 13.78 | 58580 | 0.0187 |
| 0.2595 | 13.79 | 58590 | 0.0186 |
| 0.0001 | 13.79 | 58600 | 0.0186 |
| 0.0 | 13.79 | 58610 | 0.0187 |
| 0.0032 | 13.79 | 58620 | 0.0188 |
| 0.2611 | 13.8 | 58630 | 0.0189 |
| 0.0001 | 13.8 | 58640 | 0.0187 |
| 0.0001 | 13.8 | 58650 | 0.0187 |
| 0.0008 | 13.8 | 58660 | 0.0188 |
| 0.0 | 13.8 | 58670 | 0.0188 |
| 0.0 | 13.81 | 58680 | 0.0189 |
| 0.0 | 13.81 | 58690 | 0.0189 |
| 0.0 | 13.81 | 58700 | 0.0189 |
| 0.0039 | 13.81 | 58710 | 0.0189 |
| 0.0001 | 13.82 | 58720 | 0.0189 |
| 0.2422 | 13.82 | 58730 | 0.0190 |
| 0.0219 | 13.82 | 58740 | 0.0191 |
| 0.3385 | 13.82 | 58750 | 0.0190 |
| 0.1521 | 13.83 | 58760 | 0.0186 |
| 0.0001 | 13.83 | 58770 | 0.0185 |
| 0.049 | 13.83 | 58780 | 0.0186 |
| 0.0 | 13.83 | 58790 | 0.0186 |
| 0.31 | 13.84 | 58800 | 0.0185 |
| 0.0001 | 13.84 | 58810 | 0.0184 |
| 0.0298 | 13.84 | 58820 | 0.0184 |
| 0.0004 | 13.84 | 58830 | 0.0187 |
| 0.0362 | 13.84 | 58840 | 0.0187 |
| 0.3059 | 13.85 | 58850 | 0.0188 |
| 0.0001 | 13.85 | 58860 | 0.0186 |
| 0.0406 | 13.85 | 58870 | 0.0183 |
| 0.0001 | 13.85 | 58880 | 0.0182 |
| 0.0 | 13.86 | 58890 | 0.0182 |
| 0.0004 | 13.86 | 58900 | 0.0181 |
| 0.2097 | 13.86 | 58910 | 0.0179 |
| 0.0 | 13.86 | 58920 | 0.0178 |
| 0.0001 | 13.87 | 58930 | 0.0178 |
| 0.3109 | 13.87 | 58940 | 0.0178 |
| 0.0151 | 13.87 | 58950 | 0.0178 |
| 0.0009 | 13.87 | 58960 | 0.0179 |
| 0.0001 | 13.88 | 58970 | 0.0180 |
| 0.0054 | 13.88 | 58980 | 0.0180 |
| 0.0001 | 13.88 | 58990 | 0.0180 |
| 0.0001 | 13.88 | 59000 | 0.0181 |
| 0.0 | 13.88 | 59010 | 0.0181 |
| 0.2252 | 13.89 | 59020 | 0.0182 |
| 0.0 | 13.89 | 59030 | 0.0182 |
| 0.0501 | 13.89 | 59040 | 0.0181 |
| 0.0 | 13.89 | 59050 | 0.0180 |
| 0.028 | 13.9 | 59060 | 0.0180 |
| 0.3193 | 13.9 | 59070 | 0.0181 |
| 0.0692 | 13.9 | 59080 | 0.0181 |
| 0.0004 | 13.9 | 59090 | 0.0183 |
| 0.0806 | 13.91 | 59100 | 0.0187 |
| 0.0001 | 13.91 | 59110 | 0.0189 |
| 0.0003 | 13.91 | 59120 | 0.0191 |
| 0.0001 | 13.91 | 59130 | 0.0193 |
| 0.0001 | 13.92 | 59140 | 0.0194 |
| 0.0001 | 13.92 | 59150 | 0.0195 |
| 0.0001 | 13.92 | 59160 | 0.0195 |
| 0.0001 | 13.92 | 59170 | 0.0196 |
| 0.0001 | 13.92 | 59180 | 0.0196 |
| 0.0001 | 13.93 | 59190 | 0.0197 |
| 0.0018 | 13.93 | 59200 | 0.0198 |
| 0.0001 | 13.93 | 59210 | 0.0201 |
| 0.013 | 13.93 | 59220 | 0.0203 |
| 0.0 | 13.94 | 59230 | 0.0206 |
| 0.0 | 13.94 | 59240 | 0.0207 |
| 0.0462 | 13.94 | 59250 | 0.0208 |
| 0.0001 | 13.94 | 59260 | 0.0208 |
| 0.2505 | 13.95 | 59270 | 0.0208 |
| 0.0 | 13.95 | 59280 | 0.0205 |
| 0.0 | 13.95 | 59290 | 0.0204 |
| 0.0035 | 13.95 | 59300 | 0.0204 |
| 0.0328 | 13.96 | 59310 | 0.0202 |
| 0.0001 | 13.96 | 59320 | 0.0200 |
| 0.0006 | 13.96 | 59330 | 0.0200 |
| 0.0001 | 13.96 | 59340 | 0.0200 |
| 0.1883 | 13.96 | 59350 | 0.0196 |
| 0.3025 | 13.97 | 59360 | 0.0191 |
| 0.0001 | 13.97 | 59370 | 0.0189 |
| 0.0001 | 13.97 | 59380 | 0.0189 |
| 0.2424 | 13.97 | 59390 | 0.0188 |
| 0.0001 | 13.98 | 59400 | 0.0188 |
| 0.1212 | 13.98 | 59410 | 0.0186 |
| 0.0 | 13.98 | 59420 | 0.0184 |
| 0.0001 | 13.98 | 59430 | 0.0184 |
| 0.0001 | 13.99 | 59440 | 0.0184 |
| 0.228 | 13.99 | 59450 | 0.0184 |
| 0.2757 | 13.99 | 59460 | 0.0182 |
| 0.2244 | 13.99 | 59470 | 0.0179 |
| 0.0001 | 14.0 | 59480 | 0.0177 |
| 0.0334 | 14.0 | 59490 | 0.0175 |
| 0.2846 | 14.0 | 59500 | 0.0173 |
| 0.6522 | 14.0 | 59510 | 0.0172 |
| 0.0001 | 14.0 | 59520 | 0.0172 |
| 0.0003 | 14.01 | 59530 | 0.0174 |
| 0.0001 | 14.01 | 59540 | 0.0175 |
| 0.0001 | 14.01 | 59550 | 0.0176 |
| 0.0002 | 14.01 | 59560 | 0.0177 |
| 0.0657 | 14.02 | 59570 | 0.0179 |
| 0.0001 | 14.02 | 59580 | 0.0182 |
| 0.2869 | 14.02 | 59590 | 0.0183 |
| 0.0001 | 14.02 | 59600 | 0.0181 |
| 0.0001 | 14.03 | 59610 | 0.0181 |
| 0.0001 | 14.03 | 59620 | 0.0181 |
| 0.0001 | 14.03 | 59630 | 0.0182 |
| 0.0001 | 14.03 | 59640 | 0.0182 |
| 0.0001 | 14.04 | 59650 | 0.0183 |
| 0.0001 | 14.04 | 59660 | 0.0183 |
| 0.0001 | 14.04 | 59670 | 0.0184 |
| 0.0001 | 14.04 | 59680 | 0.0184 |
| 0.0001 | 14.04 | 59690 | 0.0185 |
| 0.0016 | 14.05 | 59700 | 0.0186 |
| 0.2511 | 14.05 | 59710 | 0.0187 |
| 0.0001 | 14.05 | 59720 | 0.0184 |
| 0.2692 | 14.05 | 59730 | 0.0181 |
| 0.291 | 14.06 | 59740 | 0.0177 |
| 0.0002 | 14.06 | 59750 | 0.0177 |
| 0.0001 | 14.06 | 59760 | 0.0177 |
| 0.0001 | 14.06 | 59770 | 0.0177 |
| 0.0001 | 14.07 | 59780 | 0.0178 |
| 0.0641 | 14.07 | 59790 | 0.0177 |
| 0.0001 | 14.07 | 59800 | 0.0177 |
| 0.0001 | 14.07 | 59810 | 0.0177 |
| 0.0001 | 14.08 | 59820 | 0.0177 |
| 0.0011 | 14.08 | 59830 | 0.0179 |
| 0.0001 | 14.08 | 59840 | 0.0182 |
| 0.0002 | 14.08 | 59850 | 0.0186 |
| 0.0 | 14.08 | 59860 | 0.0187 |
| 0.0622 | 14.09 | 59870 | 0.0190 |
| 0.0001 | 14.09 | 59880 | 0.0192 |
| 0.0001 | 14.09 | 59890 | 0.0193 |
| 0.0985 | 14.09 | 59900 | 0.0191 |
| 0.0001 | 14.1 | 59910 | 0.0189 |
| 0.257 | 14.1 | 59920 | 0.0186 |
| 0.0001 | 14.1 | 59930 | 0.0184 |
| 0.0001 | 14.1 | 59940 | 0.0184 |
| 0.0463 | 14.11 | 59950 | 0.0184 |
| 0.0442 | 14.11 | 59960 | 0.0185 |
| 0.0 | 14.11 | 59970 | 0.0185 |
| 0.0001 | 14.11 | 59980 | 0.0186 |
| 0.0001 | 14.12 | 59990 | 0.0186 |
| 0.0001 | 14.12 | 60000 | 0.0187 |
| 0.2263 | 14.12 | 60010 | 0.0186 |
| 0.0001 | 14.12 | 60020 | 0.0183 |
| 0.0001 | 14.12 | 60030 | 0.0183 |
| 0.0001 | 14.13 | 60040 | 0.0183 |
| 0.0285 | 14.13 | 60050 | 0.0182 |
| 0.0317 | 14.13 | 60060 | 0.0179 |
| 0.049 | 14.13 | 60070 | 0.0178 |
| 0.0753 | 14.14 | 60080 | 0.0177 |
| 0.0001 | 14.14 | 60090 | 0.0175 |
| 0.0002 | 14.14 | 60100 | 0.0176 |
| 0.0005 | 14.14 | 60110 | 0.0178 |
| 0.2518 | 14.15 | 60120 | 0.0177 |
| 0.0001 | 14.15 | 60130 | 0.0177 |
| 0.0001 | 14.15 | 60140 | 0.0176 |
| 0.0154 | 14.15 | 60150 | 0.0176 |
| 0.0002 | 14.16 | 60160 | 0.0177 |
| 0.0021 | 14.16 | 60170 | 0.0178 |
| 0.303 | 14.16 | 60180 | 0.0178 |
| 0.0 | 14.16 | 60190 | 0.0178 |
| 0.0215 | 14.16 | 60200 | 0.0176 |
| 0.0461 | 14.17 | 60210 | 0.0176 |
| 0.0001 | 14.17 | 60220 | 0.0178 |
| 0.0002 | 14.17 | 60230 | 0.0179 |
| 0.0056 | 14.17 | 60240 | 0.0180 |
| 0.0001 | 14.18 | 60250 | 0.0180 |
| 0.0001 | 14.18 | 60260 | 0.0181 |
| 0.0 | 14.18 | 60270 | 0.0181 |
| 0.0978 | 14.18 | 60280 | 0.0179 |
| 0.0 | 14.19 | 60290 | 0.0178 |
| 0.0048 | 14.19 | 60300 | 0.0179 |
| 0.0001 | 14.19 | 60310 | 0.0178 |
| 0.0001 | 14.19 | 60320 | 0.0178 |
| 0.0001 | 14.2 | 60330 | 0.0178 |
| 0.0002 | 14.2 | 60340 | 0.0179 |
| 0.0001 | 14.2 | 60350 | 0.0179 |
| 0.0037 | 14.2 | 60360 | 0.0179 |
| 0.0 | 14.2 | 60370 | 0.0178 |
| 0.2769 | 14.21 | 60380 | 0.0175 |
| 0.0001 | 14.21 | 60390 | 0.0174 |
| 0.0001 | 14.21 | 60400 | 0.0174 |
| 0.0322 | 14.21 | 60410 | 0.0173 |
| 0.2448 | 14.22 | 60420 | 0.0170 |
| 0.0 | 14.22 | 60430 | 0.0169 |
| 0.0108 | 14.22 | 60440 | 0.0170 |
| 0.0 | 14.22 | 60450 | 0.0171 |
| 0.0 | 14.23 | 60460 | 0.0171 |
| 0.1544 | 14.23 | 60470 | 0.0170 |
| 0.0581 | 14.23 | 60480 | 0.0170 |
| 0.0001 | 14.23 | 60490 | 0.0170 |
| 0.0001 | 14.24 | 60500 | 0.0170 |
| 0.0001 | 14.24 | 60510 | 0.0170 |
| 0.0 | 14.24 | 60520 | 0.0170 |
| 0.0001 | 14.24 | 60530 | 0.0171 |
| 0.0 | 14.24 | 60540 | 0.0172 |
| 0.0001 | 14.25 | 60550 | 0.0172 |
| 0.0001 | 14.25 | 60560 | 0.0173 |
| 0.0007 | 14.25 | 60570 | 0.0174 |
| 0.536 | 14.25 | 60580 | 0.0171 |
| 0.0 | 14.26 | 60590 | 0.0168 |
| 0.0247 | 14.26 | 60600 | 0.0167 |
| 0.0 | 14.26 | 60610 | 0.0167 |
| 0.0818 | 14.26 | 60620 | 0.0166 |
| 0.0 | 14.27 | 60630 | 0.0167 |
| 0.0 | 14.27 | 60640 | 0.0167 |
| 0.0001 | 14.27 | 60650 | 0.0168 |
| 0.1179 | 14.27 | 60660 | 0.0171 |
| 0.0045 | 14.28 | 60670 | 0.0174 |
| 0.0222 | 14.28 | 60680 | 0.0176 |
| 0.1061 | 14.28 | 60690 | 0.0179 |
| 0.198 | 14.28 | 60700 | 0.0180 |
| 0.4478 | 14.28 | 60710 | 0.0172 |
| 0.0001 | 14.29 | 60720 | 0.0170 |
| 0.2377 | 14.29 | 60730 | 0.0168 |
| 0.0144 | 14.29 | 60740 | 0.0166 |
| 0.0001 | 14.29 | 60750 | 0.0166 |
| 0.3002 | 14.3 | 60760 | 0.0165 |
| 0.329 | 14.3 | 60770 | 0.0164 |
| 0.3099 | 14.3 | 60780 | 0.0163 |
| 0.0001 | 14.3 | 60790 | 0.0162 |
| 0.4452 | 14.31 | 60800 | 0.0161 |
| 0.0001 | 14.31 | 60810 | 0.0161 |
| 0.0001 | 14.31 | 60820 | 0.0161 |
| 0.0002 | 14.31 | 60830 | 0.0162 |
| 0.184 | 14.32 | 60840 | 0.0162 |
| 0.0006 | 14.32 | 60850 | 0.0161 |
| 0.0002 | 14.32 | 60860 | 0.0162 |
| 0.2927 | 14.32 | 60870 | 0.0161 |
| 0.0009 | 14.32 | 60880 | 0.0160 |
| 0.0202 | 14.33 | 60890 | 0.0161 |
| 0.0012 | 14.33 | 60900 | 0.0162 |
| 0.0004 | 14.33 | 60910 | 0.0165 |
| 0.219 | 14.33 | 60920 | 0.0169 |
| 0.0001 | 14.34 | 60930 | 0.0171 |
| 0.0001 | 14.34 | 60940 | 0.0172 |
| 0.1986 | 14.34 | 60950 | 0.0171 |
| 0.251 | 14.34 | 60960 | 0.0169 |
| 0.0001 | 14.35 | 60970 | 0.0167 |
| 0.0123 | 14.35 | 60980 | 0.0168 |
| 0.0059 | 14.35 | 60990 | 0.0169 |
| 0.0001 | 14.35 | 61000 | 0.0171 |
| 0.0 | 14.36 | 61010 | 0.0171 |
| 0.0001 | 14.36 | 61020 | 0.0172 |
| 0.0001 | 14.36 | 61030 | 0.0172 |
| 0.4535 | 14.36 | 61040 | 0.0168 |
| 0.0001 | 14.36 | 61050 | 0.0167 |
| 0.0086 | 14.37 | 61060 | 0.0167 |
| 0.0 | 14.37 | 61070 | 0.0168 |
| 0.0001 | 14.37 | 61080 | 0.0168 |
| 0.3132 | 14.37 | 61090 | 0.0168 |
| 0.0128 | 14.38 | 61100 | 0.0167 |
| 0.0001 | 14.38 | 61110 | 0.0167 |
| 0.0001 | 14.38 | 61120 | 0.0167 |
| 0.0001 | 14.38 | 61130 | 0.0167 |
| 0.0795 | 14.39 | 61140 | 0.0165 |
| 0.1685 | 14.39 | 61150 | 0.0165 |
| 0.2087 | 14.39 | 61160 | 0.0165 |
| 0.0132 | 14.39 | 61170 | 0.0166 |
| 0.0206 | 14.4 | 61180 | 0.0167 |
| 0.0019 | 14.4 | 61190 | 0.0166 |
| 0.0001 | 14.4 | 61200 | 0.0166 |
| 0.2396 | 14.4 | 61210 | 0.0165 |
| 0.1037 | 14.4 | 61220 | 0.0165 |
| 0.0826 | 14.41 | 61230 | 0.0166 |
| 0.0002 | 14.41 | 61240 | 0.0166 |
| 0.2129 | 14.41 | 61250 | 0.0165 |
| 0.3082 | 14.41 | 61260 | 0.0167 |
| 0.0006 | 14.42 | 61270 | 0.0169 |
| 0.2415 | 14.42 | 61280 | 0.0171 |
| 0.2247 | 14.42 | 61290 | 0.0165 |
| 0.0322 | 14.42 | 61300 | 0.0164 |
| 0.0147 | 14.43 | 61310 | 0.0163 |
| 0.007 | 14.43 | 61320 | 0.0164 |
| 0.0001 | 14.43 | 61330 | 0.0165 |
| 0.0004 | 14.43 | 61340 | 0.0167 |
| 0.2823 | 14.44 | 61350 | 0.0168 |
| 0.1348 | 14.44 | 61360 | 0.0165 |
| 0.0002 | 14.44 | 61370 | 0.0165 |
| 0.0201 | 14.44 | 61380 | 0.0165 |
| 0.0001 | 14.44 | 61390 | 0.0165 |
| 0.0065 | 14.45 | 61400 | 0.0166 |
| 0.0002 | 14.45 | 61410 | 0.0166 |
| 0.0001 | 14.45 | 61420 | 0.0167 |
| 0.0001 | 14.45 | 61430 | 0.0168 |
| 0.0004 | 14.46 | 61440 | 0.0169 |
| 0.0001 | 14.46 | 61450 | 0.0172 |
| 0.0001 | 14.46 | 61460 | 0.0173 |
| 0.0001 | 14.46 | 61470 | 0.0173 |
| 0.2736 | 14.47 | 61480 | 0.0173 |
| 0.0001 | 14.47 | 61490 | 0.0173 |
| 0.0001 | 14.47 | 61500 | 0.0173 |
| 0.0 | 14.47 | 61510 | 0.0173 |
| 0.1709 | 14.48 | 61520 | 0.0173 |
| 0.0001 | 14.48 | 61530 | 0.0171 |
| 0.0001 | 14.48 | 61540 | 0.0172 |
| 0.0 | 14.48 | 61550 | 0.0173 |
| 0.1864 | 14.48 | 61560 | 0.0171 |
| 0.0001 | 14.49 | 61570 | 0.0170 |
| 0.3713 | 14.49 | 61580 | 0.0169 |
| 0.0 | 14.49 | 61590 | 0.0167 |
| 0.0001 | 14.49 | 61600 | 0.0167 |
| 0.1605 | 14.5 | 61610 | 0.0166 |
| 0.0073 | 14.5 | 61620 | 0.0164 |
| 0.0001 | 14.5 | 61630 | 0.0164 |
| 0.2936 | 14.5 | 61640 | 0.0164 |
| 0.2989 | 14.51 | 61650 | 0.0162 |
| 0.0001 | 14.51 | 61660 | 0.0162 |
| 0.2718 | 14.51 | 61670 | 0.0164 |
| 0.0001 | 14.51 | 61680 | 0.0164 |
| 0.0001 | 14.52 | 61690 | 0.0165 |
| 0.0001 | 14.52 | 61700 | 0.0166 |
| 0.0062 | 14.52 | 61710 | 0.0167 |
| 0.0001 | 14.52 | 61720 | 0.0168 |
| 0.1886 | 14.52 | 61730 | 0.0167 |
| 0.0001 | 14.53 | 61740 | 0.0165 |
| 0.0001 | 14.53 | 61750 | 0.0165 |
| 0.0002 | 14.53 | 61760 | 0.0166 |
| 0.2569 | 14.53 | 61770 | 0.0167 |
| 0.117 | 14.54 | 61780 | 0.0168 |
| 0.0001 | 14.54 | 61790 | 0.0170 |
| 0.0002 | 14.54 | 61800 | 0.0171 |
| 0.3056 | 14.54 | 61810 | 0.0170 |
| 0.2972 | 14.55 | 61820 | 0.0167 |
| 0.0176 | 14.55 | 61830 | 0.0166 |
| 0.0002 | 14.55 | 61840 | 0.0166 |
| 0.0275 | 14.55 | 61850 | 0.0168 |
| 0.0 | 14.56 | 61860 | 0.0169 |
| 0.0001 | 14.56 | 61870 | 0.0170 |
| 0.0 | 14.56 | 61880 | 0.0170 |
| 0.0 | 14.56 | 61890 | 0.0171 |
| 0.0001 | 14.56 | 61900 | 0.0171 |
| 0.0002 | 14.57 | 61910 | 0.0171 |
| 0.0001 | 14.57 | 61920 | 0.0170 |
| 0.0001 | 14.57 | 61930 | 0.0170 |
| 0.0001 | 14.57 | 61940 | 0.0171 |
| 0.0001 | 14.58 | 61950 | 0.0172 |
| 0.2358 | 14.58 | 61960 | 0.0171 |
| 0.2989 | 14.58 | 61970 | 0.0170 |
| 0.0016 | 14.58 | 61980 | 0.0169 |
| 0.0002 | 14.59 | 61990 | 0.0170 |
| 0.0549 | 14.59 | 62000 | 0.0170 |
| 0.0487 | 14.59 | 62010 | 0.0172 |
| 0.1028 | 14.59 | 62020 | 0.0172 |
| 0.003 | 14.6 | 62030 | 0.0173 |
| 0.0001 | 14.6 | 62040 | 0.0174 |
| 0.229 | 14.6 | 62050 | 0.0172 |
| 0.0005 | 14.6 | 62060 | 0.0174 |
| 0.1632 | 14.6 | 62070 | 0.0176 |
| 0.0001 | 14.61 | 62080 | 0.0177 |
| 0.0 | 14.61 | 62090 | 0.0177 |
| 0.0002 | 14.61 | 62100 | 0.0178 |
| 0.0001 | 14.61 | 62110 | 0.0179 |
| 0.0004 | 14.62 | 62120 | 0.0181 |
| 0.0365 | 14.62 | 62130 | 0.0182 |
| 0.0 | 14.62 | 62140 | 0.0181 |
| 0.273 | 14.62 | 62150 | 0.0180 |
| 0.0003 | 14.63 | 62160 | 0.0180 |
| 0.0001 | 14.63 | 62170 | 0.0181 |
| 0.0001 | 14.63 | 62180 | 0.0182 |
| 0.0135 | 14.63 | 62190 | 0.0182 |
| 0.0816 | 14.64 | 62200 | 0.0183 |
| 0.2497 | 14.64 | 62210 | 0.0184 |
| 0.0 | 14.64 | 62220 | 0.0184 |
| 0.0001 | 14.64 | 62230 | 0.0185 |
| 0.0001 | 14.64 | 62240 | 0.0185 |
| 0.3516 | 14.65 | 62250 | 0.0185 |
| 0.243 | 14.65 | 62260 | 0.0184 |
| 0.0001 | 14.65 | 62270 | 0.0184 |
| 0.0001 | 14.65 | 62280 | 0.0184 |
| 0.0591 | 14.66 | 62290 | 0.0183 |
| 0.0712 | 14.66 | 62300 | 0.0182 |
| 0.1043 | 14.66 | 62310 | 0.0178 |
| 0.0001 | 14.66 | 62320 | 0.0176 |
| 0.0001 | 14.67 | 62330 | 0.0175 |
| 0.0002 | 14.67 | 62340 | 0.0177 |
| 0.1766 | 14.67 | 62350 | 0.0175 |
| 0.0001 | 14.67 | 62360 | 0.0175 |
| 0.0001 | 14.68 | 62370 | 0.0175 |
| 0.0001 | 14.68 | 62380 | 0.0175 |
| 0.0001 | 14.68 | 62390 | 0.0176 |
| 0.0 | 14.68 | 62400 | 0.0176 |
| 0.0379 | 14.68 | 62410 | 0.0174 |
| 0.1163 | 14.69 | 62420 | 0.0173 |
| 0.2529 | 14.69 | 62430 | 0.0168 |
| 0.0001 | 14.69 | 62440 | 0.0165 |
| 0.2686 | 14.69 | 62450 | 0.0164 |
| 0.2518 | 14.7 | 62460 | 0.0163 |
| 0.0001 | 14.7 | 62470 | 0.0161 |
| 0.0001 | 14.7 | 62480 | 0.0161 |
| 0.0001 | 14.7 | 62490 | 0.0162 |
| 0.1466 | 14.71 | 62500 | 0.0161 |
| 0.0002 | 14.71 | 62510 | 0.0161 |
| 0.0001 | 14.71 | 62520 | 0.0161 |
| 0.0001 | 14.71 | 62530 | 0.0162 |
| 0.4796 | 14.72 | 62540 | 0.0161 |
| 0.1038 | 14.72 | 62550 | 0.0159 |
| 0.0001 | 14.72 | 62560 | 0.0158 |
| 0.0858 | 14.72 | 62570 | 0.0157 |
| 0.0007 | 14.72 | 62580 | 0.0158 |
| 0.0059 | 14.73 | 62590 | 0.0160 |
| 0.0001 | 14.73 | 62600 | 0.0161 |
| 0.0001 | 14.73 | 62610 | 0.0162 |
| 0.1523 | 14.73 | 62620 | 0.0163 |
| 0.0001 | 14.74 | 62630 | 0.0164 |
| 0.4473 | 14.74 | 62640 | 0.0163 |
| 0.0749 | 14.74 | 62650 | 0.0161 |
| 0.1681 | 14.74 | 62660 | 0.0160 |
| 0.0005 | 14.75 | 62670 | 0.0160 |
| 0.0362 | 14.75 | 62680 | 0.0161 |
| 0.0001 | 14.75 | 62690 | 0.0162 |
| 0.0001 | 14.75 | 62700 | 0.0163 |
| 0.0025 | 14.76 | 62710 | 0.0165 |
| 0.0001 | 14.76 | 62720 | 0.0166 |
| 0.0001 | 14.76 | 62730 | 0.0167 |
| 0.0001 | 14.76 | 62740 | 0.0167 |
| 0.0001 | 14.76 | 62750 | 0.0168 |
| 0.0001 | 14.77 | 62760 | 0.0169 |
| 0.2533 | 14.77 | 62770 | 0.0170 |
| 0.3197 | 14.77 | 62780 | 0.0171 |
| 0.0 | 14.77 | 62790 | 0.0171 |
| 0.0001 | 14.78 | 62800 | 0.0171 |
| 0.0001 | 14.78 | 62810 | 0.0171 |
| 0.2733 | 14.78 | 62820 | 0.0172 |
| 0.1469 | 14.78 | 62830 | 0.0172 |
| 0.0001 | 14.79 | 62840 | 0.0173 |
| 0.074 | 14.79 | 62850 | 0.0172 |
| 0.0001 | 14.79 | 62860 | 0.0172 |
| 0.0001 | 14.79 | 62870 | 0.0172 |
| 0.0 | 14.8 | 62880 | 0.0172 |
| 0.0001 | 14.8 | 62890 | 0.0173 |
| 0.0001 | 14.8 | 62900 | 0.0173 |
| 0.0625 | 14.8 | 62910 | 0.0175 |
| 0.0 | 14.8 | 62920 | 0.0176 |
| 0.2501 | 14.81 | 62930 | 0.0175 |
| 0.0001 | 14.81 | 62940 | 0.0173 |
| 0.0 | 14.81 | 62950 | 0.0173 |
| 0.0227 | 14.81 | 62960 | 0.0174 |
| 0.0001 | 14.82 | 62970 | 0.0175 |
| 0.0003 | 14.82 | 62980 | 0.0178 |
| 0.0474 | 14.82 | 62990 | 0.0178 |
| 0.109 | 14.82 | 63000 | 0.0178 |
| 0.2139 | 14.83 | 63010 | 0.0179 |
| 0.0 | 14.83 | 63020 | 0.0179 |
| 0.0 | 14.83 | 63030 | 0.0180 |
| 0.2147 | 14.83 | 63040 | 0.0178 |
| 0.0107 | 14.84 | 63050 | 0.0178 |
| 0.0472 | 14.84 | 63060 | 0.0176 |
| 0.0001 | 14.84 | 63070 | 0.0174 |
| 0.2379 | 14.84 | 63080 | 0.0171 |
| 0.2971 | 14.84 | 63090 | 0.0172 |
| 0.005 | 14.85 | 63100 | 0.0171 |
| 0.0001 | 14.85 | 63110 | 0.0170 |
| 0.0001 | 14.85 | 63120 | 0.0170 |
| 0.0001 | 14.85 | 63130 | 0.0171 |
| 0.0008 | 14.86 | 63140 | 0.0172 |
| 0.1764 | 14.86 | 63150 | 0.0172 |
| 0.3047 | 14.86 | 63160 | 0.0170 |
| 0.0001 | 14.86 | 63170 | 0.0169 |
| 0.0003 | 14.87 | 63180 | 0.0170 |
| 0.0002 | 14.87 | 63190 | 0.0171 |
| 0.0054 | 14.87 | 63200 | 0.0170 |
| 0.0001 | 14.87 | 63210 | 0.0170 |
| 0.0151 | 14.88 | 63220 | 0.0170 |
| 0.0001 | 14.88 | 63230 | 0.0170 |
| 0.2196 | 14.88 | 63240 | 0.0168 |
| 0.0027 | 14.88 | 63250 | 0.0170 |
| 0.0001 | 14.88 | 63260 | 0.0171 |
| 0.001 | 14.89 | 63270 | 0.0171 |
| 0.0002 | 14.89 | 63280 | 0.0171 |
| 0.2923 | 14.89 | 63290 | 0.0171 |
| 0.0001 | 14.89 | 63300 | 0.0171 |
| 0.0 | 14.9 | 63310 | 0.0170 |
| 0.0001 | 14.9 | 63320 | 0.0171 |
| 0.3871 | 14.9 | 63330 | 0.0167 |
| 0.0002 | 14.9 | 63340 | 0.0167 |
| 0.0001 | 14.91 | 63350 | 0.0168 |
| 0.0001 | 14.91 | 63360 | 0.0168 |
| 0.0001 | 14.91 | 63370 | 0.0169 |
| 0.0001 | 14.91 | 63380 | 0.0169 |
| 0.0001 | 14.92 | 63390 | 0.0170 |
| 0.0 | 14.92 | 63400 | 0.0170 |
| 0.1073 | 14.92 | 63410 | 0.0170 |
| 0.0001 | 14.92 | 63420 | 0.0170 |
| 0.0001 | 14.92 | 63430 | 0.0170 |
| 0.0001 | 14.93 | 63440 | 0.0171 |
| 0.0001 | 14.93 | 63450 | 0.0171 |
| 0.0002 | 14.93 | 63460 | 0.0172 |
| 0.1312 | 14.93 | 63470 | 0.0171 |
| 0.0001 | 14.94 | 63480 | 0.0171 |
| 0.0214 | 14.94 | 63490 | 0.0173 |
| 0.0001 | 14.94 | 63500 | 0.0175 |
| 0.0001 | 14.94 | 63510 | 0.0176 |
| 0.0 | 14.95 | 63520 | 0.0177 |
| 0.0001 | 14.95 | 63530 | 0.0177 |
| 0.0001 | 14.95 | 63540 | 0.0178 |
| 0.0001 | 14.95 | 63550 | 0.0178 |
| 0.0001 | 14.96 | 63560 | 0.0180 |
| 0.0001 | 14.96 | 63570 | 0.0181 |
| 0.0001 | 14.96 | 63580 | 0.0181 |
| 0.0 | 14.96 | 63590 | 0.0181 |
| 0.0001 | 14.96 | 63600 | 0.0182 |
| 0.0 | 14.97 | 63610 | 0.0182 |
| 0.0 | 14.97 | 63620 | 0.0183 |
| 0.0033 | 14.97 | 63630 | 0.0182 |
| 0.0 | 14.97 | 63640 | 0.0181 |
| 0.0 | 14.98 | 63650 | 0.0181 |
| 0.0002 | 14.98 | 63660 | 0.0182 |
| 0.0 | 14.98 | 63670 | 0.0183 |
| 0.0001 | 14.98 | 63680 | 0.0184 |
| 0.0005 | 14.99 | 63690 | 0.0185 |
| 0.0 | 14.99 | 63700 | 0.0186 |
| 0.0001 | 14.99 | 63710 | 0.0187 |
| 0.0 | 14.99 | 63720 | 0.0188 |
| 0.2537 | 15.0 | 63730 | 0.0189 |
| 0.0 | 15.0 | 63740 | 0.0189 |
| 0.0 | 15.0 | 63750 | 0.0190 |
| 0.0264 | 15.0 | 63760 | 0.0189 |
| 0.0004 | 15.0 | 63770 | 0.0188 |
| 0.0004 | 15.01 | 63780 | 0.0188 |
| 0.3588 | 15.01 | 63790 | 0.0183 |
| 0.0213 | 15.01 | 63800 | 0.0181 |
| 0.0015 | 15.01 | 63810 | 0.0181 |
| 0.0 | 15.02 | 63820 | 0.0181 |
| 0.0 | 15.02 | 63830 | 0.0181 |
| 0.2527 | 15.02 | 63840 | 0.0180 |
| 0.0035 | 15.02 | 63850 | 0.0179 |
| 0.2106 | 15.03 | 63860 | 0.0176 |
| 0.0001 | 15.03 | 63870 | 0.0176 |
| 0.553 | 15.03 | 63880 | 0.0173 |
| 0.0001 | 15.03 | 63890 | 0.0172 |
| 0.0381 | 15.04 | 63900 | 0.0172 |
| 0.0234 | 15.04 | 63910 | 0.0172 |
| 0.0001 | 15.04 | 63920 | 0.0170 |
| 0.0001 | 15.04 | 63930 | 0.0170 |
| 0.0002 | 15.04 | 63940 | 0.0171 |
| 0.0 | 15.05 | 63950 | 0.0172 |
| 0.1699 | 15.05 | 63960 | 0.0171 |
| 0.2457 | 15.05 | 63970 | 0.0168 |
| 0.0001 | 15.05 | 63980 | 0.0167 |
| 0.0 | 15.06 | 63990 | 0.0167 |
| 0.1139 | 15.06 | 64000 | 0.0165 |
| 0.0224 | 15.06 | 64010 | 0.0165 |
| 0.0309 | 15.06 | 64020 | 0.0167 |
| 0.0001 | 15.07 | 64030 | 0.0168 |
| 0.0 | 15.07 | 64040 | 0.0169 |
| 0.0078 | 15.07 | 64050 | 0.0170 |
| 0.0 | 15.07 | 64060 | 0.0170 |
| 0.0159 | 15.08 | 64070 | 0.0169 |
| 0.259 | 15.08 | 64080 | 0.0167 |
| 0.0006 | 15.08 | 64090 | 0.0168 |
| 0.0003 | 15.08 | 64100 | 0.0170 |
| 0.0 | 15.08 | 64110 | 0.0171 |
| 0.0 | 15.09 | 64120 | 0.0172 |
| 0.0 | 15.09 | 64130 | 0.0172 |
| 0.0186 | 15.09 | 64140 | 0.0172 |
| 0.0 | 15.09 | 64150 | 0.0171 |
| 0.138 | 15.1 | 64160 | 0.0170 |
| 0.0002 | 15.1 | 64170 | 0.0170 |
| 0.1276 | 15.1 | 64180 | 0.0170 |
| 0.0 | 15.1 | 64190 | 0.0169 |
| 0.0 | 15.11 | 64200 | 0.0169 |
| 0.3315 | 15.11 | 64210 | 0.0169 |
| 0.0036 | 15.11 | 64220 | 0.0168 |
| 0.0001 | 15.11 | 64230 | 0.0168 |
| 0.0018 | 15.12 | 64240 | 0.0167 |
| 0.0 | 15.12 | 64250 | 0.0167 |
| 0.2884 | 15.12 | 64260 | 0.0165 |
| 0.0 | 15.12 | 64270 | 0.0164 |
| 0.1424 | 15.12 | 64280 | 0.0164 |
| 0.0001 | 15.13 | 64290 | 0.0164 |
| 0.3806 | 15.13 | 64300 | 0.0163 |
| 0.1088 | 15.13 | 64310 | 0.0161 |
| 0.0 | 15.13 | 64320 | 0.0161 |
| 0.0004 | 15.14 | 64330 | 0.0161 |
| 0.0122 | 15.14 | 64340 | 0.0162 |
| 0.0349 | 15.14 | 64350 | 0.0163 |
| 0.1725 | 15.14 | 64360 | 0.0165 |
| 0.2555 | 15.15 | 64370 | 0.0165 |
| 0.0001 | 15.15 | 64380 | 0.0164 |
| 0.3215 | 15.15 | 64390 | 0.0162 |
| 0.0001 | 15.15 | 64400 | 0.0162 |
| 0.2293 | 15.16 | 64410 | 0.0160 |
| 0.6008 | 15.16 | 64420 | 0.0159 |
| 0.0002 | 15.16 | 64430 | 0.0159 |
| 0.4111 | 15.16 | 64440 | 0.0159 |
| 0.2003 | 15.16 | 64450 | 0.0157 |
| 0.0011 | 15.17 | 64460 | 0.0156 |
| 0.0001 | 15.17 | 64470 | 0.0155 |
| 0.0001 | 15.17 | 64480 | 0.0156 |
| 0.3116 | 15.17 | 64490 | 0.0157 |
| 0.0001 | 15.18 | 64500 | 0.0158 |
| 0.0001 | 15.18 | 64510 | 0.0159 |
| 0.1213 | 15.18 | 64520 | 0.0160 |
| 0.0001 | 15.18 | 64530 | 0.0160 |
| 0.0001 | 15.19 | 64540 | 0.0160 |
| 0.0003 | 15.19 | 64550 | 0.0161 |
| 0.0001 | 15.19 | 64560 | 0.0163 |
| 0.0001 | 15.19 | 64570 | 0.0164 |
| 0.0001 | 15.2 | 64580 | 0.0164 |
| 0.0001 | 15.2 | 64590 | 0.0165 |
| 0.027 | 15.2 | 64600 | 0.0166 |
| 0.0001 | 15.2 | 64610 | 0.0167 |
| 0.2946 | 15.2 | 64620 | 0.0164 |
| 0.1786 | 15.21 | 64630 | 0.0166 |
| 0.0001 | 15.21 | 64640 | 0.0167 |
| 0.0001 | 15.21 | 64650 | 0.0168 |
| 0.0001 | 15.21 | 64660 | 0.0168 |
| 0.0001 | 15.22 | 64670 | 0.0169 |
| 0.0001 | 15.22 | 64680 | 0.0169 |
| 0.1876 | 15.22 | 64690 | 0.0170 |
| 0.0001 | 15.22 | 64700 | 0.0172 |
| 0.0004 | 15.23 | 64710 | 0.0173 |
| 0.0054 | 15.23 | 64720 | 0.0172 |
| 0.0002 | 15.23 | 64730 | 0.0172 |
| 0.0157 | 15.23 | 64740 | 0.0172 |
| 0.0002 | 15.24 | 64750 | 0.0172 |
| 0.0001 | 15.24 | 64760 | 0.0172 |
| 0.0001 | 15.24 | 64770 | 0.0172 |
| 0.0001 | 15.24 | 64780 | 0.0172 |
| 0.0001 | 15.24 | 64790 | 0.0172 |
| 0.0 | 15.25 | 64800 | 0.0173 |
| 0.1655 | 15.25 | 64810 | 0.0170 |
| 0.0001 | 15.25 | 64820 | 0.0170 |
| 0.0001 | 15.25 | 64830 | 0.0170 |
| 0.2037 | 15.26 | 64840 | 0.0170 |
| 0.7796 | 15.26 | 64850 | 0.0168 |
| 0.0001 | 15.26 | 64860 | 0.0167 |
| 0.0002 | 15.26 | 64870 | 0.0167 |
| 0.0001 | 15.27 | 64880 | 0.0168 |
| 0.0001 | 15.27 | 64890 | 0.0169 |
| 0.0001 | 15.27 | 64900 | 0.0170 |
| 0.0001 | 15.27 | 64910 | 0.0170 |
| 0.0001 | 15.28 | 64920 | 0.0172 |
| 0.2842 | 15.28 | 64930 | 0.0172 |
| 0.0001 | 15.28 | 64940 | 0.0173 |
| 0.0005 | 15.28 | 64950 | 0.0176 |
| 0.0001 | 15.28 | 64960 | 0.0177 |
| 0.0002 | 15.29 | 64970 | 0.0178 |
| 0.0002 | 15.29 | 64980 | 0.0179 |
| 0.2298 | 15.29 | 64990 | 0.0178 |
| 0.0001 | 15.29 | 65000 | 0.0177 |
| 0.0 | 15.3 | 65010 | 0.0177 |
| 0.0001 | 15.3 | 65020 | 0.0178 |
| 0.0001 | 15.3 | 65030 | 0.0178 |
| 0.0001 | 15.3 | 65040 | 0.0179 |
| 0.0061 | 15.31 | 65050 | 0.0180 |
| 0.0196 | 15.31 | 65060 | 0.0180 |
| 0.0044 | 15.31 | 65070 | 0.0179 |
| 0.0159 | 15.31 | 65080 | 0.0179 |
| 0.0001 | 15.32 | 65090 | 0.0179 |
| 0.0 | 15.32 | 65100 | 0.0180 |
| 0.0 | 15.32 | 65110 | 0.0180 |
| 0.0003 | 15.32 | 65120 | 0.0180 |
| 0.0 | 15.32 | 65130 | 0.0181 |
| 0.0001 | 15.33 | 65140 | 0.0181 |
| 0.2403 | 15.33 | 65150 | 0.0180 |
| 0.0 | 15.33 | 65160 | 0.0179 |
| 0.0001 | 15.33 | 65170 | 0.0179 |
| 0.0085 | 15.34 | 65180 | 0.0179 |
| 0.0001 | 15.34 | 65190 | 0.0179 |
| 0.1294 | 15.34 | 65200 | 0.0178 |
| 0.01 | 15.34 | 65210 | 0.0177 |
| 0.0001 | 15.35 | 65220 | 0.0177 |
| 0.0001 | 15.35 | 65230 | 0.0177 |
| 0.0 | 15.35 | 65240 | 0.0177 |
| 0.053 | 15.35 | 65250 | 0.0177 |
| 0.1016 | 15.36 | 65260 | 0.0178 |
| 0.0001 | 15.36 | 65270 | 0.0178 |
| 0.2695 | 15.36 | 65280 | 0.0177 |
| 0.0169 | 15.36 | 65290 | 0.0175 |
| 0.2347 | 15.36 | 65300 | 0.0172 |
| 0.0001 | 15.37 | 65310 | 0.0170 |
| 0.0022 | 15.37 | 65320 | 0.0170 |
| 0.0 | 15.37 | 65330 | 0.0170 |
| 0.0001 | 15.37 | 65340 | 0.0171 |
| 0.0016 | 15.38 | 65350 | 0.0171 |
| 0.1423 | 15.38 | 65360 | 0.0170 |
| 0.0 | 15.38 | 65370 | 0.0170 |
| 0.0 | 15.38 | 65380 | 0.0170 |
| 0.0 | 15.39 | 65390 | 0.0170 |
| 0.2776 | 15.39 | 65400 | 0.0169 |
| 0.0 | 15.39 | 65410 | 0.0168 |
| 0.0001 | 15.39 | 65420 | 0.0168 |
| 0.0 | 15.4 | 65430 | 0.0168 |
| 0.0 | 15.4 | 65440 | 0.0168 |
| 0.007 | 15.4 | 65450 | 0.0169 |
| 0.0001 | 15.4 | 65460 | 0.0170 |
| 0.0003 | 15.4 | 65470 | 0.0173 |
| 0.0 | 15.41 | 65480 | 0.0175 |
| 0.0005 | 15.41 | 65490 | 0.0176 |
| 0.0001 | 15.41 | 65500 | 0.0177 |
| 0.0001 | 15.41 | 65510 | 0.0177 |
| 0.0001 | 15.42 | 65520 | 0.0178 |
| 0.0001 | 15.42 | 65530 | 0.0178 |
| 0.0072 | 15.42 | 65540 | 0.0178 |
| 0.0604 | 15.42 | 65550 | 0.0178 |
| 0.0001 | 15.43 | 65560 | 0.0178 |
| 0.3075 | 15.43 | 65570 | 0.0178 |
| 0.0001 | 15.43 | 65580 | 0.0177 |
| 0.0 | 15.43 | 65590 | 0.0177 |
| 0.0001 | 15.44 | 65600 | 0.0177 |
| 0.0399 | 15.44 | 65610 | 0.0178 |
| 0.0 | 15.44 | 65620 | 0.0178 |
| 0.0132 | 15.44 | 65630 | 0.0179 |
| 0.0001 | 15.44 | 65640 | 0.0179 |
| 0.0 | 15.45 | 65650 | 0.0180 |
| 0.0 | 15.45 | 65660 | 0.0180 |
| 0.0337 | 15.45 | 65670 | 0.0181 |
| 0.0717 | 15.45 | 65680 | 0.0180 |
| 0.0 | 15.46 | 65690 | 0.0179 |
| 0.0 | 15.46 | 65700 | 0.0179 |
| 0.303 | 15.46 | 65710 | 0.0179 |
| 0.0095 | 15.46 | 65720 | 0.0176 |
| 0.0003 | 15.47 | 65730 | 0.0175 |
| 0.2781 | 15.47 | 65740 | 0.0173 |
| 0.0 | 15.47 | 65750 | 0.0173 |
| 0.0001 | 15.47 | 65760 | 0.0174 |
| 0.0001 | 15.48 | 65770 | 0.0174 |
| 0.0001 | 15.48 | 65780 | 0.0175 |
| 0.0001 | 15.48 | 65790 | 0.0175 |
| 0.0 | 15.48 | 65800 | 0.0176 |
| 0.0049 | 15.48 | 65810 | 0.0177 |
| 0.0007 | 15.49 | 65820 | 0.0177 |
| 0.0871 | 15.49 | 65830 | 0.0178 |
| 0.0 | 15.49 | 65840 | 0.0178 |
| 0.1339 | 15.49 | 65850 | 0.0177 |
| 0.0001 | 15.5 | 65860 | 0.0176 |
| 0.0 | 15.5 | 65870 | 0.0175 |
| 0.0001 | 15.5 | 65880 | 0.0175 |
| 0.1481 | 15.5 | 65890 | 0.0174 |
| 0.0002 | 15.51 | 65900 | 0.0172 |
| 0.1224 | 15.51 | 65910 | 0.0172 |
| 0.0 | 15.51 | 65920 | 0.0173 |
| 0.0003 | 15.51 | 65930 | 0.0175 |
| 0.0004 | 15.52 | 65940 | 0.0176 |
| 0.0071 | 15.52 | 65950 | 0.0176 |
| 0.1019 | 15.52 | 65960 | 0.0176 |
| 0.0001 | 15.52 | 65970 | 0.0178 |
| 0.1674 | 15.52 | 65980 | 0.0176 |
| 0.0 | 15.53 | 65990 | 0.0176 |
| 0.2953 | 15.53 | 66000 | 0.0175 |
| 0.0001 | 15.53 | 66010 | 0.0174 |
| 0.0914 | 15.53 | 66020 | 0.0173 |
| 0.0 | 15.54 | 66030 | 0.0171 |
| 0.2784 | 15.54 | 66040 | 0.0169 |
| 0.5818 | 15.54 | 66050 | 0.0165 |
| 0.0001 | 15.54 | 66060 | 0.0163 |
| 0.0001 | 15.55 | 66070 | 0.0163 |
| 0.2324 | 15.55 | 66080 | 0.0161 |
| 0.0001 | 15.55 | 66090 | 0.0161 |
| 0.0001 | 15.55 | 66100 | 0.0161 |
| 0.0001 | 15.56 | 66110 | 0.0162 |
| 0.1839 | 15.56 | 66120 | 0.0162 |
| 0.0003 | 15.56 | 66130 | 0.0162 |
| 0.0001 | 15.56 | 66140 | 0.0163 |
| 0.0001 | 15.56 | 66150 | 0.0163 |
| 0.0002 | 15.57 | 66160 | 0.0163 |
| 0.0032 | 15.57 | 66170 | 0.0164 |
| 0.3013 | 15.57 | 66180 | 0.0165 |
| 0.0001 | 15.57 | 66190 | 0.0166 |
| 0.0 | 15.58 | 66200 | 0.0166 |
| 0.0001 | 15.58 | 66210 | 0.0166 |
| 0.2576 | 15.58 | 66220 | 0.0166 |
| 0.0093 | 15.58 | 66230 | 0.0165 |
| 0.0001 | 15.59 | 66240 | 0.0165 |
| 0.0001 | 15.59 | 66250 | 0.0165 |
| 0.1689 | 15.59 | 66260 | 0.0164 |
| 0.0001 | 15.59 | 66270 | 0.0164 |
| 0.0178 | 15.6 | 66280 | 0.0164 |
| 0.1617 | 15.6 | 66290 | 0.0163 |
| 0.0001 | 15.6 | 66300 | 0.0165 |
| 0.0001 | 15.6 | 66310 | 0.0166 |
| 0.0 | 15.6 | 66320 | 0.0166 |
| 0.0 | 15.61 | 66330 | 0.0166 |
| 0.0007 | 15.61 | 66340 | 0.0167 |
| 0.0663 | 15.61 | 66350 | 0.0168 |
| 0.0001 | 15.61 | 66360 | 0.0170 |
| 0.0094 | 15.62 | 66370 | 0.0171 |
| 0.0001 | 15.62 | 66380 | 0.0171 |
| 0.0001 | 15.62 | 66390 | 0.0172 |
| 0.3083 | 15.62 | 66400 | 0.0172 |
| 0.0001 | 15.63 | 66410 | 0.0172 |
| 0.2903 | 15.63 | 66420 | 0.0170 |
| 0.0002 | 15.63 | 66430 | 0.0171 |
| 0.0003 | 15.63 | 66440 | 0.0171 |
| 0.0001 | 15.64 | 66450 | 0.0171 |
| 0.0522 | 15.64 | 66460 | 0.0171 |
| 0.0001 | 15.64 | 66470 | 0.0171 |
| 0.0673 | 15.64 | 66480 | 0.0171 |
| 0.0069 | 15.64 | 66490 | 0.0170 |
| 0.0 | 15.65 | 66500 | 0.0169 |
| 0.0001 | 15.65 | 66510 | 0.0169 |
| 0.0056 | 15.65 | 66520 | 0.0169 |
| 0.0001 | 15.65 | 66530 | 0.0170 |
| 0.0001 | 15.66 | 66540 | 0.0171 |
| 0.0001 | 15.66 | 66550 | 0.0171 |
| 0.001 | 15.66 | 66560 | 0.0173 |
| 0.0001 | 15.66 | 66570 | 0.0174 |
| 0.0013 | 15.67 | 66580 | 0.0175 |
| 0.0 | 15.67 | 66590 | 0.0175 |
| 0.0001 | 15.67 | 66600 | 0.0176 |
| 0.0001 | 15.67 | 66610 | 0.0176 |
| 0.0 | 15.68 | 66620 | 0.0177 |
| 0.0 | 15.68 | 66630 | 0.0177 |
| 0.0001 | 15.68 | 66640 | 0.0178 |
| 0.0 | 15.68 | 66650 | 0.0178 |
| 0.0077 | 15.68 | 66660 | 0.0180 |
| 0.2916 | 15.69 | 66670 | 0.0181 |
| 0.325 | 15.69 | 66680 | 0.0180 |
| 0.0 | 15.69 | 66690 | 0.0178 |
| 0.1167 | 15.69 | 66700 | 0.0176 |
| 0.2827 | 15.7 | 66710 | 0.0177 |
| 0.0828 | 15.7 | 66720 | 0.0175 |
| 0.0001 | 15.7 | 66730 | 0.0174 |
| 0.0001 | 15.7 | 66740 | 0.0174 |
| 0.0 | 15.71 | 66750 | 0.0174 |
| 0.2625 | 15.71 | 66760 | 0.0174 |
| 0.1359 | 15.71 | 66770 | 0.0173 |
| 0.0001 | 15.71 | 66780 | 0.0171 |
| 0.0 | 15.72 | 66790 | 0.0171 |
| 0.0001 | 15.72 | 66800 | 0.0171 |
| 0.0001 | 15.72 | 66810 | 0.0171 |
| 0.3336 | 15.72 | 66820 | 0.0172 |
| 0.2451 | 15.72 | 66830 | 0.0171 |
| 0.0001 | 15.73 | 66840 | 0.0170 |
| 0.0001 | 15.73 | 66850 | 0.0170 |
| 0.0032 | 15.73 | 66860 | 0.0168 |
| 0.0046 | 15.73 | 66870 | 0.0166 |
| 0.0001 | 15.74 | 66880 | 0.0165 |
| 0.0001 | 15.74 | 66890 | 0.0166 |
| 0.0001 | 15.74 | 66900 | 0.0166 |
| 0.0001 | 15.74 | 66910 | 0.0166 |
| 0.0001 | 15.75 | 66920 | 0.0167 |
| 0.0 | 15.75 | 66930 | 0.0167 |
| 0.0001 | 15.75 | 66940 | 0.0167 |
| 0.0 | 15.75 | 66950 | 0.0167 |
| 0.0001 | 15.76 | 66960 | 0.0167 |
| 0.1844 | 15.76 | 66970 | 0.0168 |
| 0.0141 | 15.76 | 66980 | 0.0168 |
| 0.0934 | 15.76 | 66990 | 0.0165 |
| 0.0 | 15.76 | 67000 | 0.0164 |
| 0.0 | 15.77 | 67010 | 0.0164 |
| 0.0001 | 15.77 | 67020 | 0.0164 |
| 0.0009 | 15.77 | 67030 | 0.0164 |
| 0.0001 | 15.77 | 67040 | 0.0164 |
| 0.2344 | 15.78 | 67050 | 0.0164 |
| 0.0104 | 15.78 | 67060 | 0.0165 |
| 0.2212 | 15.78 | 67070 | 0.0167 |
| 0.0001 | 15.78 | 67080 | 0.0168 |
| 0.1179 | 15.79 | 67090 | 0.0165 |
| 0.0 | 15.79 | 67100 | 0.0164 |
| 0.0251 | 15.79 | 67110 | 0.0164 |
| 0.0654 | 15.79 | 67120 | 0.0164 |
| 0.3022 | 15.8 | 67130 | 0.0164 |
| 0.0001 | 15.8 | 67140 | 0.0161 |
| 0.0 | 15.8 | 67150 | 0.0160 |
| 0.0001 | 15.8 | 67160 | 0.0160 |
| 0.0001 | 15.8 | 67170 | 0.0160 |
| 0.1016 | 15.81 | 67180 | 0.0162 |
| 0.0001 | 15.81 | 67190 | 0.0165 |
| 0.1626 | 15.81 | 67200 | 0.0167 |
| 0.0003 | 15.81 | 67210 | 0.0167 |
| 0.0001 | 15.82 | 67220 | 0.0168 |
| 0.0001 | 15.82 | 67230 | 0.0168 |
| 0.2136 | 15.82 | 67240 | 0.0166 |
| 0.0001 | 15.82 | 67250 | 0.0166 |
| 0.0003 | 15.83 | 67260 | 0.0166 |
| 0.007 | 15.83 | 67270 | 0.0167 |
| 0.0 | 15.83 | 67280 | 0.0167 |
| 0.0 | 15.83 | 67290 | 0.0167 |
| 0.0001 | 15.84 | 67300 | 0.0167 |
| 0.0001 | 15.84 | 67310 | 0.0168 |
| 0.0005 | 15.84 | 67320 | 0.0168 |
| 0.0002 | 15.84 | 67330 | 0.0171 |
| 0.048 | 15.84 | 67340 | 0.0172 |
| 0.0 | 15.85 | 67350 | 0.0172 |
| 0.0007 | 15.85 | 67360 | 0.0173 |
| 0.199 | 15.85 | 67370 | 0.0172 |
| 0.0 | 15.85 | 67380 | 0.0171 |
| 0.4751 | 15.86 | 67390 | 0.0168 |
| 0.0009 | 15.86 | 67400 | 0.0166 |
| 0.0 | 15.86 | 67410 | 0.0166 |
| 0.0001 | 15.86 | 67420 | 0.0166 |
| 0.0011 | 15.87 | 67430 | 0.0166 |
| 0.0001 | 15.87 | 67440 | 0.0168 |
| 0.0014 | 15.87 | 67450 | 0.0169 |
| 0.0218 | 15.87 | 67460 | 0.0171 |
| 0.0 | 15.88 | 67470 | 0.0172 |
| 0.0002 | 15.88 | 67480 | 0.0172 |
| 0.0001 | 15.88 | 67490 | 0.0172 |
| 0.0001 | 15.88 | 67500 | 0.0172 |
| 0.0001 | 15.88 | 67510 | 0.0172 |
| 0.0001 | 15.89 | 67520 | 0.0172 |
| 0.0001 | 15.89 | 67530 | 0.0173 |
| 0.0 | 15.89 | 67540 | 0.0173 |
| 0.0006 | 15.89 | 67550 | 0.0172 |
| 0.0016 | 15.9 | 67560 | 0.0170 |
| 0.0001 | 15.9 | 67570 | 0.0170 |
| 0.0 | 15.9 | 67580 | 0.0170 |
| 0.0001 | 15.9 | 67590 | 0.0170 |
| 0.0888 | 15.91 | 67600 | 0.0169 |
| 0.1602 | 15.91 | 67610 | 0.0169 |
| 0.0001 | 15.91 | 67620 | 0.0170 |
| 0.0001 | 15.91 | 67630 | 0.0170 |
| 0.0881 | 15.92 | 67640 | 0.0172 |
| 0.0003 | 15.92 | 67650 | 0.0175 |
| 0.0 | 15.92 | 67660 | 0.0177 |
| 0.0001 | 15.92 | 67670 | 0.0179 |
| 0.0001 | 15.92 | 67680 | 0.0179 |
| 0.0 | 15.93 | 67690 | 0.0180 |
| 0.0001 | 15.93 | 67700 | 0.0180 |
| 0.0009 | 15.93 | 67710 | 0.0181 |
| 0.0001 | 15.93 | 67720 | 0.0181 |
| 0.0654 | 15.94 | 67730 | 0.0180 |
| 0.0 | 15.94 | 67740 | 0.0179 |
| 0.0001 | 15.94 | 67750 | 0.0179 |
| 0.0 | 15.94 | 67760 | 0.0179 |
| 0.0005 | 15.95 | 67770 | 0.0179 |
| 0.0 | 15.95 | 67780 | 0.0180 |
| 0.0 | 15.95 | 67790 | 0.0180 |
| 0.0209 | 15.95 | 67800 | 0.0180 |
| 0.2274 | 15.96 | 67810 | 0.0180 |
| 0.0128 | 15.96 | 67820 | 0.0180 |
| 0.0001 | 15.96 | 67830 | 0.0181 |
| 0.0 | 15.96 | 67840 | 0.0182 |
| 0.0 | 15.96 | 67850 | 0.0182 |
| 0.0 | 15.97 | 67860 | 0.0182 |
| 0.0 | 15.97 | 67870 | 0.0182 |
| 0.0 | 15.97 | 67880 | 0.0182 |
| 0.0 | 15.97 | 67890 | 0.0183 |
| 0.0001 | 15.98 | 67900 | 0.0183 |
| 0.0 | 15.98 | 67910 | 0.0183 |
| 0.0003 | 15.98 | 67920 | 0.0183 |
| 0.0048 | 15.98 | 67930 | 0.0182 |
| 0.0 | 15.99 | 67940 | 0.0181 |
| 0.0 | 15.99 | 67950 | 0.0181 |
| 0.3255 | 15.99 | 67960 | 0.0180 |
| 0.0127 | 15.99 | 67970 | 0.0180 |
| 0.0 | 16.0 | 67980 | 0.0180 |
| 0.1098 | 16.0 | 67990 | 0.0180 |
| 0.3037 | 16.0 | 68000 | 0.0179 |
| 0.1279 | 16.0 | 68010 | 0.0176 |
| 0.0 | 16.0 | 68020 | 0.0175 |
| 0.0001 | 16.01 | 68030 | 0.0175 |
| 0.0001 | 16.01 | 68040 | 0.0175 |
| 0.0 | 16.01 | 68050 | 0.0176 |
| 0.0 | 16.01 | 68060 | 0.0176 |
| 0.0001 | 16.02 | 68070 | 0.0176 |
| 0.2554 | 16.02 | 68080 | 0.0177 |
| 0.5381 | 16.02 | 68090 | 0.0175 |
| 0.0 | 16.02 | 68100 | 0.0172 |
| 0.0 | 16.03 | 68110 | 0.0171 |
| 0.0003 | 16.03 | 68120 | 0.0171 |
| 0.0001 | 16.03 | 68130 | 0.0172 |
| 0.0001 | 16.03 | 68140 | 0.0172 |
| 0.4776 | 16.04 | 68150 | 0.0171 |
| 0.0 | 16.04 | 68160 | 0.0168 |
| 0.0001 | 16.04 | 68170 | 0.0168 |
| 0.0 | 16.04 | 68180 | 0.0168 |
| 0.0001 | 16.04 | 68190 | 0.0168 |
| 0.0 | 16.05 | 68200 | 0.0168 |
| 0.0001 | 16.05 | 68210 | 0.0168 |
| 0.0 | 16.05 | 68220 | 0.0169 |
| 0.0 | 16.05 | 68230 | 0.0169 |
| 0.0001 | 16.06 | 68240 | 0.0170 |
| 0.0318 | 16.06 | 68250 | 0.0172 |
| 0.0001 | 16.06 | 68260 | 0.0172 |
| 0.0 | 16.06 | 68270 | 0.0173 |
| 0.0026 | 16.07 | 68280 | 0.0173 |
| 0.0 | 16.07 | 68290 | 0.0173 |
| 0.0 | 16.07 | 68300 | 0.0174 |
| 0.0067 | 16.07 | 68310 | 0.0173 |
| 0.0169 | 16.08 | 68320 | 0.0173 |
| 0.2674 | 16.08 | 68330 | 0.0172 |
| 0.0001 | 16.08 | 68340 | 0.0172 |
| 0.0003 | 16.08 | 68350 | 0.0172 |
| 0.0787 | 16.08 | 68360 | 0.0173 |
| 0.0001 | 16.09 | 68370 | 0.0174 |
| 0.0045 | 16.09 | 68380 | 0.0174 |
| 0.2583 | 16.09 | 68390 | 0.0171 |
| 0.0046 | 16.09 | 68400 | 0.0169 |
| 0.0001 | 16.1 | 68410 | 0.0168 |
| 0.0001 | 16.1 | 68420 | 0.0167 |
| 0.0424 | 16.1 | 68430 | 0.0168 |
| 0.0 | 16.1 | 68440 | 0.0168 |
| 0.0 | 16.11 | 68450 | 0.0168 |
| 0.0 | 16.11 | 68460 | 0.0168 |
| 0.2448 | 16.11 | 68470 | 0.0167 |
| 0.2976 | 16.11 | 68480 | 0.0167 |
| 0.0001 | 16.12 | 68490 | 0.0168 |
| 0.2392 | 16.12 | 68500 | 0.0167 |
| 0.0 | 16.12 | 68510 | 0.0166 |
| 0.0001 | 16.12 | 68520 | 0.0166 |
| 0.0001 | 16.12 | 68530 | 0.0166 |
| 0.0001 | 16.13 | 68540 | 0.0166 |
| 0.0001 | 16.13 | 68550 | 0.0166 |
| 0.2153 | 16.13 | 68560 | 0.0165 |
| 0.0001 | 16.13 | 68570 | 0.0164 |
| 0.0001 | 16.14 | 68580 | 0.0164 |
| 0.0143 | 16.14 | 68590 | 0.0164 |
| 0.1081 | 16.14 | 68600 | 0.0163 |
| 0.0009 | 16.14 | 68610 | 0.0163 |
| 0.0001 | 16.15 | 68620 | 0.0164 |
| 0.0001 | 16.15 | 68630 | 0.0165 |
| 0.0001 | 16.15 | 68640 | 0.0165 |
| 0.0 | 16.15 | 68650 | 0.0166 |
| 0.2839 | 16.16 | 68660 | 0.0166 |
| 0.0004 | 16.16 | 68670 | 0.0166 |
| 0.0002 | 16.16 | 68680 | 0.0167 |
| 0.0001 | 16.16 | 68690 | 0.0168 |
| 0.321 | 16.16 | 68700 | 0.0168 |
| 0.0 | 16.17 | 68710 | 0.0167 |
| 0.0001 | 16.17 | 68720 | 0.0167 |
| 0.292 | 16.17 | 68730 | 0.0166 |
| 0.0487 | 16.17 | 68740 | 0.0166 |
| 0.0001 | 16.18 | 68750 | 0.0167 |
| 0.0467 | 16.18 | 68760 | 0.0166 |
| 0.0021 | 16.18 | 68770 | 0.0166 |
| 0.0001 | 16.18 | 68780 | 0.0166 |
| 0.0197 | 16.19 | 68790 | 0.0166 |
| 0.0359 | 16.19 | 68800 | 0.0167 |
| 0.0001 | 16.19 | 68810 | 0.0168 |
| 0.0001 | 16.19 | 68820 | 0.0169 |
| 0.0 | 16.2 | 68830 | 0.0170 |
| 0.0001 | 16.2 | 68840 | 0.0170 |
| 0.0 | 16.2 | 68850 | 0.0170 |
| 0.0 | 16.2 | 68860 | 0.0170 |
| 0.0 | 16.2 | 68870 | 0.0171 |
| 0.0023 | 16.21 | 68880 | 0.0172 |
| 0.0196 | 16.21 | 68890 | 0.0172 |
| 0.0 | 16.21 | 68900 | 0.0173 |
| 0.0002 | 16.21 | 68910 | 0.0173 |
| 0.0356 | 16.22 | 68920 | 0.0173 |
| 0.0001 | 16.22 | 68930 | 0.0172 |
| 0.7088 | 16.22 | 68940 | 0.0167 |
| 0.0065 | 16.22 | 68950 | 0.0166 |
| 0.3385 | 16.23 | 68960 | 0.0167 |
| 0.0001 | 16.23 | 68970 | 0.0167 |
| 0.0155 | 16.23 | 68980 | 0.0168 |
| 0.0001 | 16.23 | 68990 | 0.0169 |
| 0.1879 | 16.24 | 69000 | 0.0169 |
| 0.1186 | 16.24 | 69010 | 0.0168 |
| 0.0001 | 16.24 | 69020 | 0.0168 |
| 0.0 | 16.24 | 69030 | 0.0169 |
| 0.0001 | 16.24 | 69040 | 0.0169 |
| 0.2257 | 16.25 | 69050 | 0.0168 |
| 0.0001 | 16.25 | 69060 | 0.0167 |
| 0.0001 | 16.25 | 69070 | 0.0168 |
| 0.0007 | 16.25 | 69080 | 0.0169 |
| 0.0001 | 16.26 | 69090 | 0.0170 |
| 0.3003 | 16.26 | 69100 | 0.0170 |
| 0.0001 | 16.26 | 69110 | 0.0169 |
| 0.0001 | 16.26 | 69120 | 0.0168 |
| 0.0 | 16.27 | 69130 | 0.0168 |
| 0.001 | 16.27 | 69140 | 0.0169 |
| 0.353 | 16.27 | 69150 | 0.0169 |
| 0.0001 | 16.27 | 69160 | 0.0168 |
| 0.0437 | 16.28 | 69170 | 0.0168 |
| 0.0184 | 16.28 | 69180 | 0.0167 |
| 0.0153 | 16.28 | 69190 | 0.0166 |
| 0.0 | 16.28 | 69200 | 0.0165 |
| 0.0001 | 16.28 | 69210 | 0.0165 |
| 0.0001 | 16.29 | 69220 | 0.0165 |
| 0.0001 | 16.29 | 69230 | 0.0166 |
| 0.0 | 16.29 | 69240 | 0.0166 |
| 0.0066 | 16.29 | 69250 | 0.0167 |
| 0.0001 | 16.3 | 69260 | 0.0168 |
| 0.0 | 16.3 | 69270 | 0.0168 |
| 0.0 | 16.3 | 69280 | 0.0168 |
| 0.2887 | 16.3 | 69290 | 0.0168 |
| 0.0001 | 16.31 | 69300 | 0.0167 |
| 0.0 | 16.31 | 69310 | 0.0166 |
| 0.0 | 16.31 | 69320 | 0.0166 |
| 0.0001 | 16.31 | 69330 | 0.0167 |
| 0.2021 | 16.32 | 69340 | 0.0166 |
| 0.0029 | 16.32 | 69350 | 0.0166 |
| 0.0025 | 16.32 | 69360 | 0.0166 |
| 0.0941 | 16.32 | 69370 | 0.0165 |
| 0.0001 | 16.32 | 69380 | 0.0165 |
| 0.0001 | 16.33 | 69390 | 0.0165 |
| 0.0001 | 16.33 | 69400 | 0.0165 |
| 0.1827 | 16.33 | 69410 | 0.0166 |
| 0.0874 | 16.33 | 69420 | 0.0163 |
| 0.0001 | 16.34 | 69430 | 0.0162 |
| 0.0001 | 16.34 | 69440 | 0.0162 |
| 0.0001 | 16.34 | 69450 | 0.0162 |
| 0.1322 | 16.34 | 69460 | 0.0162 |
| 0.0515 | 16.35 | 69470 | 0.0160 |
| 0.0 | 16.35 | 69480 | 0.0159 |
| 0.0001 | 16.35 | 69490 | 0.0159 |
| 0.0001 | 16.35 | 69500 | 0.0159 |
| 0.0004 | 16.36 | 69510 | 0.0160 |
| 0.0023 | 16.36 | 69520 | 0.0161 |
| 0.1462 | 16.36 | 69530 | 0.0160 |
| 0.2775 | 16.36 | 69540 | 0.0160 |
| 0.0001 | 16.36 | 69550 | 0.0160 |
| 0.0 | 16.37 | 69560 | 0.0160 |
| 0.0001 | 16.37 | 69570 | 0.0160 |
| 0.0001 | 16.37 | 69580 | 0.0161 |
| 0.0001 | 16.37 | 69590 | 0.0161 |
| 0.0 | 16.38 | 69600 | 0.0161 |
| 0.0 | 16.38 | 69610 | 0.0161 |
| 0.1819 | 16.38 | 69620 | 0.0160 |
| 0.2115 | 16.38 | 69630 | 0.0159 |
| 0.0 | 16.39 | 69640 | 0.0158 |
| 0.0 | 16.39 | 69650 | 0.0158 |
| 0.0047 | 16.39 | 69660 | 0.0158 |
| 0.1974 | 16.39 | 69670 | 0.0157 |
| 0.0 | 16.4 | 69680 | 0.0157 |
| 0.2832 | 16.4 | 69690 | 0.0157 |
| 0.0001 | 16.4 | 69700 | 0.0156 |
| 0.0559 | 16.4 | 69710 | 0.0156 |
| 0.0 | 16.4 | 69720 | 0.0157 |
| 0.0002 | 16.41 | 69730 | 0.0157 |
| 0.0136 | 16.41 | 69740 | 0.0158 |
| 0.1154 | 16.41 | 69750 | 0.0158 |
| 0.0001 | 16.41 | 69760 | 0.0158 |
| 0.0001 | 16.42 | 69770 | 0.0159 |
| 0.0001 | 16.42 | 69780 | 0.0159 |
| 0.5815 | 16.42 | 69790 | 0.0158 |
| 0.0 | 16.42 | 69800 | 0.0158 |
| 0.0866 | 16.43 | 69810 | 0.0157 |
| 0.0001 | 16.43 | 69820 | 0.0156 |
| 0.0274 | 16.43 | 69830 | 0.0157 |
| 0.0194 | 16.43 | 69840 | 0.0158 |
| 0.0001 | 16.44 | 69850 | 0.0159 |
| 0.0001 | 16.44 | 69860 | 0.0159 |
| 0.0001 | 16.44 | 69870 | 0.0159 |
| 0.1102 | 16.44 | 69880 | 0.0159 |
| 0.0007 | 16.44 | 69890 | 0.0160 |
| 0.0 | 16.45 | 69900 | 0.0162 |
| 0.3546 | 16.45 | 69910 | 0.0164 |
| 0.0001 | 16.45 | 69920 | 0.0163 |
| 0.0 | 16.45 | 69930 | 0.0163 |
| 0.0001 | 16.46 | 69940 | 0.0163 |
| 0.0001 | 16.46 | 69950 | 0.0163 |
| 0.0001 | 16.46 | 69960 | 0.0164 |
| 0.0877 | 16.46 | 69970 | 0.0165 |
| 0.3119 | 16.47 | 69980 | 0.0165 |
| 0.0006 | 16.47 | 69990 | 0.0167 |
| 0.0001 | 16.47 | 70000 | 0.0168 |
| 0.0001 | 16.47 | 70010 | 0.0169 |
| 0.0001 | 16.48 | 70020 | 0.0169 |
| 0.0 | 16.48 | 70030 | 0.0170 |
| 0.1537 | 16.48 | 70040 | 0.0169 |
| 0.0009 | 16.48 | 70050 | 0.0167 |
| 0.0044 | 16.48 | 70060 | 0.0168 |
| 0.0001 | 16.49 | 70070 | 0.0169 |
| 0.3076 | 16.49 | 70080 | 0.0169 |
| 0.0001 | 16.49 | 70090 | 0.0167 |
| 0.0 | 16.49 | 70100 | 0.0166 |
| 0.0319 | 16.5 | 70110 | 0.0166 |
| 0.0001 | 16.5 | 70120 | 0.0166 |
| 0.0 | 16.5 | 70130 | 0.0166 |
| 0.0001 | 16.5 | 70140 | 0.0166 |
| 0.0001 | 16.51 | 70150 | 0.0166 |
| 0.0 | 16.51 | 70160 | 0.0167 |
| 0.1518 | 16.51 | 70170 | 0.0166 |
| 0.0001 | 16.51 | 70180 | 0.0165 |
| 0.0001 | 16.52 | 70190 | 0.0165 |
| 0.3792 | 16.52 | 70200 | 0.0165 |
| 0.0001 | 16.52 | 70210 | 0.0166 |
| 0.2332 | 16.52 | 70220 | 0.0165 |
| 0.0 | 16.52 | 70230 | 0.0165 |
| 0.0 | 16.53 | 70240 | 0.0164 |
| 0.065 | 16.53 | 70250 | 0.0165 |
| 0.0245 | 16.53 | 70260 | 0.0164 |
| 0.3186 | 16.53 | 70270 | 0.0163 |
| 0.4999 | 16.54 | 70280 | 0.0162 |
| 0.0143 | 16.54 | 70290 | 0.0161 |
| 0.0033 | 16.54 | 70300 | 0.0162 |
| 0.0 | 16.54 | 70310 | 0.0162 |
| 0.0001 | 16.55 | 70320 | 0.0162 |
| 0.0013 | 16.55 | 70330 | 0.0163 |
| 0.0007 | 16.55 | 70340 | 0.0163 |
| 0.001 | 16.55 | 70350 | 0.0163 |
| 0.0001 | 16.56 | 70360 | 0.0163 |
| 0.0381 | 16.56 | 70370 | 0.0163 |
| 0.3316 | 16.56 | 70380 | 0.0162 |
| 0.0022 | 16.56 | 70390 | 0.0162 |
| 0.002 | 16.56 | 70400 | 0.0162 |
| 0.0001 | 16.57 | 70410 | 0.0163 |
| 0.0 | 16.57 | 70420 | 0.0163 |
| 0.0001 | 16.57 | 70430 | 0.0164 |
| 0.0001 | 16.57 | 70440 | 0.0164 |
| 0.0006 | 16.58 | 70450 | 0.0164 |
| 0.0084 | 16.58 | 70460 | 0.0165 |
| 0.0 | 16.58 | 70470 | 0.0164 |
| 0.0291 | 16.58 | 70480 | 0.0165 |
| 0.0001 | 16.59 | 70490 | 0.0166 |
| 0.0001 | 16.59 | 70500 | 0.0166 |
| 0.0932 | 16.59 | 70510 | 0.0167 |
| 0.0001 | 16.59 | 70520 | 0.0167 |
| 0.0002 | 16.6 | 70530 | 0.0168 |
| 0.0 | 16.6 | 70540 | 0.0168 |
| 0.0 | 16.6 | 70550 | 0.0169 |
| 0.0 | 16.6 | 70560 | 0.0169 |
| 0.0001 | 16.6 | 70570 | 0.0169 |
| 0.0001 | 16.61 | 70580 | 0.0169 |
| 0.1303 | 16.61 | 70590 | 0.0169 |
| 0.0103 | 16.61 | 70600 | 0.0169 |
| 0.0041 | 16.61 | 70610 | 0.0168 |
| 0.0001 | 16.62 | 70620 | 0.0168 |
| 0.0001 | 16.62 | 70630 | 0.0168 |
| 0.3161 | 16.62 | 70640 | 0.0168 |
| 0.0031 | 16.62 | 70650 | 0.0169 |
| 0.0 | 16.63 | 70660 | 0.0170 |
| 0.0001 | 16.63 | 70670 | 0.0170 |
| 0.0002 | 16.63 | 70680 | 0.0171 |
| 0.431 | 16.63 | 70690 | 0.0170 |
| 0.0001 | 16.64 | 70700 | 0.0170 |
| 0.0 | 16.64 | 70710 | 0.0170 |
| 0.0189 | 16.64 | 70720 | 0.0170 |
| 0.18 | 16.64 | 70730 | 0.0169 |
| 0.0035 | 16.64 | 70740 | 0.0168 |
| 0.0 | 16.65 | 70750 | 0.0168 |
| 0.0002 | 16.65 | 70760 | 0.0168 |
| 0.1354 | 16.65 | 70770 | 0.0167 |
| 0.0 | 16.65 | 70780 | 0.0166 |
| 0.0001 | 16.66 | 70790 | 0.0166 |
| 0.0 | 16.66 | 70800 | 0.0166 |
| 0.0 | 16.66 | 70810 | 0.0166 |
| 0.0013 | 16.66 | 70820 | 0.0168 |
| 0.0001 | 16.67 | 70830 | 0.0168 |
| 0.0 | 16.67 | 70840 | 0.0168 |
| 0.0 | 16.67 | 70850 | 0.0168 |
| 0.0484 | 16.67 | 70860 | 0.0169 |
| 0.0416 | 16.68 | 70870 | 0.0169 |
| 0.0425 | 16.68 | 70880 | 0.0169 |
| 0.006 | 16.68 | 70890 | 0.0170 |
| 0.0 | 16.68 | 70900 | 0.0171 |
| 0.2872 | 16.68 | 70910 | 0.0172 |
| 0.22 | 16.69 | 70920 | 0.0171 |
| 0.0001 | 16.69 | 70930 | 0.0171 |
| 0.0856 | 16.69 | 70940 | 0.0171 |
| 0.0 | 16.69 | 70950 | 0.0171 |
| 0.0 | 16.7 | 70960 | 0.0171 |
| 0.0001 | 16.7 | 70970 | 0.0171 |
| 0.0001 | 16.7 | 70980 | 0.0171 |
| 0.2381 | 16.7 | 70990 | 0.0171 |
| 0.0001 | 16.71 | 71000 | 0.0172 |
| 0.0001 | 16.71 | 71010 | 0.0173 |
| 0.0001 | 16.71 | 71020 | 0.0173 |
| 0.0 | 16.71 | 71030 | 0.0173 |
| 0.0 | 16.72 | 71040 | 0.0174 |
| 0.0001 | 16.72 | 71050 | 0.0174 |
| 0.0001 | 16.72 | 71060 | 0.0174 |
| 0.0001 | 16.72 | 71070 | 0.0175 |
| 0.0704 | 16.72 | 71080 | 0.0175 |
| 0.0128 | 16.73 | 71090 | 0.0177 |
| 0.0315 | 16.73 | 71100 | 0.0178 |
| 0.0001 | 16.73 | 71110 | 0.0179 |
| 0.0 | 16.73 | 71120 | 0.0179 |
| 0.0 | 16.74 | 71130 | 0.0179 |
| 0.0036 | 16.74 | 71140 | 0.0180 |
| 0.0016 | 16.74 | 71150 | 0.0180 |
| 0.0 | 16.74 | 71160 | 0.0180 |
| 0.0 | 16.75 | 71170 | 0.0180 |
| 0.0025 | 16.75 | 71180 | 0.0181 |
| 0.0011 | 16.75 | 71190 | 0.0181 |
| 0.0 | 16.75 | 71200 | 0.0181 |
| 0.0001 | 16.76 | 71210 | 0.0181 |
| 0.2854 | 16.76 | 71220 | 0.0181 |
| 0.0 | 16.76 | 71230 | 0.0180 |
| 0.0 | 16.76 | 71240 | 0.0179 |
| 0.3132 | 16.76 | 71250 | 0.0179 |
| 0.0001 | 16.77 | 71260 | 0.0178 |
| 0.0 | 16.77 | 71270 | 0.0178 |
| 0.0001 | 16.77 | 71280 | 0.0178 |
| 0.0 | 16.77 | 71290 | 0.0179 |
| 0.3755 | 16.78 | 71300 | 0.0176 |
| 0.0001 | 16.78 | 71310 | 0.0176 |
| 0.2083 | 16.78 | 71320 | 0.0175 |
| 0.2775 | 16.78 | 71330 | 0.0174 |
| 0.0 | 16.79 | 71340 | 0.0174 |
| 0.0 | 16.79 | 71350 | 0.0174 |
| 0.0001 | 16.79 | 71360 | 0.0174 |
| 0.0001 | 16.79 | 71370 | 0.0174 |
| 0.0016 | 16.8 | 71380 | 0.0175 |
| 0.0 | 16.8 | 71390 | 0.0176 |
| 0.2059 | 16.8 | 71400 | 0.0176 |
| 0.0 | 16.8 | 71410 | 0.0174 |
| 0.0 | 16.8 | 71420 | 0.0174 |
| 0.0 | 16.81 | 71430 | 0.0174 |
| 0.0 | 16.81 | 71440 | 0.0174 |
| 0.0218 | 16.81 | 71450 | 0.0175 |
| 0.2953 | 16.81 | 71460 | 0.0174 |
| 0.0619 | 16.82 | 71470 | 0.0174 |
| 0.0001 | 16.82 | 71480 | 0.0174 |
| 0.0002 | 16.82 | 71490 | 0.0174 |
| 0.4001 | 16.82 | 71500 | 0.0173 |
| 0.0 | 16.83 | 71510 | 0.0173 |
| 0.0001 | 16.83 | 71520 | 0.0173 |
| 0.0001 | 16.83 | 71530 | 0.0173 |
| 0.0 | 16.83 | 71540 | 0.0173 |
| 0.2755 | 16.84 | 71550 | 0.0172 |
| 0.2846 | 16.84 | 71560 | 0.0171 |
| 0.0007 | 16.84 | 71570 | 0.0171 |
| 0.0141 | 16.84 | 71580 | 0.0171 |
| 0.1911 | 16.84 | 71590 | 0.0171 |
| 0.0 | 16.85 | 71600 | 0.0171 |
| 0.0001 | 16.85 | 71610 | 0.0171 |
| 0.0001 | 16.85 | 71620 | 0.0171 |
| 0.0001 | 16.85 | 71630 | 0.0171 |
| 0.0001 | 16.86 | 71640 | 0.0172 |
| 0.0 | 16.86 | 71650 | 0.0172 |
| 0.0002 | 16.86 | 71660 | 0.0173 |
| 0.0007 | 16.86 | 71670 | 0.0173 |
| 0.0 | 16.87 | 71680 | 0.0174 |
| 0.0002 | 16.87 | 71690 | 0.0175 |
| 0.0204 | 16.87 | 71700 | 0.0175 |
| 0.0003 | 16.87 | 71710 | 0.0175 |
| 0.0001 | 16.88 | 71720 | 0.0176 |
| 0.0 | 16.88 | 71730 | 0.0176 |
| 0.0 | 16.88 | 71740 | 0.0177 |
| 0.0001 | 16.88 | 71750 | 0.0177 |
| 0.0 | 16.88 | 71760 | 0.0177 |
| 0.1413 | 16.89 | 71770 | 0.0176 |
| 0.0174 | 16.89 | 71780 | 0.0174 |
| 0.0804 | 16.89 | 71790 | 0.0174 |
| 0.0001 | 16.89 | 71800 | 0.0175 |
| 0.0 | 16.9 | 71810 | 0.0175 |
| 0.0 | 16.9 | 71820 | 0.0175 |
| 0.0 | 16.9 | 71830 | 0.0175 |
| 0.0002 | 16.9 | 71840 | 0.0175 |
| 0.0001 | 16.91 | 71850 | 0.0174 |
| 0.0001 | 16.91 | 71860 | 0.0174 |
| 0.0162 | 16.91 | 71870 | 0.0173 |
| 0.2311 | 16.91 | 71880 | 0.0172 |
| 0.0 | 16.92 | 71890 | 0.0172 |
| 0.0001 | 16.92 | 71900 | 0.0172 |
| 0.0922 | 16.92 | 71910 | 0.0172 |
| 0.3448 | 16.92 | 71920 | 0.0172 |
| 0.0 | 16.92 | 71930 | 0.0170 |
| 0.0537 | 16.93 | 71940 | 0.0169 |
| 0.0 | 16.93 | 71950 | 0.0167 |
| 0.086 | 16.93 | 71960 | 0.0166 |
| 0.0 | 16.93 | 71970 | 0.0165 |
| 0.1938 | 16.94 | 71980 | 0.0166 |
| 0.0001 | 16.94 | 71990 | 0.0167 |
| 0.0 | 16.94 | 72000 | 0.0167 |
| 0.0001 | 16.94 | 72010 | 0.0167 |
| 0.0001 | 16.95 | 72020 | 0.0168 |
| 0.0002 | 16.95 | 72030 | 0.0168 |
| 0.0001 | 16.95 | 72040 | 0.0169 |
| 0.0 | 16.95 | 72050 | 0.0170 |
| 0.0057 | 16.96 | 72060 | 0.0170 |
| 0.0 | 16.96 | 72070 | 0.0171 |
| 0.0 | 16.96 | 72080 | 0.0171 |
| 0.0015 | 16.96 | 72090 | 0.0171 |
| 0.0002 | 16.96 | 72100 | 0.0172 |
| 0.2642 | 16.97 | 72110 | 0.0173 |
| 0.0 | 16.97 | 72120 | 0.0172 |
| 0.0 | 16.97 | 72130 | 0.0171 |
| 0.0 | 16.97 | 72140 | 0.0171 |
| 0.2477 | 16.98 | 72150 | 0.0170 |
| 0.5299 | 16.98 | 72160 | 0.0168 |
| 0.0109 | 16.98 | 72170 | 0.0169 |
| 0.0001 | 16.98 | 72180 | 0.0170 |
| 0.0001 | 16.99 | 72190 | 0.0170 |
| 0.0063 | 16.99 | 72200 | 0.0170 |
| 0.0001 | 16.99 | 72210 | 0.0170 |
| 0.0002 | 16.99 | 72220 | 0.0171 |
| 0.0001 | 17.0 | 72230 | 0.0172 |
| 0.2751 | 17.0 | 72240 | 0.0172 |
| 0.0298 | 17.0 | 72250 | 0.0170 |
| 0.0055 | 17.0 | 72260 | 0.0170 |
| 0.0 | 17.0 | 72270 | 0.0170 |
| 0.0349 | 17.01 | 72280 | 0.0170 |
| 0.0407 | 17.01 | 72290 | 0.0172 |
| 0.0 | 17.01 | 72300 | 0.0173 |
| 0.0001 | 17.01 | 72310 | 0.0173 |
| 0.0369 | 17.02 | 72320 | 0.0172 |
| 0.0001 | 17.02 | 72330 | 0.0172 |
| 0.0114 | 17.02 | 72340 | 0.0171 |
| 0.0 | 17.02 | 72350 | 0.0170 |
| 0.0 | 17.03 | 72360 | 0.0170 |
| 0.0 | 17.03 | 72370 | 0.0170 |
| 0.0001 | 17.03 | 72380 | 0.0170 |
| 0.0027 | 17.03 | 72390 | 0.0170 |
| 0.0001 | 17.04 | 72400 | 0.0172 |
| 0.0023 | 17.04 | 72410 | 0.0171 |
| 0.221 | 17.04 | 72420 | 0.0170 |
| 0.0145 | 17.04 | 72430 | 0.0169 |
| 0.0 | 17.04 | 72440 | 0.0168 |
| 0.0001 | 17.05 | 72450 | 0.0168 |
| 0.0003 | 17.05 | 72460 | 0.0167 |
| 0.0 | 17.05 | 72470 | 0.0167 |
| 0.0001 | 17.05 | 72480 | 0.0167 |
| 0.0 | 17.06 | 72490 | 0.0167 |
| 0.0001 | 17.06 | 72500 | 0.0167 |
| 0.0363 | 17.06 | 72510 | 0.0167 |
| 0.2278 | 17.06 | 72520 | 0.0166 |
| 0.0267 | 17.07 | 72530 | 0.0165 |
| 0.0001 | 17.07 | 72540 | 0.0164 |
| 0.0 | 17.07 | 72550 | 0.0164 |
| 0.0001 | 17.07 | 72560 | 0.0164 |
| 0.1462 | 17.08 | 72570 | 0.0164 |
| 0.0001 | 17.08 | 72580 | 0.0164 |
| 0.0001 | 17.08 | 72590 | 0.0164 |
| 0.0001 | 17.08 | 72600 | 0.0165 |
| 0.0002 | 17.08 | 72610 | 0.0164 |
| 0.0887 | 17.09 | 72620 | 0.0165 |
| 0.0003 | 17.09 | 72630 | 0.0166 |
| 0.0001 | 17.09 | 72640 | 0.0167 |
| 0.0001 | 17.09 | 72650 | 0.0168 |
| 0.0001 | 17.1 | 72660 | 0.0168 |
| 0.2854 | 17.1 | 72670 | 0.0169 |
| 0.0 | 17.1 | 72680 | 0.0169 |
| 0.003 | 17.1 | 72690 | 0.0169 |
| 0.3039 | 17.11 | 72700 | 0.0170 |
| 0.0001 | 17.11 | 72710 | 0.0171 |
| 0.0059 | 17.11 | 72720 | 0.0171 |
| 0.0312 | 17.11 | 72730 | 0.0172 |
| 0.0001 | 17.12 | 72740 | 0.0173 |
| 0.0 | 17.12 | 72750 | 0.0173 |
| 0.0 | 17.12 | 72760 | 0.0173 |
| 0.0001 | 17.12 | 72770 | 0.0173 |
| 0.0 | 17.12 | 72780 | 0.0173 |
| 0.0621 | 17.13 | 72790 | 0.0172 |
| 0.0 | 17.13 | 72800 | 0.0172 |
| 0.0001 | 17.13 | 72810 | 0.0172 |
| 0.0 | 17.13 | 72820 | 0.0172 |
| 0.0 | 17.14 | 72830 | 0.0172 |
| 0.0 | 17.14 | 72840 | 0.0172 |
| 0.0 | 17.14 | 72850 | 0.0173 |
| 0.0 | 17.14 | 72860 | 0.0173 |
| 0.0038 | 17.15 | 72870 | 0.0175 |
| 0.0 | 17.15 | 72880 | 0.0176 |
| 0.3546 | 17.15 | 72890 | 0.0175 |
| 0.0 | 17.15 | 72900 | 0.0175 |
| 0.0521 | 17.16 | 72910 | 0.0172 |
| 0.0 | 17.16 | 72920 | 0.0171 |
| 0.008 | 17.16 | 72930 | 0.0171 |
| 0.0001 | 17.16 | 72940 | 0.0170 |
| 0.0 | 17.16 | 72950 | 0.0170 |
| 0.0 | 17.17 | 72960 | 0.0170 |
| 0.0001 | 17.17 | 72970 | 0.0171 |
| 0.0006 | 17.17 | 72980 | 0.0171 |
| 0.0009 | 17.17 | 72990 | 0.0172 |
| 0.0 | 17.18 | 73000 | 0.0172 |
| 0.2732 | 17.18 | 73010 | 0.0172 |
| 0.0 | 17.18 | 73020 | 0.0171 |
| 0.0049 | 17.18 | 73030 | 0.0171 |
| 0.0 | 17.19 | 73040 | 0.0171 |
| 0.0 | 17.19 | 73050 | 0.0172 |
| 0.0 | 17.19 | 73060 | 0.0172 |
| 0.0026 | 17.19 | 73070 | 0.0171 |
| 0.0001 | 17.2 | 73080 | 0.0171 |
| 0.5366 | 17.2 | 73090 | 0.0171 |
| 0.0019 | 17.2 | 73100 | 0.0170 |
| 0.2095 | 17.2 | 73110 | 0.0169 |
| 0.0003 | 17.2 | 73120 | 0.0168 |
| 0.0959 | 17.21 | 73130 | 0.0168 |
| 0.0002 | 17.21 | 73140 | 0.0168 |
| 0.0 | 17.21 | 73150 | 0.0168 |
| 0.0016 | 17.21 | 73160 | 0.0168 |
| 0.0077 | 17.22 | 73170 | 0.0168 |
| 0.0001 | 17.22 | 73180 | 0.0169 |
| 0.0004 | 17.22 | 73190 | 0.0169 |
| 0.0 | 17.22 | 73200 | 0.0169 |
| 0.0138 | 17.23 | 73210 | 0.0169 |
| 0.0 | 17.23 | 73220 | 0.0169 |
| 0.0 | 17.23 | 73230 | 0.0170 |
| 0.0001 | 17.23 | 73240 | 0.0170 |
| 0.2907 | 17.24 | 73250 | 0.0169 |
| 0.0789 | 17.24 | 73260 | 0.0169 |
| 0.0618 | 17.24 | 73270 | 0.0168 |
| 0.3176 | 17.24 | 73280 | 0.0166 |
| 0.3405 | 17.24 | 73290 | 0.0165 |
| 0.0001 | 17.25 | 73300 | 0.0164 |
| 0.0043 | 17.25 | 73310 | 0.0164 |
| 0.1297 | 17.25 | 73320 | 0.0163 |
| 0.0002 | 17.25 | 73330 | 0.0163 |
| 0.331 | 17.26 | 73340 | 0.0164 |
| 0.0 | 17.26 | 73350 | 0.0164 |
| 0.122 | 17.26 | 73360 | 0.0164 |
| 0.0001 | 17.26 | 73370 | 0.0164 |
| 0.0001 | 17.27 | 73380 | 0.0165 |
| 0.0483 | 17.27 | 73390 | 0.0166 |
| 0.2502 | 17.27 | 73400 | 0.0165 |
| 0.0161 | 17.27 | 73410 | 0.0165 |
| 0.2255 | 17.28 | 73420 | 0.0164 |
| 0.0339 | 17.28 | 73430 | 0.0163 |
| 0.0001 | 17.28 | 73440 | 0.0163 |
| 0.3435 | 17.28 | 73450 | 0.0162 |
| 0.0004 | 17.28 | 73460 | 0.0162 |
| 0.0092 | 17.29 | 73470 | 0.0163 |
| 0.0 | 17.29 | 73480 | 0.0163 |
| 0.0 | 17.29 | 73490 | 0.0163 |
| 0.0 | 17.29 | 73500 | 0.0163 |
| 0.0002 | 17.3 | 73510 | 0.0164 |
| 0.0002 | 17.3 | 73520 | 0.0164 |
| 0.0 | 17.3 | 73530 | 0.0165 |
| 0.0001 | 17.3 | 73540 | 0.0166 |
| 0.0 | 17.31 | 73550 | 0.0166 |
| 0.0 | 17.31 | 73560 | 0.0166 |
| 0.2552 | 17.31 | 73570 | 0.0166 |
| 0.0011 | 17.31 | 73580 | 0.0166 |
| 0.0017 | 17.32 | 73590 | 0.0167 |
| 0.3021 | 17.32 | 73600 | 0.0168 |
| 0.0 | 17.32 | 73610 | 0.0169 |
| 0.0002 | 17.32 | 73620 | 0.0170 |
| 0.0003 | 17.32 | 73630 | 0.0170 |
| 0.0 | 17.33 | 73640 | 0.0171 |
| 0.1302 | 17.33 | 73650 | 0.0171 |
| 0.0001 | 17.33 | 73660 | 0.0170 |
| 0.0 | 17.33 | 73670 | 0.0170 |
| 0.0 | 17.34 | 73680 | 0.0170 |
| 0.0003 | 17.34 | 73690 | 0.0171 |
| 0.0001 | 17.34 | 73700 | 0.0172 |
| 0.0006 | 17.34 | 73710 | 0.0174 |
| 0.0003 | 17.35 | 73720 | 0.0176 |
| 0.0 | 17.35 | 73730 | 0.0177 |
| 0.0 | 17.35 | 73740 | 0.0177 |
| 0.0139 | 17.35 | 73750 | 0.0177 |
| 0.2991 | 17.36 | 73760 | 0.0177 |
| 0.0 | 17.36 | 73770 | 0.0177 |
| 0.0 | 17.36 | 73780 | 0.0177 |
| 0.001 | 17.36 | 73790 | 0.0177 |
| 0.0 | 17.36 | 73800 | 0.0177 |
| 0.2169 | 17.37 | 73810 | 0.0177 |
| 0.3112 | 17.37 | 73820 | 0.0177 |
| 0.0 | 17.37 | 73830 | 0.0177 |
| 0.0006 | 17.37 | 73840 | 0.0177 |
| 0.0 | 17.38 | 73850 | 0.0178 |
| 0.0 | 17.38 | 73860 | 0.0179 |
| 0.0743 | 17.38 | 73870 | 0.0178 |
| 0.1112 | 17.38 | 73880 | 0.0177 |
| 0.0 | 17.39 | 73890 | 0.0177 |
| 0.1213 | 17.39 | 73900 | 0.0175 |
| 0.0 | 17.39 | 73910 | 0.0175 |
| 0.2468 | 17.39 | 73920 | 0.0174 |
| 0.0031 | 17.4 | 73930 | 0.0174 |
| 0.1057 | 17.4 | 73940 | 0.0174 |
| 0.2483 | 17.4 | 73950 | 0.0173 |
| 0.0001 | 17.4 | 73960 | 0.0173 |
| 0.2639 | 17.4 | 73970 | 0.0173 |
| 0.0001 | 17.41 | 73980 | 0.0172 |
| 0.0001 | 17.41 | 73990 | 0.0172 |
| 0.0 | 17.41 | 74000 | 0.0172 |
| 0.0003 | 17.41 | 74010 | 0.0172 |
| 0.3341 | 17.42 | 74020 | 0.0172 |
| 0.0001 | 17.42 | 74030 | 0.0171 |
| 0.0001 | 17.42 | 74040 | 0.0172 |
| 0.2116 | 17.42 | 74050 | 0.0171 |
| 0.0002 | 17.43 | 74060 | 0.0171 |
| 0.2884 | 17.43 | 74070 | 0.0170 |
| 0.0898 | 17.43 | 74080 | 0.0169 |
| 0.0 | 17.43 | 74090 | 0.0170 |
| 0.0001 | 17.44 | 74100 | 0.0170 |
| 0.0 | 17.44 | 74110 | 0.0170 |
| 0.0127 | 17.44 | 74120 | 0.0170 |
| 0.0001 | 17.44 | 74130 | 0.0169 |
| 0.0007 | 17.44 | 74140 | 0.0169 |
| 0.0001 | 17.45 | 74150 | 0.0171 |
| 0.0001 | 17.45 | 74160 | 0.0171 |
| 0.0001 | 17.45 | 74170 | 0.0172 |
| 0.0002 | 17.45 | 74180 | 0.0173 |
| 0.0751 | 17.46 | 74190 | 0.0173 |
| 0.2849 | 17.46 | 74200 | 0.0172 |
| 0.0001 | 17.46 | 74210 | 0.0172 |
| 0.0001 | 17.46 | 74220 | 0.0172 |
| 0.0001 | 17.47 | 74230 | 0.0172 |
| 0.0001 | 17.47 | 74240 | 0.0172 |
| 0.0 | 17.47 | 74250 | 0.0173 |
| 0.0001 | 17.47 | 74260 | 0.0173 |
| 0.3774 | 17.48 | 74270 | 0.0172 |
| 0.0001 | 17.48 | 74280 | 0.0171 |
| 0.0001 | 17.48 | 74290 | 0.0171 |
| 0.0 | 17.48 | 74300 | 0.0171 |
| 0.0001 | 17.48 | 74310 | 0.0171 |
| 0.0001 | 17.49 | 74320 | 0.0171 |
| 0.0001 | 17.49 | 74330 | 0.0172 |
| 0.0001 | 17.49 | 74340 | 0.0172 |
| 0.0001 | 17.49 | 74350 | 0.0172 |
| 0.0001 | 17.5 | 74360 | 0.0172 |
| 0.0 | 17.5 | 74370 | 0.0172 |
| 0.0001 | 17.5 | 74380 | 0.0172 |
| 0.3248 | 17.5 | 74390 | 0.0172 |
| 0.2861 | 17.51 | 74400 | 0.0172 |
| 0.0001 | 17.51 | 74410 | 0.0171 |
| 0.4169 | 17.51 | 74420 | 0.0170 |
| 0.0001 | 17.51 | 74430 | 0.0169 |
| 0.0001 | 17.52 | 74440 | 0.0169 |
| 0.0468 | 17.52 | 74450 | 0.0170 |
| 0.1809 | 17.52 | 74460 | 0.0169 |
| 0.0001 | 17.52 | 74470 | 0.0169 |
| 0.2507 | 17.52 | 74480 | 0.0168 |
| 0.2641 | 17.53 | 74490 | 0.0167 |
| 0.0033 | 17.53 | 74500 | 0.0166 |
| 0.0005 | 17.53 | 74510 | 0.0166 |
| 0.0039 | 17.53 | 74520 | 0.0165 |
| 0.0 | 17.54 | 74530 | 0.0165 |
| 0.0 | 17.54 | 74540 | 0.0165 |
| 0.0 | 17.54 | 74550 | 0.0165 |
| 0.0001 | 17.54 | 74560 | 0.0165 |
| 0.0 | 17.55 | 74570 | 0.0165 |
| 0.0001 | 17.55 | 74580 | 0.0165 |
| 0.0004 | 17.55 | 74590 | 0.0166 |
| 0.2545 | 17.55 | 74600 | 0.0167 |
| 0.0001 | 17.56 | 74610 | 0.0166 |
| 0.0001 | 17.56 | 74620 | 0.0166 |
| 0.0001 | 17.56 | 74630 | 0.0166 |
| 0.2706 | 17.56 | 74640 | 0.0165 |
| 0.0001 | 17.56 | 74650 | 0.0165 |
| 0.0001 | 17.57 | 74660 | 0.0166 |
| 0.0001 | 17.57 | 74670 | 0.0166 |
| 0.0 | 17.57 | 74680 | 0.0167 |
| 0.0 | 17.57 | 74690 | 0.0167 |
| 0.2301 | 17.58 | 74700 | 0.0166 |
| 0.009 | 17.58 | 74710 | 0.0166 |
| 0.001 | 17.58 | 74720 | 0.0166 |
| 0.0001 | 17.58 | 74730 | 0.0166 |
| 0.0567 | 17.59 | 74740 | 0.0166 |
| 0.0001 | 17.59 | 74750 | 0.0166 |
| 0.0 | 17.59 | 74760 | 0.0166 |
| 0.0001 | 17.59 | 74770 | 0.0166 |
| 0.0079 | 17.6 | 74780 | 0.0167 |
| 0.3419 | 17.6 | 74790 | 0.0168 |
| 0.0007 | 17.6 | 74800 | 0.0168 |
| 0.0016 | 17.6 | 74810 | 0.0169 |
| 0.0001 | 17.6 | 74820 | 0.0168 |
| 0.0713 | 17.61 | 74830 | 0.0168 |
| 0.31 | 17.61 | 74840 | 0.0167 |
| 0.0 | 17.61 | 74850 | 0.0167 |
| 0.1634 | 17.61 | 74860 | 0.0166 |
| 0.0001 | 17.62 | 74870 | 0.0166 |
| 0.0059 | 17.62 | 74880 | 0.0166 |
| 0.0 | 17.62 | 74890 | 0.0166 |
| 0.0042 | 17.62 | 74900 | 0.0165 |
| 0.249 | 17.63 | 74910 | 0.0164 |
| 0.1928 | 17.63 | 74920 | 0.0163 |
| 0.0001 | 17.63 | 74930 | 0.0162 |
| 0.0002 | 17.63 | 74940 | 0.0163 |
| 0.0001 | 17.64 | 74950 | 0.0164 |
| 0.0001 | 17.64 | 74960 | 0.0164 |
| 0.1108 | 17.64 | 74970 | 0.0164 |
| 0.0001 | 17.64 | 74980 | 0.0164 |
| 0.0001 | 17.64 | 74990 | 0.0164 |
| 0.0001 | 17.65 | 75000 | 0.0164 |
| 0.0 | 17.65 | 75010 | 0.0164 |
| 0.3324 | 17.65 | 75020 | 0.0164 |
| 0.2564 | 17.65 | 75030 | 0.0163 |
| 0.0 | 17.66 | 75040 | 0.0163 |
| 0.0001 | 17.66 | 75050 | 0.0163 |
| 0.0683 | 17.66 | 75060 | 0.0162 |
| 0.0 | 17.66 | 75070 | 0.0162 |
| 0.0001 | 17.67 | 75080 | 0.0162 |
| 0.0 | 17.67 | 75090 | 0.0162 |
| 0.0001 | 17.67 | 75100 | 0.0162 |
| 0.0002 | 17.67 | 75110 | 0.0162 |
| 0.5886 | 17.68 | 75120 | 0.0161 |
| 0.0001 | 17.68 | 75130 | 0.0160 |
| 0.0001 | 17.68 | 75140 | 0.0159 |
| 0.0002 | 17.68 | 75150 | 0.0160 |
| 0.0001 | 17.68 | 75160 | 0.0160 |
| 0.0 | 17.69 | 75170 | 0.0161 |
| 0.0001 | 17.69 | 75180 | 0.0161 |
| 0.0001 | 17.69 | 75190 | 0.0161 |
| 0.0002 | 17.69 | 75200 | 0.0162 |
| 0.0001 | 17.7 | 75210 | 0.0163 |
| 0.0061 | 17.7 | 75220 | 0.0163 |
| 0.0001 | 17.7 | 75230 | 0.0163 |
| 0.0001 | 17.7 | 75240 | 0.0164 |
| 0.3515 | 17.71 | 75250 | 0.0163 |
| 0.0 | 17.71 | 75260 | 0.0163 |
| 0.2449 | 17.71 | 75270 | 0.0163 |
| 0.0001 | 17.71 | 75280 | 0.0161 |
| 0.0357 | 17.72 | 75290 | 0.0161 |
| 0.0006 | 17.72 | 75300 | 0.0161 |
| 0.0001 | 17.72 | 75310 | 0.0162 |
| 0.0001 | 17.72 | 75320 | 0.0163 |
| 0.0001 | 17.72 | 75330 | 0.0163 |
| 0.0 | 17.73 | 75340 | 0.0163 |
| 0.2082 | 17.73 | 75350 | 0.0162 |
| 0.0665 | 17.73 | 75360 | 0.0162 |
| 0.0001 | 17.73 | 75370 | 0.0162 |
| 0.0001 | 17.74 | 75380 | 0.0162 |
| 0.0001 | 17.74 | 75390 | 0.0162 |
| 0.034 | 17.74 | 75400 | 0.0162 |
| 0.0001 | 17.74 | 75410 | 0.0162 |
| 0.1441 | 17.75 | 75420 | 0.0162 |
| 0.0001 | 17.75 | 75430 | 0.0161 |
| 0.0002 | 17.75 | 75440 | 0.0161 |
| 0.0001 | 17.75 | 75450 | 0.0161 |
| 0.3068 | 17.76 | 75460 | 0.0161 |
| 0.0001 | 17.76 | 75470 | 0.0161 |
| 0.0131 | 17.76 | 75480 | 0.0160 |
| 0.0367 | 17.76 | 75490 | 0.0160 |
| 0.0001 | 17.76 | 75500 | 0.0160 |
| 0.0001 | 17.77 | 75510 | 0.0160 |
| 0.0005 | 17.77 | 75520 | 0.0160 |
| 0.0001 | 17.77 | 75530 | 0.0160 |
| 0.0431 | 17.77 | 75540 | 0.0160 |
| 0.0001 | 17.78 | 75550 | 0.0159 |
| 0.0001 | 17.78 | 75560 | 0.0159 |
| 0.0834 | 17.78 | 75570 | 0.0159 |
| 0.0017 | 17.78 | 75580 | 0.0159 |
| 0.0001 | 17.79 | 75590 | 0.0159 |
| 0.0002 | 17.79 | 75600 | 0.0160 |
| 0.0001 | 17.79 | 75610 | 0.0161 |
| 0.0001 | 17.79 | 75620 | 0.0161 |
| 0.0 | 17.8 | 75630 | 0.0161 |
| 0.0002 | 17.8 | 75640 | 0.0161 |
| 0.0001 | 17.8 | 75650 | 0.0161 |
| 0.0 | 17.8 | 75660 | 0.0161 |
| 0.0 | 17.8 | 75670 | 0.0161 |
| 0.0 | 17.81 | 75680 | 0.0161 |
| 0.1285 | 17.81 | 75690 | 0.0162 |
| 0.0007 | 17.81 | 75700 | 0.0162 |
| 0.0 | 17.81 | 75710 | 0.0162 |
| 0.0 | 17.82 | 75720 | 0.0162 |
| 0.0001 | 17.82 | 75730 | 0.0162 |
| 0.0001 | 17.82 | 75740 | 0.0163 |
| 0.096 | 17.82 | 75750 | 0.0163 |
| 0.0 | 17.83 | 75760 | 0.0163 |
| 0.2793 | 17.83 | 75770 | 0.0162 |
| 0.0565 | 17.83 | 75780 | 0.0162 |
| 0.0002 | 17.83 | 75790 | 0.0161 |
| 0.0001 | 17.84 | 75800 | 0.0161 |
| 0.0001 | 17.84 | 75810 | 0.0161 |
| 0.0001 | 17.84 | 75820 | 0.0161 |
| 0.0001 | 17.84 | 75830 | 0.0161 |
| 0.0 | 17.84 | 75840 | 0.0162 |
| 0.0001 | 17.85 | 75850 | 0.0162 |
| 0.0 | 17.85 | 75860 | 0.0162 |
| 0.0054 | 17.85 | 75870 | 0.0162 |
| 0.0 | 17.85 | 75880 | 0.0162 |
| 0.0 | 17.86 | 75890 | 0.0162 |
| 0.0001 | 17.86 | 75900 | 0.0163 |
| 0.049 | 17.86 | 75910 | 0.0162 |
| 0.0 | 17.86 | 75920 | 0.0161 |
| 0.0071 | 17.87 | 75930 | 0.0162 |
| 0.1896 | 17.87 | 75940 | 0.0162 |
| 0.0 | 17.87 | 75950 | 0.0161 |
| 0.0001 | 17.87 | 75960 | 0.0161 |
| 0.0002 | 17.88 | 75970 | 0.0162 |
| 0.0 | 17.88 | 75980 | 0.0162 |
| 0.0003 | 17.88 | 75990 | 0.0163 |
| 0.0 | 17.88 | 76000 | 0.0164 |
| 0.4425 | 17.88 | 76010 | 0.0163 |
| 0.0 | 17.89 | 76020 | 0.0162 |
| 0.0005 | 17.89 | 76030 | 0.0162 |
| 0.0001 | 17.89 | 76040 | 0.0162 |
| 0.3203 | 17.89 | 76050 | 0.0162 |
| 0.0 | 17.9 | 76060 | 0.0162 |
| 0.0 | 17.9 | 76070 | 0.0162 |
| 0.2278 | 17.9 | 76080 | 0.0161 |
| 0.0143 | 17.9 | 76090 | 0.0161 |
| 0.0001 | 17.91 | 76100 | 0.0161 |
| 0.1333 | 17.91 | 76110 | 0.0160 |
| 0.18 | 17.91 | 76120 | 0.0160 |
| 0.0 | 17.91 | 76130 | 0.0159 |
| 0.0 | 17.92 | 76140 | 0.0159 |
| 0.0002 | 17.92 | 76150 | 0.0160 |
| 0.0001 | 17.92 | 76160 | 0.0160 |
| 0.0001 | 17.92 | 76170 | 0.0161 |
| 0.0191 | 17.92 | 76180 | 0.0161 |
| 0.0001 | 17.93 | 76190 | 0.0162 |
| 0.2343 | 17.93 | 76200 | 0.0162 |
| 0.0012 | 17.93 | 76210 | 0.0162 |
| 0.0001 | 17.93 | 76220 | 0.0163 |
| 0.0003 | 17.94 | 76230 | 0.0164 |
| 0.1078 | 17.94 | 76240 | 0.0164 |
| 0.0 | 17.94 | 76250 | 0.0164 |
| 0.0 | 17.94 | 76260 | 0.0164 |
| 0.0005 | 17.95 | 76270 | 0.0164 |
| 0.0034 | 17.95 | 76280 | 0.0165 |
| 0.3307 | 17.95 | 76290 | 0.0164 |
| 0.0 | 17.95 | 76300 | 0.0164 |
| 0.1434 | 17.96 | 76310 | 0.0164 |
| 0.0001 | 17.96 | 76320 | 0.0164 |
| 0.0 | 17.96 | 76330 | 0.0163 |
| 0.0115 | 17.96 | 76340 | 0.0164 |
| 0.0001 | 17.96 | 76350 | 0.0165 |
| 0.0 | 17.97 | 76360 | 0.0165 |
| 0.0 | 17.97 | 76370 | 0.0165 |
| 0.0131 | 17.97 | 76380 | 0.0165 |
| 0.3182 | 17.97 | 76390 | 0.0164 |
| 0.2344 | 17.98 | 76400 | 0.0163 |
| 0.2722 | 17.98 | 76410 | 0.0163 |
| 0.0001 | 17.98 | 76420 | 0.0163 |
| 0.0001 | 17.98 | 76430 | 0.0163 |
| 0.0675 | 17.99 | 76440 | 0.0163 |
| 0.0104 | 17.99 | 76450 | 0.0163 |
| 0.0 | 17.99 | 76460 | 0.0163 |
| 0.0001 | 17.99 | 76470 | 0.0163 |
| 0.0001 | 18.0 | 76480 | 0.0163 |
| 0.0001 | 18.0 | 76490 | 0.0163 |
| 0.2028 | 18.0 | 76500 | 0.0164 |
| 0.0067 | 18.0 | 76510 | 0.0164 |
| 0.0083 | 18.0 | 76520 | 0.0164 |
| 0.0 | 18.01 | 76530 | 0.0163 |
| 0.0001 | 18.01 | 76540 | 0.0164 |
| 0.0001 | 18.01 | 76550 | 0.0164 |
| 0.0626 | 18.01 | 76560 | 0.0165 |
| 0.0001 | 18.02 | 76570 | 0.0165 |
| 0.1927 | 18.02 | 76580 | 0.0165 |
| 0.2597 | 18.02 | 76590 | 0.0164 |
| 0.0005 | 18.02 | 76600 | 0.0164 |
| 0.0001 | 18.03 | 76610 | 0.0165 |
| 0.0 | 18.03 | 76620 | 0.0165 |
| 0.0916 | 18.03 | 76630 | 0.0164 |
| 0.3546 | 18.03 | 76640 | 0.0163 |
| 0.0001 | 18.04 | 76650 | 0.0163 |
| 0.0001 | 18.04 | 76660 | 0.0163 |
| 0.0001 | 18.04 | 76670 | 0.0163 |
| 0.0 | 18.04 | 76680 | 0.0163 |
| 0.0001 | 18.04 | 76690 | 0.0163 |
| 0.0016 | 18.05 | 76700 | 0.0163 |
| 0.0001 | 18.05 | 76710 | 0.0163 |
| 0.2033 | 18.05 | 76720 | 0.0162 |
| 0.0072 | 18.05 | 76730 | 0.0162 |
| 0.0002 | 18.06 | 76740 | 0.0162 |
| 0.3743 | 18.06 | 76750 | 0.0161 |
| 0.0078 | 18.06 | 76760 | 0.0161 |
| 0.1377 | 18.06 | 76770 | 0.0160 |
| 0.0001 | 18.07 | 76780 | 0.0160 |
| 0.0003 | 18.07 | 76790 | 0.0160 |
| 0.0482 | 18.07 | 76800 | 0.0161 |
| 0.0 | 18.07 | 76810 | 0.0161 |
| 0.0 | 18.08 | 76820 | 0.0162 |
| 0.0626 | 18.08 | 76830 | 0.0162 |
| 0.0001 | 18.08 | 76840 | 0.0162 |
| 0.0 | 18.08 | 76850 | 0.0162 |
| 0.0884 | 18.08 | 76860 | 0.0162 |
| 0.0 | 18.09 | 76870 | 0.0162 |
| 0.0 | 18.09 | 76880 | 0.0162 |
| 0.0001 | 18.09 | 76890 | 0.0162 |
| 0.0108 | 18.09 | 76900 | 0.0162 |
| 0.0001 | 18.1 | 76910 | 0.0163 |
| 0.0 | 18.1 | 76920 | 0.0163 |
| 0.012 | 18.1 | 76930 | 0.0164 |
| 0.0001 | 18.1 | 76940 | 0.0164 |
| 0.0 | 18.11 | 76950 | 0.0165 |
| 0.2371 | 18.11 | 76960 | 0.0165 |
| 0.0 | 18.11 | 76970 | 0.0164 |
| 0.0 | 18.11 | 76980 | 0.0164 |
| 0.0007 | 18.12 | 76990 | 0.0165 |
| 0.0001 | 18.12 | 77000 | 0.0165 |
| 0.216 | 18.12 | 77010 | 0.0165 |
| 0.0775 | 18.12 | 77020 | 0.0165 |
| 0.0002 | 18.12 | 77030 | 0.0165 |
| 0.0001 | 18.13 | 77040 | 0.0166 |
| 0.0 | 18.13 | 77050 | 0.0167 |
| 0.0 | 18.13 | 77060 | 0.0167 |
| 0.2445 | 18.13 | 77070 | 0.0166 |
| 0.0004 | 18.14 | 77080 | 0.0165 |
| 0.0003 | 18.14 | 77090 | 0.0166 |
| 0.0001 | 18.14 | 77100 | 0.0166 |
| 0.0001 | 18.14 | 77110 | 0.0166 |
| 0.0001 | 18.15 | 77120 | 0.0166 |
| 0.0 | 18.15 | 77130 | 0.0167 |
| 0.0 | 18.15 | 77140 | 0.0167 |
| 0.0001 | 18.15 | 77150 | 0.0167 |
| 0.0 | 18.16 | 77160 | 0.0167 |
| 0.0001 | 18.16 | 77170 | 0.0167 |
| 0.0 | 18.16 | 77180 | 0.0167 |
| 0.5617 | 18.16 | 77190 | 0.0166 |
| 0.0 | 18.16 | 77200 | 0.0164 |
| 0.0 | 18.17 | 77210 | 0.0164 |
| 0.0001 | 18.17 | 77220 | 0.0164 |
| 0.008 | 18.17 | 77230 | 0.0164 |
| 0.0328 | 18.17 | 77240 | 0.0165 |
| 0.0 | 18.18 | 77250 | 0.0165 |
| 0.3182 | 18.18 | 77260 | 0.0165 |
| 0.0002 | 18.18 | 77270 | 0.0165 |
| 0.0002 | 18.18 | 77280 | 0.0165 |
| 0.0 | 18.19 | 77290 | 0.0165 |
| 0.0688 | 18.19 | 77300 | 0.0165 |
| 0.0001 | 18.19 | 77310 | 0.0165 |
| 0.0323 | 18.19 | 77320 | 0.0164 |
| 0.0001 | 18.2 | 77330 | 0.0164 |
| 0.0 | 18.2 | 77340 | 0.0164 |
| 0.0001 | 18.2 | 77350 | 0.0164 |
| 0.0001 | 18.2 | 77360 | 0.0165 |
| 0.0918 | 18.2 | 77370 | 0.0164 |
| 0.0 | 18.21 | 77380 | 0.0164 |
| 0.0 | 18.21 | 77390 | 0.0164 |
| 0.0001 | 18.21 | 77400 | 0.0164 |
| 0.0456 | 18.21 | 77410 | 0.0164 |
| 0.0001 | 18.22 | 77420 | 0.0165 |
| 0.0 | 18.22 | 77430 | 0.0165 |
| 0.0 | 18.22 | 77440 | 0.0165 |
| 0.0 | 18.22 | 77450 | 0.0165 |
| 0.0001 | 18.23 | 77460 | 0.0166 |
| 0.0001 | 18.23 | 77470 | 0.0166 |
| 0.0001 | 18.23 | 77480 | 0.0166 |
| 0.349 | 18.23 | 77490 | 0.0166 |
| 0.0 | 18.24 | 77500 | 0.0167 |
| 0.0 | 18.24 | 77510 | 0.0167 |
| 0.269 | 18.24 | 77520 | 0.0167 |
| 0.349 | 18.24 | 77530 | 0.0166 |
| 0.0003 | 18.24 | 77540 | 0.0166 |
| 0.2117 | 18.25 | 77550 | 0.0166 |
| 0.006 | 18.25 | 77560 | 0.0165 |
| 0.353 | 18.25 | 77570 | 0.0164 |
| 0.0001 | 18.25 | 77580 | 0.0163 |
| 0.1801 | 18.26 | 77590 | 0.0163 |
| 0.0 | 18.26 | 77600 | 0.0162 |
| 0.0016 | 18.26 | 77610 | 0.0162 |
| 0.537 | 18.26 | 77620 | 0.0162 |
| 0.0001 | 18.27 | 77630 | 0.0162 |
| 0.0001 | 18.27 | 77640 | 0.0162 |
| 0.0099 | 18.27 | 77650 | 0.0161 |
| 0.0155 | 18.27 | 77660 | 0.0161 |
| 0.0001 | 18.28 | 77670 | 0.0160 |
| 0.0001 | 18.28 | 77680 | 0.0160 |
| 0.1329 | 18.28 | 77690 | 0.0160 |
| 0.0 | 18.28 | 77700 | 0.0160 |
| 0.0001 | 18.28 | 77710 | 0.0160 |
| 0.0001 | 18.29 | 77720 | 0.0160 |
| 0.2672 | 18.29 | 77730 | 0.0160 |
| 0.261 | 18.29 | 77740 | 0.0158 |
| 0.0001 | 18.29 | 77750 | 0.0158 |
| 0.0001 | 18.3 | 77760 | 0.0159 |
| 0.0002 | 18.3 | 77770 | 0.0159 |
| 0.0978 | 18.3 | 77780 | 0.0159 |
| 0.0001 | 18.3 | 77790 | 0.0160 |
| 0.0 | 18.31 | 77800 | 0.0160 |
| 0.0599 | 18.31 | 77810 | 0.0160 |
| 0.0001 | 18.31 | 77820 | 0.0159 |
| 0.0001 | 18.31 | 77830 | 0.0160 |
| 0.0002 | 18.32 | 77840 | 0.0160 |
| 0.2694 | 18.32 | 77850 | 0.0160 |
| 0.0088 | 18.32 | 77860 | 0.0159 |
| 0.2623 | 18.32 | 77870 | 0.0159 |
| 0.0211 | 18.32 | 77880 | 0.0158 |
| 0.3211 | 18.33 | 77890 | 0.0157 |
| 0.0001 | 18.33 | 77900 | 0.0157 |
| 0.0114 | 18.33 | 77910 | 0.0157 |
| 0.2307 | 18.33 | 77920 | 0.0157 |
| 0.0573 | 18.34 | 77930 | 0.0158 |
| 0.0001 | 18.34 | 77940 | 0.0158 |
| 0.0016 | 18.34 | 77950 | 0.0158 |
| 0.1253 | 18.34 | 77960 | 0.0158 |
| 0.0391 | 18.35 | 77970 | 0.0158 |
| 0.325 | 18.35 | 77980 | 0.0158 |
| 0.0015 | 18.35 | 77990 | 0.0158 |
| 0.2125 | 18.35 | 78000 | 0.0158 |
| 0.1988 | 18.36 | 78010 | 0.0157 |
| 0.0001 | 18.36 | 78020 | 0.0156 |
| 0.032 | 18.36 | 78030 | 0.0156 |
| 0.0016 | 18.36 | 78040 | 0.0156 |
| 0.0001 | 18.36 | 78050 | 0.0156 |
| 0.0001 | 18.37 | 78060 | 0.0156 |
| 0.0001 | 18.37 | 78070 | 0.0156 |
| 0.0696 | 18.37 | 78080 | 0.0156 |
| 0.0714 | 18.37 | 78090 | 0.0156 |
| 0.0162 | 18.38 | 78100 | 0.0156 |
| 0.0001 | 18.38 | 78110 | 0.0156 |
| 0.0001 | 18.38 | 78120 | 0.0156 |
| 0.0001 | 18.38 | 78130 | 0.0157 |
| 0.0001 | 18.39 | 78140 | 0.0157 |
| 0.0 | 18.39 | 78150 | 0.0157 |
| 0.0001 | 18.39 | 78160 | 0.0157 |
| 0.1775 | 18.39 | 78170 | 0.0157 |
| 0.0001 | 18.4 | 78180 | 0.0157 |
| 0.0034 | 18.4 | 78190 | 0.0157 |
| 0.0 | 18.4 | 78200 | 0.0157 |
| 0.0 | 18.4 | 78210 | 0.0157 |
| 0.0001 | 18.4 | 78220 | 0.0157 |
| 0.0001 | 18.41 | 78230 | 0.0157 |
| 0.0001 | 18.41 | 78240 | 0.0157 |
| 0.1145 | 18.41 | 78250 | 0.0157 |
| 0.0001 | 18.41 | 78260 | 0.0157 |
| 0.0001 | 18.42 | 78270 | 0.0157 |
| 0.0 | 18.42 | 78280 | 0.0157 |
| 0.2626 | 18.42 | 78290 | 0.0156 |
| 0.001 | 18.42 | 78300 | 0.0156 |
| 0.0 | 18.43 | 78310 | 0.0156 |
| 0.0751 | 18.43 | 78320 | 0.0156 |
| 0.0007 | 18.43 | 78330 | 0.0157 |
| 0.0001 | 18.43 | 78340 | 0.0157 |
| 0.0 | 18.44 | 78350 | 0.0157 |
| 0.0001 | 18.44 | 78360 | 0.0157 |
| 0.0674 | 18.44 | 78370 | 0.0158 |
| 0.0081 | 18.44 | 78380 | 0.0158 |
| 0.0 | 18.44 | 78390 | 0.0158 |
| 0.0001 | 18.45 | 78400 | 0.0158 |
| 0.3541 | 18.45 | 78410 | 0.0159 |
| 0.0 | 18.45 | 78420 | 0.0159 |
| 0.1889 | 18.45 | 78430 | 0.0158 |
| 0.0 | 18.46 | 78440 | 0.0158 |
| 0.1841 | 18.46 | 78450 | 0.0157 |
| 0.1677 | 18.46 | 78460 | 0.0156 |
| 0.0001 | 18.46 | 78470 | 0.0156 |
| 0.0001 | 18.47 | 78480 | 0.0155 |
| 0.0445 | 18.47 | 78490 | 0.0156 |
| 0.0001 | 18.47 | 78500 | 0.0156 |
| 0.0 | 18.47 | 78510 | 0.0157 |
| 0.0001 | 18.48 | 78520 | 0.0157 |
| 0.0001 | 18.48 | 78530 | 0.0157 |
| 0.0001 | 18.48 | 78540 | 0.0157 |
| 0.0001 | 18.48 | 78550 | 0.0157 |
| 0.0108 | 18.48 | 78560 | 0.0157 |
| 0.0132 | 18.49 | 78570 | 0.0157 |
| 0.286 | 18.49 | 78580 | 0.0157 |
| 0.1043 | 18.49 | 78590 | 0.0156 |
| 0.0002 | 18.49 | 78600 | 0.0156 |
| 0.0001 | 18.5 | 78610 | 0.0156 |
| 0.0 | 18.5 | 78620 | 0.0156 |
| 0.0 | 18.5 | 78630 | 0.0156 |
| 0.0 | 18.5 | 78640 | 0.0156 |
| 0.0 | 18.51 | 78650 | 0.0156 |
| 0.0004 | 18.51 | 78660 | 0.0157 |
| 0.0 | 18.51 | 78670 | 0.0158 |
| 0.0 | 18.51 | 78680 | 0.0158 |
| 0.0 | 18.52 | 78690 | 0.0158 |
| 0.0126 | 18.52 | 78700 | 0.0158 |
| 0.0008 | 18.52 | 78710 | 0.0159 |
| 0.0 | 18.52 | 78720 | 0.0159 |
| 0.0 | 18.52 | 78730 | 0.0159 |
| 0.0001 | 18.53 | 78740 | 0.0159 |
| 0.0 | 18.53 | 78750 | 0.0160 |
| 0.002 | 18.53 | 78760 | 0.0160 |
| 0.0148 | 18.53 | 78770 | 0.0161 |
| 0.0001 | 18.54 | 78780 | 0.0161 |
| 0.2806 | 18.54 | 78790 | 0.0161 |
| 0.0001 | 18.54 | 78800 | 0.0160 |
| 0.0003 | 18.54 | 78810 | 0.0160 |
| 0.0127 | 18.55 | 78820 | 0.0161 |
| 0.0016 | 18.55 | 78830 | 0.0161 |
| 0.0 | 18.55 | 78840 | 0.0161 |
| 0.0001 | 18.55 | 78850 | 0.0161 |
| 0.0199 | 18.56 | 78860 | 0.0161 |
| 0.0 | 18.56 | 78870 | 0.0162 |
| 0.0 | 18.56 | 78880 | 0.0162 |
| 0.0001 | 18.56 | 78890 | 0.0162 |
| 0.0001 | 18.56 | 78900 | 0.0163 |
| 0.1343 | 18.57 | 78910 | 0.0163 |
| 0.0001 | 18.57 | 78920 | 0.0162 |
| 0.0009 | 18.57 | 78930 | 0.0162 |
| 0.2884 | 18.57 | 78940 | 0.0162 |
| 0.0 | 18.58 | 78950 | 0.0162 |
| 0.0001 | 18.58 | 78960 | 0.0162 |
| 0.0 | 18.58 | 78970 | 0.0162 |
| 0.0001 | 18.58 | 78980 | 0.0162 |
| 0.0 | 18.59 | 78990 | 0.0162 |
| 0.0003 | 18.59 | 79000 | 0.0162 |
| 0.3563 | 18.59 | 79010 | 0.0162 |
| 0.309 | 18.59 | 79020 | 0.0161 |
| 0.2838 | 18.6 | 79030 | 0.0159 |
| 0.007 | 18.6 | 79040 | 0.0159 |
| 0.0001 | 18.6 | 79050 | 0.0158 |
| 0.0008 | 18.6 | 79060 | 0.0158 |
| 0.0 | 18.6 | 79070 | 0.0158 |
| 0.0193 | 18.61 | 79080 | 0.0157 |
| 0.0001 | 18.61 | 79090 | 0.0157 |
| 0.0104 | 18.61 | 79100 | 0.0157 |
| 0.298 | 18.61 | 79110 | 0.0157 |
| 0.0 | 18.62 | 79120 | 0.0157 |
| 0.0001 | 18.62 | 79130 | 0.0157 |
| 0.0716 | 18.62 | 79140 | 0.0157 |
| 0.0001 | 18.62 | 79150 | 0.0157 |
| 0.0001 | 18.63 | 79160 | 0.0157 |
| 0.0001 | 18.63 | 79170 | 0.0157 |
| 0.1928 | 18.63 | 79180 | 0.0157 |
| 0.0001 | 18.63 | 79190 | 0.0157 |
| 0.0 | 18.64 | 79200 | 0.0157 |
| 0.0001 | 18.64 | 79210 | 0.0157 |
| 0.0001 | 18.64 | 79220 | 0.0157 |
| 0.0 | 18.64 | 79230 | 0.0157 |
| 0.0001 | 18.64 | 79240 | 0.0158 |
| 0.0001 | 18.65 | 79250 | 0.0158 |
| 0.3031 | 18.65 | 79260 | 0.0158 |
| 0.0001 | 18.65 | 79270 | 0.0157 |
| 0.4242 | 18.65 | 79280 | 0.0156 |
| 0.0 | 18.66 | 79290 | 0.0156 |
| 0.0411 | 18.66 | 79300 | 0.0155 |
| 0.0017 | 18.66 | 79310 | 0.0156 |
| 0.0001 | 18.66 | 79320 | 0.0157 |
| 0.0 | 18.67 | 79330 | 0.0157 |
| 0.0 | 18.67 | 79340 | 0.0157 |
| 0.0001 | 18.67 | 79350 | 0.0157 |
| 0.0001 | 18.67 | 79360 | 0.0157 |
| 0.0 | 18.68 | 79370 | 0.0157 |
| 0.0001 | 18.68 | 79380 | 0.0158 |
| 0.0001 | 18.68 | 79390 | 0.0158 |
| 0.0113 | 18.68 | 79400 | 0.0158 |
| 0.0 | 18.68 | 79410 | 0.0158 |
| 0.0108 | 18.69 | 79420 | 0.0158 |
| 0.4969 | 18.69 | 79430 | 0.0158 |
| 0.0082 | 18.69 | 79440 | 0.0157 |
| 0.0001 | 18.69 | 79450 | 0.0157 |
| 0.0001 | 18.7 | 79460 | 0.0156 |
| 0.0 | 18.7 | 79470 | 0.0156 |
| 0.0 | 18.7 | 79480 | 0.0156 |
| 0.0005 | 18.7 | 79490 | 0.0156 |
| 0.0006 | 18.71 | 79500 | 0.0156 |
| 0.0004 | 18.71 | 79510 | 0.0156 |
| 0.0283 | 18.71 | 79520 | 0.0156 |
| 0.0 | 18.71 | 79530 | 0.0156 |
| 0.0 | 18.72 | 79540 | 0.0155 |
| 0.0 | 18.72 | 79550 | 0.0155 |
| 0.0001 | 18.72 | 79560 | 0.0156 |
| 0.0003 | 18.72 | 79570 | 0.0156 |
| 0.0001 | 18.72 | 79580 | 0.0157 |
| 0.0007 | 18.73 | 79590 | 0.0157 |
| 0.3058 | 18.73 | 79600 | 0.0157 |
| 0.0 | 18.73 | 79610 | 0.0158 |
| 0.0016 | 18.73 | 79620 | 0.0158 |
| 0.0 | 18.74 | 79630 | 0.0159 |
| 0.0431 | 18.74 | 79640 | 0.0159 |
| 0.0 | 18.74 | 79650 | 0.0159 |
| 0.2433 | 18.74 | 79660 | 0.0159 |
| 0.0012 | 18.75 | 79670 | 0.0159 |
| 0.0784 | 18.75 | 79680 | 0.0160 |
| 0.0001 | 18.75 | 79690 | 0.0160 |
| 0.0 | 18.75 | 79700 | 0.0160 |
| 0.0 | 18.76 | 79710 | 0.0160 |
| 0.0001 | 18.76 | 79720 | 0.0160 |
| 0.0 | 18.76 | 79730 | 0.0160 |
| 0.0 | 18.76 | 79740 | 0.0160 |
| 0.1117 | 18.76 | 79750 | 0.0160 |
| 0.0 | 18.77 | 79760 | 0.0160 |
| 0.0001 | 18.77 | 79770 | 0.0160 |
| 0.0001 | 18.77 | 79780 | 0.0160 |
| 0.0001 | 18.77 | 79790 | 0.0160 |
| 0.0 | 18.78 | 79800 | 0.0160 |
| 0.0193 | 18.78 | 79810 | 0.0160 |
| 0.2026 | 18.78 | 79820 | 0.0160 |
| 0.0024 | 18.78 | 79830 | 0.0159 |
| 0.0 | 18.79 | 79840 | 0.0159 |
| 0.0001 | 18.79 | 79850 | 0.0159 |
| 0.0001 | 18.79 | 79860 | 0.0159 |
| 0.0 | 18.79 | 79870 | 0.0159 |
| 0.0006 | 18.8 | 79880 | 0.0159 |
| 0.0001 | 18.8 | 79890 | 0.0159 |
| 0.0001 | 18.8 | 79900 | 0.0159 |
| 0.0006 | 18.8 | 79910 | 0.0160 |
| 0.0 | 18.8 | 79920 | 0.0160 |
| 0.0155 | 18.81 | 79930 | 0.0161 |
| 0.0093 | 18.81 | 79940 | 0.0161 |
| 0.0 | 18.81 | 79950 | 0.0161 |
| 0.0971 | 18.81 | 79960 | 0.0161 |
| 0.0 | 18.82 | 79970 | 0.0160 |
| 0.3628 | 18.82 | 79980 | 0.0160 |
| 0.0001 | 18.82 | 79990 | 0.0160 |
| 0.0 | 18.82 | 80000 | 0.0160 |
| 0.0001 | 18.83 | 80010 | 0.0160 |
| 0.0001 | 18.83 | 80020 | 0.0160 |
| 0.042 | 18.83 | 80030 | 0.0160 |
| 0.0 | 18.83 | 80040 | 0.0160 |
| 0.0 | 18.84 | 80050 | 0.0160 |
| 0.0 | 18.84 | 80060 | 0.0160 |
| 0.2442 | 18.84 | 80070 | 0.0160 |
| 0.0001 | 18.84 | 80080 | 0.0159 |
| 0.0 | 18.84 | 80090 | 0.0159 |
| 0.0004 | 18.85 | 80100 | 0.0159 |
| 0.0 | 18.85 | 80110 | 0.0159 |
| 0.284 | 18.85 | 80120 | 0.0159 |
| 0.0 | 18.85 | 80130 | 0.0158 |
| 0.0 | 18.86 | 80140 | 0.0158 |
| 0.0004 | 18.86 | 80150 | 0.0159 |
| 0.0002 | 18.86 | 80160 | 0.0160 |
| 0.0001 | 18.86 | 80170 | 0.0160 |
| 0.0001 | 18.87 | 80180 | 0.0161 |
| 0.0 | 18.87 | 80190 | 0.0161 |
| 0.2286 | 18.87 | 80200 | 0.0160 |
| 0.0 | 18.87 | 80210 | 0.0160 |
| 0.0577 | 18.88 | 80220 | 0.0160 |
| 0.0001 | 18.88 | 80230 | 0.0160 |
| 0.0001 | 18.88 | 80240 | 0.0161 |
| 0.0102 | 18.88 | 80250 | 0.0161 |
| 0.0 | 18.88 | 80260 | 0.0161 |
| 0.0 | 18.89 | 80270 | 0.0161 |
| 0.0 | 18.89 | 80280 | 0.0162 |
| 0.0001 | 18.89 | 80290 | 0.0162 |
| 0.0001 | 18.89 | 80300 | 0.0162 |
| 0.0 | 18.9 | 80310 | 0.0162 |
| 0.0001 | 18.9 | 80320 | 0.0162 |
| 0.0 | 18.9 | 80330 | 0.0162 |
| 0.0 | 18.9 | 80340 | 0.0162 |
| 0.2226 | 18.91 | 80350 | 0.0162 |
| 0.0002 | 18.91 | 80360 | 0.0161 |
| 0.0002 | 18.91 | 80370 | 0.0161 |
| 0.0478 | 18.91 | 80380 | 0.0162 |
| 0.0004 | 18.92 | 80390 | 0.0162 |
| 0.0 | 18.92 | 80400 | 0.0162 |
| 0.2993 | 18.92 | 80410 | 0.0162 |
| 0.1497 | 18.92 | 80420 | 0.0162 |
| 0.0001 | 18.92 | 80430 | 0.0162 |
| 0.0 | 18.93 | 80440 | 0.0162 |
| 0.0001 | 18.93 | 80450 | 0.0162 |
| 0.2078 | 18.93 | 80460 | 0.0162 |
| 0.0 | 18.93 | 80470 | 0.0162 |
| 0.0001 | 18.94 | 80480 | 0.0162 |
| 0.2904 | 18.94 | 80490 | 0.0162 |
| 0.0 | 18.94 | 80500 | 0.0162 |
| 0.0 | 18.94 | 80510 | 0.0162 |
| 0.0001 | 18.95 | 80520 | 0.0162 |
| 0.1025 | 18.95 | 80530 | 0.0162 |
| 0.2893 | 18.95 | 80540 | 0.0162 |
| 0.1132 | 18.95 | 80550 | 0.0161 |
| 0.252 | 18.96 | 80560 | 0.0161 |
| 0.0008 | 18.96 | 80570 | 0.0161 |
| 0.3479 | 18.96 | 80580 | 0.0160 |
| 0.0 | 18.96 | 80590 | 0.0160 |
| 0.0001 | 18.96 | 80600 | 0.0160 |
| 0.0046 | 18.97 | 80610 | 0.0160 |
| 0.0 | 18.97 | 80620 | 0.0160 |
| 0.2956 | 18.97 | 80630 | 0.0159 |
| 0.0487 | 18.97 | 80640 | 0.0159 |
| 0.0 | 18.98 | 80650 | 0.0159 |
| 0.2289 | 18.98 | 80660 | 0.0159 |
| 0.0007 | 18.98 | 80670 | 0.0159 |
| 0.0056 | 18.98 | 80680 | 0.0159 |
| 0.0 | 18.99 | 80690 | 0.0159 |
| 0.1881 | 18.99 | 80700 | 0.0159 |
| 0.0006 | 18.99 | 80710 | 0.0159 |
| 0.054 | 18.99 | 80720 | 0.0159 |
| 0.0001 | 19.0 | 80730 | 0.0158 |
| 0.0 | 19.0 | 80740 | 0.0158 |
| 0.0001 | 19.0 | 80750 | 0.0159 |
| 0.0075 | 19.0 | 80760 | 0.0159 |
| 0.0001 | 19.0 | 80770 | 0.0160 |
| 0.0 | 19.01 | 80780 | 0.0160 |
| 0.0001 | 19.01 | 80790 | 0.0160 |
| 0.0228 | 19.01 | 80800 | 0.0160 |
| 0.0001 | 19.01 | 80810 | 0.0160 |
| 0.0 | 19.02 | 80820 | 0.0160 |
| 0.0 | 19.02 | 80830 | 0.0160 |
| 0.2431 | 19.02 | 80840 | 0.0160 |
| 0.0 | 19.02 | 80850 | 0.0160 |
| 0.3517 | 19.03 | 80860 | 0.0160 |
| 0.0 | 19.03 | 80870 | 0.0160 |
| 0.0001 | 19.03 | 80880 | 0.0160 |
| 0.0005 | 19.03 | 80890 | 0.0160 |
| 0.0174 | 19.04 | 80900 | 0.0160 |
| 0.1305 | 19.04 | 80910 | 0.0160 |
| 0.0 | 19.04 | 80920 | 0.0160 |
| 0.2742 | 19.04 | 80930 | 0.0160 |
| 0.0 | 19.04 | 80940 | 0.0160 |
| 0.0 | 19.05 | 80950 | 0.0159 |
| 0.0002 | 19.05 | 80960 | 0.0160 |
| 0.0 | 19.05 | 80970 | 0.0160 |
| 0.0001 | 19.05 | 80980 | 0.0160 |
| 0.0001 | 19.06 | 80990 | 0.0160 |
| 0.0001 | 19.06 | 81000 | 0.0160 |
| 0.0001 | 19.06 | 81010 | 0.0160 |
| 0.0018 | 19.06 | 81020 | 0.0160 |
| 0.0004 | 19.07 | 81030 | 0.0160 |
| 0.0643 | 19.07 | 81040 | 0.0161 |
| 0.0196 | 19.07 | 81050 | 0.0161 |
| 0.0 | 19.07 | 81060 | 0.0161 |
| 0.0 | 19.08 | 81070 | 0.0161 |
| 0.0001 | 19.08 | 81080 | 0.0161 |
| 0.0001 | 19.08 | 81090 | 0.0161 |
| 0.0021 | 19.08 | 81100 | 0.0161 |
| 0.0 | 19.08 | 81110 | 0.0161 |
| 0.0003 | 19.09 | 81120 | 0.0161 |
| 0.0037 | 19.09 | 81130 | 0.0162 |
| 0.0005 | 19.09 | 81140 | 0.0162 |
| 0.0029 | 19.09 | 81150 | 0.0162 |
| 0.0 | 19.1 | 81160 | 0.0162 |
| 0.0 | 19.1 | 81170 | 0.0162 |
| 0.0 | 19.1 | 81180 | 0.0162 |
| 0.0369 | 19.1 | 81190 | 0.0162 |
| 0.2726 | 19.11 | 81200 | 0.0162 |
| 0.0 | 19.11 | 81210 | 0.0163 |
| 0.071 | 19.11 | 81220 | 0.0163 |
| 0.0001 | 19.11 | 81230 | 0.0163 |
| 0.0001 | 19.12 | 81240 | 0.0163 |
| 0.0 | 19.12 | 81250 | 0.0163 |
| 0.012 | 19.12 | 81260 | 0.0163 |
| 0.0001 | 19.12 | 81270 | 0.0163 |
| 0.0 | 19.12 | 81280 | 0.0163 |
| 0.0002 | 19.13 | 81290 | 0.0163 |
| 0.0001 | 19.13 | 81300 | 0.0163 |
| 0.1815 | 19.13 | 81310 | 0.0163 |
| 0.0 | 19.13 | 81320 | 0.0163 |
| 0.0001 | 19.14 | 81330 | 0.0163 |
| 0.2349 | 19.14 | 81340 | 0.0163 |
| 0.0001 | 19.14 | 81350 | 0.0163 |
| 0.0017 | 19.14 | 81360 | 0.0163 |
| 0.0005 | 19.15 | 81370 | 0.0163 |
| 0.0396 | 19.15 | 81380 | 0.0163 |
| 0.0001 | 19.15 | 81390 | 0.0163 |
| 0.0002 | 19.15 | 81400 | 0.0164 |
| 0.2098 | 19.16 | 81410 | 0.0164 |
| 0.0 | 19.16 | 81420 | 0.0164 |
| 0.0 | 19.16 | 81430 | 0.0164 |
| 0.0 | 19.16 | 81440 | 0.0164 |
| 0.0001 | 19.16 | 81450 | 0.0164 |
| 0.1547 | 19.17 | 81460 | 0.0164 |
| 0.0001 | 19.17 | 81470 | 0.0163 |
| 0.0 | 19.17 | 81480 | 0.0163 |
| 0.1807 | 19.17 | 81490 | 0.0163 |
| 0.0332 | 19.18 | 81500 | 0.0163 |
| 0.0002 | 19.18 | 81510 | 0.0163 |
| 0.0001 | 19.18 | 81520 | 0.0163 |
| 0.0 | 19.18 | 81530 | 0.0163 |
| 0.0001 | 19.19 | 81540 | 0.0163 |
| 0.1462 | 19.19 | 81550 | 0.0163 |
| 0.0001 | 19.19 | 81560 | 0.0162 |
| 0.0 | 19.19 | 81570 | 0.0162 |
| 0.2769 | 19.2 | 81580 | 0.0162 |
| 0.322 | 19.2 | 81590 | 0.0161 |
| 0.0003 | 19.2 | 81600 | 0.0161 |
| 0.0 | 19.2 | 81610 | 0.0162 |
| 0.0 | 19.2 | 81620 | 0.0162 |
| 0.1991 | 19.21 | 81630 | 0.0162 |
| 0.1824 | 19.21 | 81640 | 0.0161 |
| 0.0 | 19.21 | 81650 | 0.0161 |
| 0.278 | 19.21 | 81660 | 0.0161 |
| 0.0002 | 19.22 | 81670 | 0.0161 |
| 0.0001 | 19.22 | 81680 | 0.0161 |
| 0.1099 | 19.22 | 81690 | 0.0161 |
| 0.0 | 19.22 | 81700 | 0.0161 |
| 0.0001 | 19.23 | 81710 | 0.0161 |
| 0.0001 | 19.23 | 81720 | 0.0161 |
| 0.0 | 19.23 | 81730 | 0.0161 |
| 0.2761 | 19.23 | 81740 | 0.0161 |
| 0.0 | 19.24 | 81750 | 0.0161 |
| 0.267 | 19.24 | 81760 | 0.0161 |
| 0.5601 | 19.24 | 81770 | 0.0161 |
| 0.0 | 19.24 | 81780 | 0.0161 |
| 0.0001 | 19.24 | 81790 | 0.0161 |
| 0.0089 | 19.25 | 81800 | 0.0161 |
| 0.0001 | 19.25 | 81810 | 0.0161 |
| 0.0004 | 19.25 | 81820 | 0.0161 |
| 0.2672 | 19.25 | 81830 | 0.0161 |
| 0.0 | 19.26 | 81840 | 0.0160 |
| 0.0028 | 19.26 | 81850 | 0.0160 |
| 0.0001 | 19.26 | 81860 | 0.0160 |
| 0.0372 | 19.26 | 81870 | 0.0160 |
| 0.0773 | 19.27 | 81880 | 0.0160 |
| 0.0 | 19.27 | 81890 | 0.0159 |
| 0.0182 | 19.27 | 81900 | 0.0160 |
| 0.1412 | 19.27 | 81910 | 0.0160 |
| 0.0031 | 19.28 | 81920 | 0.0160 |
| 0.0 | 19.28 | 81930 | 0.0160 |
| 0.0564 | 19.28 | 81940 | 0.0160 |
| 0.4822 | 19.28 | 81950 | 0.0160 |
| 0.0288 | 19.28 | 81960 | 0.0160 |
| 0.0 | 19.29 | 81970 | 0.0160 |
| 0.0001 | 19.29 | 81980 | 0.0160 |
| 0.0 | 19.29 | 81990 | 0.0160 |
| 0.0 | 19.29 | 82000 | 0.0160 |
| 0.0833 | 19.3 | 82010 | 0.0160 |
| 0.2071 | 19.3 | 82020 | 0.0159 |
| 0.0983 | 19.3 | 82030 | 0.0159 |
| 0.0009 | 19.3 | 82040 | 0.0159 |
| 0.0001 | 19.31 | 82050 | 0.0158 |
| 0.0 | 19.31 | 82060 | 0.0158 |
| 0.0001 | 19.31 | 82070 | 0.0158 |
| 0.0001 | 19.31 | 82080 | 0.0158 |
| 0.0 | 19.32 | 82090 | 0.0158 |
| 0.0007 | 19.32 | 82100 | 0.0159 |
| 0.1219 | 19.32 | 82110 | 0.0159 |
| 0.0 | 19.32 | 82120 | 0.0160 |
| 0.0001 | 19.32 | 82130 | 0.0160 |
| 0.0 | 19.33 | 82140 | 0.0160 |
| 0.0001 | 19.33 | 82150 | 0.0160 |
| 0.0001 | 19.33 | 82160 | 0.0160 |
| 0.0001 | 19.33 | 82170 | 0.0160 |
| 0.0001 | 19.34 | 82180 | 0.0160 |
| 0.0 | 19.34 | 82190 | 0.0160 |
| 0.0001 | 19.34 | 82200 | 0.0160 |
| 0.0001 | 19.34 | 82210 | 0.0160 |
| 0.1505 | 19.35 | 82220 | 0.0161 |
| 0.0001 | 19.35 | 82230 | 0.0161 |
| 0.0 | 19.35 | 82240 | 0.0161 |
| 0.0015 | 19.35 | 82250 | 0.0161 |
| 0.0 | 19.36 | 82260 | 0.0161 |
| 0.0001 | 19.36 | 82270 | 0.0162 |
| 0.3453 | 19.36 | 82280 | 0.0161 |
| 0.0002 | 19.36 | 82290 | 0.0161 |
| 0.0883 | 19.36 | 82300 | 0.0161 |
| 0.0002 | 19.37 | 82310 | 0.0161 |
| 0.0001 | 19.37 | 82320 | 0.0161 |
| 0.0 | 19.37 | 82330 | 0.0162 |
| 0.0 | 19.37 | 82340 | 0.0162 |
| 0.0001 | 19.38 | 82350 | 0.0162 |
| 0.0 | 19.38 | 82360 | 0.0162 |
| 0.0 | 19.38 | 82370 | 0.0162 |
| 0.0003 | 19.38 | 82380 | 0.0162 |
| 0.0001 | 19.39 | 82390 | 0.0162 |
| 0.0 | 19.39 | 82400 | 0.0162 |
| 0.0001 | 19.39 | 82410 | 0.0162 |
| 0.002 | 19.39 | 82420 | 0.0162 |
| 0.0 | 19.4 | 82430 | 0.0162 |
| 0.2602 | 19.4 | 82440 | 0.0162 |
| 0.0 | 19.4 | 82450 | 0.0162 |
| 0.0 | 19.4 | 82460 | 0.0162 |
| 0.0018 | 19.4 | 82470 | 0.0162 |
| 0.0001 | 19.41 | 82480 | 0.0162 |
| 0.0 | 19.41 | 82490 | 0.0162 |
| 0.2818 | 19.41 | 82500 | 0.0162 |
| 0.016 | 19.41 | 82510 | 0.0162 |
| 0.0 | 19.42 | 82520 | 0.0162 |
| 0.0001 | 19.42 | 82530 | 0.0162 |
| 0.0001 | 19.42 | 82540 | 0.0162 |
| 0.0001 | 19.42 | 82550 | 0.0162 |
| 0.0 | 19.43 | 82560 | 0.0162 |
| 0.0001 | 19.43 | 82570 | 0.0162 |
| 0.1272 | 19.43 | 82580 | 0.0162 |
| 0.0 | 19.43 | 82590 | 0.0162 |
| 0.0001 | 19.44 | 82600 | 0.0162 |
| 0.0163 | 19.44 | 82610 | 0.0162 |
| 0.0022 | 19.44 | 82620 | 0.0163 |
| 0.0795 | 19.44 | 82630 | 0.0163 |
| 0.0078 | 19.44 | 82640 | 0.0163 |
| 0.0 | 19.45 | 82650 | 0.0163 |
| 0.2398 | 19.45 | 82660 | 0.0163 |
| 0.0803 | 19.45 | 82670 | 0.0163 |
| 0.0 | 19.45 | 82680 | 0.0163 |
| 0.0001 | 19.46 | 82690 | 0.0163 |
| 0.0004 | 19.46 | 82700 | 0.0163 |
| 0.0285 | 19.46 | 82710 | 0.0163 |
| 0.0 | 19.46 | 82720 | 0.0163 |
| 0.0005 | 19.47 | 82730 | 0.0163 |
| 0.0002 | 19.47 | 82740 | 0.0163 |
| 0.0 | 19.47 | 82750 | 0.0163 |
| 0.0 | 19.47 | 82760 | 0.0163 |
| 0.0 | 19.48 | 82770 | 0.0163 |
| 0.0 | 19.48 | 82780 | 0.0163 |
| 0.0 | 19.48 | 82790 | 0.0163 |
| 0.0001 | 19.48 | 82800 | 0.0163 |
| 0.0006 | 19.48 | 82810 | 0.0163 |
| 0.0974 | 19.49 | 82820 | 0.0163 |
| 0.0002 | 19.49 | 82830 | 0.0164 |
| 0.0 | 19.49 | 82840 | 0.0164 |
| 0.1894 | 19.49 | 82850 | 0.0164 |
| 0.0001 | 19.5 | 82860 | 0.0163 |
| 0.0024 | 19.5 | 82870 | 0.0163 |
| 0.1037 | 19.5 | 82880 | 0.0164 |
| 0.0025 | 19.5 | 82890 | 0.0164 |
| 0.0001 | 19.51 | 82900 | 0.0164 |
| 0.2402 | 19.51 | 82910 | 0.0164 |
| 0.0 | 19.51 | 82920 | 0.0164 |
| 0.0142 | 19.51 | 82930 | 0.0164 |
| 0.0128 | 19.52 | 82940 | 0.0164 |
| 0.0 | 19.52 | 82950 | 0.0164 |
| 0.0001 | 19.52 | 82960 | 0.0164 |
| 0.0067 | 19.52 | 82970 | 0.0164 |
| 0.0 | 19.52 | 82980 | 0.0164 |
| 0.0576 | 19.53 | 82990 | 0.0164 |
| 0.0363 | 19.53 | 83000 | 0.0164 |
| 0.0001 | 19.53 | 83010 | 0.0164 |
| 0.0018 | 19.53 | 83020 | 0.0164 |
| 0.0 | 19.54 | 83030 | 0.0164 |
| 0.0001 | 19.54 | 83040 | 0.0164 |
| 0.0 | 19.54 | 83050 | 0.0164 |
| 0.0002 | 19.54 | 83060 | 0.0164 |
| 0.2919 | 19.55 | 83070 | 0.0164 |
| 0.0 | 19.55 | 83080 | 0.0164 |
| 0.2001 | 19.55 | 83090 | 0.0164 |
| 0.1108 | 19.55 | 83100 | 0.0164 |
| 0.0 | 19.56 | 83110 | 0.0164 |
| 0.1293 | 19.56 | 83120 | 0.0164 |
| 0.1255 | 19.56 | 83130 | 0.0164 |
| 0.0016 | 19.56 | 83140 | 0.0163 |
| 0.0 | 19.56 | 83150 | 0.0163 |
| 0.0 | 19.57 | 83160 | 0.0163 |
| 0.0002 | 19.57 | 83170 | 0.0163 |
| 0.0 | 19.57 | 83180 | 0.0164 |
| 0.0 | 19.57 | 83190 | 0.0164 |
| 0.0001 | 19.58 | 83200 | 0.0164 |
| 0.0 | 19.58 | 83210 | 0.0164 |
| 0.0 | 19.58 | 83220 | 0.0164 |
| 0.2286 | 19.58 | 83230 | 0.0164 |
| 0.0009 | 19.59 | 83240 | 0.0163 |
| 0.0 | 19.59 | 83250 | 0.0163 |
| 0.0 | 19.59 | 83260 | 0.0163 |
| 0.0 | 19.59 | 83270 | 0.0163 |
| 0.0 | 19.6 | 83280 | 0.0163 |
| 0.0147 | 19.6 | 83290 | 0.0163 |
| 0.0001 | 19.6 | 83300 | 0.0163 |
| 0.0085 | 19.6 | 83310 | 0.0163 |
| 0.0761 | 19.6 | 83320 | 0.0163 |
| 0.0 | 19.61 | 83330 | 0.0163 |
| 0.3758 | 19.61 | 83340 | 0.0163 |
| 0.0001 | 19.61 | 83350 | 0.0163 |
| 0.3384 | 19.61 | 83360 | 0.0163 |
| 0.0 | 19.62 | 83370 | 0.0163 |
| 0.0795 | 19.62 | 83380 | 0.0163 |
| 0.0 | 19.62 | 83390 | 0.0163 |
| 0.1505 | 19.62 | 83400 | 0.0163 |
| 0.0 | 19.63 | 83410 | 0.0164 |
| 0.0005 | 19.63 | 83420 | 0.0164 |
| 0.0 | 19.63 | 83430 | 0.0164 |
| 0.0 | 19.63 | 83440 | 0.0164 |
| 0.0001 | 19.64 | 83450 | 0.0164 |
| 0.0002 | 19.64 | 83460 | 0.0164 |
| 0.0 | 19.64 | 83470 | 0.0164 |
| 0.0 | 19.64 | 83480 | 0.0164 |
| 0.0 | 19.64 | 83490 | 0.0164 |
| 0.0001 | 19.65 | 83500 | 0.0164 |
| 0.0 | 19.65 | 83510 | 0.0164 |
| 0.2232 | 19.65 | 83520 | 0.0164 |
| 0.2736 | 19.65 | 83530 | 0.0164 |
| 0.0001 | 19.66 | 83540 | 0.0164 |
| 0.0 | 19.66 | 83550 | 0.0164 |
| 0.0001 | 19.66 | 83560 | 0.0164 |
| 0.0 | 19.66 | 83570 | 0.0164 |
| 0.0009 | 19.67 | 83580 | 0.0164 |
| 0.0 | 19.67 | 83590 | 0.0164 |
| 0.0027 | 19.67 | 83600 | 0.0164 |
| 0.0 | 19.67 | 83610 | 0.0164 |
| 0.0001 | 19.68 | 83620 | 0.0164 |
| 0.0001 | 19.68 | 83630 | 0.0164 |
| 0.0011 | 19.68 | 83640 | 0.0164 |
| 0.0267 | 19.68 | 83650 | 0.0165 |
| 0.0001 | 19.68 | 83660 | 0.0165 |
| 0.0002 | 19.69 | 83670 | 0.0165 |
| 0.0001 | 19.69 | 83680 | 0.0165 |
| 0.1943 | 19.69 | 83690 | 0.0165 |
| 0.0 | 19.69 | 83700 | 0.0165 |
| 0.0 | 19.7 | 83710 | 0.0165 |
| 0.0001 | 19.7 | 83720 | 0.0165 |
| 0.0 | 19.7 | 83730 | 0.0165 |
| 0.484 | 19.7 | 83740 | 0.0165 |
| 0.0001 | 19.71 | 83750 | 0.0164 |
| 0.0 | 19.71 | 83760 | 0.0164 |
| 0.0 | 19.71 | 83770 | 0.0164 |
| 0.1702 | 19.71 | 83780 | 0.0164 |
| 0.1155 | 19.72 | 83790 | 0.0164 |
| 0.0 | 19.72 | 83800 | 0.0164 |
| 0.113 | 19.72 | 83810 | 0.0164 |
| 0.0001 | 19.72 | 83820 | 0.0164 |
| 0.0016 | 19.72 | 83830 | 0.0164 |
| 0.0 | 19.73 | 83840 | 0.0164 |
| 0.0 | 19.73 | 83850 | 0.0164 |
| 0.0 | 19.73 | 83860 | 0.0164 |
| 0.1219 | 19.73 | 83870 | 0.0164 |
| 0.0002 | 19.74 | 83880 | 0.0164 |
| 0.0001 | 19.74 | 83890 | 0.0164 |
| 0.0 | 19.74 | 83900 | 0.0164 |
| 0.1329 | 19.74 | 83910 | 0.0164 |
| 0.0762 | 19.75 | 83920 | 0.0164 |
| 0.0001 | 19.75 | 83930 | 0.0164 |
| 0.0001 | 19.75 | 83940 | 0.0164 |
| 0.0002 | 19.75 | 83950 | 0.0164 |
| 0.0001 | 19.76 | 83960 | 0.0164 |
| 0.0 | 19.76 | 83970 | 0.0164 |
| 0.0 | 19.76 | 83980 | 0.0164 |
| 0.0633 | 19.76 | 83990 | 0.0165 |
| 0.0001 | 19.76 | 84000 | 0.0165 |
| 0.0 | 19.77 | 84010 | 0.0165 |
| 0.0 | 19.77 | 84020 | 0.0165 |
| 0.1817 | 19.77 | 84030 | 0.0165 |
| 0.0001 | 19.77 | 84040 | 0.0165 |
| 0.0 | 19.78 | 84050 | 0.0165 |
| 0.0 | 19.78 | 84060 | 0.0165 |
| 0.0015 | 19.78 | 84070 | 0.0165 |
| 0.3774 | 19.78 | 84080 | 0.0165 |
| 0.0 | 19.79 | 84090 | 0.0165 |
| 0.0001 | 19.79 | 84100 | 0.0165 |
| 0.0 | 19.79 | 84110 | 0.0165 |
| 0.0001 | 19.79 | 84120 | 0.0165 |
| 0.4395 | 19.8 | 84130 | 0.0164 |
| 0.0 | 19.8 | 84140 | 0.0164 |
| 0.0 | 19.8 | 84150 | 0.0164 |
| 0.0 | 19.8 | 84160 | 0.0164 |
| 0.3431 | 19.8 | 84170 | 0.0164 |
| 0.0479 | 19.81 | 84180 | 0.0164 |
| 0.0 | 19.81 | 84190 | 0.0164 |
| 0.1788 | 19.81 | 84200 | 0.0164 |
| 0.0002 | 19.81 | 84210 | 0.0164 |
| 0.0001 | 19.82 | 84220 | 0.0164 |
| 0.0013 | 19.82 | 84230 | 0.0164 |
| 0.0 | 19.82 | 84240 | 0.0165 |
| 0.0 | 19.82 | 84250 | 0.0165 |
| 0.0 | 19.83 | 84260 | 0.0165 |
| 0.0001 | 19.83 | 84270 | 0.0165 |
| 0.0001 | 19.83 | 84280 | 0.0165 |
| 0.3192 | 19.83 | 84290 | 0.0165 |
| 0.0001 | 19.84 | 84300 | 0.0165 |
| 0.0 | 19.84 | 84310 | 0.0165 |
| 0.0001 | 19.84 | 84320 | 0.0165 |
| 0.2201 | 19.84 | 84330 | 0.0164 |
| 0.0 | 19.84 | 84340 | 0.0164 |
| 0.0 | 19.85 | 84350 | 0.0164 |
| 0.2692 | 19.85 | 84360 | 0.0164 |
| 0.1767 | 19.85 | 84370 | 0.0164 |
| 0.0 | 19.85 | 84380 | 0.0164 |
| 0.0029 | 19.86 | 84390 | 0.0164 |
| 0.078 | 19.86 | 84400 | 0.0164 |
| 0.0 | 19.86 | 84410 | 0.0164 |
| 0.0 | 19.86 | 84420 | 0.0164 |
| 0.0 | 19.87 | 84430 | 0.0164 |
| 0.0036 | 19.87 | 84440 | 0.0164 |
| 0.3 | 19.87 | 84450 | 0.0164 |
| 0.0 | 19.87 | 84460 | 0.0164 |
| 0.11 | 19.88 | 84470 | 0.0164 |
| 0.0001 | 19.88 | 84480 | 0.0164 |
| 0.0001 | 19.88 | 84490 | 0.0164 |
| 0.0 | 19.88 | 84500 | 0.0164 |
| 0.0001 | 19.88 | 84510 | 0.0164 |
| 0.3587 | 19.89 | 84520 | 0.0164 |
| 0.0001 | 19.89 | 84530 | 0.0164 |
| 0.0392 | 19.89 | 84540 | 0.0164 |
| 0.2611 | 19.89 | 84550 | 0.0164 |
| 0.0017 | 19.9 | 84560 | 0.0164 |
| 0.0001 | 19.9 | 84570 | 0.0164 |
| 0.0 | 19.9 | 84580 | 0.0164 |
| 0.0 | 19.9 | 84590 | 0.0164 |
| 0.2921 | 19.91 | 84600 | 0.0164 |
| 0.0383 | 19.91 | 84610 | 0.0164 |
| 0.0 | 19.91 | 84620 | 0.0164 |
| 0.3177 | 19.91 | 84630 | 0.0164 |
| 0.0001 | 19.92 | 84640 | 0.0164 |
| 0.0013 | 19.92 | 84650 | 0.0164 |
| 0.0001 | 19.92 | 84660 | 0.0164 |
| 0.3556 | 19.92 | 84670 | 0.0164 |
| 0.3457 | 19.92 | 84680 | 0.0164 |
| 0.2998 | 19.93 | 84690 | 0.0164 |
| 0.0881 | 19.93 | 84700 | 0.0164 |
| 0.0001 | 19.93 | 84710 | 0.0164 |
| 0.0798 | 19.93 | 84720 | 0.0164 |
| 0.2417 | 19.94 | 84730 | 0.0164 |
| 0.0308 | 19.94 | 84740 | 0.0164 |
| 0.0 | 19.94 | 84750 | 0.0164 |
| 0.0 | 19.94 | 84760 | 0.0164 |
| 0.1669 | 19.95 | 84770 | 0.0164 |
| 0.26 | 19.95 | 84780 | 0.0164 |
| 0.0 | 19.95 | 84790 | 0.0164 |
| 0.0 | 19.95 | 84800 | 0.0164 |
| 0.0 | 19.96 | 84810 | 0.0164 |
| 0.0011 | 19.96 | 84820 | 0.0164 |
| 0.0001 | 19.96 | 84830 | 0.0164 |
| 0.0 | 19.96 | 84840 | 0.0164 |
| 0.5716 | 19.96 | 84850 | 0.0164 |
| 0.0001 | 19.97 | 84860 | 0.0164 |
| 0.235 | 19.97 | 84870 | 0.0164 |
| 0.0 | 19.97 | 84880 | 0.0164 |
| 0.001 | 19.97 | 84890 | 0.0164 |
| 0.0 | 19.98 | 84900 | 0.0164 |
| 0.0209 | 19.98 | 84910 | 0.0164 |
| 0.2252 | 19.98 | 84920 | 0.0164 |
| 0.0203 | 19.98 | 84930 | 0.0164 |
| 0.0001 | 19.99 | 84940 | 0.0164 |
| 0.0 | 19.99 | 84950 | 0.0164 |
| 0.0002 | 19.99 | 84960 | 0.0164 |
| 0.0001 | 19.99 | 84970 | 0.0164 |
| 0.3327 | 20.0 | 84980 | 0.0164 |
| 0.0001 | 20.0 | 84990 | 0.0164 |
| 0.195 | 20.0 | 85000 | 0.0164 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.14.7
- Tokenizers 0.15.0
|
OpenPipe/sample-lora-pii-redaction
|
OpenPipe
| 2023-12-23T00:36:30Z | 5 | 0 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"generated_from_trainer",
"base_model:OpenPipe/mistral-ft-optimized-1218",
"base_model:quantized:OpenPipe/mistral-ft-optimized-1218",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] |
text-generation
| 2023-12-23T00:36:14Z |
---
license: apache-2.0
base_model: OpenPipe/mistral-ft-optimized-1218
tags:
- generated_from_trainer
model-index:
- name: models/loras/OpenPipe/ft-development-0b0f52d6-bc53-4443-bbad-4a6103c95501-pii-7b-optimized
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
# models/loras/OpenPipe/ft-development-0b0f52d6-bc53-4443-bbad-4a6103c95501-pii-7b-optimized
This model is a fine-tuned version of [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0174
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.42 | 0.02 | 1 | 0.3989 |
| 0.0258 | 0.21 | 13 | 0.0304 |
| 0.025 | 0.43 | 26 | 0.0220 |
| 0.0146 | 0.64 | 39 | 0.0204 |
| 0.0208 | 0.85 | 52 | 0.0196 |
| 0.0136 | 1.07 | 65 | 0.0187 |
| 0.0148 | 1.28 | 78 | 0.0181 |
| 0.0178 | 1.49 | 91 | 0.0180 |
| 0.0204 | 1.7 | 104 | 0.0175 |
| 0.0128 | 1.92 | 117 | 0.0174 |
### Framework versions
- Transformers 4.34.1
- Pytorch 2.0.1+cu117
- Datasets 2.14.6
- Tokenizers 0.14.1
|
c-wang/drl-course-unit5-pyramid
|
c-wang
| 2023-12-23T00:35:59Z | 0 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Pyramids",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Pyramids",
"region:us"
] |
reinforcement-learning
| 2023-12-23T00:35:55Z |
---
library_name: ml-agents
tags:
- Pyramids
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Pyramids
---
# **ppo** Agent playing **Pyramids**
This is a trained model of a **ppo** agent playing **Pyramids**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: c-wang/drl-course-unit5-pyramid
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
bdsaglam/llama-2-7b-chat-hf-kg-cons-multi-peft-1703274332
|
bdsaglam
| 2023-12-23T00:15:04Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-12-23T00:14:52Z |
---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0
|
Shepherd17/ppo-SpaceInvadersNoFrameskip-v4
|
Shepherd17
| 2023-12-23T00:11:03Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-12-23T00:10:17Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 294.25 +/- 16.95
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
dev2k/my_awesome_model
|
dev2k
| 2023-12-23T00:08:03Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-23T00:07:43Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: my_awesome_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1385
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 7 | 0.2584 | 1.0 |
| No log | 2.0 | 14 | 0.1385 | 1.0 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
arya555/email_classification
|
arya555
| 2023-12-23T00:07:29Z | 6 | 0 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"base_model:FacebookAI/roberta-base",
"base_model:finetune:FacebookAI/roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-09-17T18:43:20Z |
---
license: mit
base_model: roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: email_classification
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# email_classification
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5632
- Accuracy: 0.9038
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.2987 | 1.0 | 121 | 1.1291 | 0.6058 |
| 0.6602 | 2.0 | 242 | 0.8249 | 0.75 |
| 0.4545 | 3.0 | 363 | 0.4199 | 0.8942 |
| 0.2338 | 4.0 | 484 | 0.5669 | 0.9038 |
| 0.083 | 5.0 | 605 | 0.5672 | 0.9038 |
| 0.0057 | 6.0 | 726 | 0.5632 | 0.9038 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.0.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Maxlin12/gpt2-finetuned-wikitext2
|
Maxlin12
| 2023-12-23T00:01:46Z | 3 | 0 |
transformers
|
[
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T21:02:47Z |
---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: Maxlin12/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Maxlin12/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4917
- Validation Loss: 6.3408
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3091 | 6.7586 | 0 |
| 6.4917 | 6.3408 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
hqbui/ppo-SnowballTarget
|
hqbui
| 2023-12-22T23:42:21Z | 0 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"SnowballTarget",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SnowballTarget",
"region:us"
] |
reinforcement-learning
| 2023-12-22T23:42:18Z |
---
library_name: ml-agents
tags:
- SnowballTarget
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SnowballTarget
---
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: hqbui/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Alena-Poluboyarinova/bert-base-cased-finetuned-wikitext2
|
Alena-Poluboyarinova
| 2023-12-22T23:37:53Z | 3 | 0 |
transformers
|
[
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-12-22T23:15:51Z |
---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: Alena-Poluboyarinova/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Alena-Poluboyarinova/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9592
- Validation Loss: 6.8829
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4204 | 7.0579 | 0 |
| 6.9592 | 6.8829 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
JLei/climate_fever_roberta-base-fact-checking
|
JLei
| 2023-12-22T23:25:27Z | 6 | 0 |
transformers
|
[
"transformers",
"safetensors",
"bart",
"text-classification",
"en",
"dataset:Jasontth/climate_fever_plus",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-22T05:32:35Z |
---
license: mit
datasets:
- Jasontth/climate_fever_plus
language:
- en
pipeline_tag: text-classification
---
|
JConnor/mixtral-cat2-finetune-1
|
JConnor
| 2023-12-22T23:14:49Z | 0 | 0 |
peft
|
[
"peft",
"safetensors",
"generated_from_trainer",
"base_model:mistralai/Mixtral-8x7B-Instruct-v0.1",
"base_model:adapter:mistralai/Mixtral-8x7B-Instruct-v0.1",
"license:apache-2.0",
"region:us"
] | null | 2023-12-22T23:13:54Z |
---
license: apache-2.0
library_name: peft
tags:
- generated_from_trainer
base_model: mistralai/Mixtral-8x7B-Instruct-v0.1
model-index:
- name: mixtral-cat2-finetune-1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mixtral-cat2-finetune-1
This model is a fine-tuned version of [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.3493
- eval_runtime: 912.56
- eval_samples_per_second: 0.548
- eval_steps_per_second: 0.069
- epoch: 0.36
- step: 1950
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1
- training_steps: 10960
### Framework versions
- PEFT 0.7.2.dev0
- Transformers 4.37.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
|
Azaliya-Vagizova/bert-base-cased-finetuned-wikitext2
|
Azaliya-Vagizova
| 2023-12-22T23:07:28Z | 3 | 0 |
transformers
|
[
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-12-22T22:46:47Z |
---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: Azaliya-Vagizova/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Azaliya-Vagizova/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9600
- Validation Loss: 6.8787
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4268 | 7.0439 | 0 |
| 6.9600 | 6.8787 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
winehertz/gpt2-finetuned-wikitext2
|
winehertz
| 2023-12-22T23:04:55Z | 5 | 0 |
transformers
|
[
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T22:44:38Z |
---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: winehertz/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# winehertz/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4986
- Validation Loss: 6.3582
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3193 | 6.7700 | 0 |
| 6.4986 | 6.3582 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
andregn/Realistic_Vision_V6.0-inpainting
|
andregn
| 2023-12-22T22:53:33Z | 49 | 2 |
diffusers
|
[
"diffusers",
"safetensors",
"v8",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-07-11T20:10:40Z |
---
license: creativeml-openrail-m
---
<b>The recommended negative prompt:</b><br>
(deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime:1.4), text, close up, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck<br>
<b>OR</b><br>
(deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime, mutated hands and fingers:1.4), (deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, disconnected limbs, mutation, mutated, ugly, disgusting, amputation
<b>Recommended parameters for generation:</b><br>
Euler A or DPM++ SDE Karras<br>
CFG Scale 3,5 - 15<br>
Hires. fix with 4x-UltraSharp upscaler<br>
0 Hires steps and Denoising strength 0.25-0.7<br>
Upscale by 1.1-2.0
|
userdiawatch/gpt2-finetuned-wikitext2
|
userdiawatch
| 2023-12-22T22:46:11Z | 5 | 0 |
transformers
|
[
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T19:15:18Z |
---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: userdiawatch/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# userdiawatch/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4967
- Validation Loss: 6.3487
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3150 | 6.7667 | 0 |
| 6.4967 | 6.3487 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
imagepipeline/Weight_Slider
|
imagepipeline
| 2023-12-22T22:22:54Z | 0 | 0 | null |
[
"imagepipeline",
"imagepipeline.io",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-12-22T22:22:52Z |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
## Weight_Slider
<img src="https://via.placeholder.com/468x300?text=App+Screenshot+Here" alt="Generated by Image Pipeline" style="border-radius: 10px;">
**This lora model is uploaded on [imagepipeline.io](https://imagepipeline.io/)**
Model details - Slider for Weight
[](https://imagepipeline.io/models/Weight_Slider?id=0b5ff6f2-ce18-48ab-a506-5597e98e3490/)
## How to try this model ?
You can try using it locally or send an API call to test the output quality.
Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/). No payment required.
Coding in `php` `javascript` `node` etc ? Checkout our documentation
[](https://docs.imagepipeline.io/docs/introduction)
```python
import requests
import json
url = "https://imagepipeline.io/sd/text2image/v1/run"
payload = json.dumps({
"model_id": "sd1.5",
"prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": false,
"guidance_scale": 7.5,
"multi_lingual": "no",
"embeddings": "",
"lora_models": "0b5ff6f2-ce18-48ab-a506-5597e98e3490",
"lora_weights": "0.5"
})
headers = {
'Content-Type': 'application/json',
'API-Key': 'your_api_key'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
}
```
Get more ready to use `MODELS` like this for `SD 1.5` and `SDXL` :
[](https://imagepipeline.io/models)
### API Reference
#### Generate Image
```http
https://api.imagepipeline.io/sd/text2image/v1
```
| Headers | Type | Description |
|:----------------------| :------- |:-------------------------------------------------------------------------------------------------------------------|
| `API-Key` | `str` | Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/) |
| `Content-Type` | `str` | application/json - content type of the request body |
| Parameter | Type | Description |
| :-------- | :------- | :------------------------- |
| `model_id` | `str` | Your base model, find available lists in [models page](https://imagepipeline.io/models) or upload your own|
| `prompt` | `str` | Text Prompt. Check our [Prompt Guide](https://docs.imagepipeline.io/docs/SD-1.5/docs/extras/prompt-guide) for tips |
| `num_inference_steps` | `int [1-50]` | Noise is removed with each step, resulting in a higher-quality image over time. Ideal value 30-50 (without LCM) |
| `guidance_scale` | `float [1-20]` | Higher guidance scale prioritizes text prompt relevance but sacrifices image quality. Ideal value 7.5-12.5 |
| `lora_models` | `str, array` | Pass the model_id(s) of LoRA models that can be found in models page |
| `lora_weights` | `str, array` | Strength of the LoRA effect |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
### Feedback
If you have any feedback, please reach out to us at hello@imagepipeline.io
#### 🔗 Visit Website
[](https://imagepipeline.io/)
If you are the original author of this model, please [click here](https://airtable.com/apprTaRnJbDJ8ufOx/shr4g7o9B6fWfOlUR) to add credits
|
bartowski/CodeNinja-1.0-OpenChat-7B-exl2
|
bartowski
| 2023-12-22T22:18:50Z | 1 | 0 | null |
[
"code",
"text-generation-inference",
"text-generation",
"en",
"dataset:glaiveai/glaive-code-assistant-v2",
"dataset:TokenBender/code_instructions_122k_alpaca_style",
"license:mit",
"region:us"
] |
text-generation
| 2023-12-22T20:50:08Z |
---
license: mit
datasets:
- glaiveai/glaive-code-assistant-v2
- TokenBender/code_instructions_122k_alpaca_style
language:
- en
metrics:
- code_eval
pipeline_tag: text-generation
tags:
- code
- text-generation-inference
quantized_by: bartowski
---
## Exllama v2 Quantizations of CodeNinja-1.0-OpenChat-7B
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.11">turboderp's ExLlamaV2 v0.0.11</a> for quantization.
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
Conversion was done using the default calibration dataset.
Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
Original model: https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B
<a href="https://huggingface.co/bartowski/CodeNinja-1.0-OpenChat-7B-exl2/tree/4_0">4.0 bits per weight</a>
<a href="https://huggingface.co/bartowski/CodeNinja-1.0-OpenChat-7B-exl2/tree/5_0">5.0 bits per weight</a>
<a href="https://huggingface.co/bartowski/CodeNinja-1.0-OpenChat-7B-exl2/tree/6_0">6.0 bits per weight</a>
<a href="https://huggingface.co/bartowski/CodeNinja-1.0-OpenChat-7B-exl2/tree/8_0">8.0 bits per weight</a>
## Download instructions
With git:
```shell
git clone --single-branch --branch 4_0 https://huggingface.co/bartowski/CodeNinja-1.0-OpenChat-7B-exl2
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `CodeNinja-1.0-OpenChat-7B-exl2`:
```shell
mkdir CodeNinja-1.0-OpenChat-7B-exl2
huggingface-cli download bartowski/CodeNinja-1.0-OpenChat-7B-exl2 --local-dir CodeNinja-1.0-OpenChat-7B-exl2 --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir CodeNinja-1.0-OpenChat-7B-exl2
huggingface-cli download bartowski/CodeNinja-1.0-OpenChat-7B-exl2 --revision 4_0 --local-dir CodeNinja-1.0-OpenChat-7B-exl2 --local-dir-use-symlinks False
```
|
imagepipeline/th3_pit
|
imagepipeline
| 2023-12-22T22:18:16Z | 0 | 0 | null |
[
"imagepipeline",
"imagepipeline.io",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-12-22T22:18:15Z |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
## th3_pit
<img src="https://via.placeholder.com/468x300?text=App+Screenshot+Here" alt="Generated by Image Pipeline" style="border-radius: 10px;">
**This lora model is uploaded on [imagepipeline.io](https://imagepipeline.io/)**
Model details - th3 pit
[](https://imagepipeline.io/models/th3_pit?id=b92f9a0c-165d-4c5e-9dd8-8501fc47273b/)
## How to try this model ?
You can try using it locally or send an API call to test the output quality.
Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/). No payment required.
Coding in `php` `javascript` `node` etc ? Checkout our documentation
[](https://docs.imagepipeline.io/docs/introduction)
```python
import requests
import json
url = "https://imagepipeline.io/sd/text2image/v1/run"
payload = json.dumps({
"model_id": "sd1.5",
"prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": false,
"guidance_scale": 7.5,
"multi_lingual": "no",
"embeddings": "",
"lora_models": "b92f9a0c-165d-4c5e-9dd8-8501fc47273b",
"lora_weights": "0.5"
})
headers = {
'Content-Type': 'application/json',
'API-Key': 'your_api_key'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
}
```
Get more ready to use `MODELS` like this for `SD 1.5` and `SDXL` :
[](https://imagepipeline.io/models)
### API Reference
#### Generate Image
```http
https://api.imagepipeline.io/sd/text2image/v1
```
| Headers | Type | Description |
|:----------------------| :------- |:-------------------------------------------------------------------------------------------------------------------|
| `API-Key` | `str` | Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/) |
| `Content-Type` | `str` | application/json - content type of the request body |
| Parameter | Type | Description |
| :-------- | :------- | :------------------------- |
| `model_id` | `str` | Your base model, find available lists in [models page](https://imagepipeline.io/models) or upload your own|
| `prompt` | `str` | Text Prompt. Check our [Prompt Guide](https://docs.imagepipeline.io/docs/SD-1.5/docs/extras/prompt-guide) for tips |
| `num_inference_steps` | `int [1-50]` | Noise is removed with each step, resulting in a higher-quality image over time. Ideal value 30-50 (without LCM) |
| `guidance_scale` | `float [1-20]` | Higher guidance scale prioritizes text prompt relevance but sacrifices image quality. Ideal value 7.5-12.5 |
| `lora_models` | `str, array` | Pass the model_id(s) of LoRA models that can be found in models page |
| `lora_weights` | `str, array` | Strength of the LoRA effect |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
### Feedback
If you have any feedback, please reach out to us at hello@imagepipeline.io
#### 🔗 Visit Website
[](https://imagepipeline.io/)
If you are the original author of this model, please [click here](https://airtable.com/apprTaRnJbDJ8ufOx/shr4g7o9B6fWfOlUR) to add credits
|
imagepipeline/Ahip
|
imagepipeline
| 2023-12-22T22:16:27Z | 0 | 0 | null |
[
"imagepipeline",
"imagepipeline.io",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-12-22T22:16:25Z |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
## Ahip
<img src="https://via.placeholder.com/468x300?text=App+Screenshot+Here" alt="Generated by Image Pipeline" style="border-radius: 10px;">
**This lora model is uploaded on [imagepipeline.io](https://imagepipeline.io/)**
Model details - Hip improvement
[](https://imagepipeline.io/models/Ahip?id=d1a38f4d-eec4-4020-8430-5bfa57575ea4/)
## How to try this model ?
You can try using it locally or send an API call to test the output quality.
Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/). No payment required.
Coding in `php` `javascript` `node` etc ? Checkout our documentation
[](https://docs.imagepipeline.io/docs/introduction)
```python
import requests
import json
url = "https://imagepipeline.io/sd/text2image/v1/run"
payload = json.dumps({
"model_id": "sd1.5",
"prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": false,
"guidance_scale": 7.5,
"multi_lingual": "no",
"embeddings": "",
"lora_models": "d1a38f4d-eec4-4020-8430-5bfa57575ea4",
"lora_weights": "0.5"
})
headers = {
'Content-Type': 'application/json',
'API-Key': 'your_api_key'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
}
```
Get more ready to use `MODELS` like this for `SD 1.5` and `SDXL` :
[](https://imagepipeline.io/models)
### API Reference
#### Generate Image
```http
https://api.imagepipeline.io/sd/text2image/v1
```
| Headers | Type | Description |
|:----------------------| :------- |:-------------------------------------------------------------------------------------------------------------------|
| `API-Key` | `str` | Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/) |
| `Content-Type` | `str` | application/json - content type of the request body |
| Parameter | Type | Description |
| :-------- | :------- | :------------------------- |
| `model_id` | `str` | Your base model, find available lists in [models page](https://imagepipeline.io/models) or upload your own|
| `prompt` | `str` | Text Prompt. Check our [Prompt Guide](https://docs.imagepipeline.io/docs/SD-1.5/docs/extras/prompt-guide) for tips |
| `num_inference_steps` | `int [1-50]` | Noise is removed with each step, resulting in a higher-quality image over time. Ideal value 30-50 (without LCM) |
| `guidance_scale` | `float [1-20]` | Higher guidance scale prioritizes text prompt relevance but sacrifices image quality. Ideal value 7.5-12.5 |
| `lora_models` | `str, array` | Pass the model_id(s) of LoRA models that can be found in models page |
| `lora_weights` | `str, array` | Strength of the LoRA effect |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
### Feedback
If you have any feedback, please reach out to us at hello@imagepipeline.io
#### 🔗 Visit Website
[](https://imagepipeline.io/)
If you are the original author of this model, please [click here](https://airtable.com/apprTaRnJbDJ8ufOx/shr4g7o9B6fWfOlUR) to add credits
|
imagepipeline/Fullbreasts
|
imagepipeline
| 2023-12-22T22:13:55Z | 0 | 1 | null |
[
"imagepipeline",
"imagepipeline.io",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-12-22T22:13:53Z |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
## Fullbreasts
<img src="https://via.placeholder.com/468x300?text=App+Screenshot+Here" alt="Generated by Image Pipeline" style="border-radius: 10px;">
**This lora model is uploaded on [imagepipeline.io](https://imagepipeline.io/)**
Model details - Perfect Full Round Breasts & Slim Waist
[](https://imagepipeline.io/models/Fullbreasts?id=69913ba3-c67c-4965-ae1b-6cab0e67168f/)
## How to try this model ?
You can try using it locally or send an API call to test the output quality.
Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/). No payment required.
Coding in `php` `javascript` `node` etc ? Checkout our documentation
[](https://docs.imagepipeline.io/docs/introduction)
```python
import requests
import json
url = "https://imagepipeline.io/sd/text2image/v1/run"
payload = json.dumps({
"model_id": "sd1.5",
"prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": false,
"guidance_scale": 7.5,
"multi_lingual": "no",
"embeddings": "",
"lora_models": "69913ba3-c67c-4965-ae1b-6cab0e67168f",
"lora_weights": "0.5"
})
headers = {
'Content-Type': 'application/json',
'API-Key': 'your_api_key'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
}
```
Get more ready to use `MODELS` like this for `SD 1.5` and `SDXL` :
[](https://imagepipeline.io/models)
### API Reference
#### Generate Image
```http
https://api.imagepipeline.io/sd/text2image/v1
```
| Headers | Type | Description |
|:----------------------| :------- |:-------------------------------------------------------------------------------------------------------------------|
| `API-Key` | `str` | Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/) |
| `Content-Type` | `str` | application/json - content type of the request body |
| Parameter | Type | Description |
| :-------- | :------- | :------------------------- |
| `model_id` | `str` | Your base model, find available lists in [models page](https://imagepipeline.io/models) or upload your own|
| `prompt` | `str` | Text Prompt. Check our [Prompt Guide](https://docs.imagepipeline.io/docs/SD-1.5/docs/extras/prompt-guide) for tips |
| `num_inference_steps` | `int [1-50]` | Noise is removed with each step, resulting in a higher-quality image over time. Ideal value 30-50 (without LCM) |
| `guidance_scale` | `float [1-20]` | Higher guidance scale prioritizes text prompt relevance but sacrifices image quality. Ideal value 7.5-12.5 |
| `lora_models` | `str, array` | Pass the model_id(s) of LoRA models that can be found in models page |
| `lora_weights` | `str, array` | Strength of the LoRA effect |
---
license: creativeml-openrail-m
tags:
- imagepipeline
- imagepipeline.io
- text-to-image
- ultra-realistic
pinned: false
pipeline_tag: text-to-image
---
### Feedback
If you have any feedback, please reach out to us at hello@imagepipeline.io
#### 🔗 Visit Website
[](https://imagepipeline.io/)
If you are the original author of this model, please [click here](https://airtable.com/apprTaRnJbDJ8ufOx/shr4g7o9B6fWfOlUR) to add credits
|
s3nh/phi-2_dolly_instruction_polish
|
s3nh
| 2023-12-22T22:13:16Z | 0 | 2 |
peft
|
[
"peft",
"safetensors",
"phi-msft",
"generated_from_trainer",
"custom_code",
"base_model:microsoft/phi-2",
"base_model:adapter:microsoft/phi-2",
"license:other",
"region:us"
] | null | 2023-12-22T22:11:11Z |
---
license: other
library_name: peft
tags:
- generated_from_trainer
base_model: microsoft/phi-2
model-index:
- name: phi-2-sft-out
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
# phi-2-sft-out
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2813
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-06
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.95) and epsilon=1e-05
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| No log | 0.0 | 1 | 1.7973 |
| 1.9767 | 0.25 | 5290 | 1.4832 |
| 1.8474 | 0.5 | 10580 | 1.4356 |
| 1.8121 | 0.75 | 15870 | 1.4022 |
| 1.8333 | 1.0 | 21160 | 1.3678 |
| 1.6601 | 1.25 | 26450 | 1.3508 |
| 1.5452 | 1.5 | 31740 | 1.3357 |
| 1.7381 | 1.75 | 37030 | 1.3191 |
| 1.6256 | 2.0 | 42320 | 1.3090 |
| 1.5521 | 2.25 | 47610 | 1.2961 |
| 1.8318 | 2.5 | 52900 | 1.2910 |
| 1.6761 | 2.75 | 58190 | 1.2901 |
| 1.6312 | 3.0 | 63480 | 1.2879 |
| 1.7003 | 3.25 | 68770 | 1.2820 |
| 1.6915 | 3.5 | 74060 | 1.2814 |
| 1.5757 | 3.75 | 79350 | 1.2813 |
### Framework versions
- Transformers 4.37.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.6.0
|
Aryan-401/distilhubert-finetuned-gtzan
|
Aryan-401
| 2023-12-22T21:59:49Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"hubert",
"audio-classification",
"generated_from_trainer",
"dataset:marsyas/gtzan",
"base_model:ntu-spml/distilhubert",
"base_model:finetune:ntu-spml/distilhubert",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
audio-classification
| 2023-12-22T12:20:23Z |
---
license: apache-2.0
base_model: ntu-spml/distilhubert
tags:
- generated_from_trainer
datasets:
- marsyas/gtzan
metrics:
- accuracy
model-index:
- name: distilhubert-finetuned-gtzan
results:
- task:
name: Audio Classification
type: audio-classification
dataset:
name: GTZAN
type: marsyas/gtzan
config: all
split: train
args: all
metrics:
- name: Accuracy
type: accuracy
value: 0.85
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilhubert-finetuned-gtzan
This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the GTZAN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7207
- Accuracy: 0.85
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0917 | 1.0 | 113 | 1.9773 | 0.48 |
| 1.3655 | 2.0 | 226 | 1.3419 | 0.62 |
| 0.9901 | 3.0 | 339 | 0.9640 | 0.75 |
| 0.8565 | 4.0 | 452 | 0.7732 | 0.81 |
| 0.6259 | 5.0 | 565 | 0.7502 | 0.8 |
| 0.4507 | 6.0 | 678 | 0.6888 | 0.81 |
| 0.4018 | 7.0 | 791 | 0.7404 | 0.8 |
| 0.1275 | 8.0 | 904 | 0.6718 | 0.83 |
| 0.1077 | 9.0 | 1017 | 0.6175 | 0.86 |
| 0.028 | 10.0 | 1130 | 0.6317 | 0.86 |
| 0.0867 | 11.0 | 1243 | 0.6053 | 0.88 |
| 0.0149 | 12.0 | 1356 | 0.7164 | 0.85 |
| 0.0108 | 13.0 | 1469 | 0.7224 | 0.85 |
| 0.0101 | 14.0 | 1582 | 0.7101 | 0.84 |
| 0.0096 | 15.0 | 1695 | 0.7207 | 0.85 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
SimplCup/JotaroKujo
|
SimplCup
| 2023-12-22T21:56:23Z | 0 | 0 | null |
[
"license:cc-by-nc-nd-4.0",
"region:us"
] | null | 2023-12-22T21:56:03Z |
---
license: cc-by-nc-nd-4.0
---
|
alperiox/weapons-lora
|
alperiox
| 2023-12-22T21:45:06Z | 3 | 0 |
diffusers
|
[
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"lora",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-12-16T13:03:30Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA text2image fine-tuning - alperiox/weapons-lora
These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were fine-tuned on the alperiox/cctv_captions dataset. You can find some example images in the following.


|
Dan1212121212/MLM
|
Dan1212121212
| 2023-12-22T21:24:46Z | 5 | 0 |
transformers
|
[
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-12-22T21:23:42Z |
---
tags:
- generated_from_trainer
model-index:
- name: MLM
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MLM
This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 6.4494
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 6.0372 | 10.87 | 500 | 6.3028 |
| 6.0257 | 21.74 | 1000 | 6.2374 |
| 6.0084 | 32.61 | 1500 | 6.2937 |
| 5.9784 | 43.48 | 2000 | 6.1836 |
| 5.9738 | 54.35 | 2500 | 6.5184 |
| 5.9795 | 65.22 | 3000 | 6.2282 |
| 5.9726 | 76.09 | 3500 | 6.3921 |
| 5.9744 | 86.96 | 4000 | 6.2515 |
| 5.9694 | 97.83 | 4500 | 6.4494 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Aryan-401/ast-finetuned-audioset-10-10-0.4593-finetuned-gtzan
|
Aryan-401
| 2023-12-22T21:12:25Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"audio-spectrogram-transformer",
"audio-classification",
"generated_from_trainer",
"dataset:marsyas/gtzan",
"base_model:MIT/ast-finetuned-audioset-10-10-0.4593",
"base_model:finetune:MIT/ast-finetuned-audioset-10-10-0.4593",
"license:bsd-3-clause",
"model-index",
"endpoints_compatible",
"region:us"
] |
audio-classification
| 2023-12-22T19:54:42Z |
---
license: bsd-3-clause
base_model: MIT/ast-finetuned-audioset-10-10-0.4593
tags:
- generated_from_trainer
datasets:
- marsyas/gtzan
metrics:
- accuracy
model-index:
- name: ast-finetuned-audioset-10-10-0.4593-finetuned-gtzan
results:
- task:
name: Audio Classification
type: audio-classification
dataset:
name: GTZAN
type: marsyas/gtzan
config: all
split: train
args: all
metrics:
- name: Accuracy
type: accuracy
value: 0.88
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ast-finetuned-audioset-10-10-0.4593-finetuned-gtzan
This model is a fine-tuned version of [MIT/ast-finetuned-audioset-10-10-0.4593](https://huggingface.co/MIT/ast-finetuned-audioset-10-10-0.4593) on the GTZAN dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1549
- Accuracy: 0.88
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 3.3456 | 1.0 | 225 | 1.2066 | 0.74 |
| 0.8545 | 2.0 | 450 | 0.9730 | 0.77 |
| 0.2187 | 3.0 | 675 | 1.0979 | 0.74 |
| 0.0901 | 4.0 | 900 | 1.5440 | 0.83 |
| 0.139 | 5.0 | 1125 | 1.1210 | 0.89 |
| 0.2476 | 6.0 | 1350 | 1.1494 | 0.87 |
| 0.0 | 7.0 | 1575 | 1.3533 | 0.86 |
| 0.0002 | 8.0 | 1800 | 1.6525 | 0.87 |
| 0.0 | 9.0 | 2025 | 1.0403 | 0.88 |
| 0.0 | 10.0 | 2250 | 1.6173 | 0.85 |
| 0.0 | 11.0 | 2475 | 1.2292 | 0.88 |
| 0.0 | 12.0 | 2700 | 1.2066 | 0.88 |
| 0.0 | 13.0 | 2925 | 1.1427 | 0.88 |
| 0.0 | 14.0 | 3150 | 1.1401 | 0.88 |
| 0.0 | 15.0 | 3375 | 1.1549 | 0.88 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
ksushausenko/bert-base-cased-finetuned-wikitext2
|
ksushausenko
| 2023-12-22T21:08:27Z | 4 | 0 |
transformers
|
[
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-12-22T20:47:10Z |
---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: ksushausenko/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# ksushausenko/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9617
- Validation Loss: 6.8848
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4354 | 7.0595 | 0 |
| 6.9617 | 6.8848 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
toddwilson147/pixelcopter-reinforce
|
toddwilson147
| 2023-12-22T21:07:40Z | 0 | 0 | null |
[
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-12-22T21:07:11Z |
---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: pixelcopter-reinforce
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 56.60 +/- 41.93
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
ThuyNT03/KLTN_COQE_viT5_total_ASOPL
|
ThuyNT03
| 2023-12-22T21:05:37Z | 4 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"base_model:finetune:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-22T19:55:31Z |
---
license: mit
base_model: VietAI/vit5-large
tags:
- generated_from_trainer
model-index:
- name: KLTN_COQE_viT5_total_ASOPL
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# KLTN_COQE_viT5_total_ASOPL
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
evenicole/zephyr-7b-enem-nlp
|
evenicole
| 2023-12-22T21:04:32Z | 4 | 0 |
peft
|
[
"peft",
"safetensors",
"generated_from_trainer",
"base_model:HuggingFaceH4/zephyr-7b-beta",
"base_model:adapter:HuggingFaceH4/zephyr-7b-beta",
"license:mit",
"region:us"
] | null | 2023-12-22T21:04:19Z |
---
license: mit
library_name: peft
tags:
- generated_from_trainer
base_model: HuggingFaceH4/zephyr-7b-beta
model-index:
- name: zephyr-7b-enem-nlp
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# zephyr-7b-enem-nlp
This model is a fine-tuned version of [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4433
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.6904 | 2.13 | 100 | 1.4433 |
### Framework versions
- PEFT 0.7.1.dev0
- Transformers 4.36.0.dev0
- Pytorch 2.0.1+cu117
- Datasets 2.14.7
- Tokenizers 0.15.0
|
LoneStriker/CodeNinja-1.0-OpenChat-7B-8.0bpw-h8-exl2
|
LoneStriker
| 2023-12-22T21:00:55Z | 8 | 1 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"code",
"text-generation-inference",
"conversational",
"en",
"dataset:glaiveai/glaive-code-assistant-v2",
"dataset:TokenBender/code_instructions_122k_alpaca_style",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T20:57:49Z |
---
license: mit
datasets:
- glaiveai/glaive-code-assistant-v2
- TokenBender/code_instructions_122k_alpaca_style
language:
- en
metrics:
- code_eval
pipeline_tag: text-generation
tags:
- code
- text-generation-inference
---
<p align="center">
<img width="700px" alt="DeepSeek Coder" src="https://cdn-uploads.huggingface.co/production/uploads/64b566ab04fa6584c03b5247/5COagfF6EwrV4utZJ-ClI.png">
</p>
<hr>
# CodeNinja: Your Advanced Coding Assistant
## Overview
CodeNinja is an enhanced version of the renowned model [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210). It represents a breakthrough in coding assistance, having been fine-tuned through Supervised Fine Tuning on two expansive datasets, encompassing over 400,000 coding instructions. Designed to be an indispensable tool for coders, CodeNinja aims to integrate seamlessly into your daily coding routine.
Discover the quantized versions at: [beowolx/CodeNinja-1.0-OpenChat-7B-GGUF](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF).
### Key Features
- **Expansive Training Database**: CodeNinja has been refined with datasets from [glaiveai/glaive-code-assistant-v2](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2) and [TokenBender/code_instructions_122k_alpaca_style](https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style), incorporating around 400,000 coding instructions across various languages including Python, C, C++, Rust, Java, JavaScript, and more.
- **Flexibility and Scalability**: Available in a 7B model size, CodeNinja is adaptable for local runtime environments.
- **Exceptional Performance**: Achieves top-tier results among publicly accessible coding models, particularly notable on benchmarks like HumanEval.
- **Advanced Code Completion**: With a substantial context window size of 8192, it supports comprehensive project-level code completion.
## Prompt Format
CodeNinja maintains the same prompt structure as OpenChat 3.5. Effective utilization requires adherence to this format:
```
GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant: Hi<|end_of_turn|>GPT4 Correct User: How are you today?<|end_of_turn|>GPT4 Correct Assistant:
```
🚨 Important: Ensure the use of `<|end_of_turn|>` as the end-of-generation token.
**Adhering to this format is crucial for optimal results.**
## Usage Instructions
### Using LM Studio
The simplest way to engage with CodeNinja is via the [quantized versions](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF) on [LM Studio](https://lmstudio.ai/). Ensure you select the "OpenChat" preset, which incorporates the necessary prompt format. The preset is also available in this [gist](https://gist.github.com/beowolx/b219466681c02ff67baf8f313a3ad817).
### Using the Transformers Library
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
# Initialize the model
model_path = "beowolx/CodeNinja-1.0-OpenChat-7B"
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto")
# Load the OpenChat tokenizer
tokenizer = AutoTokenizer.from_pretrained("openchat/openchat-3.5-1210", use_fast=True)
def generate_one_completion(prompt: str):
messages = [
{"role": "user", "content": prompt},
{"role": "assistant", "content": ""} # Model response placeholder
]
# Generate token IDs using the chat template
input_ids = tokenizer.apply_chat_template(messages, add_generation_prompt=True)
# Produce completion
generate_ids = model.generate(
torch.tensor([input_ids]).to("cuda"),
max_length=256,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id
)
# Process the completion
completion = tokenizer.decode(generate_ids[0], skip_special_tokens=True)
completion = completion.split("\n\n\n")[0].strip()
return completion
```
## License
CodeNinja is licensed under the MIT License, with model usage subject to the Model License.
## Contact
For queries or support, please open an issue in the repository.
|
LoneStriker/CodeNinja-1.0-OpenChat-7B-6.0bpw-h6-exl2
|
LoneStriker
| 2023-12-22T20:53:37Z | 5 | 0 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"code",
"text-generation-inference",
"conversational",
"en",
"dataset:glaiveai/glaive-code-assistant-v2",
"dataset:TokenBender/code_instructions_122k_alpaca_style",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T20:51:12Z |
---
license: mit
datasets:
- glaiveai/glaive-code-assistant-v2
- TokenBender/code_instructions_122k_alpaca_style
language:
- en
metrics:
- code_eval
pipeline_tag: text-generation
tags:
- code
- text-generation-inference
---
<p align="center">
<img width="700px" alt="DeepSeek Coder" src="https://cdn-uploads.huggingface.co/production/uploads/64b566ab04fa6584c03b5247/5COagfF6EwrV4utZJ-ClI.png">
</p>
<hr>
# CodeNinja: Your Advanced Coding Assistant
## Overview
CodeNinja is an enhanced version of the renowned model [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210). It represents a breakthrough in coding assistance, having been fine-tuned through Supervised Fine Tuning on two expansive datasets, encompassing over 400,000 coding instructions. Designed to be an indispensable tool for coders, CodeNinja aims to integrate seamlessly into your daily coding routine.
Discover the quantized versions at: [beowolx/CodeNinja-1.0-OpenChat-7B-GGUF](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF).
### Key Features
- **Expansive Training Database**: CodeNinja has been refined with datasets from [glaiveai/glaive-code-assistant-v2](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2) and [TokenBender/code_instructions_122k_alpaca_style](https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style), incorporating around 400,000 coding instructions across various languages including Python, C, C++, Rust, Java, JavaScript, and more.
- **Flexibility and Scalability**: Available in a 7B model size, CodeNinja is adaptable for local runtime environments.
- **Exceptional Performance**: Achieves top-tier results among publicly accessible coding models, particularly notable on benchmarks like HumanEval.
- **Advanced Code Completion**: With a substantial context window size of 8192, it supports comprehensive project-level code completion.
## Prompt Format
CodeNinja maintains the same prompt structure as OpenChat 3.5. Effective utilization requires adherence to this format:
```
GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant: Hi<|end_of_turn|>GPT4 Correct User: How are you today?<|end_of_turn|>GPT4 Correct Assistant:
```
🚨 Important: Ensure the use of `<|end_of_turn|>` as the end-of-generation token.
**Adhering to this format is crucial for optimal results.**
## Usage Instructions
### Using LM Studio
The simplest way to engage with CodeNinja is via the [quantized versions](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF) on [LM Studio](https://lmstudio.ai/). Ensure you select the "OpenChat" preset, which incorporates the necessary prompt format. The preset is also available in this [gist](https://gist.github.com/beowolx/b219466681c02ff67baf8f313a3ad817).
### Using the Transformers Library
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
# Initialize the model
model_path = "beowolx/CodeNinja-1.0-OpenChat-7B"
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto")
# Load the OpenChat tokenizer
tokenizer = AutoTokenizer.from_pretrained("openchat/openchat-3.5-1210", use_fast=True)
def generate_one_completion(prompt: str):
messages = [
{"role": "user", "content": prompt},
{"role": "assistant", "content": ""} # Model response placeholder
]
# Generate token IDs using the chat template
input_ids = tokenizer.apply_chat_template(messages, add_generation_prompt=True)
# Produce completion
generate_ids = model.generate(
torch.tensor([input_ids]).to("cuda"),
max_length=256,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id
)
# Process the completion
completion = tokenizer.decode(generate_ids[0], skip_special_tokens=True)
completion = completion.split("\n\n\n")[0].strip()
return completion
```
## License
CodeNinja is licensed under the MIT License, with model usage subject to the Model License.
## Contact
For queries or support, please open an issue in the repository.
|
NazmusAshrafi/large_dataset_stock_twitter_topic_Bert
|
NazmusAshrafi
| 2023-12-22T20:52:40Z | 7 | 0 |
transformers
|
[
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-22T19:27:15Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: large_dataset_stock_twitter_topic_Bert
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# large_dataset_stock_twitter_topic_Bert
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0389
- Accuracy: 0.9925
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0791 | 1.0 | 4938 | 0.0652 | 0.9854 |
| 0.021 | 2.0 | 9876 | 0.0389 | 0.9925 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
LoneStriker/CodeNinja-1.0-OpenChat-7B-5.0bpw-h6-exl2
|
LoneStriker
| 2023-12-22T20:46:40Z | 6 | 0 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"code",
"text-generation-inference",
"conversational",
"en",
"dataset:glaiveai/glaive-code-assistant-v2",
"dataset:TokenBender/code_instructions_122k_alpaca_style",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T20:44:38Z |
---
license: mit
datasets:
- glaiveai/glaive-code-assistant-v2
- TokenBender/code_instructions_122k_alpaca_style
language:
- en
metrics:
- code_eval
pipeline_tag: text-generation
tags:
- code
- text-generation-inference
---
<p align="center">
<img width="700px" alt="DeepSeek Coder" src="https://cdn-uploads.huggingface.co/production/uploads/64b566ab04fa6584c03b5247/5COagfF6EwrV4utZJ-ClI.png">
</p>
<hr>
# CodeNinja: Your Advanced Coding Assistant
## Overview
CodeNinja is an enhanced version of the renowned model [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210). It represents a breakthrough in coding assistance, having been fine-tuned through Supervised Fine Tuning on two expansive datasets, encompassing over 400,000 coding instructions. Designed to be an indispensable tool for coders, CodeNinja aims to integrate seamlessly into your daily coding routine.
Discover the quantized versions at: [beowolx/CodeNinja-1.0-OpenChat-7B-GGUF](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF).
### Key Features
- **Expansive Training Database**: CodeNinja has been refined with datasets from [glaiveai/glaive-code-assistant-v2](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2) and [TokenBender/code_instructions_122k_alpaca_style](https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style), incorporating around 400,000 coding instructions across various languages including Python, C, C++, Rust, Java, JavaScript, and more.
- **Flexibility and Scalability**: Available in a 7B model size, CodeNinja is adaptable for local runtime environments.
- **Exceptional Performance**: Achieves top-tier results among publicly accessible coding models, particularly notable on benchmarks like HumanEval.
- **Advanced Code Completion**: With a substantial context window size of 8192, it supports comprehensive project-level code completion.
## Prompt Format
CodeNinja maintains the same prompt structure as OpenChat 3.5. Effective utilization requires adherence to this format:
```
GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant: Hi<|end_of_turn|>GPT4 Correct User: How are you today?<|end_of_turn|>GPT4 Correct Assistant:
```
🚨 Important: Ensure the use of `<|end_of_turn|>` as the end-of-generation token.
**Adhering to this format is crucial for optimal results.**
## Usage Instructions
### Using LM Studio
The simplest way to engage with CodeNinja is via the [quantized versions](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF) on [LM Studio](https://lmstudio.ai/). Ensure you select the "OpenChat" preset, which incorporates the necessary prompt format. The preset is also available in this [gist](https://gist.github.com/beowolx/b219466681c02ff67baf8f313a3ad817).
### Using the Transformers Library
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
# Initialize the model
model_path = "beowolx/CodeNinja-1.0-OpenChat-7B"
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto")
# Load the OpenChat tokenizer
tokenizer = AutoTokenizer.from_pretrained("openchat/openchat-3.5-1210", use_fast=True)
def generate_one_completion(prompt: str):
messages = [
{"role": "user", "content": prompt},
{"role": "assistant", "content": ""} # Model response placeholder
]
# Generate token IDs using the chat template
input_ids = tokenizer.apply_chat_template(messages, add_generation_prompt=True)
# Produce completion
generate_ids = model.generate(
torch.tensor([input_ids]).to("cuda"),
max_length=256,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id
)
# Process the completion
completion = tokenizer.decode(generate_ids[0], skip_special_tokens=True)
completion = completion.split("\n\n\n")[0].strip()
return completion
```
## License
CodeNinja is licensed under the MIT License, with model usage subject to the Model License.
## Contact
For queries or support, please open an issue in the repository.
|
ksushausenko/gpt2-finetuned-wikitext2
|
ksushausenko
| 2023-12-22T20:42:06Z | 2 | 0 |
transformers
|
[
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T20:20:53Z |
---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: ksushausenko/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# ksushausenko/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4929
- Validation Loss: 6.3476
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3180 | 6.7637 | 0 |
| 6.4929 | 6.3476 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ
|
TheBloke
| 2023-12-22T20:38:47Z | 26 | 6 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"code",
"text-generation-inference",
"conversational",
"en",
"dataset:glaiveai/glaive-code-assistant-v2",
"dataset:TokenBender/code_instructions_122k_alpaca_style",
"base_model:beowolx/CodeNinja-1.0-OpenChat-7B",
"base_model:quantized:beowolx/CodeNinja-1.0-OpenChat-7B",
"license:mit",
"autotrain_compatible",
"4-bit",
"gptq",
"region:us"
] |
text-generation
| 2023-12-22T20:10:18Z |
---
base_model: beowolx/CodeNinja-1.0-OpenChat-7B
datasets:
- glaiveai/glaive-code-assistant-v2
- TokenBender/code_instructions_122k_alpaca_style
inference: false
language:
- en
license: mit
metrics:
- code_eval
model_creator: beowulf
model_name: CodeNinja 1.0 Openchat 7B
model_type: mistral
pipeline_tag: text-generation
prompt_template: 'GPT4 Correct User: {prompt}<|end_of_turn|>GPT4 Correct Assistant:
'
quantized_by: TheBloke
tags:
- code
- text-generation-inference
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# CodeNinja 1.0 Openchat 7B - GPTQ
- Model creator: [beowulf](https://huggingface.co/beowolx)
- Original model: [CodeNinja 1.0 Openchat 7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B)
<!-- description start -->
# Description
This repo contains GPTQ model files for [beowulf's CodeNinja 1.0 Openchat 7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GGUF)
* [beowulf's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: OpenChat-Correct
```
GPT4 Correct User: {prompt}<|end_of_turn|>GPT4 Correct Assistant:
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-compatible clients start -->
## Known compatible clients / servers
GPTQ models are currently supported on Linux (NVidia/AMD) and Windows (NVidia only). macOS users: please use GGUF models.
These GPTQ models are known to work in the following inference servers/webuis.
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [KoboldAI United](https://github.com/henk717/koboldai)
- [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui)
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
This may not be a complete list; if you know of others, please let me know!
<!-- README_GPTQ.md-compatible clients end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama and Mistral models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1/viewer/) | 4096 | 4.16 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1/viewer/) | 4096 | 4.57 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1/viewer/) | 4096 | 7.52 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1/viewer/) | 4096 | 7.68 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1/viewer/) | 4096 | 8.17 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1/viewer/) | 4096 | 4.30 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `CodeNinja-1.0-OpenChat-7B-GPTQ`:
```shell
mkdir CodeNinja-1.0-OpenChat-7B-GPTQ
huggingface-cli download TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ --local-dir CodeNinja-1.0-OpenChat-7B-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir CodeNinja-1.0-OpenChat-7B-GPTQ
huggingface-cli download TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir CodeNinja-1.0-OpenChat-7B-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir CodeNinja-1.0-OpenChat-7B-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ --local-dir CodeNinja-1.0-OpenChat-7B-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `CodeNinja-1.0-OpenChat-7B-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
- Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''GPT4 Correct User: {prompt}<|end_of_turn|>GPT4 Correct Assistant:
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## Python code example: inference from this GPTQ model
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install --upgrade transformers optimum
# If using PyTorch 2.1 + CUDA 12.x:
pip3 install --upgrade auto-gptq
# or, if using PyTorch 2.1 + CUDA 11.x:
pip3 install --upgrade auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
```
If you are using PyTorch 2.0, you will need to install AutoGPTQ from source. Likewise if you have problems with the pre-built wheels, you should try building from source:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.5.1
pip3 install .
```
### Example Python code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/CodeNinja-1.0-OpenChat-7B-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Write a story about llamas"
system_message = "You are a story writing assistant"
prompt_template=f'''GPT4 Correct User: {prompt}<|end_of_turn|>GPT4 Correct Assistant:
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with Transformers. For non-Mistral models, AutoGPTQ can also be used directly.
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama architecture models (including Mistral, Yi, DeepSeek, SOLAR, etc) in 4-bit. Please see the Provided Files table above for per-file compatibility.
For a list of clients/servers, please see "Known compatible clients / servers", above.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, 阿明, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjäreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: beowulf's CodeNinja 1.0 Openchat 7B
<p align="center">
<img width="700px" alt="DeepSeek Coder" src="https://cdn-uploads.huggingface.co/production/uploads/64b566ab04fa6584c03b5247/5COagfF6EwrV4utZJ-ClI.png">
</p>
<hr>
# CodeNinja: Your Advanced Coding Assistant
## Overview
CodeNinja is an enhanced version of the renowned model [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210). It represents a breakthrough in coding assistance, having been fine-tuned through Supervised Fine Tuning on two expansive datasets, encompassing over 400,000 coding instructions. Designed to be an indispensable tool for coders, CodeNinja aims to integrate seamlessly into your daily coding routine.
Discover the quantized versions at: [beowolx/CodeNinja-1.0-OpenChat-7B-GGUF](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF).
### Key Features
- **Expansive Training Database**: CodeNinja has been refined with datasets from [glaiveai/glaive-code-assistant-v2](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2) and [TokenBender/code_instructions_122k_alpaca_style](https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style), incorporating around 400,000 coding instructions across various languages including Python, C, C++, Rust, Java, JavaScript, and more.
- **Flexibility and Scalability**: Available in a 7B model size, CodeNinja is adaptable for local runtime environments.
- **Exceptional Performance**: Achieves top-tier results among publicly accessible coding models, particularly notable on benchmarks like HumanEval.
- **Advanced Code Completion**: With a substantial context window size of 8192, it supports comprehensive project-level code completion.
## Prompt Format
CodeNinja maintains the same prompt structure as OpenChat 3.5. Effective utilization requires adherence to this format:
```
GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant: Hi<|end_of_turn|>GPT4 Correct User: How are you today?<|end_of_turn|>GPT4 Correct Assistant:
```
🚨 Important: Ensure the use of `<|end_of_turn|>` as the end-of-generation token.
**Adhering to this format is crucial for optimal results.**
## Usage Instructions
### Using LM Studio
The simplest way to engage with CodeNinja is via the [quantized versions](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF) on [LM Studio](https://lmstudio.ai/). Ensure you select the "OpenChat" preset, which incorporates the necessary prompt format. The preset is also available in this [gist](https://gist.github.com/beowolx/b219466681c02ff67baf8f313a3ad817).
### Using the Transformers Library
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
# Initialize the model
model_path = "beowolx/CodeNinja-1.0-OpenChat-7B"
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto")
# Load the OpenChat tokenizer
tokenizer = AutoTokenizer.from_pretrained("openchat/openchat-3.5-1210", use_fast=True)
def generate_one_completion(prompt: str):
messages = [
{"role": "user", "content": prompt},
{"role": "assistant", "content": ""} # Model response placeholder
]
# Generate token IDs using the chat template
input_ids = tokenizer.apply_chat_template(messages, add_generation_prompt=True)
# Produce completion
generate_ids = model.generate(
torch.tensor([input_ids]).to("cuda"),
max_length=256,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id
)
# Process the completion
completion = tokenizer.decode(generate_ids[0], skip_special_tokens=True)
completion = completion.split("\n\n\n")[0].strip()
return completion
```
## License
CodeNinja is licensed under the MIT License, with model usage subject to the Model License.
## Contact
For queries or support, please open an issue in the repository.
|
sarvamai/OpenHathi-7B-Hi-v0.1-Base
|
sarvamai
| 2023-12-22T20:37:42Z | 2,253 | 107 |
transformers
|
[
"transformers",
"safetensors",
"gguf",
"llama",
"text-generation",
"hi",
"license:llama2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-13T13:41:11Z |
---
license: llama2
language:
- hi
---
This repository is the first model in the OpenHathi series of models that will be released by Sarvam AI. This is a 7B parameter, based on Llama2, trained on Hindi, English, and Hinglish. More details about the model, its training procedure, and evaluations can be found [here](https://www.sarvam.ai/blog/announcing-openhathi-series).
Note: this is a base model and not meant to be used as is. We recommend first finetuning it on task(s) you are interested in.
```
# Usage
import torch
from transformers import LlamaTokenizer, LlamaForCausalLM
tokenizer = LlamaTokenizer.from_pretrained('sarvamai/OpenHathi-7B-Hi-v0.1-Base')
model = LlamaForCausalLM.from_pretrained('sarvamai/OpenHathi-7B-Hi-v0.1-Base', torch_dtype=torch.bfloat16)
prompt = "मैं एक अच्छा हाथी हूँ"
inputs = tokenizer(prompt, return_tensors="pt")
# Generate
generate_ids = model.generate(inputs.input_ids, max_length=30)
tokenizer.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0]
```
|
LoneStriker/CodeNinja-1.0-OpenChat-7B-3.0bpw-h6-exl2
|
LoneStriker
| 2023-12-22T20:33:34Z | 7 | 1 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"code",
"text-generation-inference",
"conversational",
"en",
"dataset:glaiveai/glaive-code-assistant-v2",
"dataset:TokenBender/code_instructions_122k_alpaca_style",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T20:32:13Z |
---
license: mit
datasets:
- glaiveai/glaive-code-assistant-v2
- TokenBender/code_instructions_122k_alpaca_style
language:
- en
metrics:
- code_eval
pipeline_tag: text-generation
tags:
- code
- text-generation-inference
---
<p align="center">
<img width="700px" alt="DeepSeek Coder" src="https://cdn-uploads.huggingface.co/production/uploads/64b566ab04fa6584c03b5247/5COagfF6EwrV4utZJ-ClI.png">
</p>
<hr>
# CodeNinja: Your Advanced Coding Assistant
## Overview
CodeNinja is an enhanced version of the renowned model [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210). It represents a breakthrough in coding assistance, having been fine-tuned through Supervised Fine Tuning on two expansive datasets, encompassing over 400,000 coding instructions. Designed to be an indispensable tool for coders, CodeNinja aims to integrate seamlessly into your daily coding routine.
Discover the quantized versions at: [beowolx/CodeNinja-1.0-OpenChat-7B-GGUF](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF).
### Key Features
- **Expansive Training Database**: CodeNinja has been refined with datasets from [glaiveai/glaive-code-assistant-v2](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2) and [TokenBender/code_instructions_122k_alpaca_style](https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style), incorporating around 400,000 coding instructions across various languages including Python, C, C++, Rust, Java, JavaScript, and more.
- **Flexibility and Scalability**: Available in a 7B model size, CodeNinja is adaptable for local runtime environments.
- **Exceptional Performance**: Achieves top-tier results among publicly accessible coding models, particularly notable on benchmarks like HumanEval.
- **Advanced Code Completion**: With a substantial context window size of 8192, it supports comprehensive project-level code completion.
## Prompt Format
CodeNinja maintains the same prompt structure as OpenChat 3.5. Effective utilization requires adherence to this format:
```
GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant: Hi<|end_of_turn|>GPT4 Correct User: How are you today?<|end_of_turn|>GPT4 Correct Assistant:
```
🚨 Important: Ensure the use of `<|end_of_turn|>` as the end-of-generation token.
**Adhering to this format is crucial for optimal results.**
## Usage Instructions
### Using LM Studio
The simplest way to engage with CodeNinja is via the [quantized versions](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B-GGUF) on [LM Studio](https://lmstudio.ai/). Ensure you select the "OpenChat" preset, which incorporates the necessary prompt format. The preset is also available in this [gist](https://gist.github.com/beowolx/b219466681c02ff67baf8f313a3ad817).
### Using the Transformers Library
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
# Initialize the model
model_path = "beowolx/CodeNinja-1.0-OpenChat-7B"
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto")
# Load the OpenChat tokenizer
tokenizer = AutoTokenizer.from_pretrained("openchat/openchat-3.5-1210", use_fast=True)
def generate_one_completion(prompt: str):
messages = [
{"role": "user", "content": prompt},
{"role": "assistant", "content": ""} # Model response placeholder
]
# Generate token IDs using the chat template
input_ids = tokenizer.apply_chat_template(messages, add_generation_prompt=True)
# Produce completion
generate_ids = model.generate(
torch.tensor([input_ids]).to("cuda"),
max_length=256,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id
)
# Process the completion
completion = tokenizer.decode(generate_ids[0], skip_special_tokens=True)
completion = completion.split("\n\n\n")[0].strip()
return completion
```
## License
CodeNinja is licensed under the MIT License, with model usage subject to the Model License.
## Contact
For queries or support, please open an issue in the repository.
|
ElnaggarLab/ankh2-large
|
ElnaggarLab
| 2023-12-22T20:23:32Z | 27 | 3 |
transformers
|
[
"transformers",
"t5",
"text2text-generation",
"biology",
"protein",
"protein language model",
"protein embedding",
"dataset:agemagician/uniref50",
"arxiv:2301.06568",
"license:cc-by-nc-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-22T19:24:16Z |
---
license: cc-by-nc-sa-4.0
tags:
- biology
- protein
- protein language model
- protein embedding
datasets:
- agemagician/uniref50
---
# Important
The model will be uploaded soon, please stay tuned.
# ANKH2-Large model
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/2301.06568) and first released in
[this repository](https://github.com/agemagician/Ankh). This model is trained on uppercase amino acids: it only works with capital letter amino acids.
## Model description
ANKH2-Large is based on the `ANKH-Large` model and was pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
Two important differences between this ANKH2-Large model and the original ANKH-Large version are:
1. The model was trained with more number of epochs.
2. The activation function changed to silu.
It has been shown that the features extracted from this self-supervised model (LM-embeddings) captured important biophysical properties governing protein shape.
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
## Intended uses & limitations
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks you can gain more accuracy by fine-tuning the model using lora method rather than using it as a feature extractor.
We have also noticed that for feature extraction, its better to use the feature extracted from the encoder rather than from the decoder.
### How to use
Here is how to use this model to extract the features of a given protein sequence in PyTorch:
```python
sequence_examples = ["PRTEINO", "SEQWENCE"]
# tokenize sequences and pad up to the longest sequence in the batch
ids = tokenizer.batch_encode_plus(sequence_examples, add_special_tokens=True, padding="longest")
input_ids = torch.tensor(ids['input_ids']).to(device)
attention_mask = torch.tensor(ids['attention_mask']).to(device)
# generate embeddings
with torch.no_grad():
embedding_repr = model(input_ids=input_ids,attention_mask=attention_mask)
# extract embeddings for the first ([0,:]) sequence in the batch while removing padded & special tokens ([0,:7])
emb_0 = embedding_repr.last_hidden_state[0,:7] # shape (7 x 1536)
print(f"Shape of per-residue embedding of first sequences: {emb_0.shape}")
# do the same for the second ([1,:]) sequence in the batch while taking into account different sequence lengths ([1,:8])
emb_1 = embedding_repr.last_hidden_state[1,:8] # shape (8 x 1536)
# if you want to derive a single representation (per-protein embedding) for the whole protein
emb_0_per_protein = emb_0.mean(dim=0) # shape (1536)
print(f"Shape of per-protein embedding of first sequences: {emb_0_per_protein.shape}")
```
## Training data
The ANKH2-Large model was pretrained on [UniRef50](https://www.uniprot.org/help/uniref), a dataset consisting of 60 million protein sequences.
## Training procedure
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 25.
The inputs of the model are then of the form:
```
Protein Sequence </s>
```
The preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.
The details of the masking procedure for each sequence are as follows:
- 20% of the amino acids are masked.
- In 100% of the cases, the masked amino acids are replaced by `<extra_id_num>` token, where "num" is a number in range 0 and 115.
### Pretraining
The model was trained on a single TPU Pod V4-256 for 45 epochs in total, using sequence length 512 (batch size 1k).
It was trained using ANKH-Large model as an initial checkpoint, rather than training from scratch.
It has a total of approximately 2B parameters and was trained using the encoder-decoder architecture.
The optimizer used is Adafactor with linear warmup with linear decay learning rate schedule for pre-training.
## Evaluation results
When the model is used for feature extraction "FE" and parameter efficient fine-tuning "Lora", this model achieves the following results:
Test results :
| Task/Dataset | Method | secondary structure (3-states) | secondary structure (8-states) | Localization | Membrane | Solubility | Fluorescence |
|:-----:|:-----:|:-----:|:-----:|:-----:|:-----:|:-----:|:-----:|
| CASP12 | FE | comming soon | comming soon | | | | |
| CASP12 | Lora | comming soon | comming soon | | | | |
| TS115 | FE | comming soon | comming soon | | | | |
| TS115 | Lora | comming soon | comming soon | | | | |
| CB513 | FE | comming soon | comming soon | | | | |
| CB513 | Lora | comming soon | comming soon | | | | |
| DeepLoc | FE | | | comming soon | comming soon | |
| DeepLoc | Lora | | | comming soon | comming soon | | |
| Solubility | FE | | | | | comming soon | |
| Solubility | Lora | | | | | 74% | |
| Fluorescence | FE | | | | | | Comming Soon |
| Fluorescence | Lora | | | | | | 68% |
### BibTeX entry and citation info
```bibtex
@article{elnaggar2023ankh,
title={Ankh☥: Optimized protein language model unlocks general-purpose modelling},
author={Elnaggar, Ahmed and Essam, Hazem and Salah-Eldin, Wafaa and Moustafa, Walid and Elkerdawy, Mohamed and Rochereau, Charlotte and Rost, Burkhard},
journal={bioRxiv},
pages={2023--01},
year={2023},
publisher={Cold Spring Harbor Laboratory}
}
```
> Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
|
tdnathmlenthusiast/102_oxford_flower_classifier
|
tdnathmlenthusiast
| 2023-12-22T20:13:23Z | 0 | 0 | null |
[
"license:apache-2.0",
"region:us"
] | null | 2023-12-22T19:52:17Z |
---
title: Flower Classifier(Oxford 102 Dataset)
emoji: 📉
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 3.38.0
app_file: app.py
pinned: false
license: apache-2.0
---
|
ThuyNT03/KLTN_CSI_COQE_total
|
ThuyNT03
| 2023-12-22T20:08:20Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"base_model:vinai/phobert-base-v2",
"base_model:finetune:vinai/phobert-base-v2",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-12-22T19:19:22Z |
---
base_model: vinai/phobert-base-v2
tags:
- generated_from_trainer
model-index:
- name: KLTN_CSI_COQE_total
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# KLTN_CSI_COQE_total
This model is a fine-tuned version of [vinai/phobert-base-v2](https://huggingface.co/vinai/phobert-base-v2) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
sid220/asl-now-fingerspelling
|
sid220
| 2023-12-22T19:59:14Z | 5 | 0 |
keras
|
[
"keras",
"en",
"dataset:sid220/asl-now-fingerspelling",
"doi:10.57967/hf/1516",
"license:mit",
"region:us"
] | null | 2023-12-13T15:14:41Z |
---
license: mit
datasets:
- sid220/asl-now-fingerspelling
language:
- en
metrics:
- accuracy
library_name: keras
---
# ASLNow!
ASLNow! is a web app designed to make learning ASL fingerspelling easy and fun! You can try it live at [asl-now.vercel.app](https://asl-now.vercel.app/).
Demo: [https://www.youtube.com/watch?v=Wi5tAxVasq8](https://www.youtube.com/watch?v=Wi5tAxVasq8)
## Model
This model, trained on the isolated fingerspelling dataset is licensed under the MIT License. It will be updated frequently as more data is collected.
### Format

#### Input
21 hand landmarks, each composed of `x`, `y` and `z` coordinates. The `x` and `y` coordinates are normalized
to `[0.0, 1.0]` by the
image width and height, respectively. The `z` coordinate represents the landmark depth, with the depth at the wrist
being
the origin. The smaller the value, the closer the landmark is to the camera. The magnitude of `z` uses roughly the same
scale as x.

From: [https://developers.google.com/mediapipe/solutions/vision/hand_landmarker](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker)
Example:
```
[
# Landmark 1
[x, y, z],
# Landmark 2
[x, y, z],
...
# Landmark 20
[x, y, z]
# Landmark 21
[x, y, z]
]
```
#### Output
The probability of each class, where classes are defined as such:
```json
{
"A": 0,
"B": 1,
"C": 2,
"D": 3,
"E": 4,
"F": 5,
"G": 6,
"H": 7,
"I": 8,
"J": 9,
"K": 10,
"L": 11,
"M": 12,
"N": 13,
"O": 14,
"P": 15,
"Q": 16,
"R": 17,
"S": 18,
"T": 19,
"U": 20,
"V": 21,
"W": 22,
"X": 23,
"Y": 24,
"Z": 25
}
```
|
ThuyNT03/KLTN_COQE_viT5_total_SPAOL
|
ThuyNT03
| 2023-12-22T19:55:16Z | 4 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"base_model:finetune:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-22T09:07:28Z |
---
license: mit
base_model: VietAI/vit5-large
tags:
- generated_from_trainer
model-index:
- name: KLTN_COQE_viT5_total_SPAOL
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# KLTN_COQE_viT5_total_SPAOL
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
C0uchP0tat0/gpt2medium-finetuned
|
C0uchP0tat0
| 2023-12-22T19:54:30Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:ai-forever/rugpt3medium_based_on_gpt2",
"base_model:finetune:ai-forever/rugpt3medium_based_on_gpt2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-12-22T19:30:49Z |
---
base_model: ai-forever/rugpt3medium_based_on_gpt2
tags:
- generated_from_trainer
model-index:
- name: gpt2medium-finetuned
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# gpt2medium-finetuned
This model is a fine-tuned version of [ai-forever/rugpt3medium_based_on_gpt2](https://huggingface.co/ai-forever/rugpt3medium_based_on_gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1476
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 3
- total_train_batch_size: 24
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 25
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.0464 | 5.0 | 5 | 2.7436 |
| 2.2106 | 10.0 | 10 | 1.6388 |
| 1.1598 | 15.0 | 15 | 0.6276 |
| 0.3638 | 20.0 | 20 | 0.1913 |
| 0.0696 | 25.0 | 25 | 0.1476 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
ThuyNT03/KLTN_COQE_viT5_total_POSAL
|
ThuyNT03
| 2023-12-22T19:52:39Z | 5 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"base_model:finetune:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-12-22T18:44:26Z |
---
license: mit
base_model: VietAI/vit5-large
tags:
- generated_from_trainer
model-index:
- name: KLTN_COQE_viT5_total_POSAL
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# KLTN_COQE_viT5_total_POSAL
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.36.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF
|
afrideva
| 2023-12-22T19:45:13Z | 38 | 2 | null |
[
"gguf",
"ggml",
"quantized",
"q2_k",
"q3_k_m",
"q4_k_m",
"q5_k_m",
"q6_k",
"q8_0",
"text-generation",
"base_model:Yhyu13/phi-2-sft-dpo-gpt4_en-ep1",
"base_model:quantized:Yhyu13/phi-2-sft-dpo-gpt4_en-ep1",
"license:other",
"region:us"
] |
text-generation
| 2023-12-22T19:37:38Z |
---
base_model: Yhyu13/phi-2-sft-dpo-gpt4_en-ep1
inference: false
license: other
license_link: https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE
license_name: microsoft-research-license
model_creator: Yhyu13
model_name: phi-2-sft-dpo-gpt4_en-ep1
pipeline_tag: text-generation
quantized_by: afrideva
tags:
- gguf
- ggml
- quantized
- q2_k
- q3_k_m
- q4_k_m
- q5_k_m
- q6_k
- q8_0
---
# Yhyu13/phi-2-sft-dpo-gpt4_en-ep1-GGUF
Quantized GGUF model files for [phi-2-sft-dpo-gpt4_en-ep1](https://huggingface.co/Yhyu13/phi-2-sft-dpo-gpt4_en-ep1) from [Yhyu13](https://huggingface.co/Yhyu13)
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [phi-2-sft-dpo-gpt4_en-ep1.fp16.gguf](https://huggingface.co/afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF/resolve/main/phi-2-sft-dpo-gpt4_en-ep1.fp16.gguf) | fp16 | 5.56 GB |
| [phi-2-sft-dpo-gpt4_en-ep1.q2_k.gguf](https://huggingface.co/afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF/resolve/main/phi-2-sft-dpo-gpt4_en-ep1.q2_k.gguf) | q2_k | 1.17 GB |
| [phi-2-sft-dpo-gpt4_en-ep1.q3_k_m.gguf](https://huggingface.co/afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF/resolve/main/phi-2-sft-dpo-gpt4_en-ep1.q3_k_m.gguf) | q3_k_m | 1.48 GB |
| [phi-2-sft-dpo-gpt4_en-ep1.q4_k_m.gguf](https://huggingface.co/afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF/resolve/main/phi-2-sft-dpo-gpt4_en-ep1.q4_k_m.gguf) | q4_k_m | 1.79 GB |
| [phi-2-sft-dpo-gpt4_en-ep1.q5_k_m.gguf](https://huggingface.co/afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF/resolve/main/phi-2-sft-dpo-gpt4_en-ep1.q5_k_m.gguf) | q5_k_m | 2.07 GB |
| [phi-2-sft-dpo-gpt4_en-ep1.q6_k.gguf](https://huggingface.co/afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF/resolve/main/phi-2-sft-dpo-gpt4_en-ep1.q6_k.gguf) | q6_k | 2.29 GB |
| [phi-2-sft-dpo-gpt4_en-ep1.q8_0.gguf](https://huggingface.co/afrideva/phi-2-sft-dpo-gpt4_en-ep1-GGUF/resolve/main/phi-2-sft-dpo-gpt4_en-ep1.q8_0.gguf) | q8_0 | 2.96 GB |
## Original Model Card:
This is the merged model for LoRA https://huggingface.co/Yhyu13/phi-2-sft-dpo-gpt4_en-ep1-lora
This model is a dpo improvement to this base model https://huggingface.co/Yhyu13/phi-2-sft-alpaca_gpt4_en-ep1 who achieve better than text-davinci-003 on AlpcaEval judged by ChatGPT.
|
ntc-ai/SDXL-LoRA-slider.puffed-out-cheeks
|
ntc-ai
| 2023-12-22T19:42:07Z | 101 | 0 |
diffusers
|
[
"diffusers",
"text-to-image",
"stable-diffusion-xl",
"lora",
"template:sd-lora",
"template:sdxl-lora",
"sdxl-sliders",
"ntcai.xyz-sliders",
"concept",
"en",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:mit",
"region:us"
] |
text-to-image
| 2023-12-22T19:42:04Z |
---
language:
- en
thumbnail: "images/evaluate/puffed out cheeks.../puffed out cheeks_17_3.0.png"
widget:
- text: puffed out cheeks
output:
url: images/puffed out cheeks_17_3.0.png
- text: puffed out cheeks
output:
url: images/puffed out cheeks_19_3.0.png
- text: puffed out cheeks
output:
url: images/puffed out cheeks_20_3.0.png
- text: puffed out cheeks
output:
url: images/puffed out cheeks_21_3.0.png
- text: puffed out cheeks
output:
url: images/puffed out cheeks_22_3.0.png
tags:
- text-to-image
- stable-diffusion-xl
- lora
- template:sd-lora
- template:sdxl-lora
- sdxl-sliders
- ntcai.xyz-sliders
- concept
- diffusers
license: "mit"
inference: false
instance_prompt: "puffed out cheeks"
base_model: "stabilityai/stable-diffusion-xl-base-1.0"
---
# ntcai.xyz slider - puffed out cheeks (SDXL LoRA)
| Strength: -3 | Strength: 0 | Strength: 3 |
| --- | --- | --- |
| <img src="images/puffed out cheeks_17_-3.0.png" width=256 height=256 /> | <img src="images/puffed out cheeks_17_0.0.png" width=256 height=256 /> | <img src="images/puffed out cheeks_17_3.0.png" width=256 height=256 /> |
| <img src="images/puffed out cheeks_19_-3.0.png" width=256 height=256 /> | <img src="images/puffed out cheeks_19_0.0.png" width=256 height=256 /> | <img src="images/puffed out cheeks_19_3.0.png" width=256 height=256 /> |
| <img src="images/puffed out cheeks_20_-3.0.png" width=256 height=256 /> | <img src="images/puffed out cheeks_20_0.0.png" width=256 height=256 /> | <img src="images/puffed out cheeks_20_3.0.png" width=256 height=256 /> |
## Download
Weights for this model are available in Safetensors format.
## Trigger words
You can apply this LoRA with trigger words for additional effect:
```
puffed out cheeks
```
## Use in diffusers
```python
from diffusers import StableDiffusionXLPipeline
from diffusers import EulerAncestralDiscreteScheduler
import torch
pipe = StableDiffusionXLPipeline.from_single_file("https://huggingface.co/martyn/sdxl-turbo-mario-merge-top-rated/blob/main/topRatedTurboxlLCM_v10.safetensors")
pipe.to("cuda")
pipe.scheduler = EulerAncestralDiscreteScheduler.from_config(pipe.scheduler.config)
# Load the LoRA
pipe.load_lora_weights('ntc-ai/SDXL-LoRA-slider.puffed-out-cheeks', weight_name='puffed out cheeks.safetensors', adapter_name="puffed out cheeks")
# Activate the LoRA
pipe.set_adapters(["puffed out cheeks"], adapter_weights=[2.0])
prompt = "medieval rich kingpin sitting in a tavern, puffed out cheeks"
negative_prompt = "nsfw"
width = 512
height = 512
num_inference_steps = 10
guidance_scale = 2
image = pipe(prompt, negative_prompt=negative_prompt, width=width, height=height, guidance_scale=guidance_scale, num_inference_steps=num_inference_steps).images[0]
image.save('result.png')
```
## Support the Patreon
If you like this model please consider [joining our Patreon](https://www.patreon.com/NTCAI).
By joining our Patreon, you'll gain access to an ever-growing library of over 550+ unique and diverse LoRAs, covering a wide range of styles and genres. You'll also receive early access to new models and updates, exclusive behind-the-scenes content, and the powerful LoRA slider creator, allowing you to craft your own custom LoRAs and experiment with endless possibilities.
Your support on Patreon will allow us to continue developing and refining new models.
## Other resources
- [CivitAI](https://civitai.com/user/ntc) - Follow ntc on Civit for even more LoRAs
- [ntcai.xyz](https://ntcai.xyz) - See ntcai.xyz to find more articles and LoRAs
|
DmitryNvm/sd15-lora-dreambooth
|
DmitryNvm
| 2023-12-22T19:41:17Z | 0 | 0 |
diffusers
|
[
"diffusers",
"tensorboard",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"dreambooth",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:finetune:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-12-22T19:30:10Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
instance_prompt: a photo of sks dog
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- dreambooth
inference: true
---
# DreamBooth - DmitryNvm/sd15-lora-dreambooth
This is a dreambooth model derived from runwayml/stable-diffusion-v1-5. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/).
You can find some example images in the following.
DreamBooth for the text encoder was enabled: False.
|
omarelsayeed/Search_Test_e5_cosine_sim
|
omarelsayeed
| 2023-12-22T19:35:01Z | 4 | 0 |
sentence-transformers
|
[
"sentence-transformers",
"pytorch",
"xlm-roberta",
"feature-extraction",
"sentence-similarity",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| 2023-12-22T19:34:04Z |
---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 8404 with parameters:
```
{'batch_size': 4, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`__main__.LoggingCosineSim`
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 5e-06
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 200,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
(2): Normalize()
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.