modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-09-09 18:59:16
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 551
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-09-09 18:27:33
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
GuiTap/bert-large-uncased-finetuned-ner-harem
|
GuiTap
| 2024-12-04T16:56:43Z | 110 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"base_model:google-bert/bert-large-uncased",
"base_model:finetune:google-bert/bert-large-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2024-11-26T15:07:33Z |
---
library_name: transformers
license: apache-2.0
base_model: google-bert/bert-large-uncased
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-large-uncased-finetuned-ner-harem
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-finetuned-ner-harem
This model is a fine-tuned version of [google-bert/bert-large-uncased](https://huggingface.co/google-bert/bert-large-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3109
- Precision: 0.6895
- Recall: 0.6442
- F1: 0.6661
- Accuracy: 0.9512
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 0.9978 | 281 | 0.2896 | 0.5442 | 0.4772 | 0.5085 | 0.9238 |
| 0.3496 | 1.9973 | 562 | 0.2340 | 0.6811 | 0.5295 | 0.5958 | 0.9412 |
| 0.3496 | 2.9969 | 843 | 0.2240 | 0.5876 | 0.5599 | 0.5734 | 0.9409 |
| 0.1372 | 3.9964 | 1124 | 0.2540 | 0.6910 | 0.6223 | 0.6548 | 0.9403 |
| 0.1372 | 4.9960 | 1405 | 0.2598 | 0.6433 | 0.6358 | 0.6395 | 0.9439 |
| 0.0648 | 5.9956 | 1686 | 0.2377 | 0.6945 | 0.6442 | 0.6684 | 0.9497 |
| 0.0648 | 6.9951 | 1967 | 0.2822 | 0.6965 | 0.6425 | 0.6684 | 0.9501 |
| 0.0316 | 7.9982 | 2249 | 0.2958 | 0.7044 | 0.6509 | 0.6766 | 0.9518 |
| 0.0148 | 8.9978 | 2530 | 0.3006 | 0.6944 | 0.6476 | 0.6702 | 0.9496 |
| 0.0148 | 9.9938 | 2810 | 0.3109 | 0.6895 | 0.6442 | 0.6661 | 0.9512 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
jaba99/Phi-3-mini-4k-lora-tuned-4b
|
jaba99
| 2024-12-04T16:54:13Z | 6 | 0 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] |
text-generation
| 2024-12-03T20:05:25Z |
---
base_model: unsloth/phi-3-mini-4k-instruct-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
- sft
---
# Uploaded model
- **Developed by:** jaba99
- **License:** apache-2.0
- **Finetuned from model :** unsloth/phi-3-mini-4k-instruct-bnb-4bit
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
kclayto1/mt5-small-finetuned-amazon-en-es
|
kclayto1
| 2024-12-04T16:53:35Z | 12 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"mt5",
"text2text-generation",
"summarization",
"generated_from_trainer",
"base_model:google/mt5-small",
"base_model:finetune:google/mt5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
summarization
| 2024-12-02T21:29:05Z |
---
library_name: transformers
license: apache-2.0
base_model: google/mt5-small
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: mt5-small-finetuned-amazon-en-es
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mt5-small-finetuned-amazon-en-es
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.9719
- Rouge1: 12.8375
- Rouge2: 4.873
- Rougel: 12.7298
- Rougelsum: 12.7317
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:------:|:-------:|:---------:|
| 5.97 | 1.0 | 2202 | 3.2260 | 9.8577 | 3.6445 | 9.7485 | 9.7848 |
| 3.647 | 2.0 | 4404 | 3.0734 | 11.4598 | 4.3082 | 11.3677 | 11.4138 |
| 3.3891 | 3.0 | 6606 | 3.0070 | 12.0211 | 4.3801 | 11.7115 | 11.7438 |
| 3.2655 | 4.0 | 8808 | 2.9878 | 11.8522 | 4.3727 | 11.6773 | 11.7271 |
| 3.1735 | 5.0 | 11010 | 2.9825 | 12.6673 | 4.3875 | 12.5586 | 12.6202 |
| 3.1156 | 6.0 | 13212 | 2.9798 | 13.0236 | 5.2414 | 12.8478 | 12.9195 |
| 3.0672 | 7.0 | 15414 | 2.9771 | 12.6105 | 4.8806 | 12.5344 | 12.545 |
| 3.044 | 8.0 | 17616 | 2.9719 | 12.8375 | 4.873 | 12.7298 | 12.7317 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/distilbert_base_lda_qnli
|
gokulsrinivasagan
| 2024-12-04T16:50:53Z | 119 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_base_lda",
"base_model:finetune:gokulsrinivasagan/distilbert_base_lda",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:37:55Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_base_lda
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_base_lda_qnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE QNLI
type: glue
args: qnli
metrics:
- name: Accuracy
type: accuracy
value: 0.8228079809628409
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_base_lda_qnli
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_base_lda](https://huggingface.co/gokulsrinivasagan/distilbert_base_lda) on the GLUE QNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3940
- Accuracy: 0.8228
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4991 | 1.0 | 410 | 0.4308 | 0.8034 |
| 0.3844 | 2.0 | 820 | 0.3940 | 0.8228 |
| 0.3017 | 3.0 | 1230 | 0.4049 | 0.8309 |
| 0.2254 | 4.0 | 1640 | 0.5099 | 0.8023 |
| 0.164 | 5.0 | 2050 | 0.5351 | 0.8083 |
| 0.1187 | 6.0 | 2460 | 0.6020 | 0.8148 |
| 0.0911 | 7.0 | 2870 | 0.6820 | 0.8144 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
GigoFof/llama-3.2-3b-lv2-CSP-RU-ChatBot
|
GigoFof
| 2024-12-04T16:50:07Z | 113 | 0 |
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"text-classification",
"ru",
"en",
"base_model:meta-llama/Llama-3.2-3B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-3B-Instruct",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T10:52:31Z |
---
library_name: transformers
license: mit
language:
- ru
- en
base_model:
- meta-llama/Llama-3.2-3B-Instruct
pipeline_tag: text-classification
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
С этой моделью вы можете поговорить о культурных ценностях.
- **Developed by:** Artamonov K.A.
- **Model type:** Textual
- **Language(s) (NLP):** Russian, English
- **License:** MIT
- **Finetuned from model [optional]:** LLAMA3.2-3B-Instrcut
|
Dawid511/speecht5_finetuned_librispeech_polish
|
Dawid511
| 2024-12-04T16:46:20Z | 75 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"speecht5",
"text-to-audio",
"generated_from_trainer",
"base_model:microsoft/speecht5_tts",
"base_model:finetune:microsoft/speecht5_tts",
"license:mit",
"endpoints_compatible",
"region:us"
] |
text-to-audio
| 2024-12-04T16:06:28Z |
---
library_name: transformers
license: mit
base_model: microsoft/speecht5_tts
tags:
- generated_from_trainer
model-index:
- name: speecht5_finetuned_librispeech_polish
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# speecht5_finetuned_librispeech_polish
This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3712
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.4598 | 1.3851 | 100 | 0.4189 |
| 0.4308 | 2.7703 | 200 | 0.3925 |
| 0.4123 | 4.1589 | 300 | 0.3818 |
| 0.4004 | 5.5440 | 400 | 0.3754 |
| 0.3973 | 6.9292 | 500 | 0.3712 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Muadil/t5-large_sum_DPO_25k_8_1ep
|
Muadil
| 2024-12-04T16:44:15Z | 105 | 0 |
transformers
|
[
"transformers",
"safetensors",
"t5",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2024-12-04T16:43:21Z |
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
gokulsrinivasagan/bert-base-uncased_mrpc
|
gokulsrinivasagan
| 2024-12-04T16:42:31Z | 119 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-22T04:55:47Z |
---
library_name: transformers
language:
- en
license: apache-2.0
base_model: google-bert/bert-base-uncased
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: bert-base-uncased_mrpc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MRPC
type: glue
args: mrpc
metrics:
- name: Accuracy
type: accuracy
value: 0.7867647058823529
- name: F1
type: f1
value: 0.8481675392670157
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased_mrpc
This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the GLUE MRPC dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4455
- Accuracy: 0.7868
- F1: 0.8482
- Combined Score: 0.8175
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:|
| 0.5933 | 1.0 | 15 | 0.5066 | 0.7745 | 0.8351 | 0.8048 |
| 0.4605 | 2.0 | 30 | 0.4455 | 0.7868 | 0.8482 | 0.8175 |
| 0.31 | 3.0 | 45 | 0.5169 | 0.8162 | 0.8777 | 0.8469 |
| 0.1871 | 4.0 | 60 | 0.4473 | 0.8407 | 0.8862 | 0.8634 |
| 0.1453 | 5.0 | 75 | 0.5061 | 0.8235 | 0.8672 | 0.8453 |
| 0.0963 | 6.0 | 90 | 0.5724 | 0.8284 | 0.8797 | 0.8541 |
| 0.0515 | 7.0 | 105 | 0.7238 | 0.8333 | 0.8863 | 0.8598 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_uncased_L-2_H-256_A-4_mrpc
|
gokulsrinivasagan
| 2024-12-04T16:40:46Z | 113 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:google/bert_uncased_L-2_H-256_A-4",
"base_model:finetune:google/bert_uncased_L-2_H-256_A-4",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:40:02Z |
---
library_name: transformers
language:
- en
license: apache-2.0
base_model: google/bert_uncased_L-2_H-256_A-4
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: bert_uncased_L-2_H-256_A-4_mrpc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MRPC
type: glue
args: mrpc
metrics:
- name: Accuracy
type: accuracy
value: 0.7475490196078431
- name: F1
type: f1
value: 0.835725677830941
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_uncased_L-2_H-256_A-4_mrpc
This model is a fine-tuned version of [google/bert_uncased_L-2_H-256_A-4](https://huggingface.co/google/bert_uncased_L-2_H-256_A-4) on the GLUE MRPC dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5344
- Accuracy: 0.7475
- F1: 0.8357
- Combined Score: 0.7916
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:|
| 0.619 | 1.0 | 15 | 0.5956 | 0.6887 | 0.8146 | 0.7517 |
| 0.5893 | 2.0 | 30 | 0.5835 | 0.7010 | 0.8179 | 0.7594 |
| 0.5612 | 3.0 | 45 | 0.5597 | 0.7059 | 0.8171 | 0.7615 |
| 0.5397 | 4.0 | 60 | 0.5398 | 0.7377 | 0.8320 | 0.7849 |
| 0.5063 | 5.0 | 75 | 0.5358 | 0.7426 | 0.8336 | 0.7881 |
| 0.476 | 6.0 | 90 | 0.5344 | 0.7475 | 0.8357 | 0.7916 |
| 0.4361 | 7.0 | 105 | 0.5515 | 0.7451 | 0.8349 | 0.7900 |
| 0.4014 | 8.0 | 120 | 0.5508 | 0.75 | 0.8365 | 0.7933 |
| 0.3684 | 9.0 | 135 | 0.5901 | 0.7304 | 0.8254 | 0.7779 |
| 0.3396 | 10.0 | 150 | 0.5755 | 0.7426 | 0.8276 | 0.7851 |
| 0.3061 | 11.0 | 165 | 0.5943 | 0.75 | 0.8317 | 0.7908 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_uncased_L-2_H-256_A-4_cola
|
gokulsrinivasagan
| 2024-12-04T16:39:43Z | 123 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:google/bert_uncased_L-2_H-256_A-4",
"base_model:finetune:google/bert_uncased_L-2_H-256_A-4",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:38:47Z |
---
library_name: transformers
language:
- en
license: apache-2.0
base_model: google/bert_uncased_L-2_H-256_A-4
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
- accuracy
model-index:
- name: bert_uncased_L-2_H-256_A-4_cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE COLA
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.0
- name: Accuracy
type: accuracy
value: 0.6912751793861389
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_uncased_L-2_H-256_A-4_cola
This model is a fine-tuned version of [google/bert_uncased_L-2_H-256_A-4](https://huggingface.co/google/bert_uncased_L-2_H-256_A-4) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6078
- Matthews Correlation: 0.0
- Accuracy: 0.6913
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|:--------:|
| 0.6144 | 1.0 | 34 | 0.6150 | 0.0 | 0.6913 |
| 0.6036 | 2.0 | 68 | 0.6160 | 0.0 | 0.6913 |
| 0.5981 | 3.0 | 102 | 0.6136 | 0.0 | 0.6913 |
| 0.5876 | 4.0 | 136 | 0.6078 | 0.0 | 0.6913 |
| 0.5762 | 5.0 | 170 | 0.6084 | 0.0549 | 0.6913 |
| 0.5546 | 6.0 | 204 | 0.6115 | 0.1563 | 0.6980 |
| 0.533 | 7.0 | 238 | 0.6301 | 0.1339 | 0.6961 |
| 0.5116 | 8.0 | 272 | 0.6459 | 0.1041 | 0.6846 |
| 0.4936 | 9.0 | 306 | 0.6675 | 0.1149 | 0.6894 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
mradermacher/Mistral-7b-Sumi-v0.2-GGUF
|
mradermacher
| 2024-12-04T16:38:55Z | 73 | 0 |
transformers
|
[
"transformers",
"gguf",
"unsloth",
"en",
"base_model:Oisu/Mistral-7b-Sumi-v0.2",
"base_model:quantized:Oisu/Mistral-7b-Sumi-v0.2",
"endpoints_compatible",
"region:us"
] | null | 2024-12-04T14:56:14Z |
---
base_model: Oisu/Mistral-7b-Sumi-v0.2
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- unsloth
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/Oisu/Mistral-7b-Sumi-v0.2
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q2_K.gguf) | Q2_K | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-7b-Sumi-v0.2-GGUF/resolve/main/Mistral-7b-Sumi-v0.2.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
gokulsrinivasagan/bert_uncased_L-4_H-256_A-4_mrpc
|
gokulsrinivasagan
| 2024-12-04T16:38:54Z | 110 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:google/bert_uncased_L-4_H-256_A-4",
"base_model:finetune:google/bert_uncased_L-4_H-256_A-4",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:38:03Z |
---
library_name: transformers
language:
- en
license: apache-2.0
base_model: google/bert_uncased_L-4_H-256_A-4
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: bert_uncased_L-4_H-256_A-4_mrpc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MRPC
type: glue
args: mrpc
metrics:
- name: Accuracy
type: accuracy
value: 0.7720588235294118
- name: F1
type: f1
value: 0.8393782383419689
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_uncased_L-4_H-256_A-4_mrpc
This model is a fine-tuned version of [google/bert_uncased_L-4_H-256_A-4](https://huggingface.co/google/bert_uncased_L-4_H-256_A-4) on the GLUE MRPC dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5071
- Accuracy: 0.7721
- F1: 0.8394
- Combined Score: 0.8057
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:|
| 0.6375 | 1.0 | 15 | 0.6024 | 0.6936 | 0.8170 | 0.7553 |
| 0.594 | 2.0 | 30 | 0.5776 | 0.6985 | 0.8167 | 0.7576 |
| 0.5504 | 3.0 | 45 | 0.5475 | 0.7279 | 0.8274 | 0.7777 |
| 0.5155 | 4.0 | 60 | 0.5083 | 0.7598 | 0.8345 | 0.7971 |
| 0.4668 | 5.0 | 75 | 0.5116 | 0.7598 | 0.8345 | 0.7971 |
| 0.4292 | 6.0 | 90 | 0.5237 | 0.7696 | 0.8433 | 0.8065 |
| 0.3859 | 7.0 | 105 | 0.5071 | 0.7721 | 0.8394 | 0.8057 |
| 0.3455 | 8.0 | 120 | 0.5300 | 0.7721 | 0.8426 | 0.8073 |
| 0.3049 | 9.0 | 135 | 0.5408 | 0.7721 | 0.8410 | 0.8065 |
| 0.2735 | 10.0 | 150 | 0.5337 | 0.7745 | 0.8425 | 0.8085 |
| 0.2454 | 11.0 | 165 | 0.5962 | 0.7647 | 0.84 | 0.8024 |
| 0.2117 | 12.0 | 180 | 0.5756 | 0.7794 | 0.8469 | 0.8132 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert-base-uncased_cola
|
gokulsrinivasagan
| 2024-12-04T16:38:43Z | 108 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-22T04:48:21Z |
---
library_name: transformers
language:
- en
license: apache-2.0
base_model: google-bert/bert-base-uncased
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
- accuracy
model-index:
- name: bert-base-uncased_cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE COLA
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5683668297227801
- name: Accuracy
type: accuracy
value: 0.8245446085929871
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased_cola
This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4191
- Matthews Correlation: 0.5684
- Accuracy: 0.8245
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|:--------:|
| 0.5311 | 1.0 | 34 | 0.4443 | 0.5154 | 0.8035 |
| 0.3306 | 2.0 | 68 | 0.4191 | 0.5684 | 0.8245 |
| 0.2104 | 3.0 | 102 | 0.5792 | 0.5730 | 0.8265 |
| 0.1325 | 4.0 | 136 | 0.5178 | 0.5883 | 0.8322 |
| 0.0962 | 5.0 | 170 | 0.6488 | 0.5779 | 0.8274 |
| 0.0751 | 6.0 | 204 | 0.7336 | 0.5449 | 0.8159 |
| 0.0685 | 7.0 | 238 | 0.7193 | 0.5650 | 0.8236 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_01
|
ELVISIO
| 2024-12-04T16:37:48Z | 5 | 0 |
sentence-transformers
|
[
"sentence-transformers",
"safetensors",
"xlm-roberta",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:10000",
"loss:OnlineContrastiveLoss",
"custom_code",
"arxiv:1908.10084",
"base_model:jinaai/jina-embeddings-v3",
"base_model:finetune:jinaai/jina-embeddings-v3",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| 2024-12-02T05:10:41Z |
---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:10000
- loss:OnlineContrastiveLoss
base_model: jinaai/jina-embeddings-v3
widget:
- source_sentence: i be try to picture the pitch for dark angel . i be think matrix
and i be think bladerunner and i be think that chick that play faith in angel
and wear shiny black leather or some chick just like her and leave that one with
u . only get this . we will do it without any plot and dialogue and character
and decent action or budget and just some loud bang and a hot chick in shiny black
leather straddle a big throbbing bike . fanboys dig loud bang and hot chick in
shiny black leather straddle big throbbing bike and right . flashy and shallow
and dreary and formulaic and passionless and tedious and dull and dumb and humourless
and desultory and barely competent . live action anime without any action and
or indeed any life . sf just the way joe fanboy like it and in fact . negative
.
sentences:
- This is a semantically positive review.
- This is a semantically negative review.
- This is a semantically positive review.
- source_sentence: despite the high rating give to this film by imdb user and this
be nothing more than your typical girl with a bad childhood obsessively stalks
married man film . the attractive justine priestly brief nude scene may attract
voyeur and but the film be hackneyed tripe . half out of .
sentences:
- This is a semantically positive review.
- This is a semantically positive review.
- This is a semantically positive review.
- source_sentence: this movie portray ruth a a womanizing and hard drinking and gambling
and overeat sport figure with a little baseball thrown in . babe ruth early life
be quite interesting and this be for all intent and purpose be omit in this film
. also and lou gehrig be barely cover and this be a well know relationship and
good bad or indifferent and it should have be cover well than it be . his life
be more than all bad . he be an american hero and an icon that a lot of baseball
great pattern their life after . i feel that i be be fair to the memory of a great
baseball player that this film completely ignore . shame on the maker of this
film for capitalize on his fault and not his greatness .
sentences:
- This is a semantically positive review.
- This is a semantically negative review.
- This is a semantically positive review.
- source_sentence: the silent one panel cartoon henry come to fleischer studio and
bill a the world funny human in this dull little cartoon . betty and long past
her prime and thanks to the production code and be run a pet shop and leave henry
in charge for far too long five minute . a bore .
sentences:
- This is a semantically positive review.
- This is a semantically negative review.
- This is a semantically negative review.
- source_sentence: zu warrior most definitely should have be an animated series because
a a movie it like watch an old anime on acid . the movie just start out of nowhere
and people just fly around fight with metal wing and other stupid weapon until
this princess sacrifice herself for her lover on a cloud or something . whether
this princess be a god or an angel be beyond me but soon enough this fly wind
bad guy come in and kill her while the guy with the razor wing fight some other
mystical god or demon or wizard thing . the plot line be either not there or extremely
hard to follow you need to be insanely intelligent to get this movie . the plot
soon follow this chinese mortal who be call upon by this god to fight the evil
flying and princess kill bad guy and soon we have a very badly choreograph uwe
boll like fight scene complete with terrible martial art on a mountain or something
. even the visuals be weird some might say they be stun and colorful but i be
go to say they be blurry and acid trip like ( yes that a word . ) . i watch it
both dub and with subtitle and both be equally bad and hard to understand . who
be i kidding i do not understand it at all . it felt like i be watch episode 30
of some 1980 anime and completely miss how the story begin or like i start read
a comic series of 5 at number 4 because i have no clue how this thing start where
it be go or how it would end i be lose the entire time . i can honestly say this
be one of the bad film experience ever it be like watch inu yasha at episode 134
drunk . yeah that right you do not know what the hell be go on . don not waste
your brain try to figure this out .
sentences:
- This is a semantically positive review.
- This is a semantically negative review.
- This is a semantically positive review.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on jinaai/jina-embeddings-v3
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) <!-- at revision 30996fea06f69ecd8382ee4f11e29acaf6b5405e -->
- **Maximum Sequence Length:** 8194 tokens
- **Output Dimensionality:** 1024 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(transformer): Transformer(
(auto_model): XLMRobertaLoRA(
(roberta): XLMRobertaModel(
(embeddings): XLMRobertaEmbeddings(
(word_embeddings): ParametrizedEmbedding(
250002, 1024, padding_idx=1
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(token_type_embeddings): ParametrizedEmbedding(
1, 1024
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(emb_drop): Dropout(p=0.1, inplace=False)
(emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(encoder): XLMRobertaEncoder(
(layers): ModuleList(
(0-23): 24 x Block(
(mixer): MHA(
(rotary_emb): RotaryEmbedding()
(Wqkv): ParametrizedLinearResidual(
in_features=1024, out_features=3072, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(inner_attn): FlashSelfAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(inner_cross_attn): FlashCrossAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(out_proj): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout1): Dropout(p=0.1, inplace=False)
(drop_path1): StochasticDepth(p=0.0, mode=row)
(norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): ParametrizedLinear(
in_features=1024, out_features=4096, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(fc2): ParametrizedLinear(
in_features=4096, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout2): Dropout(p=0.1, inplace=False)
(drop_path2): StochasticDepth(p=0.0, mode=row)
(norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
)
)
)
(pooler): XLMRobertaPooler(
(dense): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(activation): Tanh()
)
)
)
)
(pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(normalizer): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_01", trust_remote_code=True, model_kwargs={'default_task': 'classification'})
# Run inference
sentences = [
'zu warrior most definitely should have be an animated series because a a movie it like watch an old anime on acid . the movie just start out of nowhere and people just fly around fight with metal wing and other stupid weapon until this princess sacrifice herself for her lover on a cloud or something . whether this princess be a god or an angel be beyond me but soon enough this fly wind bad guy come in and kill her while the guy with the razor wing fight some other mystical god or demon or wizard thing . the plot line be either not there or extremely hard to follow you need to be insanely intelligent to get this movie . the plot soon follow this chinese mortal who be call upon by this god to fight the evil flying and princess kill bad guy and soon we have a very badly choreograph uwe boll like fight scene complete with terrible martial art on a mountain or something . even the visuals be weird some might say they be stun and colorful but i be go to say they be blurry and acid trip like ( yes that a word . ) . i watch it both dub and with subtitle and both be equally bad and hard to understand . who be i kidding i do not understand it at all . it felt like i be watch episode 30 of some 1980 anime and completely miss how the story begin or like i start read a comic series of 5 at number 4 because i have no clue how this thing start where it be go or how it would end i be lose the entire time . i can honestly say this be one of the bad film experience ever it be like watch inu yasha at episode 134 drunk . yeah that right you do not know what the hell be go on . don not waste your brain try to figure this out .',
'This is a semantically negative review.',
'This is a semantically positive review.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 10000 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | label |
|:--------|:--------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:--------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 19 tokens</li><li>mean: 300.92 tokens</li><li>max: 1415 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 11.0 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.5</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | label |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------|:-----------------|
| <code>i rent i be curious yellow from my video store because of all the controversy that surround it when it be first release in 1967. i also hear that at first it be seize by u. s. custom if it ever try to enter this country and therefore be a fan of film consider controversial i really have to see this for myself . the plot be center around a young swedish drama student name lena who want to learn everything she can about life . in particular she want to focus her attention to make some sort of documentary on what the average swede think about certain political issue such a the vietnam war and race issue in the united state . in between ask politician and ordinary denizen of stockholm about their opinion on politics and she have sex with her drama teacher and classmate and and marry men . what kill me about i be curious yellow be that 40 year ago and this be consider pornographic . really and the sex and nudity scene be few and far between and even then it not shot like some cheaply make porno . while my countryman mind find it shock and in reality sex and nudity be a major staple in swedish cinema . even ingmar bergman and arguably their answer to good old boy john ford and have sex scene in his film . i do commend the filmmaker for the fact that any sex show in the film be show for artistic purpose rather than just to shock people and make money to be show in pornographic theater in america . i be curious yellow be a good film for anyone want to study the meat and potato ( no pun intend ) of swedish cinema . but really and this film doesn not have much of a plot .</code> | <code>This is a semantically negative review.</code> | <code>1.0</code> |
| <code>i rent i be curious yellow from my video store because of all the controversy that surround it when it be first release in 1967. i also hear that at first it be seize by u. s. custom if it ever try to enter this country and therefore be a fan of film consider controversial i really have to see this for myself . the plot be center around a young swedish drama student name lena who want to learn everything she can about life . in particular she want to focus her attention to make some sort of documentary on what the average swede think about certain political issue such a the vietnam war and race issue in the united state . in between ask politician and ordinary denizen of stockholm about their opinion on politics and she have sex with her drama teacher and classmate and and marry men . what kill me about i be curious yellow be that 40 year ago and this be consider pornographic . really and the sex and nudity scene be few and far between and even then it not shot like some cheaply make porno . while my countryman mind find it shock and in reality sex and nudity be a major staple in swedish cinema . even ingmar bergman and arguably their answer to good old boy john ford and have sex scene in his film . i do commend the filmmaker for the fact that any sex show in the film be show for artistic purpose rather than just to shock people and make money to be show in pornographic theater in america . i be curious yellow be a good film for anyone want to study the meat and potato ( no pun intend ) of swedish cinema . but really and this film doesn not have much of a plot .</code> | <code>This is a semantically positive review.</code> | <code>0.0</code> |
| <code>i be curious represent yellow be a risible and pretentious steam pile . it doesn not matter what one political view be because this film can hardly be take seriously on any level . a for the claim that frontal male nudity be an automatic nc 17 and that isn not true . i have see r rat film with male nudity . grant and they only offer some fleeting view and but where be the r rat film with gap vulva and flap labium . nowhere and because they do not exist . the same go for those crappy cable show represent schlongs swing in the breeze but not a clitoris in sight . and those pretentious indie movie like the brown bunny and in which be treat to the site of vincent gallo throb johnson and but not a trace of pink visible on chloe sevigny . before cry ( or imply ) double standard in matter of nudity and the mentally obtuse should take into account one unavoidably obvious anatomical difference between men and woman represent there be no genitals on display when actresses appear nude and and the same can not be say for a man . in fact and you generally would not see female genitals in an american film in anything short of porn or explicit erotica . this allege double standard be less a double standard than an admittedly depressing ability to come to term culturally with the inside of woman body .</code> | <code>This is a semantically negative review.</code> | <code>1.0</code> |
* Loss: [<code>OnlineContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3.0
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss |
|:------:|:----:|:-------------:|
| 0.6394 | 500 | 0.9485 |
| 1.2788 | 1000 | 0.6908 |
| 1.9182 | 1500 | 0.7048 |
| 2.5575 | 2000 | 0.6892 |
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.1.1
- Transformers: 4.45.2
- PyTorch: 2.5.1+cu121
- Accelerate: 1.1.1
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->
|
gokulsrinivasagan/bert_uncased_L-4_H-256_A-4_cola
|
gokulsrinivasagan
| 2024-12-04T16:37:44Z | 105 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:google/bert_uncased_L-4_H-256_A-4",
"base_model:finetune:google/bert_uncased_L-4_H-256_A-4",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:36:25Z |
---
library_name: transformers
language:
- en
license: apache-2.0
base_model: google/bert_uncased_L-4_H-256_A-4
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
- accuracy
model-index:
- name: bert_uncased_L-4_H-256_A-4_cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE COLA
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.2650812590803394
- name: Accuracy
type: accuracy
value: 0.7027804255485535
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_uncased_L-4_H-256_A-4_cola
This model is a fine-tuned version of [google/bert_uncased_L-4_H-256_A-4](https://huggingface.co/google/bert_uncased_L-4_H-256_A-4) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5943
- Matthews Correlation: 0.2651
- Accuracy: 0.7028
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|:--------:|
| 0.6358 | 1.0 | 34 | 0.6182 | 0.0 | 0.6913 |
| 0.6077 | 2.0 | 68 | 0.6184 | 0.0 | 0.6913 |
| 0.5982 | 3.0 | 102 | 0.6035 | 0.0 | 0.6913 |
| 0.575 | 4.0 | 136 | 0.5997 | 0.1458 | 0.7009 |
| 0.5391 | 5.0 | 170 | 0.5992 | 0.2018 | 0.7028 |
| 0.4999 | 6.0 | 204 | 0.6159 | 0.2088 | 0.7085 |
| 0.4722 | 7.0 | 238 | 0.5974 | 0.2782 | 0.7248 |
| 0.4437 | 8.0 | 272 | 0.5943 | 0.2651 | 0.7028 |
| 0.4204 | 9.0 | 306 | 0.6239 | 0.2618 | 0.7210 |
| 0.3956 | 10.0 | 340 | 0.6360 | 0.2655 | 0.7191 |
| 0.3671 | 11.0 | 374 | 0.6876 | 0.2592 | 0.7200 |
| 0.3546 | 12.0 | 408 | 0.7041 | 0.2665 | 0.7239 |
| 0.333 | 13.0 | 442 | 0.6849 | 0.2891 | 0.7229 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
wavyduck/XLRS_FullDataset
|
wavyduck
| 2024-12-04T16:35:42Z | 109 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"en",
"dataset:timit-asr/timit_asr",
"base_model:facebook/wav2vec2-base",
"base_model:finetune:facebook/wav2vec2-base",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2024-12-03T15:32:53Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: XLRS_FullDataset
results: []
datasets:
- timit-asr/timit_asr
language:
- en
base_model:
- facebook/wav2vec2-base
pipeline_tag: automatic-speech-recognition
metrics:
- wer
library_name: transformers
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLRS_FullDataset
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3057
- Wer: 0.2697
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 4.5696 | 1.0 | 500 | 3.1546 | 1.0 |
| 2.491 | 2.01 | 1000 | 0.8309 | 0.7872 |
| 0.7519 | 3.01 | 1500 | 0.3648 | 0.4364 |
| 0.4704 | 4.02 | 2000 | 0.2998 | 0.3758 |
| 0.3385 | 5.02 | 2500 | 0.2639 | 0.3439 |
| 0.2837 | 6.02 | 3000 | 0.2604 | 0.3309 |
| 0.2233 | 7.03 | 3500 | 0.2734 | 0.3143 |
| 0.1997 | 8.03 | 4000 | 0.2676 | 0.3121 |
| 0.1717 | 9.04 | 4500 | 0.2489 | 0.2941 |
| 0.1558 | 10.04 | 5000 | 0.2777 | 0.2969 |
| 0.1497 | 11.04 | 5500 | 0.2693 | 0.2890 |
| 0.1326 | 12.05 | 6000 | 0.2844 | 0.2921 |
| 0.118 | 13.05 | 6500 | 0.2818 | 0.2969 |
| 0.119 | 14.06 | 7000 | 0.2798 | 0.2854 |
| 0.0991 | 15.06 | 7500 | 0.2765 | 0.2858 |
| 0.108 | 16.06 | 8000 | 0.2904 | 0.2794 |
| 0.0935 | 17.07 | 8500 | 0.2846 | 0.2773 |
| 0.0857 | 18.07 | 9000 | 0.3120 | 0.2812 |
| 0.0928 | 19.08 | 9500 | 0.3073 | 0.2820 |
| 0.0832 | 20.08 | 10000 | 0.2981 | 0.2808 |
| 0.0768 | 21.08 | 10500 | 0.3065 | 0.2807 |
| 0.0768 | 22.09 | 11000 | 0.2960 | 0.2766 |
| 0.0754 | 23.09 | 11500 | 0.3007 | 0.2783 |
| 0.063 | 24.1 | 12000 | 0.2918 | 0.2739 |
| 0.0614 | 25.1 | 12500 | 0.3144 | 0.2748 |
| 0.0628 | 26.1 | 13000 | 0.3074 | 0.2713 |
| 0.0595 | 27.11 | 13500 | 0.3103 | 0.2695 |
| 0.0616 | 28.11 | 14000 | 0.3108 | 0.2697 |
| 0.0587 | 29.12 | 14500 | 0.3057 | 0.2697 |
### Framework versions
- Transformers 4.17.0
- Pytorch 2.5.1+cu121
- Datasets 1.18.3
- Tokenizers 0.20.3
|
MayBashendy/ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k20_task1_organization
|
MayBashendy
| 2024-12-04T16:35:18Z | 163 | 0 |
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:23:37Z |
---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k20_task1_organization
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k20_task1_organization
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7283
- Qwk: 0.5487
- Mse: 0.7283
- Rmse: 0.8534
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0208 | 2 | 4.7881 | -0.0151 | 4.7881 | 2.1882 |
| No log | 0.0417 | 4 | 2.7562 | -0.0058 | 2.7562 | 1.6602 |
| No log | 0.0625 | 6 | 1.9863 | 0.0708 | 1.9863 | 1.4094 |
| No log | 0.0833 | 8 | 1.4735 | 0.0749 | 1.4735 | 1.2139 |
| No log | 0.1042 | 10 | 1.1256 | 0.2912 | 1.1256 | 1.0610 |
| No log | 0.125 | 12 | 1.0993 | 0.3915 | 1.0993 | 1.0485 |
| No log | 0.1458 | 14 | 1.0230 | 0.3266 | 1.0230 | 1.0114 |
| No log | 0.1667 | 16 | 1.0721 | 0.2474 | 1.0721 | 1.0354 |
| No log | 0.1875 | 18 | 1.1100 | 0.2474 | 1.1100 | 1.0536 |
| No log | 0.2083 | 20 | 1.1660 | 0.1761 | 1.1660 | 1.0798 |
| No log | 0.2292 | 22 | 1.2153 | 0.2612 | 1.2153 | 1.1024 |
| No log | 0.25 | 24 | 1.1355 | 0.2416 | 1.1355 | 1.0656 |
| No log | 0.2708 | 26 | 1.1988 | 0.3089 | 1.1988 | 1.0949 |
| No log | 0.2917 | 28 | 1.1352 | 0.1818 | 1.1352 | 1.0655 |
| No log | 0.3125 | 30 | 1.1160 | 0.3089 | 1.1160 | 1.0564 |
| No log | 0.3333 | 32 | 1.0678 | 0.2903 | 1.0678 | 1.0333 |
| No log | 0.3542 | 34 | 1.1708 | 0.0803 | 1.1708 | 1.0820 |
| No log | 0.375 | 36 | 1.1162 | 0.1910 | 1.1162 | 1.0565 |
| No log | 0.3958 | 38 | 1.0168 | 0.4096 | 1.0168 | 1.0084 |
| No log | 0.4167 | 40 | 1.0098 | 0.4276 | 1.0098 | 1.0049 |
| No log | 0.4375 | 42 | 1.0968 | 0.4454 | 1.0968 | 1.0473 |
| No log | 0.4583 | 44 | 1.0906 | 0.4454 | 1.0906 | 1.0443 |
| No log | 0.4792 | 46 | 0.9501 | 0.4454 | 0.9501 | 0.9747 |
| No log | 0.5 | 48 | 0.9508 | 0.3211 | 0.9508 | 0.9751 |
| No log | 0.5208 | 50 | 1.1001 | 0.2305 | 1.1001 | 1.0489 |
| No log | 0.5417 | 52 | 0.9371 | 0.4203 | 0.9371 | 0.9681 |
| No log | 0.5625 | 54 | 0.8986 | 0.4414 | 0.8986 | 0.9480 |
| No log | 0.5833 | 56 | 0.8932 | 0.535 | 0.8932 | 0.9451 |
| No log | 0.6042 | 58 | 0.8702 | 0.5758 | 0.8702 | 0.9329 |
| No log | 0.625 | 60 | 0.8444 | 0.5396 | 0.8444 | 0.9189 |
| No log | 0.6458 | 62 | 0.7809 | 0.6422 | 0.7809 | 0.8837 |
| No log | 0.6667 | 64 | 0.6907 | 0.6704 | 0.6907 | 0.8311 |
| No log | 0.6875 | 66 | 0.6904 | 0.5786 | 0.6904 | 0.8309 |
| No log | 0.7083 | 68 | 0.6392 | 0.6864 | 0.6392 | 0.7995 |
| No log | 0.7292 | 70 | 0.6485 | 0.7237 | 0.6485 | 0.8053 |
| No log | 0.75 | 72 | 0.6744 | 0.6892 | 0.6744 | 0.8212 |
| No log | 0.7708 | 74 | 0.6578 | 0.7097 | 0.6578 | 0.8111 |
| No log | 0.7917 | 76 | 0.7250 | 0.5666 | 0.7250 | 0.8515 |
| No log | 0.8125 | 78 | 0.7300 | 0.5909 | 0.7300 | 0.8544 |
| No log | 0.8333 | 80 | 0.6192 | 0.6447 | 0.6192 | 0.7869 |
| No log | 0.8542 | 82 | 0.6387 | 0.6447 | 0.6387 | 0.7992 |
| No log | 0.875 | 84 | 0.7998 | 0.5574 | 0.7998 | 0.8943 |
| No log | 0.8958 | 86 | 1.1013 | 0.4290 | 1.1013 | 1.0494 |
| No log | 0.9167 | 88 | 1.1100 | 0.4176 | 1.1100 | 1.0536 |
| No log | 0.9375 | 90 | 1.0181 | 0.3915 | 1.0181 | 1.0090 |
| No log | 0.9583 | 92 | 0.7709 | 0.4936 | 0.7709 | 0.8780 |
| No log | 0.9792 | 94 | 0.8041 | 0.5204 | 0.8041 | 0.8967 |
| No log | 1.0 | 96 | 0.9883 | 0.4038 | 0.9883 | 0.9941 |
| No log | 1.0208 | 98 | 0.9787 | 0.4320 | 0.9787 | 0.9893 |
| No log | 1.0417 | 100 | 0.7014 | 0.5359 | 0.7014 | 0.8375 |
| No log | 1.0625 | 102 | 0.6554 | 0.6212 | 0.6554 | 0.8096 |
| No log | 1.0833 | 104 | 0.7481 | 0.5330 | 0.7481 | 0.8649 |
| No log | 1.1042 | 106 | 0.8204 | 0.5789 | 0.8204 | 0.9057 |
| No log | 1.125 | 108 | 0.9833 | 0.5755 | 0.9833 | 0.9916 |
| No log | 1.1458 | 110 | 1.0799 | 0.5089 | 1.0799 | 1.0392 |
| No log | 1.1667 | 112 | 1.1292 | 0.5215 | 1.1292 | 1.0626 |
| No log | 1.1875 | 114 | 0.8704 | 0.5877 | 0.8704 | 0.9329 |
| No log | 1.2083 | 116 | 0.6763 | 0.6438 | 0.6763 | 0.8224 |
| No log | 1.2292 | 118 | 0.7456 | 0.6192 | 0.7456 | 0.8635 |
| No log | 1.25 | 120 | 1.0700 | 0.5590 | 1.0700 | 1.0344 |
| No log | 1.2708 | 122 | 1.1244 | 0.5206 | 1.1244 | 1.0604 |
| No log | 1.2917 | 124 | 0.8907 | 0.5877 | 0.8907 | 0.9438 |
| No log | 1.3125 | 126 | 0.7172 | 0.5112 | 0.7172 | 0.8469 |
| No log | 1.3333 | 128 | 0.7926 | 0.6050 | 0.7926 | 0.8903 |
| No log | 1.3542 | 130 | 0.7416 | 0.5237 | 0.7416 | 0.8612 |
| No log | 1.375 | 132 | 0.8042 | 0.5156 | 0.8042 | 0.8968 |
| No log | 1.3958 | 134 | 0.9929 | 0.4252 | 0.9929 | 0.9965 |
| No log | 1.4167 | 136 | 0.9129 | 0.4877 | 0.9129 | 0.9554 |
| No log | 1.4375 | 138 | 0.9181 | 0.5526 | 0.9181 | 0.9582 |
| No log | 1.4583 | 140 | 0.7750 | 0.5760 | 0.7750 | 0.8804 |
| No log | 1.4792 | 142 | 0.7910 | 0.5822 | 0.7910 | 0.8894 |
| No log | 1.5 | 144 | 0.7775 | 0.6026 | 0.7775 | 0.8818 |
| No log | 1.5208 | 146 | 0.9220 | 0.5777 | 0.9220 | 0.9602 |
| No log | 1.5417 | 148 | 1.0334 | 0.5104 | 1.0334 | 1.0166 |
| No log | 1.5625 | 150 | 0.8791 | 0.5645 | 0.8791 | 0.9376 |
| No log | 1.5833 | 152 | 0.8940 | 0.5270 | 0.8940 | 0.9455 |
| No log | 1.6042 | 154 | 1.0551 | 0.5357 | 1.0551 | 1.0272 |
| No log | 1.625 | 156 | 1.0206 | 0.5357 | 1.0206 | 1.0103 |
| No log | 1.6458 | 158 | 0.7377 | 0.5552 | 0.7377 | 0.8589 |
| No log | 1.6667 | 160 | 0.6507 | 0.6287 | 0.6507 | 0.8067 |
| No log | 1.6875 | 162 | 0.6498 | 0.6527 | 0.6498 | 0.8061 |
| No log | 1.7083 | 164 | 0.7715 | 0.5358 | 0.7715 | 0.8783 |
| No log | 1.7292 | 166 | 1.0440 | 0.5378 | 1.0440 | 1.0218 |
| No log | 1.75 | 168 | 0.9958 | 0.5357 | 0.9958 | 0.9979 |
| No log | 1.7708 | 170 | 0.7167 | 0.6217 | 0.7167 | 0.8466 |
| No log | 1.7917 | 172 | 0.6154 | 0.6417 | 0.6154 | 0.7845 |
| No log | 1.8125 | 174 | 0.6231 | 0.6336 | 0.6231 | 0.7894 |
| No log | 1.8333 | 176 | 0.6101 | 0.6550 | 0.6101 | 0.7811 |
| No log | 1.8542 | 178 | 0.6069 | 0.6825 | 0.6069 | 0.7790 |
| No log | 1.875 | 180 | 0.6355 | 0.6672 | 0.6355 | 0.7972 |
| No log | 1.8958 | 182 | 0.5888 | 0.7008 | 0.5888 | 0.7674 |
| No log | 1.9167 | 184 | 0.5870 | 0.6813 | 0.5870 | 0.7661 |
| No log | 1.9375 | 186 | 0.6171 | 0.6795 | 0.6171 | 0.7855 |
| No log | 1.9583 | 188 | 0.8046 | 0.5943 | 0.8046 | 0.8970 |
| No log | 1.9792 | 190 | 0.8988 | 0.5668 | 0.8988 | 0.9481 |
| No log | 2.0 | 192 | 0.7854 | 0.6114 | 0.7854 | 0.8862 |
| No log | 2.0208 | 194 | 0.6687 | 0.6777 | 0.6687 | 0.8178 |
| No log | 2.0417 | 196 | 0.6901 | 0.6291 | 0.6901 | 0.8307 |
| No log | 2.0625 | 198 | 0.7321 | 0.625 | 0.7321 | 0.8556 |
| No log | 2.0833 | 200 | 0.9630 | 0.5668 | 0.9630 | 0.9813 |
| No log | 2.1042 | 202 | 1.0300 | 0.5615 | 1.0300 | 1.0149 |
| No log | 2.125 | 204 | 0.9468 | 0.5668 | 0.9468 | 0.9730 |
| No log | 2.1458 | 206 | 0.8285 | 0.5584 | 0.8285 | 0.9102 |
| No log | 2.1667 | 208 | 0.7432 | 0.6139 | 0.7432 | 0.8621 |
| No log | 2.1875 | 210 | 0.7412 | 0.625 | 0.7412 | 0.8609 |
| No log | 2.2083 | 212 | 0.8342 | 0.58 | 0.8342 | 0.9133 |
| No log | 2.2292 | 214 | 0.8733 | 0.5909 | 0.8733 | 0.9345 |
| No log | 2.25 | 216 | 0.8472 | 0.5943 | 0.8472 | 0.9205 |
| No log | 2.2708 | 218 | 0.6503 | 0.6358 | 0.6503 | 0.8064 |
| No log | 2.2917 | 220 | 0.6147 | 0.6646 | 0.6147 | 0.7840 |
| No log | 2.3125 | 222 | 0.6293 | 0.6364 | 0.6293 | 0.7933 |
| No log | 2.3333 | 224 | 0.6729 | 0.5377 | 0.6729 | 0.8203 |
| No log | 2.3542 | 226 | 0.6703 | 0.6494 | 0.6703 | 0.8187 |
| No log | 2.375 | 228 | 0.6233 | 0.6045 | 0.6233 | 0.7895 |
| No log | 2.3958 | 230 | 0.6599 | 0.5802 | 0.6599 | 0.8123 |
| No log | 2.4167 | 232 | 0.6573 | 0.5802 | 0.6573 | 0.8108 |
| No log | 2.4375 | 234 | 0.6311 | 0.6186 | 0.6311 | 0.7944 |
| No log | 2.4583 | 236 | 0.6328 | 0.5968 | 0.6328 | 0.7955 |
| No log | 2.4792 | 238 | 0.6762 | 0.5989 | 0.6762 | 0.8223 |
| No log | 2.5 | 240 | 0.8398 | 0.5691 | 0.8398 | 0.9164 |
| No log | 2.5208 | 242 | 0.9586 | 0.5152 | 0.9586 | 0.9791 |
| No log | 2.5417 | 244 | 0.9076 | 0.4915 | 0.9076 | 0.9527 |
| No log | 2.5625 | 246 | 0.8812 | 0.4915 | 0.8812 | 0.9387 |
| No log | 2.5833 | 248 | 0.7892 | 0.5551 | 0.7892 | 0.8884 |
| No log | 2.6042 | 250 | 0.7055 | 0.6104 | 0.7055 | 0.8399 |
| No log | 2.625 | 252 | 0.7326 | 0.5811 | 0.7326 | 0.8559 |
| No log | 2.6458 | 254 | 0.9604 | 0.5119 | 0.9604 | 0.9800 |
| No log | 2.6667 | 256 | 1.1813 | 0.4860 | 1.1813 | 1.0869 |
| No log | 2.6875 | 258 | 1.1682 | 0.4870 | 1.1682 | 1.0808 |
| No log | 2.7083 | 260 | 0.9108 | 0.5840 | 0.9108 | 0.9543 |
| No log | 2.7292 | 262 | 0.7301 | 0.5909 | 0.7301 | 0.8544 |
| No log | 2.75 | 264 | 0.7045 | 0.5997 | 0.7045 | 0.8393 |
| No log | 2.7708 | 266 | 0.7349 | 0.5760 | 0.7349 | 0.8573 |
| No log | 2.7917 | 268 | 0.8936 | 0.5449 | 0.8936 | 0.9453 |
| No log | 2.8125 | 270 | 1.1077 | 0.5119 | 1.1077 | 1.0525 |
| No log | 2.8333 | 272 | 1.0901 | 0.5378 | 1.0901 | 1.0441 |
| No log | 2.8542 | 274 | 1.0415 | 0.5378 | 1.0415 | 1.0205 |
| No log | 2.875 | 276 | 0.8918 | 0.5402 | 0.8918 | 0.9443 |
| No log | 2.8958 | 278 | 0.7644 | 0.5355 | 0.7644 | 0.8743 |
| No log | 2.9167 | 280 | 0.7114 | 0.5804 | 0.7114 | 0.8434 |
| No log | 2.9375 | 282 | 0.6608 | 0.6182 | 0.6608 | 0.8129 |
| No log | 2.9583 | 284 | 0.6228 | 0.6569 | 0.6228 | 0.7892 |
| No log | 2.9792 | 286 | 0.6103 | 0.6577 | 0.6103 | 0.7812 |
| No log | 3.0 | 288 | 0.6057 | 0.6369 | 0.6057 | 0.7783 |
| No log | 3.0208 | 290 | 0.6255 | 0.6777 | 0.6255 | 0.7909 |
| No log | 3.0417 | 292 | 0.6686 | 0.6059 | 0.6686 | 0.8177 |
| No log | 3.0625 | 294 | 0.6515 | 0.6777 | 0.6515 | 0.8071 |
| No log | 3.0833 | 296 | 0.6196 | 0.7008 | 0.6196 | 0.7871 |
| No log | 3.1042 | 298 | 0.6294 | 0.7210 | 0.6294 | 0.7933 |
| No log | 3.125 | 300 | 0.6396 | 0.7210 | 0.6396 | 0.7998 |
| No log | 3.1458 | 302 | 0.6490 | 0.7008 | 0.6490 | 0.8056 |
| No log | 3.1667 | 304 | 0.6865 | 0.6914 | 0.6865 | 0.8286 |
| No log | 3.1875 | 306 | 0.7775 | 0.5822 | 0.7775 | 0.8818 |
| No log | 3.2083 | 308 | 0.9298 | 0.5535 | 0.9298 | 0.9642 |
| No log | 3.2292 | 310 | 0.8729 | 0.5535 | 0.8729 | 0.9343 |
| No log | 3.25 | 312 | 0.6645 | 0.6511 | 0.6645 | 0.8152 |
| No log | 3.2708 | 314 | 0.5727 | 0.7179 | 0.5727 | 0.7568 |
| No log | 3.2917 | 316 | 0.6571 | 0.6777 | 0.6571 | 0.8106 |
| No log | 3.3125 | 318 | 0.6440 | 0.6777 | 0.6440 | 0.8025 |
| No log | 3.3333 | 320 | 0.5814 | 0.7171 | 0.5814 | 0.7625 |
| No log | 3.3542 | 322 | 0.6139 | 0.64 | 0.6139 | 0.7835 |
| No log | 3.375 | 324 | 0.6737 | 0.5944 | 0.6737 | 0.8208 |
| No log | 3.3958 | 326 | 0.6486 | 0.6956 | 0.6486 | 0.8054 |
| No log | 3.4167 | 328 | 0.6117 | 0.6191 | 0.6117 | 0.7821 |
| No log | 3.4375 | 330 | 0.6405 | 0.6176 | 0.6405 | 0.8003 |
| No log | 3.4583 | 332 | 0.6662 | 0.6359 | 0.6662 | 0.8162 |
| No log | 3.4792 | 334 | 0.6405 | 0.6857 | 0.6405 | 0.8003 |
| No log | 3.5 | 336 | 0.6162 | 0.6550 | 0.6162 | 0.7850 |
| No log | 3.5208 | 338 | 0.6342 | 0.6099 | 0.6342 | 0.7964 |
| No log | 3.5417 | 340 | 0.7087 | 0.5358 | 0.7087 | 0.8418 |
| No log | 3.5625 | 342 | 0.6654 | 0.6599 | 0.6654 | 0.8157 |
| No log | 3.5833 | 344 | 0.6213 | 0.64 | 0.6213 | 0.7882 |
| No log | 3.6042 | 346 | 0.6210 | 0.6561 | 0.6210 | 0.7880 |
| No log | 3.625 | 348 | 0.6415 | 0.6297 | 0.6415 | 0.8009 |
| No log | 3.6458 | 350 | 0.6802 | 0.5831 | 0.6802 | 0.8247 |
| No log | 3.6667 | 352 | 0.6618 | 0.6297 | 0.6618 | 0.8135 |
| No log | 3.6875 | 354 | 0.6508 | 0.6297 | 0.6508 | 0.8067 |
| No log | 3.7083 | 356 | 0.6463 | 0.6528 | 0.6463 | 0.8039 |
| No log | 3.7292 | 358 | 0.6393 | 0.6528 | 0.6393 | 0.7996 |
| No log | 3.75 | 360 | 0.6540 | 0.6476 | 0.6540 | 0.8087 |
| No log | 3.7708 | 362 | 0.6748 | 0.5873 | 0.6748 | 0.8215 |
| No log | 3.7917 | 364 | 0.6828 | 0.6274 | 0.6828 | 0.8263 |
| No log | 3.8125 | 366 | 0.7140 | 0.5486 | 0.7140 | 0.8450 |
| No log | 3.8333 | 368 | 0.7643 | 0.5263 | 0.7643 | 0.8742 |
| No log | 3.8542 | 370 | 0.7335 | 0.5408 | 0.7335 | 0.8565 |
| No log | 3.875 | 372 | 0.6730 | 0.6273 | 0.6730 | 0.8203 |
| No log | 3.8958 | 374 | 0.6654 | 0.6273 | 0.6654 | 0.8157 |
| No log | 3.9167 | 376 | 0.6764 | 0.6297 | 0.6764 | 0.8224 |
| No log | 3.9375 | 378 | 0.7029 | 0.5359 | 0.7029 | 0.8384 |
| No log | 3.9583 | 380 | 0.7203 | 0.5433 | 0.7203 | 0.8487 |
| No log | 3.9792 | 382 | 0.6967 | 0.5610 | 0.6967 | 0.8347 |
| No log | 4.0 | 384 | 0.6610 | 0.6273 | 0.6610 | 0.8130 |
| No log | 4.0208 | 386 | 0.6551 | 0.6369 | 0.6551 | 0.8094 |
| No log | 4.0417 | 388 | 0.6583 | 0.625 | 0.6583 | 0.8113 |
| No log | 4.0625 | 390 | 0.6891 | 0.6197 | 0.6891 | 0.8301 |
| No log | 4.0833 | 392 | 0.7011 | 0.6197 | 0.7011 | 0.8373 |
| No log | 4.1042 | 394 | 0.6709 | 0.6143 | 0.6709 | 0.8191 |
| No log | 4.125 | 396 | 0.6597 | 0.6497 | 0.6597 | 0.8122 |
| No log | 4.1458 | 398 | 0.6495 | 0.6597 | 0.6495 | 0.8059 |
| No log | 4.1667 | 400 | 0.6472 | 0.6662 | 0.6472 | 0.8045 |
| No log | 4.1875 | 402 | 0.6566 | 0.6297 | 0.6566 | 0.8103 |
| No log | 4.2083 | 404 | 0.6832 | 0.6227 | 0.6832 | 0.8266 |
| No log | 4.2292 | 406 | 0.7245 | 0.6009 | 0.7245 | 0.8512 |
| No log | 4.25 | 408 | 0.7239 | 0.6009 | 0.7239 | 0.8508 |
| No log | 4.2708 | 410 | 0.6848 | 0.6355 | 0.6848 | 0.8276 |
| No log | 4.2917 | 412 | 0.6550 | 0.6378 | 0.6550 | 0.8093 |
| No log | 4.3125 | 414 | 0.6424 | 0.6351 | 0.6424 | 0.8015 |
| No log | 4.3333 | 416 | 0.6271 | 0.6949 | 0.6271 | 0.7919 |
| No log | 4.3542 | 418 | 0.6393 | 0.7324 | 0.6393 | 0.7996 |
| No log | 4.375 | 420 | 0.6732 | 0.7109 | 0.6732 | 0.8205 |
| No log | 4.3958 | 422 | 0.6643 | 0.7109 | 0.6643 | 0.8150 |
| No log | 4.4167 | 424 | 0.6842 | 0.6938 | 0.6842 | 0.8272 |
| No log | 4.4375 | 426 | 0.6719 | 0.6914 | 0.6719 | 0.8197 |
| No log | 4.4583 | 428 | 0.6407 | 0.6518 | 0.6407 | 0.8004 |
| No log | 4.4792 | 430 | 0.6428 | 0.6518 | 0.6428 | 0.8017 |
| No log | 4.5 | 432 | 0.6390 | 0.6749 | 0.6390 | 0.7994 |
| No log | 4.5208 | 434 | 0.6352 | 0.6888 | 0.6352 | 0.7970 |
| No log | 4.5417 | 436 | 0.6330 | 0.6767 | 0.6330 | 0.7956 |
| No log | 4.5625 | 438 | 0.6301 | 0.6767 | 0.6301 | 0.7938 |
| No log | 4.5833 | 440 | 0.6545 | 0.6538 | 0.6545 | 0.8090 |
| No log | 4.6042 | 442 | 0.6743 | 0.6606 | 0.6743 | 0.8212 |
| No log | 4.625 | 444 | 0.6475 | 0.6468 | 0.6475 | 0.8047 |
| No log | 4.6458 | 446 | 0.6290 | 0.6767 | 0.6290 | 0.7931 |
| No log | 4.6667 | 448 | 0.6405 | 0.6983 | 0.6405 | 0.8003 |
| No log | 4.6875 | 450 | 0.6263 | 0.6983 | 0.6263 | 0.7914 |
| No log | 4.7083 | 452 | 0.6469 | 0.6892 | 0.6469 | 0.8043 |
| No log | 4.7292 | 454 | 0.7488 | 0.6466 | 0.7488 | 0.8653 |
| No log | 4.75 | 456 | 0.7796 | 0.6654 | 0.7796 | 0.8830 |
| No log | 4.7708 | 458 | 0.7353 | 0.6828 | 0.7353 | 0.8575 |
| No log | 4.7917 | 460 | 0.6724 | 0.6916 | 0.6724 | 0.8200 |
| No log | 4.8125 | 462 | 0.6442 | 0.7157 | 0.6442 | 0.8026 |
| No log | 4.8333 | 464 | 0.6570 | 0.6949 | 0.6570 | 0.8105 |
| No log | 4.8542 | 466 | 0.7246 | 0.6240 | 0.7246 | 0.8513 |
| No log | 4.875 | 468 | 0.8446 | 0.556 | 0.8446 | 0.9190 |
| No log | 4.8958 | 470 | 0.8565 | 0.556 | 0.8565 | 0.9255 |
| No log | 4.9167 | 472 | 0.8245 | 0.5164 | 0.8245 | 0.9080 |
| No log | 4.9375 | 474 | 0.8630 | 0.5278 | 0.8630 | 0.9290 |
| No log | 4.9583 | 476 | 0.9134 | 0.5169 | 0.9134 | 0.9557 |
| No log | 4.9792 | 478 | 0.9709 | 0.5401 | 0.9709 | 0.9853 |
| No log | 5.0 | 480 | 1.0286 | 0.5119 | 1.0286 | 1.0142 |
| No log | 5.0208 | 482 | 1.0159 | 0.5119 | 1.0159 | 1.0079 |
| No log | 5.0417 | 484 | 0.9497 | 0.5401 | 0.9497 | 0.9745 |
| No log | 5.0625 | 486 | 0.8894 | 0.4823 | 0.8894 | 0.9431 |
| No log | 5.0833 | 488 | 0.8146 | 0.5164 | 0.8146 | 0.9026 |
| No log | 5.1042 | 490 | 0.8184 | 0.4621 | 0.8184 | 0.9046 |
| No log | 5.125 | 492 | 0.8758 | 0.4957 | 0.8758 | 0.9358 |
| No log | 5.1458 | 494 | 0.8886 | 0.4957 | 0.8886 | 0.9427 |
| No log | 5.1667 | 496 | 0.8379 | 0.4474 | 0.8379 | 0.9154 |
| No log | 5.1875 | 498 | 0.8445 | 0.4729 | 0.8445 | 0.9190 |
| 0.383 | 5.2083 | 500 | 0.7990 | 0.5183 | 0.7990 | 0.8938 |
| 0.383 | 5.2292 | 502 | 0.7386 | 0.6676 | 0.7386 | 0.8594 |
| 0.383 | 5.25 | 504 | 0.7259 | 0.6730 | 0.7259 | 0.8520 |
| 0.383 | 5.2708 | 506 | 0.7213 | 0.6730 | 0.7213 | 0.8493 |
| 0.383 | 5.2917 | 508 | 0.7517 | 0.6125 | 0.7517 | 0.8670 |
| 0.383 | 5.3125 | 510 | 0.7579 | 0.6332 | 0.7579 | 0.8706 |
| 0.383 | 5.3333 | 512 | 0.7307 | 0.6125 | 0.7307 | 0.8548 |
| 0.383 | 5.3542 | 514 | 0.7096 | 0.6165 | 0.7096 | 0.8424 |
| 0.383 | 5.375 | 516 | 0.6834 | 0.7157 | 0.6834 | 0.8267 |
| 0.383 | 5.3958 | 518 | 0.7075 | 0.6395 | 0.7075 | 0.8411 |
| 0.383 | 5.4167 | 520 | 0.7821 | 0.6719 | 0.7821 | 0.8844 |
| 0.383 | 5.4375 | 522 | 0.8247 | 0.5844 | 0.8247 | 0.9081 |
| 0.383 | 5.4583 | 524 | 0.7888 | 0.6199 | 0.7888 | 0.8881 |
| 0.383 | 5.4792 | 526 | 0.7814 | 0.6332 | 0.7814 | 0.8840 |
| 0.383 | 5.5 | 528 | 0.7686 | 0.6165 | 0.7686 | 0.8767 |
| 0.383 | 5.5208 | 530 | 0.7858 | 0.6375 | 0.7858 | 0.8864 |
| 0.383 | 5.5417 | 532 | 0.8385 | 0.5584 | 0.8385 | 0.9157 |
| 0.383 | 5.5625 | 534 | 0.9337 | 0.5276 | 0.9337 | 0.9663 |
| 0.383 | 5.5833 | 536 | 1.0495 | 0.5 | 1.0495 | 1.0245 |
| 0.383 | 5.6042 | 538 | 1.0694 | 0.5 | 1.0694 | 1.0341 |
| 0.383 | 5.625 | 540 | 0.9871 | 0.5013 | 0.9871 | 0.9935 |
| 0.383 | 5.6458 | 542 | 0.8844 | 0.5289 | 0.8844 | 0.9404 |
| 0.383 | 5.6667 | 544 | 0.7831 | 0.5897 | 0.7831 | 0.8849 |
| 0.383 | 5.6875 | 546 | 0.7220 | 0.6730 | 0.7220 | 0.8497 |
| 0.383 | 5.7083 | 548 | 0.6954 | 0.6730 | 0.6954 | 0.8339 |
| 0.383 | 5.7292 | 550 | 0.6961 | 0.6730 | 0.6961 | 0.8343 |
| 0.383 | 5.75 | 552 | 0.7373 | 0.6332 | 0.7373 | 0.8587 |
| 0.383 | 5.7708 | 554 | 0.8375 | 0.556 | 0.8375 | 0.9151 |
| 0.383 | 5.7917 | 556 | 0.8906 | 0.5426 | 0.8906 | 0.9437 |
| 0.383 | 5.8125 | 558 | 0.8446 | 0.5428 | 0.8446 | 0.9190 |
| 0.383 | 5.8333 | 560 | 0.7715 | 0.5789 | 0.7715 | 0.8784 |
| 0.383 | 5.8542 | 562 | 0.7153 | 0.6375 | 0.7153 | 0.8458 |
| 0.383 | 5.875 | 564 | 0.6738 | 0.6870 | 0.6738 | 0.8209 |
| 0.383 | 5.8958 | 566 | 0.6606 | 0.6662 | 0.6606 | 0.8128 |
| 0.383 | 5.9167 | 568 | 0.6580 | 0.6662 | 0.6580 | 0.8112 |
| 0.383 | 5.9375 | 570 | 0.6584 | 0.6870 | 0.6584 | 0.8114 |
| 0.383 | 5.9583 | 572 | 0.6609 | 0.6932 | 0.6609 | 0.8130 |
| 0.383 | 5.9792 | 574 | 0.6676 | 0.7125 | 0.6676 | 0.8171 |
| 0.383 | 6.0 | 576 | 0.6626 | 0.7125 | 0.6626 | 0.8140 |
| 0.383 | 6.0208 | 578 | 0.6382 | 0.7258 | 0.6382 | 0.7989 |
| 0.383 | 6.0417 | 580 | 0.6189 | 0.7258 | 0.6189 | 0.7867 |
| 0.383 | 6.0625 | 582 | 0.6140 | 0.6795 | 0.6140 | 0.7836 |
| 0.383 | 6.0833 | 584 | 0.6146 | 0.7085 | 0.6146 | 0.7839 |
| 0.383 | 6.1042 | 586 | 0.6184 | 0.7258 | 0.6184 | 0.7864 |
| 0.383 | 6.125 | 588 | 0.6534 | 0.7258 | 0.6534 | 0.8084 |
| 0.383 | 6.1458 | 590 | 0.7340 | 0.5773 | 0.7340 | 0.8567 |
| 0.383 | 6.1667 | 592 | 0.7652 | 0.5362 | 0.7652 | 0.8748 |
| 0.383 | 6.1875 | 594 | 0.7464 | 0.55 | 0.7464 | 0.8639 |
| 0.383 | 6.2083 | 596 | 0.7229 | 0.6379 | 0.7229 | 0.8502 |
| 0.383 | 6.2292 | 598 | 0.7036 | 0.5921 | 0.7036 | 0.8388 |
| 0.383 | 6.25 | 600 | 0.6964 | 0.6662 | 0.6964 | 0.8345 |
| 0.383 | 6.2708 | 602 | 0.6964 | 0.6662 | 0.6964 | 0.8345 |
| 0.383 | 6.2917 | 604 | 0.7011 | 0.6662 | 0.7011 | 0.8373 |
| 0.383 | 6.3125 | 606 | 0.7082 | 0.6662 | 0.7082 | 0.8416 |
| 0.383 | 6.3333 | 608 | 0.7198 | 0.6104 | 0.7198 | 0.8484 |
| 0.383 | 6.3542 | 610 | 0.7319 | 0.5846 | 0.7319 | 0.8555 |
| 0.383 | 6.375 | 612 | 0.7323 | 0.5846 | 0.7323 | 0.8558 |
| 0.383 | 6.3958 | 614 | 0.7432 | 0.5256 | 0.7432 | 0.8621 |
| 0.383 | 6.4167 | 616 | 0.7520 | 0.5204 | 0.7520 | 0.8672 |
| 0.383 | 6.4375 | 618 | 0.7537 | 0.5146 | 0.7537 | 0.8681 |
| 0.383 | 6.4583 | 620 | 0.7415 | 0.5797 | 0.7415 | 0.8611 |
| 0.383 | 6.4792 | 622 | 0.7273 | 0.6178 | 0.7273 | 0.8528 |
| 0.383 | 6.5 | 624 | 0.7231 | 0.6178 | 0.7231 | 0.8503 |
| 0.383 | 6.5208 | 626 | 0.7002 | 0.6777 | 0.7002 | 0.8368 |
| 0.383 | 6.5417 | 628 | 0.6883 | 0.6741 | 0.6883 | 0.8296 |
| 0.383 | 6.5625 | 630 | 0.6711 | 0.7125 | 0.6711 | 0.8192 |
| 0.383 | 6.5833 | 632 | 0.6583 | 0.7125 | 0.6583 | 0.8113 |
| 0.383 | 6.6042 | 634 | 0.6281 | 0.7068 | 0.6281 | 0.7925 |
| 0.383 | 6.625 | 636 | 0.6208 | 0.6662 | 0.6208 | 0.7879 |
| 0.383 | 6.6458 | 638 | 0.6226 | 0.6662 | 0.6226 | 0.7891 |
| 0.383 | 6.6667 | 640 | 0.6286 | 0.6662 | 0.6286 | 0.7928 |
| 0.383 | 6.6875 | 642 | 0.6497 | 0.7171 | 0.6497 | 0.8061 |
| 0.383 | 6.7083 | 644 | 0.6632 | 0.7171 | 0.6632 | 0.8144 |
| 0.383 | 6.7292 | 646 | 0.6578 | 0.7171 | 0.6578 | 0.8110 |
| 0.383 | 6.75 | 648 | 0.6434 | 0.6662 | 0.6434 | 0.8021 |
| 0.383 | 6.7708 | 650 | 0.6491 | 0.6662 | 0.6491 | 0.8057 |
| 0.383 | 6.7917 | 652 | 0.6707 | 0.6518 | 0.6707 | 0.8190 |
| 0.383 | 6.8125 | 654 | 0.7161 | 0.5794 | 0.7161 | 0.8463 |
| 0.383 | 6.8333 | 656 | 0.7601 | 0.5243 | 0.7601 | 0.8719 |
| 0.383 | 6.8542 | 658 | 0.8175 | 0.5336 | 0.8175 | 0.9041 |
| 0.383 | 6.875 | 660 | 0.8564 | 0.5537 | 0.8564 | 0.9254 |
| 0.383 | 6.8958 | 662 | 0.8351 | 0.5562 | 0.8351 | 0.9138 |
| 0.383 | 6.9167 | 664 | 0.7960 | 0.5336 | 0.7960 | 0.8922 |
| 0.383 | 6.9375 | 666 | 0.7463 | 0.5291 | 0.7463 | 0.8639 |
| 0.383 | 6.9583 | 668 | 0.7113 | 0.6099 | 0.7113 | 0.8434 |
| 0.383 | 6.9792 | 670 | 0.7029 | 0.6099 | 0.7029 | 0.8384 |
| 0.383 | 7.0 | 672 | 0.7233 | 0.5610 | 0.7233 | 0.8505 |
| 0.383 | 7.0208 | 674 | 0.7746 | 0.5100 | 0.7746 | 0.8801 |
| 0.383 | 7.0417 | 676 | 0.8338 | 0.5449 | 0.8338 | 0.9131 |
| 0.383 | 7.0625 | 678 | 0.8794 | 0.5537 | 0.8794 | 0.9378 |
| 0.383 | 7.0833 | 680 | 0.9210 | 0.5401 | 0.9210 | 0.9597 |
| 0.383 | 7.1042 | 682 | 0.9037 | 0.5537 | 0.9037 | 0.9507 |
| 0.383 | 7.125 | 684 | 0.8581 | 0.5310 | 0.8581 | 0.9263 |
| 0.383 | 7.1458 | 686 | 0.7853 | 0.4746 | 0.7853 | 0.8862 |
| 0.383 | 7.1667 | 688 | 0.7279 | 0.5934 | 0.7279 | 0.8532 |
| 0.383 | 7.1875 | 690 | 0.7093 | 0.6730 | 0.7093 | 0.8422 |
| 0.383 | 7.2083 | 692 | 0.7158 | 0.6497 | 0.7158 | 0.8460 |
| 0.383 | 7.2292 | 694 | 0.7323 | 0.5934 | 0.7323 | 0.8558 |
| 0.383 | 7.25 | 696 | 0.7594 | 0.4652 | 0.7594 | 0.8714 |
| 0.383 | 7.2708 | 698 | 0.7779 | 0.5139 | 0.7779 | 0.8820 |
| 0.383 | 7.2917 | 700 | 0.7879 | 0.5586 | 0.7879 | 0.8876 |
| 0.383 | 7.3125 | 702 | 0.7576 | 0.5193 | 0.7576 | 0.8704 |
| 0.383 | 7.3333 | 704 | 0.7271 | 0.5340 | 0.7271 | 0.8527 |
| 0.383 | 7.3542 | 706 | 0.7130 | 0.6143 | 0.7130 | 0.8444 |
| 0.383 | 7.375 | 708 | 0.7017 | 0.6676 | 0.7017 | 0.8377 |
| 0.383 | 7.3958 | 710 | 0.7025 | 0.6676 | 0.7025 | 0.8382 |
| 0.383 | 7.4167 | 712 | 0.7028 | 0.6676 | 0.7028 | 0.8383 |
| 0.383 | 7.4375 | 714 | 0.7160 | 0.6622 | 0.7160 | 0.8462 |
| 0.383 | 7.4583 | 716 | 0.7156 | 0.6935 | 0.7156 | 0.8459 |
| 0.383 | 7.4792 | 718 | 0.7085 | 0.6875 | 0.7085 | 0.8417 |
| 0.383 | 7.5 | 720 | 0.7158 | 0.6538 | 0.7158 | 0.8461 |
| 0.383 | 7.5208 | 722 | 0.7466 | 0.5433 | 0.7466 | 0.8641 |
| 0.383 | 7.5417 | 724 | 0.7743 | 0.5284 | 0.7743 | 0.8800 |
| 0.383 | 7.5625 | 726 | 0.8046 | 0.5526 | 0.8046 | 0.8970 |
| 0.383 | 7.5833 | 728 | 0.8111 | 0.5978 | 0.8111 | 0.9006 |
| 0.383 | 7.6042 | 730 | 0.8209 | 0.5451 | 0.8209 | 0.9060 |
| 0.383 | 7.625 | 732 | 0.8125 | 0.5451 | 0.8125 | 0.9014 |
| 0.383 | 7.6458 | 734 | 0.7744 | 0.5965 | 0.7744 | 0.8800 |
| 0.383 | 7.6667 | 736 | 0.7208 | 0.7065 | 0.7208 | 0.8490 |
| 0.383 | 7.6875 | 738 | 0.6737 | 0.6949 | 0.6737 | 0.8208 |
| 0.383 | 7.7083 | 740 | 0.6524 | 0.7085 | 0.6524 | 0.8077 |
| 0.383 | 7.7292 | 742 | 0.6514 | 0.7085 | 0.6514 | 0.8071 |
| 0.383 | 7.75 | 744 | 0.6575 | 0.7085 | 0.6575 | 0.8108 |
| 0.383 | 7.7708 | 746 | 0.6741 | 0.7141 | 0.6741 | 0.8210 |
| 0.383 | 7.7917 | 748 | 0.6859 | 0.7141 | 0.6859 | 0.8282 |
| 0.383 | 7.8125 | 750 | 0.6995 | 0.7125 | 0.6995 | 0.8364 |
| 0.383 | 7.8333 | 752 | 0.7289 | 0.6769 | 0.7289 | 0.8538 |
| 0.383 | 7.8542 | 754 | 0.7507 | 0.6425 | 0.7507 | 0.8664 |
| 0.383 | 7.875 | 756 | 0.7584 | 0.5781 | 0.7584 | 0.8709 |
| 0.383 | 7.8958 | 758 | 0.7494 | 0.6098 | 0.7494 | 0.8657 |
| 0.383 | 7.9167 | 760 | 0.7372 | 0.6711 | 0.7372 | 0.8586 |
| 0.383 | 7.9375 | 762 | 0.7227 | 0.6497 | 0.7227 | 0.8501 |
| 0.383 | 7.9583 | 764 | 0.7067 | 0.6730 | 0.7067 | 0.8406 |
| 0.383 | 7.9792 | 766 | 0.6926 | 0.6730 | 0.6926 | 0.8322 |
| 0.383 | 8.0 | 768 | 0.6878 | 0.6949 | 0.6878 | 0.8293 |
| 0.383 | 8.0208 | 770 | 0.6923 | 0.6949 | 0.6923 | 0.8320 |
| 0.383 | 8.0417 | 772 | 0.7022 | 0.7141 | 0.7022 | 0.8380 |
| 0.383 | 8.0625 | 774 | 0.7322 | 0.6218 | 0.7322 | 0.8557 |
| 0.383 | 8.0833 | 776 | 0.7751 | 0.5743 | 0.7751 | 0.8804 |
| 0.383 | 8.1042 | 778 | 0.8033 | 0.5931 | 0.8033 | 0.8963 |
| 0.383 | 8.125 | 780 | 0.8076 | 0.604 | 0.8076 | 0.8987 |
| 0.383 | 8.1458 | 782 | 0.7900 | 0.5965 | 0.7900 | 0.8888 |
| 0.383 | 8.1667 | 784 | 0.7625 | 0.5781 | 0.7625 | 0.8732 |
| 0.383 | 8.1875 | 786 | 0.7283 | 0.5934 | 0.7283 | 0.8534 |
| 0.383 | 8.2083 | 788 | 0.7051 | 0.6351 | 0.7051 | 0.8397 |
| 0.383 | 8.2292 | 790 | 0.6982 | 0.6351 | 0.6982 | 0.8356 |
| 0.383 | 8.25 | 792 | 0.6885 | 0.6730 | 0.6885 | 0.8298 |
| 0.383 | 8.2708 | 794 | 0.6935 | 0.6351 | 0.6935 | 0.8328 |
| 0.383 | 8.2917 | 796 | 0.6966 | 0.6378 | 0.6966 | 0.8346 |
| 0.383 | 8.3125 | 798 | 0.7065 | 0.5947 | 0.7065 | 0.8405 |
| 0.383 | 8.3333 | 800 | 0.7254 | 0.5204 | 0.7254 | 0.8517 |
| 0.383 | 8.3542 | 802 | 0.7368 | 0.5583 | 0.7368 | 0.8584 |
| 0.383 | 8.375 | 804 | 0.7339 | 0.5459 | 0.7339 | 0.8567 |
| 0.383 | 8.3958 | 806 | 0.7175 | 0.5359 | 0.7175 | 0.8471 |
| 0.383 | 8.4167 | 808 | 0.7007 | 0.6378 | 0.7007 | 0.8371 |
| 0.383 | 8.4375 | 810 | 0.6857 | 0.6378 | 0.6857 | 0.8281 |
| 0.383 | 8.4583 | 812 | 0.6834 | 0.6766 | 0.6834 | 0.8267 |
| 0.383 | 8.4792 | 814 | 0.6821 | 0.6766 | 0.6821 | 0.8259 |
| 0.383 | 8.5 | 816 | 0.6882 | 0.6622 | 0.6882 | 0.8296 |
| 0.383 | 8.5208 | 818 | 0.6972 | 0.6205 | 0.6972 | 0.8350 |
| 0.383 | 8.5417 | 820 | 0.7074 | 0.6205 | 0.7074 | 0.8411 |
| 0.383 | 8.5625 | 822 | 0.7171 | 0.5785 | 0.7171 | 0.8468 |
| 0.383 | 8.5833 | 824 | 0.7159 | 0.6205 | 0.7159 | 0.8461 |
| 0.383 | 8.6042 | 826 | 0.7049 | 0.6589 | 0.7049 | 0.8396 |
| 0.383 | 8.625 | 828 | 0.6907 | 0.6730 | 0.6907 | 0.8311 |
| 0.383 | 8.6458 | 830 | 0.6852 | 0.6730 | 0.6852 | 0.8278 |
| 0.383 | 8.6667 | 832 | 0.6831 | 0.6730 | 0.6831 | 0.8265 |
| 0.383 | 8.6875 | 834 | 0.6849 | 0.6730 | 0.6849 | 0.8276 |
| 0.383 | 8.7083 | 836 | 0.6861 | 0.6730 | 0.6861 | 0.8283 |
| 0.383 | 8.7292 | 838 | 0.6892 | 0.6730 | 0.6892 | 0.8302 |
| 0.383 | 8.75 | 840 | 0.6906 | 0.6730 | 0.6906 | 0.8311 |
| 0.383 | 8.7708 | 842 | 0.6993 | 0.6730 | 0.6993 | 0.8362 |
| 0.383 | 8.7917 | 844 | 0.7049 | 0.6730 | 0.7049 | 0.8396 |
| 0.383 | 8.8125 | 846 | 0.7064 | 0.6730 | 0.7064 | 0.8405 |
| 0.383 | 8.8333 | 848 | 0.7138 | 0.6622 | 0.7138 | 0.8449 |
| 0.383 | 8.8542 | 850 | 0.7237 | 0.6205 | 0.7237 | 0.8507 |
| 0.383 | 8.875 | 852 | 0.7294 | 0.5785 | 0.7294 | 0.8540 |
| 0.383 | 8.8958 | 854 | 0.7341 | 0.5636 | 0.7341 | 0.8568 |
| 0.383 | 8.9167 | 856 | 0.7364 | 0.5486 | 0.7364 | 0.8582 |
| 0.383 | 8.9375 | 858 | 0.7331 | 0.5486 | 0.7331 | 0.8562 |
| 0.383 | 8.9583 | 860 | 0.7240 | 0.5785 | 0.7240 | 0.8509 |
| 0.383 | 8.9792 | 862 | 0.7209 | 0.5785 | 0.7209 | 0.8490 |
| 0.383 | 9.0 | 864 | 0.7135 | 0.6622 | 0.7135 | 0.8447 |
| 0.383 | 9.0208 | 866 | 0.7050 | 0.6589 | 0.7050 | 0.8397 |
| 0.383 | 9.0417 | 868 | 0.7051 | 0.6589 | 0.7051 | 0.8397 |
| 0.383 | 9.0625 | 870 | 0.6968 | 0.6730 | 0.6968 | 0.8348 |
| 0.383 | 9.0833 | 872 | 0.6923 | 0.6730 | 0.6923 | 0.8321 |
| 0.383 | 9.1042 | 874 | 0.6880 | 0.6730 | 0.6880 | 0.8295 |
| 0.383 | 9.125 | 876 | 0.6902 | 0.6730 | 0.6902 | 0.8308 |
| 0.383 | 9.1458 | 878 | 0.6925 | 0.6730 | 0.6925 | 0.8321 |
| 0.383 | 9.1667 | 880 | 0.6969 | 0.6730 | 0.6969 | 0.8348 |
| 0.383 | 9.1875 | 882 | 0.7027 | 0.6184 | 0.7027 | 0.8383 |
| 0.383 | 9.2083 | 884 | 0.7089 | 0.6184 | 0.7089 | 0.8420 |
| 0.383 | 9.2292 | 886 | 0.7142 | 0.6041 | 0.7142 | 0.8451 |
| 0.383 | 9.25 | 888 | 0.7168 | 0.6041 | 0.7168 | 0.8466 |
| 0.383 | 9.2708 | 890 | 0.7236 | 0.5632 | 0.7236 | 0.8506 |
| 0.383 | 9.2917 | 892 | 0.7349 | 0.5486 | 0.7349 | 0.8573 |
| 0.383 | 9.3125 | 894 | 0.7478 | 0.5047 | 0.7478 | 0.8648 |
| 0.383 | 9.3333 | 896 | 0.7622 | 0.5047 | 0.7622 | 0.8731 |
| 0.383 | 9.3542 | 898 | 0.7804 | 0.5307 | 0.7804 | 0.8834 |
| 0.383 | 9.375 | 900 | 0.7900 | 0.5645 | 0.7900 | 0.8888 |
| 0.383 | 9.3958 | 902 | 0.7923 | 0.5874 | 0.7923 | 0.8901 |
| 0.383 | 9.4167 | 904 | 0.7910 | 0.5874 | 0.7910 | 0.8894 |
| 0.383 | 9.4375 | 906 | 0.7931 | 0.5874 | 0.7931 | 0.8906 |
| 0.383 | 9.4583 | 908 | 0.7982 | 0.5874 | 0.7982 | 0.8934 |
| 0.383 | 9.4792 | 910 | 0.7968 | 0.5874 | 0.7968 | 0.8927 |
| 0.383 | 9.5 | 912 | 0.7914 | 0.5874 | 0.7914 | 0.8896 |
| 0.383 | 9.5208 | 914 | 0.7882 | 0.5757 | 0.7882 | 0.8878 |
| 0.383 | 9.5417 | 916 | 0.7835 | 0.5757 | 0.7835 | 0.8851 |
| 0.383 | 9.5625 | 918 | 0.7788 | 0.5789 | 0.7788 | 0.8825 |
| 0.383 | 9.5833 | 920 | 0.7712 | 0.5701 | 0.7712 | 0.8782 |
| 0.383 | 9.6042 | 922 | 0.7626 | 0.5459 | 0.7626 | 0.8733 |
| 0.383 | 9.625 | 924 | 0.7594 | 0.5459 | 0.7594 | 0.8714 |
| 0.383 | 9.6458 | 926 | 0.7568 | 0.5204 | 0.7568 | 0.8700 |
| 0.383 | 9.6667 | 928 | 0.7528 | 0.5204 | 0.7528 | 0.8676 |
| 0.383 | 9.6875 | 930 | 0.7495 | 0.5204 | 0.7495 | 0.8657 |
| 0.383 | 9.7083 | 932 | 0.7448 | 0.5204 | 0.7448 | 0.8630 |
| 0.383 | 9.7292 | 934 | 0.7402 | 0.5486 | 0.7402 | 0.8603 |
| 0.383 | 9.75 | 936 | 0.7368 | 0.5487 | 0.7368 | 0.8584 |
| 0.383 | 9.7708 | 938 | 0.7337 | 0.5487 | 0.7337 | 0.8565 |
| 0.383 | 9.7917 | 940 | 0.7319 | 0.5487 | 0.7319 | 0.8555 |
| 0.383 | 9.8125 | 942 | 0.7316 | 0.5487 | 0.7316 | 0.8553 |
| 0.383 | 9.8333 | 944 | 0.7295 | 0.5487 | 0.7295 | 0.8541 |
| 0.383 | 9.8542 | 946 | 0.7272 | 0.5487 | 0.7272 | 0.8528 |
| 0.383 | 9.875 | 948 | 0.7259 | 0.5487 | 0.7259 | 0.8520 |
| 0.383 | 9.8958 | 950 | 0.7258 | 0.5487 | 0.7258 | 0.8519 |
| 0.383 | 9.9167 | 952 | 0.7260 | 0.5487 | 0.7260 | 0.8521 |
| 0.383 | 9.9375 | 954 | 0.7267 | 0.5487 | 0.7267 | 0.8525 |
| 0.383 | 9.9583 | 956 | 0.7273 | 0.5487 | 0.7273 | 0.8528 |
| 0.383 | 9.9792 | 958 | 0.7280 | 0.5487 | 0.7280 | 0.8532 |
| 0.383 | 10.0 | 960 | 0.7283 | 0.5487 | 0.7283 | 0.8534 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Ryenhails/w2v-bert-2.0-geo-all-train_withSpec
|
Ryenhails
| 2024-12-04T16:35:11Z | 75 | 0 |
transformers
|
[
"transformers",
"safetensors",
"wav2vec2-bert",
"automatic-speech-recognition",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2024-12-04T16:33:27Z |
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
gokulsrinivasagan/distilbert_base_lda_cola
|
gokulsrinivasagan
| 2024-12-04T16:35:06Z | 122 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_base_lda",
"base_model:finetune:gokulsrinivasagan/distilbert_base_lda",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-22T04:49:54Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_base_lda
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
- accuracy
model-index:
- name: distilbert_base_lda_cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE COLA
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.06369722017462824
- name: Accuracy
type: accuracy
value: 0.6874400973320007
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_base_lda_cola
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_base_lda](https://huggingface.co/gokulsrinivasagan/distilbert_base_lda) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6122
- Matthews Correlation: 0.0637
- Accuracy: 0.6874
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|:--------:|
| 0.612 | 1.0 | 34 | 0.6140 | 0.0663 | 0.6932 |
| 0.5907 | 2.0 | 68 | 0.6122 | 0.0637 | 0.6874 |
| 0.5383 | 3.0 | 102 | 0.6135 | 0.1260 | 0.6635 |
| 0.4644 | 4.0 | 136 | 0.6588 | 0.1918 | 0.6779 |
| 0.3799 | 5.0 | 170 | 0.6821 | 0.1670 | 0.6769 |
| 0.3082 | 6.0 | 204 | 0.8649 | 0.1628 | 0.6769 |
| 0.2435 | 7.0 | 238 | 0.9477 | 0.1554 | 0.6510 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
plaguss/Mistral-7B-v0.1-Math-Shepherd-PRM-0.1
|
plaguss
| 2024-12-04T16:29:35Z | 14 | 0 |
transformers
|
[
"transformers",
"safetensors",
"mistral",
"token-classification",
"generated_from_trainer",
"trl",
"stepwise-reward-trainer",
"dataset:trl-lib/math_shepherd",
"arxiv:2211.14275",
"base_model:mistralai/Mistral-7B-v0.1",
"base_model:finetune:mistralai/Mistral-7B-v0.1",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2024-12-04T07:26:59Z |
---
base_model: mistralai/Mistral-7B-v0.1
datasets: trl-lib/math_shepherd
library_name: transformers
model_name: Mistral-7B-v0.1-Math-Shepherd-PRM-0.1
tags:
- generated_from_trainer
- trl
- stepwise-reward-trainer
licence: license
---
# Model Card for Mistral-7B-v0.1-Math-Shepherd-PRM-0.1
This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the [trl-lib/math_shepherd](https://huggingface.co/datasets/trl-lib/math_shepherd) dataset.
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
Example 1
```python
from datasets import load_dataset
from transformers import pipeline
pipe = pipeline("token-classification", model="plaguss/Mistral-7B-v0.1-Math-Shepherd-PRM-0.1")
dataset = load_dataset("trl-lib/math_shepherd")
example = dataset["test"][10]
print("\n".join((example["prompt"], *example["completions"])))
for idx in range(1, len(example["completions"])+1):
text = "\n".join((example["prompt"], *example["completions"][0:idx])) + "\n"
score = float(pipe(text)[-1]["score"])
print(f"Step {idx}\tScore: {score:.4f}\tLabel: {example['labels'][idx-1]}")
# Asking to truncate to max_length but no maximum length is provided and the model has no predefined maximum length. Default to no truncation.
# Step 1 Score: 1.00 Label: True
# Step 2 Score: 1.00 Label: True
# Step 3 Score: 1.00 Label: True
# Step 4 Score: 0.96 Label: True
# Step 5 Score: 0.95 Label: True
# Step 6 Score: 0.88 Label: False
# Step 7 Score: 0.73 Label: False
# Step 8 Score: 0.86 Label: False
# Step 9 Score: 0.96 Label: False
```
Original case from the Math-Shepherd paper
```python
from datasets import load_dataset
from transformers import pipeline
pipe = pipeline("token-classification", model="plaguss/Mistral-7B-v0.1-Math-Shepherd-PRM-0.1", device="cuda")
examples = [
{
"prompt": "Janet\u2019s ducks lay 16 eggs per day. She eats three for breakfast every morning and bakes muffins for her friends every day with four. She sells the remainder at the farmers' market daily for $2 per fresh duck egg. How much in dollars does she make every day at the farmers' market?",
"completions": [
"Step 1: Janet's ducks lay 16 eggs per day.",
'Step 2: She eats three for breakfast every morning, so she has 16 - 3 = 13 eggs left.',
'Step 3: She bakes muffins for her friends every day with four eggs, so she has 13 - 4 = 9 eggs left.',
"Step 4: She sells the remainder at the farmers' market daily for $2 per fresh duck egg, so she makes 9 * $2 = $18 every day at the farmers' market. The answer is: 18"
],
"labels": [True, True, True, True]
},
{
"prompt": "Janet\u2019s ducks lay 16 eggs per day. She eats three for breakfast every morning and bakes muffins for her friends every day with four. She sells the remainder at the farmers' market daily for $2 per fresh duck egg. How much in dollars does she make every day at the farmers' market?",
"completions": [
"Step 1: Janet's ducks lay 16 eggs per day.",
'Step 2: She eats three for breakfast every morning, so she has 16 - 3 = 13 eggs left.',
'Step 3: She bakes muffins for her friends every day with four eggs, so she has 13 - 4 = 9 eggs left.',
"Step 4: She sells the remainder at the farmers' market daily for $2 per fresh duck egg, so she makes 9 * $2 = $18 every day at the farmers' market. The answer is: 17"
],
"labels": [True, True, True, False]
},
]
for i, example in enumerate(examples):
print(f"- Example {i}:")
for idx in range(1, len(example["completions"])+1):
text = "\n".join((example["prompt"], *example["completions"][0:idx])) + "\n"
score = float(pipe(text)[-1]["score"])
print(f"Step {idx}\tScore: {score:.2f}\tLabel: {example['labels'][idx-1]}")
# - Example 0:
# Step 1 Score: 1.00 Label: True
# Step 2 Score: 1.00 Label: True
# Step 3 Score: 1.00 Label: True
# Step 4 Score: 1.00 Label: True
# - Example 1:
# Step 1 Score: 1.00 Label: True
# Step 2 Score: 1.00 Label: True
# Step 3 Score: 1.00 Label: True
# Step 4 Score: 0.98 Label: False
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/plaguss/huggingface/runs/lnkexnro)
This model was trained with Stepwise Reward.
### Framework versions
- TRL: 0.13.0.dev0
- Transformers: 4.46.0.dev0
- Pytorch: 2.4.1
- Datasets: 3.0.1
- Tokenizers: 0.20.1
## Citations
Cite Stepwise Reward as:
```bibtex
@article{uesato2022solving,
title = {Solving Math Word Problems With Process- and Outcome-Based Feedback},
author = {Uesato, Jonathan and Kushman, Nate and Kumar, Ramana and Song, Francis and Siegel, Noah and Wang, Lisa and Creswell, Antonia and Irving, Geoffrey and Higgins, Irina},
year = 2022,
journal = {arXiv preprint arXiv:2211.14275}
}
```
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
```
|
MayBashendy/ArabicNewSplits2_FineTuningAraBERT_run1_AugV4_k70_task1_organization
|
MayBashendy
| 2024-12-04T16:23:50Z | 182 | 0 |
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T15:21:11Z |
---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: ArabicNewSplits2_FineTuningAraBERT_run1_AugV4_k70_task1_organization
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ArabicNewSplits2_FineTuningAraBERT_run1_AugV4_k70_task1_organization
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9107
- Qwk: 0.4931
- Mse: 0.9107
- Rmse: 0.9543
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0034 | 2 | 4.9532 | -0.0322 | 4.9532 | 2.2256 |
| No log | 0.0069 | 4 | 3.0359 | 0.0674 | 3.0359 | 1.7424 |
| No log | 0.0103 | 6 | 1.9072 | 0.0972 | 1.9072 | 1.3810 |
| No log | 0.0138 | 8 | 1.2139 | 0.1833 | 1.2139 | 1.1018 |
| No log | 0.0172 | 10 | 1.1173 | 0.2198 | 1.1173 | 1.0570 |
| No log | 0.0207 | 12 | 1.3641 | 0.1954 | 1.3641 | 1.1679 |
| No log | 0.0241 | 14 | 1.8158 | 0.2259 | 1.8158 | 1.3475 |
| No log | 0.0275 | 16 | 1.9003 | -0.0289 | 1.9003 | 1.3785 |
| No log | 0.0310 | 18 | 2.2217 | 0.1674 | 2.2217 | 1.4905 |
| No log | 0.0344 | 20 | 2.3755 | 0.1056 | 2.3755 | 1.5413 |
| No log | 0.0379 | 22 | 2.5879 | 0.0982 | 2.5879 | 1.6087 |
| No log | 0.0413 | 24 | 2.6498 | 0.0982 | 2.6498 | 1.6278 |
| No log | 0.0448 | 26 | 2.5044 | 0.0680 | 2.5044 | 1.5825 |
| No log | 0.0482 | 28 | 2.6112 | 0.0680 | 2.6112 | 1.6159 |
| No log | 0.0516 | 30 | 2.2750 | 0.0749 | 2.2750 | 1.5083 |
| No log | 0.0551 | 32 | 1.9070 | 0.1694 | 1.9070 | 1.3810 |
| No log | 0.0585 | 34 | 1.9180 | 0.1052 | 1.9180 | 1.3849 |
| No log | 0.0620 | 36 | 2.0971 | 0.1290 | 2.0971 | 1.4482 |
| No log | 0.0654 | 38 | 2.6041 | 0.0503 | 2.6041 | 1.6137 |
| No log | 0.0688 | 40 | 2.8958 | 0.0277 | 2.8958 | 1.7017 |
| No log | 0.0723 | 42 | 2.8565 | 0.0277 | 2.8565 | 1.6901 |
| No log | 0.0757 | 44 | 2.4540 | 0.0942 | 2.4540 | 1.5665 |
| No log | 0.0792 | 46 | 2.5919 | 0.0621 | 2.5919 | 1.6099 |
| No log | 0.0826 | 48 | 3.1619 | 0.0780 | 3.1619 | 1.7782 |
| No log | 0.0861 | 50 | 3.3076 | 0.0780 | 3.3076 | 1.8187 |
| No log | 0.0895 | 52 | 3.2954 | 0.0984 | 3.2954 | 1.8153 |
| No log | 0.0929 | 54 | 2.7327 | 0.0503 | 2.7327 | 1.6531 |
| No log | 0.0964 | 56 | 2.3254 | 0.1520 | 2.3254 | 1.5249 |
| No log | 0.0998 | 58 | 2.3115 | 0.1379 | 2.3115 | 1.5204 |
| No log | 0.1033 | 60 | 2.1962 | 0.1471 | 2.1962 | 1.4820 |
| No log | 0.1067 | 62 | 2.5026 | 0.1056 | 2.5026 | 1.5820 |
| No log | 0.1102 | 64 | 2.9234 | 0.0277 | 2.9234 | 1.7098 |
| No log | 0.1136 | 66 | 2.7962 | 0.0277 | 2.7962 | 1.6722 |
| No log | 0.1170 | 68 | 2.6829 | 0.0802 | 2.6829 | 1.6380 |
| No log | 0.1205 | 70 | 2.1787 | 0.1619 | 2.1787 | 1.4760 |
| No log | 0.1239 | 72 | 1.8859 | 0.2139 | 1.8859 | 1.3733 |
| No log | 0.1274 | 74 | 1.9206 | 0.1737 | 1.9206 | 1.3859 |
| No log | 0.1308 | 76 | 2.3491 | 0.2424 | 2.3491 | 1.5327 |
| No log | 0.1343 | 78 | 2.4381 | 0.2275 | 2.4381 | 1.5614 |
| No log | 0.1377 | 80 | 1.9809 | 0.2792 | 1.9810 | 1.4075 |
| No log | 0.1411 | 82 | 1.6162 | 0.3623 | 1.6162 | 1.2713 |
| No log | 0.1446 | 84 | 1.4155 | 0.3641 | 1.4155 | 1.1897 |
| No log | 0.1480 | 86 | 1.7479 | 0.3235 | 1.7479 | 1.3221 |
| No log | 0.1515 | 88 | 2.6459 | 0.2235 | 2.6459 | 1.6266 |
| No log | 0.1549 | 90 | 3.0568 | 0.1496 | 3.0568 | 1.7484 |
| No log | 0.1583 | 92 | 2.8222 | 0.1242 | 2.8222 | 1.6800 |
| No log | 0.1618 | 94 | 2.2701 | 0.1827 | 2.2701 | 1.5067 |
| No log | 0.1652 | 96 | 1.8733 | 0.2537 | 1.8733 | 1.3687 |
| No log | 0.1687 | 98 | 1.7527 | 0.2727 | 1.7527 | 1.3239 |
| No log | 0.1721 | 100 | 1.6284 | 0.3157 | 1.6284 | 1.2761 |
| No log | 0.1756 | 102 | 1.4281 | 0.2973 | 1.4281 | 1.1950 |
| No log | 0.1790 | 104 | 1.3380 | 0.2246 | 1.3380 | 1.1567 |
| No log | 0.1824 | 106 | 1.3581 | 0.3268 | 1.3581 | 1.1654 |
| No log | 0.1859 | 108 | 1.6216 | 0.3539 | 1.6216 | 1.2734 |
| No log | 0.1893 | 110 | 1.7263 | 0.3539 | 1.7263 | 1.3139 |
| No log | 0.1928 | 112 | 2.2106 | 0.2713 | 2.2106 | 1.4868 |
| No log | 0.1962 | 114 | 2.3469 | 0.2867 | 2.3469 | 1.5320 |
| No log | 0.1997 | 116 | 2.3057 | 0.2867 | 2.3057 | 1.5185 |
| No log | 0.2031 | 118 | 2.1113 | 0.2817 | 2.1113 | 1.4530 |
| No log | 0.2065 | 120 | 2.2383 | 0.2817 | 2.2383 | 1.4961 |
| No log | 0.2100 | 122 | 2.2800 | 0.2843 | 2.2800 | 1.5100 |
| No log | 0.2134 | 124 | 1.9925 | 0.3304 | 1.9925 | 1.4116 |
| No log | 0.2169 | 126 | 1.5880 | 0.4159 | 1.5880 | 1.2601 |
| No log | 0.2203 | 128 | 1.7665 | 0.3698 | 1.7665 | 1.3291 |
| No log | 0.2238 | 130 | 1.8871 | 0.3374 | 1.8871 | 1.3737 |
| No log | 0.2272 | 132 | 2.0904 | 0.3110 | 2.0904 | 1.4458 |
| No log | 0.2306 | 134 | 2.1743 | 0.3077 | 2.1743 | 1.4746 |
| No log | 0.2341 | 136 | 2.0983 | 0.3406 | 2.0983 | 1.4485 |
| No log | 0.2375 | 138 | 2.6150 | 0.2206 | 2.6150 | 1.6171 |
| No log | 0.2410 | 140 | 2.9793 | 0.1873 | 2.9793 | 1.7261 |
| No log | 0.2444 | 142 | 2.9671 | 0.1873 | 2.9671 | 1.7225 |
| No log | 0.2478 | 144 | 2.8415 | 0.1868 | 2.8415 | 1.6857 |
| No log | 0.2513 | 146 | 2.7863 | 0.1719 | 2.7863 | 1.6692 |
| No log | 0.2547 | 148 | 2.6756 | 0.1994 | 2.6756 | 1.6357 |
| No log | 0.2582 | 150 | 2.1297 | 0.3410 | 2.1297 | 1.4594 |
| No log | 0.2616 | 152 | 1.7682 | 0.2889 | 1.7682 | 1.3297 |
| No log | 0.2651 | 154 | 1.8568 | 0.3557 | 1.8568 | 1.3627 |
| No log | 0.2685 | 156 | 2.0878 | 0.3571 | 2.0878 | 1.4449 |
| No log | 0.2719 | 158 | 2.5276 | 0.2588 | 2.5276 | 1.5899 |
| No log | 0.2754 | 160 | 2.7889 | 0.2414 | 2.7889 | 1.6700 |
| No log | 0.2788 | 162 | 3.0111 | 0.2429 | 3.0111 | 1.7352 |
| No log | 0.2823 | 164 | 2.4902 | 0.2400 | 2.4902 | 1.5780 |
| No log | 0.2857 | 166 | 2.4689 | 0.2206 | 2.4689 | 1.5713 |
| No log | 0.2892 | 168 | 2.7096 | 0.2446 | 2.7096 | 1.6461 |
| No log | 0.2926 | 170 | 2.6838 | 0.2696 | 2.6838 | 1.6382 |
| No log | 0.2960 | 172 | 2.3723 | 0.2841 | 2.3723 | 1.5402 |
| No log | 0.2995 | 174 | 1.9362 | 0.2987 | 1.9362 | 1.3915 |
| No log | 0.3029 | 176 | 1.7476 | 0.3440 | 1.7476 | 1.3220 |
| No log | 0.3064 | 178 | 1.7811 | 0.3150 | 1.7811 | 1.3346 |
| No log | 0.3098 | 180 | 2.2615 | 0.2793 | 2.2615 | 1.5038 |
| No log | 0.3133 | 182 | 2.3796 | 0.2771 | 2.3796 | 1.5426 |
| No log | 0.3167 | 184 | 2.0942 | 0.2745 | 2.0942 | 1.4471 |
| No log | 0.3201 | 186 | 2.2800 | 0.2723 | 2.2800 | 1.5100 |
| No log | 0.3236 | 188 | 2.0412 | 0.3294 | 2.0412 | 1.4287 |
| No log | 0.3270 | 190 | 1.6894 | 0.3105 | 1.6894 | 1.2998 |
| No log | 0.3305 | 192 | 1.7076 | 0.3105 | 1.7076 | 1.3067 |
| No log | 0.3339 | 194 | 2.0356 | 0.3312 | 2.0356 | 1.4267 |
| No log | 0.3373 | 196 | 2.1710 | 0.3070 | 2.1710 | 1.4734 |
| No log | 0.3408 | 198 | 2.4053 | 0.2794 | 2.4053 | 1.5509 |
| No log | 0.3442 | 200 | 2.1154 | 0.3070 | 2.1154 | 1.4545 |
| No log | 0.3477 | 202 | 1.7370 | 0.3527 | 1.7370 | 1.3180 |
| No log | 0.3511 | 204 | 1.9232 | 0.3359 | 1.9232 | 1.3868 |
| No log | 0.3546 | 206 | 2.0638 | 0.2987 | 2.0638 | 1.4366 |
| No log | 0.3580 | 208 | 2.2056 | 0.2870 | 2.2056 | 1.4851 |
| No log | 0.3614 | 210 | 2.0524 | 0.3359 | 2.0524 | 1.4326 |
| No log | 0.3649 | 212 | 2.0940 | 0.2708 | 2.0940 | 1.4471 |
| No log | 0.3683 | 214 | 2.2624 | 0.2870 | 2.2624 | 1.5041 |
| No log | 0.3718 | 216 | 2.4150 | 0.2093 | 2.4150 | 1.5540 |
| No log | 0.3752 | 218 | 2.2586 | 0.2644 | 2.2586 | 1.5029 |
| No log | 0.3787 | 220 | 1.8588 | 0.3359 | 1.8588 | 1.3634 |
| No log | 0.3821 | 222 | 2.0419 | 0.2969 | 2.0419 | 1.4289 |
| No log | 0.3855 | 224 | 2.7439 | 0.2256 | 2.7439 | 1.6565 |
| No log | 0.3890 | 226 | 2.8497 | 0.2410 | 2.8497 | 1.6881 |
| No log | 0.3924 | 228 | 2.2876 | 0.2817 | 2.2876 | 1.5125 |
| No log | 0.3959 | 230 | 1.7035 | 0.3859 | 1.7035 | 1.3052 |
| No log | 0.3993 | 232 | 1.6196 | 0.2823 | 1.6196 | 1.2726 |
| No log | 0.4028 | 234 | 1.9294 | 0.2664 | 1.9294 | 1.3890 |
| No log | 0.4062 | 236 | 2.3813 | 0.2224 | 2.3813 | 1.5431 |
| No log | 0.4096 | 238 | 2.8086 | 0.2395 | 2.8086 | 1.6759 |
| No log | 0.4131 | 240 | 2.6978 | 0.2398 | 2.6978 | 1.6425 |
| No log | 0.4165 | 242 | 2.1492 | 0.3347 | 2.1492 | 1.4660 |
| No log | 0.4200 | 244 | 1.6078 | 0.2795 | 1.6078 | 1.2680 |
| No log | 0.4234 | 246 | 1.5253 | 0.2795 | 1.5253 | 1.2350 |
| No log | 0.4269 | 248 | 1.7395 | 0.2983 | 1.7395 | 1.3189 |
| No log | 0.4303 | 250 | 2.0592 | 0.2307 | 2.0592 | 1.4350 |
| No log | 0.4337 | 252 | 2.1575 | 0.1931 | 2.1575 | 1.4689 |
| No log | 0.4372 | 254 | 2.1722 | 0.2345 | 2.1722 | 1.4738 |
| No log | 0.4406 | 256 | 2.1686 | 0.2817 | 2.1686 | 1.4726 |
| No log | 0.4441 | 258 | 2.2999 | 0.2817 | 2.2999 | 1.5165 |
| No log | 0.4475 | 260 | 2.6142 | 0.25 | 2.6142 | 1.6169 |
| No log | 0.4509 | 262 | 2.6809 | 0.2794 | 2.6809 | 1.6374 |
| No log | 0.4544 | 264 | 2.5426 | 0.2961 | 2.5426 | 1.5945 |
| No log | 0.4578 | 266 | 1.9857 | 0.2969 | 1.9857 | 1.4091 |
| No log | 0.4613 | 268 | 1.8400 | 0.2764 | 1.8400 | 1.3565 |
| No log | 0.4647 | 270 | 1.9456 | 0.3142 | 1.9456 | 1.3948 |
| No log | 0.4682 | 272 | 2.5192 | 0.3258 | 2.5192 | 1.5872 |
| No log | 0.4716 | 274 | 2.6759 | 0.2927 | 2.6759 | 1.6358 |
| No log | 0.4750 | 276 | 2.3693 | 0.3128 | 2.3693 | 1.5392 |
| No log | 0.4785 | 278 | 2.1770 | 0.2817 | 2.1770 | 1.4755 |
| No log | 0.4819 | 280 | 1.8499 | 0.3255 | 1.8499 | 1.3601 |
| No log | 0.4854 | 282 | 1.7655 | 0.3005 | 1.7655 | 1.3287 |
| No log | 0.4888 | 284 | 2.1795 | 0.2969 | 2.1795 | 1.4763 |
| No log | 0.4923 | 286 | 2.7589 | 0.2551 | 2.7589 | 1.6610 |
| No log | 0.4957 | 288 | 2.5904 | 0.2360 | 2.5904 | 1.6095 |
| No log | 0.4991 | 290 | 2.1812 | 0.2657 | 2.1812 | 1.4769 |
| No log | 0.5026 | 292 | 1.8935 | 0.2393 | 1.8935 | 1.3761 |
| No log | 0.5060 | 294 | 1.7355 | 0.2282 | 1.7355 | 1.3174 |
| No log | 0.5095 | 296 | 1.5363 | 0.2761 | 1.5363 | 1.2395 |
| No log | 0.5129 | 298 | 1.6008 | 0.3137 | 1.6008 | 1.2652 |
| No log | 0.5164 | 300 | 1.9595 | 0.3313 | 1.9595 | 1.3998 |
| No log | 0.5198 | 302 | 2.3996 | 0.3380 | 2.3996 | 1.5491 |
| No log | 0.5232 | 304 | 2.5672 | 0.3101 | 2.5672 | 1.6022 |
| No log | 0.5267 | 306 | 2.3328 | 0.2670 | 2.3328 | 1.5273 |
| No log | 0.5301 | 308 | 1.8711 | 0.2445 | 1.8711 | 1.3679 |
| No log | 0.5336 | 310 | 1.7083 | 0.2760 | 1.7083 | 1.3070 |
| No log | 0.5370 | 312 | 1.7169 | 0.2760 | 1.7169 | 1.3103 |
| No log | 0.5404 | 314 | 1.9480 | 0.2307 | 1.9480 | 1.3957 |
| No log | 0.5439 | 316 | 2.2159 | 0.2817 | 2.2159 | 1.4886 |
| No log | 0.5473 | 318 | 2.5236 | 0.2927 | 2.5236 | 1.5886 |
| No log | 0.5508 | 320 | 2.2190 | 0.3110 | 2.2190 | 1.4896 |
| No log | 0.5542 | 322 | 1.7829 | 0.3417 | 1.7829 | 1.3352 |
| No log | 0.5577 | 324 | 1.9497 | 0.3559 | 1.9497 | 1.3963 |
| No log | 0.5611 | 326 | 2.4383 | 0.3157 | 2.4383 | 1.5615 |
| No log | 0.5645 | 328 | 2.2803 | 0.3336 | 2.2803 | 1.5101 |
| No log | 0.5680 | 330 | 1.6275 | 0.3639 | 1.6275 | 1.2758 |
| No log | 0.5714 | 332 | 1.2882 | 0.3152 | 1.2882 | 1.1350 |
| No log | 0.5749 | 334 | 1.4237 | 0.3971 | 1.4237 | 1.1932 |
| No log | 0.5783 | 336 | 1.8259 | 0.3855 | 1.8259 | 1.3513 |
| No log | 0.5818 | 338 | 2.1515 | 0.3421 | 2.1515 | 1.4668 |
| No log | 0.5852 | 340 | 2.1293 | 0.3822 | 2.1293 | 1.4592 |
| No log | 0.5886 | 342 | 1.7096 | 0.3010 | 1.7096 | 1.3075 |
| No log | 0.5921 | 344 | 1.2817 | 0.3289 | 1.2817 | 1.1321 |
| No log | 0.5955 | 346 | 1.1801 | 0.2597 | 1.1801 | 1.0863 |
| No log | 0.5990 | 348 | 1.3485 | 0.2901 | 1.3485 | 1.1613 |
| No log | 0.6024 | 350 | 1.8704 | 0.3359 | 1.8704 | 1.3676 |
| No log | 0.6059 | 352 | 2.4414 | 0.3147 | 2.4414 | 1.5625 |
| No log | 0.6093 | 354 | 2.5878 | 0.3147 | 2.5878 | 1.6087 |
| No log | 0.6127 | 356 | 2.1855 | 0.3392 | 2.1855 | 1.4783 |
| No log | 0.6162 | 358 | 1.8918 | 0.3762 | 1.8918 | 1.3754 |
| No log | 0.6196 | 360 | 1.8358 | 0.3926 | 1.8358 | 1.3549 |
| No log | 0.6231 | 362 | 1.6891 | 0.2791 | 1.6891 | 1.2996 |
| No log | 0.6265 | 364 | 1.6238 | 0.2445 | 1.6238 | 1.2743 |
| No log | 0.6299 | 366 | 1.6275 | 0.2445 | 1.6275 | 1.2757 |
| No log | 0.6334 | 368 | 1.9186 | 0.3364 | 1.9186 | 1.3851 |
| No log | 0.6368 | 370 | 2.2531 | 0.3110 | 2.2531 | 1.5010 |
| No log | 0.6403 | 372 | 2.1360 | 0.3110 | 2.1360 | 1.4615 |
| No log | 0.6437 | 374 | 1.8123 | 0.3364 | 1.8123 | 1.3462 |
| No log | 0.6472 | 376 | 1.7528 | 0.2820 | 1.7528 | 1.3239 |
| No log | 0.6506 | 378 | 1.7675 | 0.3286 | 1.7675 | 1.3295 |
| No log | 0.6540 | 380 | 1.6564 | 0.2465 | 1.6564 | 1.2870 |
| No log | 0.6575 | 382 | 1.8730 | 0.3263 | 1.8730 | 1.3686 |
| No log | 0.6609 | 384 | 1.8322 | 0.3263 | 1.8322 | 1.3536 |
| No log | 0.6644 | 386 | 1.5849 | 0.3544 | 1.5849 | 1.2589 |
| No log | 0.6678 | 388 | 1.6607 | 0.3571 | 1.6607 | 1.2887 |
| No log | 0.6713 | 390 | 1.9503 | 0.3346 | 1.9503 | 1.3965 |
| No log | 0.6747 | 392 | 2.2303 | 0.3096 | 2.2303 | 1.4934 |
| No log | 0.6781 | 394 | 1.9927 | 0.3190 | 1.9927 | 1.4116 |
| No log | 0.6816 | 396 | 1.6455 | 0.3627 | 1.6455 | 1.2828 |
| No log | 0.6850 | 398 | 1.6255 | 0.3627 | 1.6255 | 1.2749 |
| No log | 0.6885 | 400 | 1.6347 | 0.3627 | 1.6347 | 1.2786 |
| No log | 0.6919 | 402 | 1.6287 | 0.3093 | 1.6287 | 1.2762 |
| No log | 0.6954 | 404 | 1.8661 | 0.3125 | 1.8661 | 1.3661 |
| No log | 0.6988 | 406 | 2.1142 | 0.3110 | 2.1142 | 1.4540 |
| No log | 0.7022 | 408 | 1.9534 | 0.3364 | 1.9534 | 1.3976 |
| No log | 0.7057 | 410 | 1.5844 | 0.2367 | 1.5844 | 1.2587 |
| No log | 0.7091 | 412 | 1.5245 | 0.2520 | 1.5245 | 1.2347 |
| No log | 0.7126 | 414 | 1.6325 | 0.2850 | 1.6325 | 1.2777 |
| No log | 0.7160 | 416 | 1.8600 | 0.3364 | 1.8600 | 1.3638 |
| No log | 0.7194 | 418 | 1.9904 | 0.2793 | 1.9904 | 1.4108 |
| No log | 0.7229 | 420 | 1.8127 | 0.3508 | 1.8127 | 1.3464 |
| No log | 0.7263 | 422 | 1.5302 | 0.36 | 1.5302 | 1.2370 |
| No log | 0.7298 | 424 | 1.4558 | 0.3969 | 1.4558 | 1.2066 |
| No log | 0.7332 | 426 | 1.5762 | 0.4084 | 1.5762 | 1.2555 |
| No log | 0.7367 | 428 | 1.5618 | 0.4842 | 1.5618 | 1.2497 |
| No log | 0.7401 | 430 | 1.5308 | 0.4842 | 1.5308 | 1.2373 |
| No log | 0.7435 | 432 | 1.2554 | 0.4989 | 1.2554 | 1.1204 |
| No log | 0.7470 | 434 | 1.1263 | 0.5897 | 1.1263 | 1.0613 |
| No log | 0.7504 | 436 | 1.2339 | 0.4405 | 1.2339 | 1.1108 |
| No log | 0.7539 | 438 | 1.5229 | 0.3346 | 1.5229 | 1.2340 |
| No log | 0.7573 | 440 | 1.8585 | 0.3209 | 1.8585 | 1.3633 |
| No log | 0.7608 | 442 | 2.0492 | 0.2769 | 2.0492 | 1.4315 |
| No log | 0.7642 | 444 | 2.0638 | 0.2769 | 2.0638 | 1.4366 |
| No log | 0.7676 | 446 | 1.7495 | 0.2573 | 1.7495 | 1.3227 |
| No log | 0.7711 | 448 | 1.3177 | 0.3852 | 1.3177 | 1.1479 |
| No log | 0.7745 | 450 | 1.0818 | 0.4087 | 1.0818 | 1.0401 |
| No log | 0.7780 | 452 | 1.0347 | 0.4249 | 1.0347 | 1.0172 |
| No log | 0.7814 | 454 | 1.0909 | 0.4928 | 1.0909 | 1.0445 |
| No log | 0.7849 | 456 | 1.1828 | 0.5099 | 1.1828 | 1.0875 |
| No log | 0.7883 | 458 | 1.2750 | 0.4758 | 1.2750 | 1.1292 |
| No log | 0.7917 | 460 | 1.2022 | 0.4663 | 1.2022 | 1.0964 |
| No log | 0.7952 | 462 | 1.1787 | 0.4988 | 1.1787 | 1.0857 |
| No log | 0.7986 | 464 | 1.2584 | 0.4964 | 1.2584 | 1.1218 |
| No log | 0.8021 | 466 | 1.4060 | 0.4604 | 1.4060 | 1.1858 |
| No log | 0.8055 | 468 | 1.5645 | 0.4398 | 1.5645 | 1.2508 |
| No log | 0.8090 | 470 | 1.4719 | 0.4290 | 1.4719 | 1.2132 |
| No log | 0.8124 | 472 | 1.3267 | 0.4737 | 1.3267 | 1.1518 |
| No log | 0.8158 | 474 | 1.3999 | 0.4921 | 1.3999 | 1.1832 |
| No log | 0.8193 | 476 | 1.6839 | 0.3964 | 1.6839 | 1.2977 |
| No log | 0.8227 | 478 | 1.8703 | 0.3609 | 1.8703 | 1.3676 |
| No log | 0.8262 | 480 | 1.9009 | 0.3621 | 1.9009 | 1.3787 |
| No log | 0.8296 | 482 | 1.7456 | 0.3758 | 1.7456 | 1.3212 |
| No log | 0.8330 | 484 | 1.4884 | 0.4154 | 1.4884 | 1.2200 |
| No log | 0.8365 | 486 | 1.4027 | 0.4015 | 1.4027 | 1.1844 |
| No log | 0.8399 | 488 | 1.2739 | 0.4403 | 1.2739 | 1.1287 |
| No log | 0.8434 | 490 | 1.2096 | 0.4302 | 1.2096 | 1.0998 |
| No log | 0.8468 | 492 | 1.2662 | 0.4520 | 1.2662 | 1.1252 |
| No log | 0.8503 | 494 | 1.4308 | 0.4528 | 1.4308 | 1.1961 |
| No log | 0.8537 | 496 | 1.8566 | 0.4342 | 1.8566 | 1.3626 |
| No log | 0.8571 | 498 | 2.1939 | 0.3517 | 2.1939 | 1.4812 |
| 0.4688 | 0.8606 | 500 | 1.9529 | 0.3417 | 1.9529 | 1.3975 |
| 0.4688 | 0.8640 | 502 | 1.5336 | 0.4808 | 1.5336 | 1.2384 |
| 0.4688 | 0.8675 | 504 | 1.1322 | 0.3625 | 1.1322 | 1.0641 |
| 0.4688 | 0.8709 | 506 | 1.0754 | 0.4261 | 1.0754 | 1.0370 |
| 0.4688 | 0.8744 | 508 | 1.2328 | 0.3886 | 1.2328 | 1.1103 |
| 0.4688 | 0.8778 | 510 | 1.5741 | 0.48 | 1.5741 | 1.2546 |
| 0.4688 | 0.8812 | 512 | 1.6813 | 0.4421 | 1.6813 | 1.2966 |
| 0.4688 | 0.8847 | 514 | 1.6334 | 0.4523 | 1.6334 | 1.2780 |
| 0.4688 | 0.8881 | 516 | 1.3132 | 0.4941 | 1.3132 | 1.1460 |
| 0.4688 | 0.8916 | 518 | 0.9364 | 0.5931 | 0.9364 | 0.9677 |
| 0.4688 | 0.8950 | 520 | 0.8444 | 0.5459 | 0.8444 | 0.9189 |
| 0.4688 | 0.8985 | 522 | 0.9431 | 0.5931 | 0.9431 | 0.9711 |
| 0.4688 | 0.9019 | 524 | 1.3420 | 0.4634 | 1.3420 | 1.1585 |
| 0.4688 | 0.9053 | 526 | 1.5949 | 0.4341 | 1.5949 | 1.2629 |
| 0.4688 | 0.9088 | 528 | 1.4213 | 0.4512 | 1.4213 | 1.1922 |
| 0.4688 | 0.9122 | 530 | 1.2847 | 0.4824 | 1.2847 | 1.1334 |
| 0.4688 | 0.9157 | 532 | 1.3513 | 0.4816 | 1.3513 | 1.1624 |
| 0.4688 | 0.9191 | 534 | 1.3507 | 0.4512 | 1.3507 | 1.1622 |
| 0.4688 | 0.9225 | 536 | 1.5477 | 0.4238 | 1.5477 | 1.2441 |
| 0.4688 | 0.9260 | 538 | 1.7203 | 0.3895 | 1.7203 | 1.3116 |
| 0.4688 | 0.9294 | 540 | 1.8990 | 0.3759 | 1.8990 | 1.3780 |
| 0.4688 | 0.9329 | 542 | 1.7199 | 0.3903 | 1.7199 | 1.3114 |
| 0.4688 | 0.9363 | 544 | 1.7159 | 0.3903 | 1.7159 | 1.3099 |
| 0.4688 | 0.9398 | 546 | 1.5639 | 0.4238 | 1.5639 | 1.2506 |
| 0.4688 | 0.9432 | 548 | 1.4740 | 0.3979 | 1.4740 | 1.2141 |
| 0.4688 | 0.9466 | 550 | 1.5753 | 0.4065 | 1.5753 | 1.2551 |
| 0.4688 | 0.9501 | 552 | 1.8731 | 0.375 | 1.8731 | 1.3686 |
| 0.4688 | 0.9535 | 554 | 2.1090 | 0.3457 | 2.1090 | 1.4522 |
| 0.4688 | 0.9570 | 556 | 2.1962 | 0.3206 | 2.1962 | 1.4820 |
| 0.4688 | 0.9604 | 558 | 2.1616 | 0.3206 | 2.1616 | 1.4702 |
| 0.4688 | 0.9639 | 560 | 1.8757 | 0.3417 | 1.8757 | 1.3695 |
| 0.4688 | 0.9673 | 562 | 1.7337 | 0.2969 | 1.7337 | 1.3167 |
| 0.4688 | 0.9707 | 564 | 1.6335 | 0.3386 | 1.6335 | 1.2781 |
| 0.4688 | 0.9742 | 566 | 1.5915 | 0.3726 | 1.5915 | 1.2615 |
| 0.4688 | 0.9776 | 568 | 1.6363 | 0.3810 | 1.6363 | 1.2792 |
| 0.4688 | 0.9811 | 570 | 1.5811 | 0.4050 | 1.5811 | 1.2574 |
| 0.4688 | 0.9845 | 572 | 1.8175 | 0.3837 | 1.8175 | 1.3482 |
| 0.4688 | 0.9880 | 574 | 1.7621 | 0.4000 | 1.7621 | 1.3274 |
| 0.4688 | 0.9914 | 576 | 1.9475 | 0.3837 | 1.9475 | 1.3955 |
| 0.4688 | 0.9948 | 578 | 1.9045 | 0.3837 | 1.9045 | 1.3801 |
| 0.4688 | 0.9983 | 580 | 1.5508 | 0.4206 | 1.5508 | 1.2453 |
| 0.4688 | 1.0017 | 582 | 1.2107 | 0.4136 | 1.2107 | 1.1003 |
| 0.4688 | 1.0052 | 584 | 0.9530 | 0.5733 | 0.9530 | 0.9762 |
| 0.4688 | 1.0086 | 586 | 0.8999 | 0.55 | 0.8999 | 0.9487 |
| 0.4688 | 1.0120 | 588 | 1.0234 | 0.5733 | 1.0234 | 1.0116 |
| 0.4688 | 1.0155 | 590 | 1.4276 | 0.48 | 1.4276 | 1.1948 |
| 0.4688 | 1.0189 | 592 | 2.0474 | 0.3286 | 2.0474 | 1.4309 |
| 0.4688 | 1.0224 | 594 | 2.1982 | 0.3058 | 2.1982 | 1.4826 |
| 0.4688 | 1.0258 | 596 | 1.9235 | 0.3346 | 1.9235 | 1.3869 |
| 0.4688 | 1.0293 | 598 | 1.3645 | 0.4952 | 1.3645 | 1.1681 |
| 0.4688 | 1.0327 | 600 | 1.1775 | 0.5013 | 1.1775 | 1.0851 |
| 0.4688 | 1.0361 | 602 | 1.1849 | 0.5123 | 1.1849 | 1.0885 |
| 0.4688 | 1.0396 | 604 | 1.2860 | 0.4964 | 1.2860 | 1.1340 |
| 0.4688 | 1.0430 | 606 | 1.2395 | 0.4632 | 1.2395 | 1.1133 |
| 0.4688 | 1.0465 | 608 | 1.2580 | 0.4632 | 1.2580 | 1.1216 |
| 0.4688 | 1.0499 | 610 | 1.3838 | 0.4824 | 1.3838 | 1.1764 |
| 0.4688 | 1.0534 | 612 | 1.3487 | 0.5071 | 1.3487 | 1.1613 |
| 0.4688 | 1.0568 | 614 | 1.4260 | 0.5046 | 1.4260 | 1.1941 |
| 0.4688 | 1.0602 | 616 | 1.5603 | 0.4342 | 1.5603 | 1.2491 |
| 0.4688 | 1.0637 | 618 | 1.4399 | 0.5303 | 1.4399 | 1.2000 |
| 0.4688 | 1.0671 | 620 | 1.4308 | 0.5303 | 1.4308 | 1.1961 |
| 0.4688 | 1.0706 | 622 | 1.6467 | 0.4618 | 1.6467 | 1.2832 |
| 0.4688 | 1.0740 | 624 | 1.7846 | 0.3689 | 1.7846 | 1.3359 |
| 0.4688 | 1.0775 | 626 | 1.7537 | 0.3895 | 1.7537 | 1.3243 |
| 0.4688 | 1.0809 | 628 | 1.8000 | 0.3508 | 1.8000 | 1.3416 |
| 0.4688 | 1.0843 | 630 | 1.6521 | 0.4082 | 1.6521 | 1.2853 |
| 0.4688 | 1.0878 | 632 | 1.4026 | 0.4150 | 1.4026 | 1.1843 |
| 0.4688 | 1.0912 | 634 | 1.2383 | 0.4162 | 1.2383 | 1.1128 |
| 0.4688 | 1.0947 | 636 | 1.2379 | 0.4302 | 1.2379 | 1.1126 |
| 0.4688 | 1.0981 | 638 | 1.4804 | 0.4931 | 1.4804 | 1.2167 |
| 0.4688 | 1.1015 | 640 | 1.9837 | 0.3077 | 1.9837 | 1.4084 |
| 0.4688 | 1.1050 | 642 | 2.3142 | 0.3111 | 2.3142 | 1.5213 |
| 0.4688 | 1.1084 | 644 | 2.2089 | 0.3111 | 2.2089 | 1.4862 |
| 0.4688 | 1.1119 | 646 | 1.7429 | 0.4441 | 1.7429 | 1.3202 |
| 0.4688 | 1.1153 | 648 | 1.3058 | 0.4867 | 1.3058 | 1.1427 |
| 0.4688 | 1.1188 | 650 | 1.1453 | 0.4050 | 1.1453 | 1.0702 |
| 0.4688 | 1.1222 | 652 | 1.1783 | 0.3660 | 1.1783 | 1.0855 |
| 0.4688 | 1.1256 | 654 | 1.4268 | 0.4499 | 1.4268 | 1.1945 |
| 0.4688 | 1.1291 | 656 | 1.8679 | 0.3263 | 1.8679 | 1.3667 |
| 0.4688 | 1.1325 | 658 | 2.0844 | 0.3077 | 2.0844 | 1.4438 |
| 0.4688 | 1.1360 | 660 | 1.9268 | 0.3263 | 1.9268 | 1.3881 |
| 0.4688 | 1.1394 | 662 | 1.6032 | 0.4318 | 1.6032 | 1.2662 |
| 0.4688 | 1.1429 | 664 | 1.2319 | 0.4653 | 1.2319 | 1.1099 |
| 0.4688 | 1.1463 | 666 | 1.0912 | 0.4699 | 1.0912 | 1.0446 |
| 0.4688 | 1.1497 | 668 | 1.1504 | 0.4794 | 1.1504 | 1.0726 |
| 0.4688 | 1.1532 | 670 | 1.3568 | 0.4613 | 1.3568 | 1.1648 |
| 0.4688 | 1.1566 | 672 | 1.6972 | 0.3966 | 1.6972 | 1.3028 |
| 0.4688 | 1.1601 | 674 | 1.8359 | 0.3286 | 1.8359 | 1.3549 |
| 0.4688 | 1.1635 | 676 | 1.7075 | 0.3495 | 1.7075 | 1.3067 |
| 0.4688 | 1.1670 | 678 | 1.4275 | 0.3423 | 1.4275 | 1.1948 |
| 0.4688 | 1.1704 | 680 | 1.3039 | 0.3589 | 1.3039 | 1.1419 |
| 0.4688 | 1.1738 | 682 | 1.3535 | 0.3423 | 1.3535 | 1.1634 |
| 0.4688 | 1.1773 | 684 | 1.4162 | 0.3423 | 1.4162 | 1.1900 |
| 0.4688 | 1.1807 | 686 | 1.4481 | 0.3587 | 1.4481 | 1.2034 |
| 0.4688 | 1.1842 | 688 | 1.6275 | 0.3710 | 1.6275 | 1.2758 |
| 0.4688 | 1.1876 | 690 | 2.0199 | 0.3128 | 2.0199 | 1.4212 |
| 0.4688 | 1.1910 | 692 | 2.1541 | 0.2974 | 2.1541 | 1.4677 |
| 0.4688 | 1.1945 | 694 | 1.9390 | 0.3488 | 1.9390 | 1.3925 |
| 0.4688 | 1.1979 | 696 | 1.7317 | 0.3508 | 1.7317 | 1.3159 |
| 0.4688 | 1.2014 | 698 | 1.6720 | 0.3651 | 1.6720 | 1.2930 |
| 0.4688 | 1.2048 | 700 | 1.4833 | 0.4277 | 1.4833 | 1.2179 |
| 0.4688 | 1.2083 | 702 | 1.2812 | 0.3886 | 1.2812 | 1.1319 |
| 0.4688 | 1.2117 | 704 | 1.3087 | 0.3886 | 1.3087 | 1.1440 |
| 0.4688 | 1.2151 | 706 | 1.5113 | 0.4499 | 1.5113 | 1.2293 |
| 0.4688 | 1.2186 | 708 | 1.5872 | 0.4315 | 1.5872 | 1.2598 |
| 0.4688 | 1.2220 | 710 | 1.5091 | 0.4424 | 1.5091 | 1.2285 |
| 0.4688 | 1.2255 | 712 | 1.3816 | 0.4731 | 1.3816 | 1.1754 |
| 0.4688 | 1.2289 | 714 | 1.3766 | 0.4307 | 1.3766 | 1.1733 |
| 0.4688 | 1.2324 | 716 | 1.5279 | 0.4341 | 1.5279 | 1.2361 |
| 0.4688 | 1.2358 | 718 | 1.6499 | 0.3888 | 1.6499 | 1.2845 |
| 0.4688 | 1.2392 | 720 | 1.7388 | 0.3402 | 1.7388 | 1.3186 |
| 0.4688 | 1.2427 | 722 | 1.8172 | 0.3154 | 1.8172 | 1.3480 |
| 0.4688 | 1.2461 | 724 | 1.5898 | 0.3631 | 1.5898 | 1.2609 |
| 0.4688 | 1.2496 | 726 | 1.4515 | 0.3142 | 1.4515 | 1.2048 |
| 0.4688 | 1.2530 | 728 | 1.5786 | 0.2918 | 1.5786 | 1.2564 |
| 0.4688 | 1.2565 | 730 | 1.9029 | 0.2766 | 1.9029 | 1.3795 |
| 0.4688 | 1.2599 | 732 | 2.1169 | 0.2933 | 2.1169 | 1.4549 |
| 0.4688 | 1.2633 | 734 | 2.0363 | 0.3321 | 2.0363 | 1.4270 |
| 0.4688 | 1.2668 | 736 | 1.8432 | 0.3028 | 1.8432 | 1.3576 |
| 0.4688 | 1.2702 | 738 | 1.6426 | 0.3647 | 1.6426 | 1.2816 |
| 0.4688 | 1.2737 | 740 | 1.4735 | 0.4008 | 1.4735 | 1.2139 |
| 0.4688 | 1.2771 | 742 | 1.4606 | 0.4008 | 1.4606 | 1.2086 |
| 0.4688 | 1.2806 | 744 | 1.5299 | 0.4008 | 1.5299 | 1.2369 |
| 0.4688 | 1.2840 | 746 | 1.6040 | 0.3726 | 1.6040 | 1.2665 |
| 0.4688 | 1.2874 | 748 | 1.5435 | 0.4375 | 1.5435 | 1.2424 |
| 0.4688 | 1.2909 | 750 | 1.5961 | 0.3950 | 1.5961 | 1.2633 |
| 0.4688 | 1.2943 | 752 | 1.7206 | 0.37 | 1.7206 | 1.3117 |
| 0.4688 | 1.2978 | 754 | 1.7856 | 0.37 | 1.7856 | 1.3363 |
| 0.4688 | 1.3012 | 756 | 1.6573 | 0.4007 | 1.6573 | 1.2873 |
| 0.4688 | 1.3046 | 758 | 1.4766 | 0.3925 | 1.4766 | 1.2151 |
| 0.4688 | 1.3081 | 760 | 1.2982 | 0.3747 | 1.2982 | 1.1394 |
| 0.4688 | 1.3115 | 762 | 1.3676 | 0.3906 | 1.3676 | 1.1695 |
| 0.4688 | 1.3150 | 764 | 1.4551 | 0.3925 | 1.4551 | 1.2063 |
| 0.4688 | 1.3184 | 766 | 1.6262 | 0.3868 | 1.6262 | 1.2752 |
| 0.4688 | 1.3219 | 768 | 1.6860 | 0.4055 | 1.6860 | 1.2985 |
| 0.4688 | 1.3253 | 770 | 1.5930 | 0.4044 | 1.5930 | 1.2621 |
| 0.4688 | 1.3287 | 772 | 1.6544 | 0.4236 | 1.6544 | 1.2862 |
| 0.4688 | 1.3322 | 774 | 1.6105 | 0.3868 | 1.6105 | 1.2691 |
| 0.4688 | 1.3356 | 776 | 1.5908 | 0.3148 | 1.5908 | 1.2613 |
| 0.4688 | 1.3391 | 778 | 1.7170 | 0.3116 | 1.7170 | 1.3104 |
| 0.4688 | 1.3425 | 780 | 1.6077 | 0.3239 | 1.6077 | 1.2679 |
| 0.4688 | 1.3460 | 782 | 1.3641 | 0.4141 | 1.3641 | 1.1679 |
| 0.4688 | 1.3494 | 784 | 1.2339 | 0.4017 | 1.2339 | 1.1108 |
| 0.4688 | 1.3528 | 786 | 1.2820 | 0.4307 | 1.2820 | 1.1323 |
| 0.4688 | 1.3563 | 788 | 1.5548 | 0.4528 | 1.5548 | 1.2469 |
| 0.4688 | 1.3597 | 790 | 1.6727 | 0.4422 | 1.6727 | 1.2933 |
| 0.4688 | 1.3632 | 792 | 1.5600 | 0.4620 | 1.5600 | 1.2490 |
| 0.4688 | 1.3666 | 794 | 1.4885 | 0.4425 | 1.4885 | 1.2201 |
| 0.4688 | 1.3701 | 796 | 1.5612 | 0.4620 | 1.5612 | 1.2495 |
| 0.4688 | 1.3735 | 798 | 1.7740 | 0.406 | 1.7740 | 1.3319 |
| 0.4688 | 1.3769 | 800 | 1.7458 | 0.4165 | 1.7458 | 1.3213 |
| 0.4688 | 1.3804 | 802 | 1.5169 | 0.3971 | 1.5169 | 1.2316 |
| 0.4688 | 1.3838 | 804 | 1.2853 | 0.4271 | 1.2853 | 1.1337 |
| 0.4688 | 1.3873 | 806 | 1.2290 | 0.4017 | 1.2290 | 1.1086 |
| 0.4688 | 1.3907 | 808 | 1.3119 | 0.4046 | 1.3119 | 1.1454 |
| 0.4688 | 1.3941 | 810 | 1.5154 | 0.4071 | 1.5154 | 1.2310 |
| 0.4688 | 1.3976 | 812 | 1.4445 | 0.4190 | 1.4445 | 1.2019 |
| 0.4688 | 1.4010 | 814 | 1.3094 | 0.3947 | 1.3094 | 1.1443 |
| 0.4688 | 1.4045 | 816 | 1.2719 | 0.4167 | 1.2719 | 1.1278 |
| 0.4688 | 1.4079 | 818 | 1.3285 | 0.4755 | 1.3285 | 1.1526 |
| 0.4688 | 1.4114 | 820 | 1.4153 | 0.4014 | 1.4153 | 1.1897 |
| 0.4688 | 1.4148 | 822 | 1.4857 | 0.3962 | 1.4857 | 1.2189 |
| 0.4688 | 1.4182 | 824 | 1.6678 | 0.4162 | 1.6678 | 1.2914 |
| 0.4688 | 1.4217 | 826 | 1.6712 | 0.3864 | 1.6712 | 1.2928 |
| 0.4688 | 1.4251 | 828 | 1.5529 | 0.3962 | 1.5529 | 1.2462 |
| 0.4688 | 1.4286 | 830 | 1.3749 | 0.4523 | 1.3749 | 1.1726 |
| 0.4688 | 1.4320 | 832 | 1.3660 | 0.4039 | 1.3660 | 1.1688 |
| 0.4688 | 1.4355 | 834 | 1.4155 | 0.4046 | 1.4155 | 1.1897 |
| 0.4688 | 1.4389 | 836 | 1.4620 | 0.4520 | 1.4620 | 1.2091 |
| 0.4688 | 1.4423 | 838 | 1.5321 | 0.4261 | 1.5321 | 1.2378 |
| 0.4688 | 1.4458 | 840 | 1.3330 | 0.4762 | 1.3330 | 1.1546 |
| 0.4688 | 1.4492 | 842 | 1.3025 | 0.4661 | 1.3025 | 1.1413 |
| 0.4688 | 1.4527 | 844 | 1.5564 | 0.4722 | 1.5564 | 1.2476 |
| 0.4688 | 1.4561 | 846 | 2.0247 | 0.3698 | 2.0247 | 1.4229 |
| 0.4688 | 1.4596 | 848 | 2.1273 | 0.3708 | 2.1273 | 1.4585 |
| 0.4688 | 1.4630 | 850 | 1.9054 | 0.3689 | 1.9054 | 1.3804 |
| 0.4688 | 1.4664 | 852 | 1.5710 | 0.4261 | 1.5710 | 1.2534 |
| 0.4688 | 1.4699 | 854 | 1.4262 | 0.4402 | 1.4262 | 1.1943 |
| 0.4688 | 1.4733 | 856 | 1.2588 | 0.4404 | 1.2588 | 1.1220 |
| 0.4688 | 1.4768 | 858 | 1.2607 | 0.4404 | 1.2607 | 1.1228 |
| 0.4688 | 1.4802 | 860 | 1.3998 | 0.4183 | 1.3998 | 1.1831 |
| 0.4688 | 1.4836 | 862 | 1.5787 | 0.4543 | 1.5787 | 1.2565 |
| 0.4688 | 1.4871 | 864 | 1.5420 | 0.4543 | 1.5420 | 1.2418 |
| 0.4688 | 1.4905 | 866 | 1.3129 | 0.4836 | 1.3129 | 1.1458 |
| 0.4688 | 1.4940 | 868 | 1.2073 | 0.4762 | 1.2073 | 1.0988 |
| 0.4688 | 1.4974 | 870 | 1.2710 | 0.4431 | 1.2710 | 1.1274 |
| 0.4688 | 1.5009 | 872 | 1.5552 | 0.4424 | 1.5552 | 1.2471 |
| 0.4688 | 1.5043 | 874 | 1.6955 | 0.4214 | 1.6955 | 1.3021 |
| 0.4688 | 1.5077 | 876 | 1.7835 | 0.3482 | 1.7835 | 1.3355 |
| 0.4688 | 1.5112 | 878 | 1.7047 | 0.2792 | 1.7047 | 1.3056 |
| 0.4688 | 1.5146 | 880 | 1.5356 | 0.2482 | 1.5356 | 1.2392 |
| 0.4688 | 1.5181 | 882 | 1.3337 | 0.4268 | 1.3337 | 1.1549 |
| 0.4688 | 1.5215 | 884 | 1.3063 | 0.4268 | 1.3063 | 1.1429 |
| 0.4688 | 1.5250 | 886 | 1.2376 | 0.4665 | 1.2376 | 1.1125 |
| 0.4688 | 1.5284 | 888 | 1.3012 | 0.4302 | 1.3012 | 1.1407 |
| 0.4688 | 1.5318 | 890 | 1.2634 | 0.4302 | 1.2634 | 1.1240 |
| 0.4688 | 1.5353 | 892 | 1.2458 | 0.4302 | 1.2458 | 1.1162 |
| 0.4688 | 1.5387 | 894 | 1.2916 | 0.4404 | 1.2916 | 1.1365 |
| 0.4688 | 1.5422 | 896 | 1.3603 | 0.4273 | 1.3603 | 1.1663 |
| 0.4688 | 1.5456 | 898 | 1.4278 | 0.3095 | 1.4278 | 1.1949 |
| 0.4688 | 1.5491 | 900 | 1.4739 | 0.3738 | 1.4739 | 1.2140 |
| 0.4688 | 1.5525 | 902 | 1.5234 | 0.4314 | 1.5234 | 1.2343 |
| 0.4688 | 1.5559 | 904 | 1.5670 | 0.4084 | 1.5670 | 1.2518 |
| 0.4688 | 1.5594 | 906 | 1.5390 | 0.4364 | 1.5390 | 1.2406 |
| 0.4688 | 1.5628 | 908 | 1.4714 | 0.4545 | 1.4714 | 1.2130 |
| 0.4688 | 1.5663 | 910 | 1.2508 | 0.4302 | 1.2508 | 1.1184 |
| 0.4688 | 1.5697 | 912 | 1.1210 | 0.4677 | 1.1210 | 1.0588 |
| 0.4688 | 1.5731 | 914 | 1.1658 | 0.4302 | 1.1658 | 1.0797 |
| 0.4688 | 1.5766 | 916 | 1.2676 | 0.4627 | 1.2676 | 1.1259 |
| 0.4688 | 1.5800 | 918 | 1.3724 | 0.4183 | 1.3724 | 1.1715 |
| 0.4688 | 1.5835 | 920 | 1.6045 | 0.3371 | 1.6045 | 1.2667 |
| 0.4688 | 1.5869 | 922 | 1.6333 | 0.3371 | 1.6333 | 1.2780 |
| 0.4688 | 1.5904 | 924 | 1.5342 | 0.3386 | 1.5342 | 1.2386 |
| 0.4688 | 1.5938 | 926 | 1.2998 | 0.3916 | 1.2998 | 1.1401 |
| 0.4688 | 1.5972 | 928 | 1.0874 | 0.4522 | 1.0874 | 1.0428 |
| 0.4688 | 1.6007 | 930 | 1.1251 | 0.4870 | 1.1251 | 1.0607 |
| 0.4688 | 1.6041 | 932 | 1.4330 | 0.4718 | 1.4330 | 1.1971 |
| 0.4688 | 1.6076 | 934 | 1.8385 | 0.3945 | 1.8385 | 1.3559 |
| 0.4688 | 1.6110 | 936 | 2.0176 | 0.3945 | 2.0176 | 1.4204 |
| 0.4688 | 1.6145 | 938 | 1.8903 | 0.3945 | 1.8903 | 1.3749 |
| 0.4688 | 1.6179 | 940 | 1.5940 | 0.3661 | 1.5940 | 1.2625 |
| 0.4688 | 1.6213 | 942 | 1.2513 | 0.4130 | 1.2513 | 1.1186 |
| 0.4688 | 1.6248 | 944 | 1.0603 | 0.4802 | 1.0603 | 1.0297 |
| 0.4688 | 1.6282 | 946 | 1.0268 | 0.5093 | 1.0268 | 1.0133 |
| 0.4688 | 1.6317 | 948 | 1.0309 | 0.5093 | 1.0309 | 1.0153 |
| 0.4688 | 1.6351 | 950 | 1.1086 | 0.4911 | 1.1086 | 1.0529 |
| 0.4688 | 1.6386 | 952 | 1.2897 | 0.4375 | 1.2897 | 1.1356 |
| 0.4688 | 1.6420 | 954 | 1.6755 | 0.3422 | 1.6755 | 1.2944 |
| 0.4688 | 1.6454 | 956 | 1.8763 | 0.3971 | 1.8763 | 1.3698 |
| 0.4688 | 1.6489 | 958 | 1.7734 | 0.3837 | 1.7734 | 1.3317 |
| 0.4688 | 1.6523 | 960 | 1.4072 | 0.4915 | 1.4072 | 1.1862 |
| 0.4688 | 1.6558 | 962 | 1.0570 | 0.4643 | 1.0570 | 1.0281 |
| 0.4688 | 1.6592 | 964 | 0.9462 | 0.55 | 0.9462 | 0.9727 |
| 0.4688 | 1.6627 | 966 | 0.9891 | 0.4941 | 0.9891 | 0.9945 |
| 0.4688 | 1.6661 | 968 | 1.2026 | 0.4632 | 1.2026 | 1.0967 |
| 0.4688 | 1.6695 | 970 | 1.4644 | 0.4065 | 1.4644 | 1.2101 |
| 0.4688 | 1.6730 | 972 | 1.5232 | 0.4071 | 1.5232 | 1.2342 |
| 0.4688 | 1.6764 | 974 | 1.3752 | 0.4504 | 1.3752 | 1.1727 |
| 0.4688 | 1.6799 | 976 | 1.2080 | 0.4515 | 1.2080 | 1.0991 |
| 0.4688 | 1.6833 | 978 | 1.2493 | 0.4756 | 1.2493 | 1.1177 |
| 0.4688 | 1.6867 | 980 | 1.3882 | 0.4627 | 1.3882 | 1.1782 |
| 0.4688 | 1.6902 | 982 | 1.5413 | 0.4071 | 1.5413 | 1.2415 |
| 0.4688 | 1.6936 | 984 | 1.5163 | 0.4071 | 1.5163 | 1.2314 |
| 0.4688 | 1.6971 | 986 | 1.5206 | 0.4496 | 1.5206 | 1.2331 |
| 0.4688 | 1.7005 | 988 | 1.4576 | 0.4617 | 1.4576 | 1.2073 |
| 0.4688 | 1.7040 | 990 | 1.3039 | 0.4860 | 1.3039 | 1.1419 |
| 0.4688 | 1.7074 | 992 | 1.3377 | 0.5182 | 1.3377 | 1.1566 |
| 0.4688 | 1.7108 | 994 | 1.5609 | 0.4716 | 1.5609 | 1.2494 |
| 0.4688 | 1.7143 | 996 | 1.7178 | 0.3939 | 1.7178 | 1.3106 |
| 0.4688 | 1.7177 | 998 | 1.7034 | 0.3939 | 1.7034 | 1.3051 |
| 0.149 | 1.7212 | 1000 | 1.4455 | 0.4931 | 1.4455 | 1.2023 |
| 0.149 | 1.7246 | 1002 | 1.1984 | 0.4405 | 1.1984 | 1.0947 |
| 0.149 | 1.7281 | 1004 | 1.0897 | 0.4147 | 1.0897 | 1.0439 |
| 0.149 | 1.7315 | 1006 | 1.1296 | 0.4263 | 1.1296 | 1.0628 |
| 0.149 | 1.7349 | 1008 | 1.1369 | 0.4263 | 1.1369 | 1.0663 |
| 0.149 | 1.7384 | 1010 | 1.2488 | 0.4405 | 1.2488 | 1.1175 |
| 0.149 | 1.7418 | 1012 | 1.3514 | 0.5182 | 1.3514 | 1.1625 |
| 0.149 | 1.7453 | 1014 | 1.2686 | 0.4767 | 1.2686 | 1.1263 |
| 0.149 | 1.7487 | 1016 | 1.2867 | 0.4405 | 1.2867 | 1.1343 |
| 0.149 | 1.7522 | 1018 | 1.4202 | 0.4824 | 1.4202 | 1.1917 |
| 0.149 | 1.7556 | 1020 | 1.4963 | 0.4494 | 1.4963 | 1.2233 |
| 0.149 | 1.7590 | 1022 | 1.4385 | 0.4499 | 1.4385 | 1.1994 |
| 0.149 | 1.7625 | 1024 | 1.2353 | 0.4017 | 1.2353 | 1.1114 |
| 0.149 | 1.7659 | 1026 | 1.0180 | 0.5248 | 1.0180 | 1.0090 |
| 0.149 | 1.7694 | 1028 | 0.9676 | 0.5124 | 0.9676 | 0.9836 |
| 0.149 | 1.7728 | 1030 | 1.0248 | 0.4955 | 1.0248 | 1.0123 |
| 0.149 | 1.7762 | 1032 | 1.2177 | 0.4632 | 1.2177 | 1.1035 |
| 0.149 | 1.7797 | 1034 | 1.5036 | 0.4400 | 1.5036 | 1.2262 |
| 0.149 | 1.7831 | 1036 | 1.5322 | 0.3737 | 1.5322 | 1.2378 |
| 0.149 | 1.7866 | 1038 | 1.3371 | 0.4964 | 1.3371 | 1.1563 |
| 0.149 | 1.7900 | 1040 | 1.0728 | 0.4435 | 1.0728 | 1.0358 |
| 0.149 | 1.7935 | 1042 | 0.9916 | 0.4691 | 0.9916 | 0.9958 |
| 0.149 | 1.7969 | 1044 | 0.9994 | 0.4691 | 0.9994 | 0.9997 |
| 0.149 | 1.8003 | 1046 | 1.0302 | 0.4296 | 1.0302 | 1.0150 |
| 0.149 | 1.8038 | 1048 | 1.0848 | 0.4671 | 1.0848 | 1.0415 |
| 0.149 | 1.8072 | 1050 | 1.2206 | 0.4632 | 1.2206 | 1.1048 |
| 0.149 | 1.8107 | 1052 | 1.1909 | 0.5 | 1.1909 | 1.0913 |
| 0.149 | 1.8141 | 1054 | 1.2360 | 0.4632 | 1.2360 | 1.1117 |
| 0.149 | 1.8176 | 1056 | 1.4200 | 0.4071 | 1.4200 | 1.1916 |
| 0.149 | 1.8210 | 1058 | 1.4350 | 0.4281 | 1.4350 | 1.1979 |
| 0.149 | 1.8244 | 1060 | 1.4023 | 0.4504 | 1.4023 | 1.1842 |
| 0.149 | 1.8279 | 1062 | 1.2843 | 0.4620 | 1.2843 | 1.1333 |
| 0.149 | 1.8313 | 1064 | 1.3306 | 0.4620 | 1.3306 | 1.1535 |
| 0.149 | 1.8348 | 1066 | 1.3281 | 0.4740 | 1.3281 | 1.1525 |
| 0.149 | 1.8382 | 1068 | 1.2999 | 0.4740 | 1.2999 | 1.1401 |
| 0.149 | 1.8417 | 1070 | 1.4658 | 0.4150 | 1.4658 | 1.2107 |
| 0.149 | 1.8451 | 1072 | 1.5228 | 0.3661 | 1.5228 | 1.2340 |
| 0.149 | 1.8485 | 1074 | 1.3926 | 0.4733 | 1.3926 | 1.1801 |
| 0.149 | 1.8520 | 1076 | 1.1624 | 0.4870 | 1.1624 | 1.0781 |
| 0.149 | 1.8554 | 1078 | 0.9644 | 0.5124 | 0.9644 | 0.9820 |
| 0.149 | 1.8589 | 1080 | 0.9331 | 0.5124 | 0.9331 | 0.9660 |
| 0.149 | 1.8623 | 1082 | 1.0385 | 0.5152 | 1.0385 | 1.0191 |
| 0.149 | 1.8657 | 1084 | 1.1443 | 0.4756 | 1.1443 | 1.0697 |
| 0.149 | 1.8692 | 1086 | 1.2724 | 0.4504 | 1.2724 | 1.1280 |
| 0.149 | 1.8726 | 1088 | 1.2321 | 0.4506 | 1.2321 | 1.1100 |
| 0.149 | 1.8761 | 1090 | 1.2764 | 0.4504 | 1.2764 | 1.1298 |
| 0.149 | 1.8795 | 1092 | 1.2129 | 0.4509 | 1.2129 | 1.1013 |
| 0.149 | 1.8830 | 1094 | 1.1622 | 0.4509 | 1.1622 | 1.0780 |
| 0.149 | 1.8864 | 1096 | 1.1102 | 0.4509 | 1.1102 | 1.0536 |
| 0.149 | 1.8898 | 1098 | 1.0626 | 0.4756 | 1.0626 | 1.0308 |
| 0.149 | 1.8933 | 1100 | 0.9408 | 0.5798 | 0.9408 | 0.9700 |
| 0.149 | 1.8967 | 1102 | 0.9134 | 0.5556 | 0.9134 | 0.9557 |
| 0.149 | 1.9002 | 1104 | 0.9998 | 0.5207 | 0.9998 | 0.9999 |
| 0.149 | 1.9036 | 1106 | 1.1679 | 0.4509 | 1.1679 | 1.0807 |
| 0.149 | 1.9071 | 1108 | 1.3642 | 0.4150 | 1.3642 | 1.1680 |
| 0.149 | 1.9105 | 1110 | 1.3913 | 0.4496 | 1.3913 | 1.1795 |
| 0.149 | 1.9139 | 1112 | 1.2435 | 0.4504 | 1.2435 | 1.1151 |
| 0.149 | 1.9174 | 1114 | 1.0253 | 0.4792 | 1.0253 | 1.0126 |
| 0.149 | 1.9208 | 1116 | 0.9623 | 0.5773 | 0.9623 | 0.9810 |
| 0.149 | 1.9243 | 1118 | 0.9784 | 0.5773 | 0.9784 | 0.9891 |
| 0.149 | 1.9277 | 1120 | 1.0394 | 0.4522 | 1.0394 | 1.0195 |
| 0.149 | 1.9312 | 1122 | 1.0517 | 0.4928 | 1.0517 | 1.0255 |
| 0.149 | 1.9346 | 1124 | 1.0337 | 0.4813 | 1.0337 | 1.0167 |
| 0.149 | 1.9380 | 1126 | 1.1180 | 0.4756 | 1.1180 | 1.0573 |
| 0.149 | 1.9415 | 1128 | 1.2006 | 0.5286 | 1.2006 | 1.0957 |
| 0.149 | 1.9449 | 1130 | 1.1814 | 0.5286 | 1.1814 | 1.0869 |
| 0.149 | 1.9484 | 1132 | 1.1420 | 0.4987 | 1.1420 | 1.0686 |
| 0.149 | 1.9518 | 1134 | 1.1066 | 0.4515 | 1.1066 | 1.0519 |
| 0.149 | 1.9552 | 1136 | 1.1037 | 0.4515 | 1.1037 | 1.0506 |
| 0.149 | 1.9587 | 1138 | 1.0419 | 0.4661 | 1.0419 | 1.0207 |
| 0.149 | 1.9621 | 1140 | 0.9225 | 0.5921 | 0.9225 | 0.9605 |
| 0.149 | 1.9656 | 1142 | 0.8651 | 0.5985 | 0.8651 | 0.9301 |
| 0.149 | 1.9690 | 1144 | 0.9150 | 0.6069 | 0.9150 | 0.9566 |
| 0.149 | 1.9725 | 1146 | 1.0752 | 0.4263 | 1.0752 | 1.0369 |
| 0.149 | 1.9759 | 1148 | 1.1912 | 0.5199 | 1.1912 | 1.0914 |
| 0.149 | 1.9793 | 1150 | 1.1552 | 0.5199 | 1.1552 | 1.0748 |
| 0.149 | 1.9828 | 1152 | 1.1442 | 0.5199 | 1.1442 | 1.0697 |
| 0.149 | 1.9862 | 1154 | 1.0659 | 0.5251 | 1.0659 | 1.0324 |
| 0.149 | 1.9897 | 1156 | 1.0146 | 0.5056 | 1.0146 | 1.0073 |
| 0.149 | 1.9931 | 1158 | 0.9603 | 0.5449 | 0.9603 | 0.9799 |
| 0.149 | 1.9966 | 1160 | 1.0351 | 0.5152 | 1.0351 | 1.0174 |
| 0.149 | 2.0 | 1162 | 1.1805 | 0.4748 | 1.1805 | 1.0865 |
| 0.149 | 2.0034 | 1164 | 1.1368 | 0.4748 | 1.1368 | 1.0662 |
| 0.149 | 2.0069 | 1166 | 1.1003 | 0.5014 | 1.1003 | 1.0489 |
| 0.149 | 2.0103 | 1168 | 1.1500 | 0.4620 | 1.1500 | 1.0724 |
| 0.149 | 2.0138 | 1170 | 1.3085 | 0.4501 | 1.3085 | 1.1439 |
| 0.149 | 2.0172 | 1172 | 1.4351 | 0.4494 | 1.4351 | 1.1980 |
| 0.149 | 2.0207 | 1174 | 1.4169 | 0.4824 | 1.4169 | 1.1903 |
| 0.149 | 2.0241 | 1176 | 1.2419 | 0.5075 | 1.2419 | 1.1144 |
| 0.149 | 2.0275 | 1178 | 1.2083 | 0.5075 | 1.2083 | 1.0992 |
| 0.149 | 2.0310 | 1180 | 1.2611 | 0.5075 | 1.2611 | 1.1230 |
| 0.149 | 2.0344 | 1182 | 1.2930 | 0.4964 | 1.2930 | 1.1371 |
| 0.149 | 2.0379 | 1184 | 1.2053 | 0.5199 | 1.2053 | 1.0979 |
| 0.149 | 2.0413 | 1186 | 1.0269 | 0.5100 | 1.0269 | 1.0133 |
| 0.149 | 2.0448 | 1188 | 0.9732 | 0.4986 | 0.9732 | 0.9865 |
| 0.149 | 2.0482 | 1190 | 1.0421 | 0.5197 | 1.0421 | 1.0208 |
| 0.149 | 2.0516 | 1192 | 1.0538 | 0.4931 | 1.0538 | 1.0265 |
| 0.149 | 2.0551 | 1194 | 1.1390 | 0.4756 | 1.1390 | 1.0672 |
| 0.149 | 2.0585 | 1196 | 1.2590 | 0.4375 | 1.2590 | 1.1221 |
| 0.149 | 2.0620 | 1198 | 1.4331 | 0.3916 | 1.4331 | 1.1971 |
| 0.149 | 2.0654 | 1200 | 1.3959 | 0.3916 | 1.3959 | 1.1815 |
| 0.149 | 2.0688 | 1202 | 1.2478 | 0.4620 | 1.2478 | 1.1171 |
| 0.149 | 2.0723 | 1204 | 1.0817 | 0.4522 | 1.0817 | 1.0401 |
| 0.149 | 2.0757 | 1206 | 1.0756 | 0.4522 | 1.0756 | 1.0371 |
| 0.149 | 2.0792 | 1208 | 1.2166 | 0.4375 | 1.2166 | 1.1030 |
| 0.149 | 2.0826 | 1210 | 1.3472 | 0.5236 | 1.3472 | 1.1607 |
| 0.149 | 2.0861 | 1212 | 1.3089 | 0.5331 | 1.3089 | 1.1441 |
| 0.149 | 2.0895 | 1214 | 1.2542 | 0.5269 | 1.2542 | 1.1199 |
| 0.149 | 2.0929 | 1216 | 1.1488 | 0.5385 | 1.1488 | 1.0718 |
| 0.149 | 2.0964 | 1218 | 1.2158 | 0.5269 | 1.2158 | 1.1026 |
| 0.149 | 2.0998 | 1220 | 1.4308 | 0.4815 | 1.4308 | 1.1961 |
| 0.149 | 2.1033 | 1222 | 1.5556 | 0.4421 | 1.5556 | 1.2472 |
| 0.149 | 2.1067 | 1224 | 1.4976 | 0.48 | 1.4976 | 1.2238 |
| 0.149 | 2.1102 | 1226 | 1.4120 | 0.4150 | 1.4120 | 1.1883 |
| 0.149 | 2.1136 | 1228 | 1.3182 | 0.4136 | 1.3182 | 1.1481 |
| 0.149 | 2.1170 | 1230 | 1.2598 | 0.4136 | 1.2598 | 1.1224 |
| 0.149 | 2.1205 | 1232 | 1.1900 | 0.4504 | 1.1900 | 1.0909 |
| 0.149 | 2.1239 | 1234 | 1.1444 | 0.4136 | 1.1444 | 1.0697 |
| 0.149 | 2.1274 | 1236 | 1.1452 | 0.4136 | 1.1452 | 1.0702 |
| 0.149 | 2.1308 | 1238 | 1.1691 | 0.4136 | 1.1691 | 1.0812 |
| 0.149 | 2.1343 | 1240 | 1.2096 | 0.4136 | 1.2096 | 1.0998 |
| 0.149 | 2.1377 | 1242 | 1.3218 | 0.4375 | 1.3218 | 1.1497 |
| 0.149 | 2.1411 | 1244 | 1.3846 | 0.4150 | 1.3846 | 1.1767 |
| 0.149 | 2.1446 | 1246 | 1.2854 | 0.4375 | 1.2854 | 1.1338 |
| 0.149 | 2.1480 | 1248 | 1.1384 | 0.4375 | 1.1384 | 1.0669 |
| 0.149 | 2.1515 | 1250 | 1.0218 | 0.5093 | 1.0218 | 1.0109 |
| 0.149 | 2.1549 | 1252 | 1.0397 | 0.5207 | 1.0397 | 1.0197 |
| 0.149 | 2.1583 | 1254 | 1.1637 | 0.4509 | 1.1637 | 1.0788 |
| 0.149 | 2.1618 | 1256 | 1.1589 | 0.4509 | 1.1589 | 1.0765 |
| 0.149 | 2.1652 | 1258 | 1.1203 | 0.4740 | 1.1203 | 1.0585 |
| 0.149 | 2.1687 | 1260 | 1.0749 | 0.4740 | 1.0749 | 1.0368 |
| 0.149 | 2.1721 | 1262 | 1.0416 | 0.5014 | 1.0416 | 1.0206 |
| 0.149 | 2.1756 | 1264 | 1.1115 | 0.4740 | 1.1115 | 1.0543 |
| 0.149 | 2.1790 | 1266 | 1.0534 | 0.5119 | 1.0534 | 1.0264 |
| 0.149 | 2.1824 | 1268 | 1.0073 | 0.5014 | 1.0073 | 1.0036 |
| 0.149 | 2.1859 | 1270 | 1.0242 | 0.4515 | 1.0242 | 1.0120 |
| 0.149 | 2.1893 | 1272 | 1.0694 | 0.4643 | 1.0694 | 1.0341 |
| 0.149 | 2.1928 | 1274 | 0.9595 | 0.5702 | 0.9595 | 0.9796 |
| 0.149 | 2.1962 | 1276 | 0.9567 | 0.5702 | 0.9567 | 0.9781 |
| 0.149 | 2.1997 | 1278 | 1.1036 | 0.5013 | 1.1036 | 1.0505 |
| 0.149 | 2.2031 | 1280 | 1.3413 | 0.5119 | 1.3413 | 1.1581 |
| 0.149 | 2.2065 | 1282 | 1.3669 | 0.4624 | 1.3669 | 1.1691 |
| 0.149 | 2.2100 | 1284 | 1.2167 | 0.4637 | 1.2167 | 1.1031 |
| 0.149 | 2.2134 | 1286 | 1.0361 | 0.5041 | 1.0361 | 1.0179 |
| 0.149 | 2.2169 | 1288 | 0.9313 | 0.5041 | 0.9313 | 0.9650 |
| 0.149 | 2.2203 | 1290 | 0.8948 | 0.5978 | 0.8948 | 0.9459 |
| 0.149 | 2.2238 | 1292 | 0.9036 | 0.5511 | 0.9036 | 0.9506 |
| 0.149 | 2.2272 | 1294 | 0.8373 | 0.6992 | 0.8373 | 0.9150 |
| 0.149 | 2.2306 | 1296 | 0.9096 | 0.5281 | 0.9096 | 0.9537 |
| 0.149 | 2.2341 | 1298 | 1.0759 | 0.5013 | 1.0759 | 1.0373 |
| 0.149 | 2.2375 | 1300 | 1.1805 | 0.5013 | 1.1805 | 1.0865 |
| 0.149 | 2.2410 | 1302 | 1.1570 | 0.4903 | 1.1570 | 1.0756 |
| 0.149 | 2.2444 | 1304 | 1.1094 | 0.5473 | 1.1094 | 1.0533 |
| 0.149 | 2.2478 | 1306 | 1.1955 | 0.4631 | 1.1955 | 1.0934 |
| 0.149 | 2.2513 | 1308 | 1.3248 | 0.3983 | 1.3248 | 1.1510 |
| 0.149 | 2.2547 | 1310 | 1.4355 | 0.4036 | 1.4355 | 1.1981 |
| 0.149 | 2.2582 | 1312 | 1.3694 | 0.4504 | 1.3694 | 1.1702 |
| 0.149 | 2.2616 | 1314 | 1.1692 | 0.5593 | 1.1692 | 1.0813 |
| 0.149 | 2.2651 | 1316 | 0.9831 | 0.5556 | 0.9831 | 0.9915 |
| 0.149 | 2.2685 | 1318 | 0.9315 | 0.5457 | 0.9315 | 0.9651 |
| 0.149 | 2.2719 | 1320 | 0.9895 | 0.5358 | 0.9895 | 0.9947 |
| 0.149 | 2.2754 | 1322 | 1.1380 | 0.5615 | 1.1380 | 1.0668 |
| 0.149 | 2.2788 | 1324 | 1.1908 | 0.5709 | 1.1908 | 1.0913 |
| 0.149 | 2.2823 | 1326 | 1.1373 | 0.4793 | 1.1373 | 1.0664 |
| 0.149 | 2.2857 | 1328 | 1.0282 | 0.4813 | 1.0282 | 1.0140 |
| 0.149 | 2.2892 | 1330 | 0.9061 | 0.5645 | 0.9061 | 0.9519 |
| 0.149 | 2.2926 | 1332 | 0.8910 | 0.5645 | 0.8910 | 0.9439 |
| 0.149 | 2.2960 | 1334 | 0.9668 | 0.4957 | 0.9668 | 0.9832 |
| 0.149 | 2.2995 | 1336 | 0.9913 | 0.4957 | 0.9913 | 0.9957 |
| 0.149 | 2.3029 | 1338 | 1.0946 | 0.4793 | 1.0946 | 1.0462 |
| 0.149 | 2.3064 | 1340 | 1.1141 | 0.4793 | 1.1141 | 1.0555 |
| 0.149 | 2.3098 | 1342 | 1.0485 | 0.4813 | 1.0485 | 1.0240 |
| 0.149 | 2.3133 | 1344 | 0.9811 | 0.5354 | 0.9811 | 0.9905 |
| 0.149 | 2.3167 | 1346 | 0.9735 | 0.5354 | 0.9735 | 0.9867 |
| 0.149 | 2.3201 | 1348 | 1.0722 | 0.4813 | 1.0722 | 1.0355 |
| 0.149 | 2.3236 | 1350 | 1.3163 | 0.4032 | 1.3163 | 1.1473 |
| 0.149 | 2.3270 | 1352 | 1.4526 | 0.3587 | 1.4526 | 1.2052 |
| 0.149 | 2.3305 | 1354 | 1.4107 | 0.4015 | 1.4107 | 1.1877 |
| 0.149 | 2.3339 | 1356 | 1.2559 | 0.4130 | 1.2559 | 1.1206 |
| 0.149 | 2.3373 | 1358 | 1.1665 | 0.4903 | 1.1665 | 1.0800 |
| 0.149 | 2.3408 | 1360 | 1.1948 | 0.4903 | 1.1948 | 1.0931 |
| 0.149 | 2.3442 | 1362 | 1.2402 | 0.4756 | 1.2402 | 1.1137 |
| 0.149 | 2.3477 | 1364 | 1.2458 | 0.4756 | 1.2458 | 1.1162 |
| 0.149 | 2.3511 | 1366 | 1.2275 | 0.4756 | 1.2275 | 1.1079 |
| 0.149 | 2.3546 | 1368 | 1.1474 | 0.4661 | 1.1474 | 1.0712 |
| 0.149 | 2.3580 | 1370 | 1.1309 | 0.4661 | 1.1309 | 1.0634 |
| 0.149 | 2.3614 | 1372 | 1.1953 | 0.4509 | 1.1953 | 1.0933 |
| 0.149 | 2.3649 | 1374 | 1.1891 | 0.4509 | 1.1891 | 1.0904 |
| 0.149 | 2.3683 | 1376 | 1.1519 | 0.4649 | 1.1519 | 1.0733 |
| 0.149 | 2.3718 | 1378 | 1.0600 | 0.4813 | 1.0600 | 1.0296 |
| 0.149 | 2.3752 | 1380 | 1.1344 | 0.4793 | 1.1344 | 1.0651 |
| 0.149 | 2.3787 | 1382 | 1.2631 | 0.4532 | 1.2631 | 1.1239 |
| 0.149 | 2.3821 | 1384 | 1.2033 | 0.4532 | 1.2033 | 1.0970 |
| 0.149 | 2.3855 | 1386 | 1.0814 | 0.5056 | 1.0814 | 1.0399 |
| 0.149 | 2.3890 | 1388 | 1.0148 | 0.4813 | 1.0148 | 1.0074 |
| 0.149 | 2.3924 | 1390 | 0.9861 | 0.4682 | 0.9861 | 0.9930 |
| 0.149 | 2.3959 | 1392 | 1.1018 | 0.4661 | 1.1018 | 1.0497 |
| 0.149 | 2.3993 | 1394 | 1.1972 | 0.4512 | 1.1972 | 1.0941 |
| 0.149 | 2.4028 | 1396 | 1.1890 | 0.4767 | 1.1890 | 1.0904 |
| 0.149 | 2.4062 | 1398 | 1.0322 | 0.4784 | 1.0322 | 1.0160 |
| 0.149 | 2.4096 | 1400 | 0.9195 | 0.5381 | 0.9195 | 0.9589 |
| 0.149 | 2.4131 | 1402 | 0.8438 | 0.5610 | 0.8438 | 0.9186 |
| 0.149 | 2.4165 | 1404 | 0.8181 | 0.6003 | 0.8181 | 0.9045 |
| 0.149 | 2.4200 | 1406 | 0.9022 | 0.5671 | 0.9022 | 0.9499 |
| 0.149 | 2.4234 | 1408 | 1.0799 | 0.4759 | 1.0799 | 1.0392 |
| 0.149 | 2.4269 | 1410 | 1.3504 | 0.4824 | 1.3504 | 1.1621 |
| 0.149 | 2.4303 | 1412 | 1.3621 | 0.4499 | 1.3621 | 1.1671 |
| 0.149 | 2.4337 | 1414 | 1.1990 | 0.4614 | 1.1990 | 1.0950 |
| 0.149 | 2.4372 | 1416 | 1.0188 | 0.4969 | 1.0188 | 1.0093 |
| 0.149 | 2.4406 | 1418 | 0.9338 | 0.4969 | 0.9338 | 0.9663 |
| 0.149 | 2.4441 | 1420 | 0.9690 | 0.4969 | 0.9690 | 0.9844 |
| 0.149 | 2.4475 | 1422 | 1.1163 | 0.4620 | 1.1163 | 1.0566 |
| 0.149 | 2.4509 | 1424 | 1.3678 | 0.4499 | 1.3678 | 1.1695 |
| 0.149 | 2.4544 | 1426 | 1.5025 | 0.4238 | 1.5025 | 1.2258 |
| 0.149 | 2.4578 | 1428 | 1.4254 | 0.4401 | 1.4254 | 1.1939 |
| 0.149 | 2.4613 | 1430 | 1.1857 | 0.4860 | 1.1857 | 1.0889 |
| 0.149 | 2.4647 | 1432 | 0.9534 | 0.4845 | 0.9534 | 0.9764 |
| 0.149 | 2.4682 | 1434 | 0.9188 | 0.5431 | 0.9188 | 0.9585 |
| 0.149 | 2.4716 | 1436 | 1.0109 | 0.4698 | 1.0109 | 1.0054 |
| 0.149 | 2.4750 | 1438 | 1.2010 | 0.5167 | 1.2010 | 1.0959 |
| 0.149 | 2.4785 | 1440 | 1.3560 | 0.5094 | 1.3560 | 1.1645 |
| 0.149 | 2.4819 | 1442 | 1.3984 | 0.4921 | 1.3984 | 1.1826 |
| 0.149 | 2.4854 | 1444 | 1.2735 | 0.4499 | 1.2735 | 1.1285 |
| 0.149 | 2.4888 | 1446 | 1.1865 | 0.4620 | 1.1865 | 1.0893 |
| 0.149 | 2.4923 | 1448 | 1.1769 | 0.4620 | 1.1769 | 1.0848 |
| 0.149 | 2.4957 | 1450 | 1.2425 | 0.4375 | 1.2425 | 1.1147 |
| 0.149 | 2.4991 | 1452 | 1.2798 | 0.4375 | 1.2798 | 1.1313 |
| 0.149 | 2.5026 | 1454 | 1.2077 | 0.4375 | 1.2077 | 1.0989 |
| 0.149 | 2.5060 | 1456 | 1.1211 | 0.4748 | 1.1211 | 1.0588 |
| 0.149 | 2.5095 | 1458 | 1.1909 | 0.4504 | 1.1909 | 1.0913 |
| 0.149 | 2.5129 | 1460 | 1.2451 | 0.4622 | 1.2451 | 1.1158 |
| 0.149 | 2.5164 | 1462 | 1.3794 | 0.4281 | 1.3794 | 1.1745 |
| 0.149 | 2.5198 | 1464 | 1.4526 | 0.3673 | 1.4526 | 1.2052 |
| 0.149 | 2.5232 | 1466 | 1.4332 | 0.3859 | 1.4332 | 1.1971 |
| 0.149 | 2.5267 | 1468 | 1.5134 | 0.3526 | 1.5134 | 1.2302 |
| 0.149 | 2.5301 | 1470 | 1.4810 | 0.3713 | 1.4810 | 1.2169 |
| 0.149 | 2.5336 | 1472 | 1.5012 | 0.3587 | 1.5012 | 1.2252 |
| 0.149 | 2.5370 | 1474 | 1.4480 | 0.3587 | 1.4480 | 1.2033 |
| 0.149 | 2.5404 | 1476 | 1.3743 | 0.3916 | 1.3743 | 1.1723 |
| 0.149 | 2.5439 | 1478 | 1.2327 | 0.4375 | 1.2327 | 1.1103 |
| 0.149 | 2.5473 | 1480 | 1.1515 | 0.4375 | 1.1515 | 1.0731 |
| 0.149 | 2.5508 | 1482 | 0.9774 | 0.4440 | 0.9774 | 0.9886 |
| 0.149 | 2.5542 | 1484 | 0.9198 | 0.5556 | 0.9198 | 0.9591 |
| 0.149 | 2.5577 | 1486 | 0.9935 | 0.4691 | 0.9935 | 0.9968 |
| 0.149 | 2.5611 | 1488 | 1.1181 | 0.4506 | 1.1181 | 1.0574 |
| 0.149 | 2.5645 | 1490 | 1.1978 | 0.4375 | 1.1978 | 1.0944 |
| 0.149 | 2.5680 | 1492 | 1.1113 | 0.4271 | 1.1113 | 1.0542 |
| 0.149 | 2.5714 | 1494 | 1.0902 | 0.4271 | 1.0902 | 1.0441 |
| 0.149 | 2.5749 | 1496 | 1.0674 | 0.4405 | 1.0674 | 1.0332 |
| 0.149 | 2.5783 | 1498 | 0.9632 | 0.5072 | 0.9632 | 0.9814 |
| 0.1033 | 2.5818 | 1500 | 0.8536 | 0.5734 | 0.8536 | 0.9239 |
| 0.1033 | 2.5852 | 1502 | 0.8437 | 0.5734 | 0.8437 | 0.9185 |
| 0.1033 | 2.5886 | 1504 | 0.9248 | 0.5124 | 0.9248 | 0.9617 |
| 0.1033 | 2.5921 | 1506 | 1.0186 | 0.4773 | 1.0186 | 1.0093 |
| 0.1033 | 2.5955 | 1508 | 1.1388 | 0.4130 | 1.1388 | 1.0671 |
| 0.1033 | 2.5990 | 1510 | 1.2539 | 0.4375 | 1.2539 | 1.1198 |
| 0.1033 | 2.6024 | 1512 | 1.2792 | 0.4375 | 1.2792 | 1.1310 |
| 0.1033 | 2.6059 | 1514 | 1.2226 | 0.4375 | 1.2226 | 1.1057 |
| 0.1033 | 2.6093 | 1516 | 1.2535 | 0.4375 | 1.2535 | 1.1196 |
| 0.1033 | 2.6127 | 1518 | 1.3668 | 0.4496 | 1.3668 | 1.1691 |
| 0.1033 | 2.6162 | 1520 | 1.3909 | 0.4282 | 1.3909 | 1.1794 |
| 0.1033 | 2.6196 | 1522 | 1.2536 | 0.4375 | 1.2536 | 1.1197 |
| 0.1033 | 2.6231 | 1524 | 1.0973 | 0.5014 | 1.0973 | 1.0475 |
| 0.1033 | 2.6265 | 1526 | 0.9922 | 0.4408 | 0.9922 | 0.9961 |
| 0.1033 | 2.6299 | 1528 | 1.0357 | 0.4408 | 1.0357 | 1.0177 |
| 0.1033 | 2.6334 | 1530 | 1.2147 | 0.4627 | 1.2147 | 1.1021 |
| 0.1033 | 2.6368 | 1532 | 1.3750 | 0.4076 | 1.3750 | 1.1726 |
| 0.1033 | 2.6403 | 1534 | 1.3758 | 0.4076 | 1.3758 | 1.1730 |
| 0.1033 | 2.6437 | 1536 | 1.3997 | 0.3942 | 1.3997 | 1.1831 |
| 0.1033 | 2.6472 | 1538 | 1.3109 | 0.4154 | 1.3109 | 1.1449 |
| 0.1033 | 2.6506 | 1540 | 1.2992 | 0.4154 | 1.2992 | 1.1398 |
| 0.1033 | 2.6540 | 1542 | 1.3161 | 0.4154 | 1.3161 | 1.1472 |
| 0.1033 | 2.6575 | 1544 | 1.2925 | 0.4154 | 1.2925 | 1.1369 |
| 0.1033 | 2.6609 | 1546 | 1.2469 | 0.4277 | 1.2469 | 1.1166 |
| 0.1033 | 2.6644 | 1548 | 1.2440 | 0.4499 | 1.2440 | 1.1153 |
| 0.1033 | 2.6678 | 1550 | 1.2647 | 0.4499 | 1.2647 | 1.1246 |
| 0.1033 | 2.6713 | 1552 | 1.2186 | 0.4726 | 1.2186 | 1.1039 |
| 0.1033 | 2.6747 | 1554 | 1.1844 | 0.4748 | 1.1844 | 1.0883 |
| 0.1033 | 2.6781 | 1556 | 1.1706 | 0.5014 | 1.1706 | 1.0819 |
| 0.1033 | 2.6816 | 1558 | 1.1108 | 0.5014 | 1.1108 | 1.0540 |
| 0.1033 | 2.6850 | 1560 | 1.0942 | 0.4773 | 1.0942 | 1.0460 |
| 0.1033 | 2.6885 | 1562 | 1.0795 | 0.4773 | 1.0795 | 1.0390 |
| 0.1033 | 2.6919 | 1564 | 1.1301 | 0.5119 | 1.1301 | 1.0631 |
| 0.1033 | 2.6954 | 1566 | 1.1120 | 0.5014 | 1.1120 | 1.0545 |
| 0.1033 | 2.6988 | 1568 | 1.0804 | 0.4773 | 1.0804 | 1.0394 |
| 0.1033 | 2.7022 | 1570 | 1.0654 | 0.4773 | 1.0654 | 1.0322 |
| 0.1033 | 2.7057 | 1572 | 1.0473 | 0.5043 | 1.0473 | 1.0234 |
| 0.1033 | 2.7091 | 1574 | 1.0060 | 0.5350 | 1.0060 | 1.0030 |
| 0.1033 | 2.7126 | 1576 | 1.0276 | 0.4792 | 1.0276 | 1.0137 |
| 0.1033 | 2.7160 | 1578 | 1.0629 | 0.4515 | 1.0629 | 1.0310 |
| 0.1033 | 2.7194 | 1580 | 1.0477 | 0.4515 | 1.0477 | 1.0236 |
| 0.1033 | 2.7229 | 1582 | 1.1148 | 0.4870 | 1.1148 | 1.0559 |
| 0.1033 | 2.7263 | 1584 | 1.2411 | 0.4952 | 1.2411 | 1.1141 |
| 0.1033 | 2.7298 | 1586 | 1.2866 | 0.4729 | 1.2866 | 1.1343 |
| 0.1033 | 2.7332 | 1588 | 1.1827 | 0.5291 | 1.1827 | 1.0875 |
| 0.1033 | 2.7367 | 1590 | 0.9838 | 0.5742 | 0.9838 | 0.9919 |
| 0.1033 | 2.7401 | 1592 | 0.9113 | 0.5714 | 0.9113 | 0.9546 |
| 0.1033 | 2.7435 | 1594 | 0.9249 | 0.5535 | 0.9249 | 0.9617 |
| 0.1033 | 2.7470 | 1596 | 1.0001 | 0.5742 | 1.0001 | 1.0001 |
| 0.1033 | 2.7504 | 1598 | 1.1198 | 0.5543 | 1.1198 | 1.0582 |
| 0.1033 | 2.7539 | 1600 | 1.0556 | 0.5663 | 1.0556 | 1.0274 |
| 0.1033 | 2.7573 | 1602 | 0.9453 | 0.5742 | 0.9453 | 0.9723 |
| 0.1033 | 2.7608 | 1604 | 0.9663 | 0.5489 | 0.9663 | 0.9830 |
| 0.1033 | 2.7642 | 1606 | 0.9483 | 0.5535 | 0.9483 | 0.9738 |
| 0.1033 | 2.7676 | 1608 | 1.0421 | 0.5444 | 1.0421 | 1.0208 |
| 0.1033 | 2.7711 | 1610 | 1.0516 | 0.5372 | 1.0516 | 1.0255 |
| 0.1033 | 2.7745 | 1612 | 0.9964 | 0.5843 | 0.9964 | 0.9982 |
| 0.1033 | 2.7780 | 1614 | 0.9867 | 0.5843 | 0.9867 | 0.9933 |
| 0.1033 | 2.7814 | 1616 | 0.8813 | 0.6591 | 0.8813 | 0.9388 |
| 0.1033 | 2.7849 | 1618 | 0.8543 | 0.6319 | 0.8543 | 0.9243 |
| 0.1033 | 2.7883 | 1620 | 0.8288 | 0.6568 | 0.8288 | 0.9104 |
| 0.1033 | 2.7917 | 1622 | 0.8095 | 0.6568 | 0.8095 | 0.8997 |
| 0.1033 | 2.7952 | 1624 | 0.8573 | 0.6568 | 0.8573 | 0.9259 |
| 0.1033 | 2.7986 | 1626 | 1.0046 | 0.6058 | 1.0046 | 1.0023 |
| 0.1033 | 2.8021 | 1628 | 1.0539 | 0.5953 | 1.0539 | 1.0266 |
| 0.1033 | 2.8055 | 1630 | 1.0595 | 0.5953 | 1.0595 | 1.0293 |
| 0.1033 | 2.8090 | 1632 | 1.0165 | 0.5700 | 1.0165 | 1.0082 |
| 0.1033 | 2.8124 | 1634 | 0.9537 | 0.5641 | 0.9537 | 0.9766 |
| 0.1033 | 2.8158 | 1636 | 0.8282 | 0.6176 | 0.8282 | 0.9101 |
| 0.1033 | 2.8193 | 1638 | 0.7960 | 0.6358 | 0.7960 | 0.8922 |
| 0.1033 | 2.8227 | 1640 | 0.8002 | 0.6587 | 0.8002 | 0.8945 |
| 0.1033 | 2.8262 | 1642 | 0.8207 | 0.5957 | 0.8207 | 0.9059 |
| 0.1033 | 2.8296 | 1644 | 0.8362 | 0.5431 | 0.8362 | 0.9145 |
| 0.1033 | 2.8330 | 1646 | 0.8746 | 0.5381 | 0.8746 | 0.9352 |
| 0.1033 | 2.8365 | 1648 | 0.9888 | 0.5161 | 0.9888 | 0.9944 |
| 0.1033 | 2.8399 | 1650 | 1.0411 | 0.4759 | 1.0411 | 1.0203 |
| 0.1033 | 2.8434 | 1652 | 0.9762 | 0.5257 | 0.9762 | 0.9880 |
| 0.1033 | 2.8468 | 1654 | 0.9244 | 0.55 | 0.9244 | 0.9614 |
| 0.1033 | 2.8503 | 1656 | 0.8932 | 0.5381 | 0.8932 | 0.9451 |
| 0.1033 | 2.8537 | 1658 | 0.9756 | 0.5152 | 0.9756 | 0.9877 |
| 0.1033 | 2.8571 | 1660 | 1.0374 | 0.4509 | 1.0374 | 1.0185 |
| 0.1033 | 2.8606 | 1662 | 1.0318 | 0.4509 | 1.0318 | 1.0158 |
| 0.1033 | 2.8640 | 1664 | 1.1132 | 0.4509 | 1.1132 | 1.0551 |
| 0.1033 | 2.8675 | 1666 | 1.1972 | 0.4506 | 1.1972 | 1.0942 |
| 0.1033 | 2.8709 | 1668 | 1.3543 | 0.4402 | 1.3543 | 1.1637 |
| 0.1033 | 2.8744 | 1670 | 1.3665 | 0.4517 | 1.3665 | 1.1690 |
| 0.1033 | 2.8778 | 1672 | 1.2826 | 0.4627 | 1.2826 | 1.1325 |
| 0.1033 | 2.8812 | 1674 | 1.1555 | 0.4627 | 1.1555 | 1.0750 |
| 0.1033 | 2.8847 | 1676 | 1.0061 | 0.4515 | 1.0061 | 1.0030 |
| 0.1033 | 2.8881 | 1678 | 1.0126 | 0.4515 | 1.0126 | 1.0063 |
| 0.1033 | 2.8916 | 1680 | 1.0278 | 0.4515 | 1.0278 | 1.0138 |
| 0.1033 | 2.8950 | 1682 | 0.9842 | 0.4515 | 0.9842 | 0.9921 |
| 0.1033 | 2.8985 | 1684 | 0.9636 | 0.4522 | 0.9636 | 0.9816 |
| 0.1033 | 2.9019 | 1686 | 0.9177 | 0.55 | 0.9177 | 0.9580 |
| 0.1033 | 2.9053 | 1688 | 0.9345 | 0.4957 | 0.9345 | 0.9667 |
| 0.1033 | 2.9088 | 1690 | 0.9447 | 0.5068 | 0.9447 | 0.9720 |
| 0.1033 | 2.9122 | 1692 | 1.0556 | 0.4987 | 1.0556 | 1.0274 |
| 0.1033 | 2.9157 | 1694 | 1.3094 | 0.4545 | 1.3094 | 1.1443 |
| 0.1033 | 2.9191 | 1696 | 1.4154 | 0.4801 | 1.4154 | 1.1897 |
| 0.1033 | 2.9225 | 1698 | 1.3657 | 0.4808 | 1.3657 | 1.1686 |
| 0.1033 | 2.9260 | 1700 | 1.2180 | 0.4918 | 1.2180 | 1.1037 |
| 0.1033 | 2.9294 | 1702 | 1.0349 | 0.5085 | 1.0349 | 1.0173 |
| 0.1033 | 2.9329 | 1704 | 0.9360 | 0.4957 | 0.9360 | 0.9675 |
| 0.1033 | 2.9363 | 1706 | 0.9089 | 0.55 | 0.9089 | 0.9534 |
| 0.1033 | 2.9398 | 1708 | 0.9613 | 0.4668 | 0.9613 | 0.9805 |
| 0.1033 | 2.9432 | 1710 | 1.0693 | 0.4509 | 1.0693 | 1.0341 |
| 0.1033 | 2.9466 | 1712 | 1.1583 | 0.4964 | 1.1583 | 1.0763 |
| 0.1033 | 2.9501 | 1714 | 1.1347 | 0.4964 | 1.1347 | 1.0652 |
| 0.1033 | 2.9535 | 1716 | 1.1204 | 0.4964 | 1.1204 | 1.0585 |
| 0.1033 | 2.9570 | 1718 | 1.0022 | 0.4870 | 1.0022 | 1.0011 |
| 0.1033 | 2.9604 | 1720 | 0.8623 | 0.5985 | 0.8623 | 0.9286 |
| 0.1033 | 2.9639 | 1722 | 0.7352 | 0.6018 | 0.7352 | 0.8574 |
| 0.1033 | 2.9673 | 1724 | 0.6942 | 0.6184 | 0.6942 | 0.8332 |
| 0.1033 | 2.9707 | 1726 | 0.7069 | 0.6143 | 0.7069 | 0.8408 |
| 0.1033 | 2.9742 | 1728 | 0.7887 | 0.6702 | 0.7887 | 0.8881 |
| 0.1033 | 2.9776 | 1730 | 0.9490 | 0.6831 | 0.9490 | 0.9742 |
| 0.1033 | 2.9811 | 1732 | 1.1386 | 0.5132 | 1.1386 | 1.0671 |
| 0.1033 | 2.9845 | 1734 | 1.1797 | 0.5236 | 1.1797 | 1.0861 |
| 0.1033 | 2.9880 | 1736 | 1.0937 | 0.4632 | 1.0937 | 1.0458 |
| 0.1033 | 2.9914 | 1738 | 0.9621 | 0.5448 | 0.9621 | 0.9808 |
| 0.1033 | 2.9948 | 1740 | 0.8125 | 0.5765 | 0.8125 | 0.9014 |
| 0.1033 | 2.9983 | 1742 | 0.7603 | 0.5994 | 0.7603 | 0.8720 |
| 0.1033 | 3.0017 | 1744 | 0.7793 | 0.5823 | 0.7793 | 0.8828 |
| 0.1033 | 3.0052 | 1746 | 0.8640 | 0.55 | 0.8640 | 0.9295 |
| 0.1033 | 3.0086 | 1748 | 1.0199 | 0.5488 | 1.0199 | 1.0099 |
| 0.1033 | 3.0120 | 1750 | 1.1878 | 0.4941 | 1.1878 | 1.0898 |
| 0.1033 | 3.0155 | 1752 | 1.1880 | 0.5119 | 1.1880 | 1.0900 |
| 0.1033 | 3.0189 | 1754 | 1.0612 | 0.5168 | 1.0612 | 1.0301 |
| 0.1033 | 3.0224 | 1756 | 0.8860 | 0.6473 | 0.8860 | 0.9413 |
| 0.1033 | 3.0258 | 1758 | 0.7876 | 0.6358 | 0.7876 | 0.8875 |
| 0.1033 | 3.0293 | 1760 | 0.7671 | 0.6239 | 0.7671 | 0.8758 |
| 0.1033 | 3.0327 | 1762 | 0.8239 | 0.5970 | 0.8239 | 0.9077 |
| 0.1033 | 3.0361 | 1764 | 0.9766 | 0.5678 | 0.9766 | 0.9882 |
| 0.1033 | 3.0396 | 1766 | 1.1462 | 0.4851 | 1.1462 | 1.0706 |
| 0.1033 | 3.0430 | 1768 | 1.1778 | 0.4851 | 1.1778 | 1.0853 |
| 0.1033 | 3.0465 | 1770 | 1.0997 | 0.4851 | 1.0997 | 1.0487 |
| 0.1033 | 3.0499 | 1772 | 1.0087 | 0.5119 | 1.0087 | 1.0043 |
| 0.1033 | 3.0534 | 1774 | 0.8883 | 0.5885 | 0.8883 | 0.9425 |
| 0.1033 | 3.0568 | 1776 | 0.8288 | 0.5859 | 0.8288 | 0.9104 |
| 0.1033 | 3.0602 | 1778 | 0.8603 | 0.6585 | 0.8603 | 0.9275 |
| 0.1033 | 3.0637 | 1780 | 0.9329 | 0.5512 | 0.9329 | 0.9659 |
| 0.1033 | 3.0671 | 1782 | 0.9575 | 0.5512 | 0.9575 | 0.9785 |
| 0.1033 | 3.0706 | 1784 | 0.9556 | 0.5809 | 0.9556 | 0.9776 |
| 0.1033 | 3.0740 | 1786 | 0.9581 | 0.5257 | 0.9581 | 0.9788 |
| 0.1033 | 3.0775 | 1788 | 0.8838 | 0.628 | 0.8838 | 0.9401 |
| 0.1033 | 3.0809 | 1790 | 0.7836 | 0.5859 | 0.7836 | 0.8852 |
| 0.1033 | 3.0843 | 1792 | 0.7436 | 0.5613 | 0.7436 | 0.8623 |
| 0.1033 | 3.0878 | 1794 | 0.7580 | 0.5613 | 0.7580 | 0.8707 |
| 0.1033 | 3.0912 | 1796 | 0.8375 | 0.5708 | 0.8375 | 0.9151 |
| 0.1033 | 3.0947 | 1798 | 0.9788 | 0.5401 | 0.9788 | 0.9894 |
| 0.1033 | 3.0981 | 1800 | 1.0471 | 0.5014 | 1.0471 | 1.0233 |
| 0.1033 | 3.1015 | 1802 | 1.0298 | 0.5014 | 1.0298 | 1.0148 |
| 0.1033 | 3.1050 | 1804 | 1.0492 | 0.5014 | 1.0492 | 1.0243 |
| 0.1033 | 3.1084 | 1806 | 1.0353 | 0.5014 | 1.0353 | 1.0175 |
| 0.1033 | 3.1119 | 1808 | 0.9093 | 0.5227 | 0.9093 | 0.9536 |
| 0.1033 | 3.1153 | 1810 | 0.8580 | 0.5154 | 0.8580 | 0.9263 |
| 0.1033 | 3.1188 | 1812 | 0.8508 | 0.5405 | 0.8508 | 0.9224 |
| 0.1033 | 3.1222 | 1814 | 0.9009 | 0.5588 | 0.9009 | 0.9491 |
| 0.1033 | 3.1256 | 1816 | 0.9373 | 0.5512 | 0.9373 | 0.9681 |
| 0.1033 | 3.1291 | 1818 | 0.9255 | 0.5998 | 0.9255 | 0.9620 |
| 0.1033 | 3.1325 | 1820 | 0.8674 | 0.6170 | 0.8674 | 0.9313 |
| 0.1033 | 3.1360 | 1822 | 0.8533 | 0.5864 | 0.8533 | 0.9238 |
| 0.1033 | 3.1394 | 1824 | 0.7864 | 0.5307 | 0.7864 | 0.8868 |
| 0.1033 | 3.1429 | 1826 | 0.8076 | 0.5307 | 0.8076 | 0.8987 |
| 0.1033 | 3.1463 | 1828 | 0.8921 | 0.5354 | 0.8921 | 0.9445 |
| 0.1033 | 3.1497 | 1830 | 1.0775 | 0.5014 | 1.0775 | 1.0380 |
| 0.1033 | 3.1532 | 1832 | 1.3391 | 0.4015 | 1.3391 | 1.1572 |
| 0.1033 | 3.1566 | 1834 | 1.4934 | 0.3461 | 1.4934 | 1.2220 |
| 0.1033 | 3.1601 | 1836 | 1.4887 | 0.3461 | 1.4887 | 1.2201 |
| 0.1033 | 3.1635 | 1838 | 1.3744 | 0.4015 | 1.3744 | 1.1724 |
| 0.1033 | 3.1670 | 1840 | 1.1818 | 0.4773 | 1.1818 | 1.0871 |
| 0.1033 | 3.1704 | 1842 | 1.0167 | 0.5093 | 1.0167 | 1.0083 |
| 0.1033 | 3.1738 | 1844 | 0.9557 | 0.5528 | 0.9557 | 0.9776 |
| 0.1033 | 3.1773 | 1846 | 0.9550 | 0.5681 | 0.9550 | 0.9773 |
| 0.1033 | 3.1807 | 1848 | 1.0562 | 0.5310 | 1.0562 | 1.0277 |
| 0.1033 | 3.1842 | 1850 | 1.2543 | 0.4737 | 1.2543 | 1.1200 |
| 0.1033 | 3.1876 | 1852 | 1.3284 | 0.4727 | 1.3284 | 1.1525 |
| 0.1033 | 3.1910 | 1854 | 1.3672 | 0.4727 | 1.3672 | 1.1693 |
| 0.1033 | 3.1945 | 1856 | 1.2573 | 0.4737 | 1.2573 | 1.1213 |
| 0.1033 | 3.1979 | 1858 | 1.0670 | 0.5673 | 1.0670 | 1.0330 |
| 0.1033 | 3.2014 | 1860 | 0.9530 | 0.5681 | 0.9530 | 0.9762 |
| 0.1033 | 3.2048 | 1862 | 0.9401 | 0.5681 | 0.9401 | 0.9696 |
| 0.1033 | 3.2083 | 1864 | 0.9856 | 0.5591 | 0.9856 | 0.9928 |
| 0.1033 | 3.2117 | 1866 | 1.1012 | 0.4756 | 1.1012 | 1.0494 |
| 0.1033 | 3.2151 | 1868 | 1.2482 | 0.4851 | 1.2482 | 1.1172 |
| 0.1033 | 3.2186 | 1870 | 1.2552 | 0.4851 | 1.2552 | 1.1203 |
| 0.1033 | 3.2220 | 1872 | 1.2275 | 0.4860 | 1.2275 | 1.1079 |
| 0.1033 | 3.2255 | 1874 | 1.1123 | 0.5233 | 1.1123 | 1.0547 |
| 0.1033 | 3.2289 | 1876 | 1.0033 | 0.5072 | 1.0033 | 1.0017 |
| 0.1033 | 3.2324 | 1878 | 0.9396 | 0.5708 | 0.9396 | 0.9693 |
| 0.1033 | 3.2358 | 1880 | 0.9812 | 0.5921 | 0.9812 | 0.9906 |
| 0.1033 | 3.2392 | 1882 | 1.1224 | 0.4756 | 1.1224 | 1.0594 |
| 0.1033 | 3.2427 | 1884 | 1.2533 | 0.4860 | 1.2533 | 1.1195 |
| 0.1033 | 3.2461 | 1886 | 1.2257 | 0.4860 | 1.2257 | 1.1071 |
| 0.1033 | 3.2496 | 1888 | 1.1951 | 0.4870 | 1.1951 | 1.0932 |
| 0.1033 | 3.2530 | 1890 | 1.0865 | 0.5538 | 1.0865 | 1.0424 |
| 0.1033 | 3.2565 | 1892 | 0.9763 | 0.5350 | 0.9763 | 0.9881 |
| 0.1033 | 3.2599 | 1894 | 0.9354 | 0.5528 | 0.9354 | 0.9672 |
| 0.1033 | 3.2633 | 1896 | 0.9705 | 0.5350 | 0.9705 | 0.9852 |
| 0.1033 | 3.2668 | 1898 | 1.0805 | 0.5826 | 1.0805 | 1.0395 |
| 0.1033 | 3.2702 | 1900 | 1.1661 | 0.4870 | 1.1661 | 1.0798 |
| 0.1033 | 3.2737 | 1902 | 1.1458 | 0.4870 | 1.1458 | 1.0704 |
| 0.1033 | 3.2771 | 1904 | 1.0881 | 0.5622 | 1.0881 | 1.0431 |
| 0.1033 | 3.2806 | 1906 | 0.9959 | 0.5850 | 0.9959 | 0.9980 |
| 0.1033 | 3.2840 | 1908 | 0.9474 | 0.5991 | 0.9474 | 0.9733 |
| 0.1033 | 3.2874 | 1910 | 0.9980 | 0.5850 | 0.9980 | 0.9990 |
| 0.1033 | 3.2909 | 1912 | 1.1159 | 0.4870 | 1.1159 | 1.0564 |
| 0.1033 | 3.2943 | 1914 | 1.1640 | 0.4870 | 1.1640 | 1.0789 |
| 0.1033 | 3.2978 | 1916 | 1.1581 | 0.4870 | 1.1581 | 1.0762 |
| 0.1033 | 3.3012 | 1918 | 1.0516 | 0.5593 | 1.0516 | 1.0255 |
| 0.1033 | 3.3046 | 1920 | 0.9218 | 0.5528 | 0.9218 | 0.9601 |
| 0.1033 | 3.3081 | 1922 | 0.8325 | 0.5585 | 0.8325 | 0.9124 |
| 0.1033 | 3.3115 | 1924 | 0.8145 | 0.5457 | 0.8145 | 0.9025 |
| 0.1033 | 3.3150 | 1926 | 0.8418 | 0.5457 | 0.8418 | 0.9175 |
| 0.1033 | 3.3184 | 1928 | 0.9170 | 0.5681 | 0.9170 | 0.9576 |
| 0.1033 | 3.3219 | 1930 | 1.0800 | 0.5357 | 1.0800 | 1.0392 |
| 0.1033 | 3.3253 | 1932 | 1.1683 | 0.4632 | 1.1683 | 1.0809 |
| 0.1033 | 3.3287 | 1934 | 1.1104 | 0.4632 | 1.1104 | 1.0537 |
| 0.1033 | 3.3322 | 1936 | 0.9774 | 0.5921 | 0.9774 | 0.9886 |
| 0.1033 | 3.3356 | 1938 | 0.9005 | 0.5798 | 0.9005 | 0.9490 |
| 0.1033 | 3.3391 | 1940 | 0.8885 | 0.5798 | 0.8885 | 0.9426 |
| 0.1033 | 3.3425 | 1942 | 0.9294 | 0.5921 | 0.9294 | 0.9641 |
| 0.1033 | 3.3460 | 1944 | 0.9460 | 0.5773 | 0.9460 | 0.9726 |
| 0.1033 | 3.3494 | 1946 | 0.9835 | 0.5773 | 0.9835 | 0.9917 |
| 0.1033 | 3.3528 | 1948 | 0.9877 | 0.5350 | 0.9877 | 0.9938 |
| 0.1033 | 3.3563 | 1950 | 0.9905 | 0.5350 | 0.9905 | 0.9953 |
| 0.1033 | 3.3597 | 1952 | 1.0221 | 0.5350 | 1.0221 | 1.0110 |
| 0.1033 | 3.3632 | 1954 | 1.0649 | 0.4522 | 1.0649 | 1.0319 |
| 0.1033 | 3.3666 | 1956 | 1.0021 | 0.5350 | 1.0021 | 1.0011 |
| 0.1033 | 3.3701 | 1958 | 0.9492 | 0.5921 | 0.9492 | 0.9742 |
| 0.1033 | 3.3735 | 1960 | 0.8821 | 0.5556 | 0.8821 | 0.9392 |
| 0.1033 | 3.3769 | 1962 | 0.8674 | 0.5457 | 0.8674 | 0.9313 |
| 0.1033 | 3.3804 | 1964 | 0.9063 | 0.5429 | 0.9063 | 0.9520 |
| 0.1033 | 3.3838 | 1966 | 1.0080 | 0.5207 | 1.0080 | 1.0040 |
| 0.1033 | 3.3873 | 1968 | 1.1308 | 0.4649 | 1.1308 | 1.0634 |
| 0.1033 | 3.3907 | 1970 | 1.1997 | 0.5104 | 1.1997 | 1.0953 |
| 0.1033 | 3.3941 | 1972 | 1.2196 | 0.4733 | 1.2196 | 1.1043 |
| 0.1033 | 3.3976 | 1974 | 1.1876 | 0.4974 | 1.1876 | 1.0898 |
| 0.1033 | 3.4010 | 1976 | 1.0766 | 0.4881 | 1.0766 | 1.0376 |
| 0.1033 | 3.4045 | 1978 | 0.9846 | 0.5350 | 0.9846 | 0.9923 |
| 0.1033 | 3.4079 | 1980 | 0.9002 | 0.5681 | 0.9002 | 0.9488 |
| 0.1033 | 3.4114 | 1982 | 0.9014 | 0.5528 | 0.9014 | 0.9494 |
| 0.1033 | 3.4148 | 1984 | 0.9379 | 0.5093 | 0.9379 | 0.9685 |
| 0.1033 | 3.4182 | 1986 | 1.0209 | 0.5448 | 1.0209 | 1.0104 |
| 0.1033 | 3.4217 | 1988 | 1.0931 | 0.5401 | 1.0931 | 1.0455 |
| 0.1033 | 3.4251 | 1990 | 1.1190 | 0.4892 | 1.1190 | 1.0578 |
| 0.1033 | 3.4286 | 1992 | 1.0829 | 0.5152 | 1.0829 | 1.0406 |
| 0.1033 | 3.4320 | 1994 | 1.0241 | 0.5448 | 1.0241 | 1.0120 |
| 0.1033 | 3.4355 | 1996 | 1.0594 | 0.4915 | 1.0594 | 1.0293 |
| 0.1033 | 3.4389 | 1998 | 1.0476 | 0.4792 | 1.0476 | 1.0235 |
| 0.0835 | 3.4423 | 2000 | 1.0039 | 0.5350 | 1.0039 | 1.0020 |
| 0.0835 | 3.4458 | 2002 | 1.0005 | 0.5350 | 1.0005 | 1.0002 |
| 0.0835 | 3.4492 | 2004 | 1.0611 | 0.4522 | 1.0611 | 1.0301 |
| 0.0835 | 3.4527 | 2006 | 1.0933 | 0.5014 | 1.0933 | 1.0456 |
| 0.0835 | 3.4561 | 2008 | 1.1978 | 0.4748 | 1.1978 | 1.0945 |
| 0.0835 | 3.4596 | 2010 | 1.2562 | 0.4627 | 1.2562 | 1.1208 |
| 0.0835 | 3.4630 | 2012 | 1.2355 | 0.4627 | 1.2355 | 1.1115 |
| 0.0835 | 3.4664 | 2014 | 1.1960 | 0.4627 | 1.1960 | 1.0936 |
| 0.0835 | 3.4699 | 2016 | 1.0791 | 0.4522 | 1.0791 | 1.0388 |
| 0.0835 | 3.4733 | 2018 | 0.9500 | 0.5773 | 0.9500 | 0.9747 |
| 0.0835 | 3.4768 | 2020 | 0.8877 | 0.5556 | 0.8877 | 0.9422 |
| 0.0835 | 3.4802 | 2022 | 0.9148 | 0.5773 | 0.9148 | 0.9565 |
| 0.0835 | 3.4836 | 2024 | 1.0088 | 0.5773 | 1.0088 | 1.0044 |
| 0.0835 | 3.4871 | 2026 | 1.0801 | 0.5059 | 1.0801 | 1.0393 |
| 0.0835 | 3.4905 | 2028 | 1.0829 | 0.5473 | 1.0829 | 1.0406 |
| 0.0835 | 3.4940 | 2030 | 1.0494 | 0.5473 | 1.0494 | 1.0244 |
| 0.0835 | 3.4974 | 2032 | 1.0224 | 0.5773 | 1.0224 | 1.0112 |
| 0.0835 | 3.5009 | 2034 | 0.9939 | 0.5921 | 0.9939 | 0.9970 |
| 0.0835 | 3.5043 | 2036 | 1.0531 | 0.5773 | 1.0531 | 1.0262 |
| 0.0835 | 3.5077 | 2038 | 1.1557 | 0.4404 | 1.1557 | 1.0750 |
| 0.0835 | 3.5112 | 2040 | 1.2240 | 0.4404 | 1.2240 | 1.1063 |
| 0.0835 | 3.5146 | 2042 | 1.2175 | 0.4404 | 1.2175 | 1.1034 |
| 0.0835 | 3.5181 | 2044 | 1.1529 | 0.4271 | 1.1529 | 1.0737 |
| 0.0835 | 3.5215 | 2046 | 1.0838 | 0.5169 | 1.0838 | 1.0410 |
| 0.0835 | 3.5250 | 2048 | 1.0693 | 0.5188 | 1.0693 | 1.0341 |
| 0.0835 | 3.5284 | 2050 | 1.0912 | 0.4643 | 1.0912 | 1.0446 |
| 0.0835 | 3.5318 | 2052 | 1.0848 | 0.4649 | 1.0848 | 1.0415 |
| 0.0835 | 3.5353 | 2054 | 1.1239 | 0.4881 | 1.1239 | 1.0601 |
| 0.0835 | 3.5387 | 2056 | 1.0968 | 0.4881 | 1.0968 | 1.0473 |
| 0.0835 | 3.5422 | 2058 | 1.0038 | 0.5188 | 1.0038 | 1.0019 |
| 0.0835 | 3.5456 | 2060 | 0.9637 | 0.55 | 0.9637 | 0.9817 |
| 0.0835 | 3.5491 | 2062 | 0.9897 | 0.5449 | 0.9897 | 0.9948 |
| 0.0835 | 3.5525 | 2064 | 1.0788 | 0.4881 | 1.0788 | 1.0387 |
| 0.0835 | 3.5559 | 2066 | 1.0904 | 0.4881 | 1.0904 | 1.0442 |
| 0.0835 | 3.5594 | 2068 | 1.0000 | 0.5068 | 1.0000 | 1.0000 |
| 0.0835 | 3.5628 | 2070 | 0.8781 | 0.5789 | 0.8781 | 0.9371 |
| 0.0835 | 3.5663 | 2072 | 0.7809 | 0.5859 | 0.7809 | 0.8837 |
| 0.0835 | 3.5697 | 2074 | 0.7714 | 0.5859 | 0.7714 | 0.8783 |
| 0.0835 | 3.5731 | 2076 | 0.8277 | 0.5859 | 0.8277 | 0.9098 |
| 0.0835 | 3.5766 | 2078 | 0.9665 | 0.5216 | 0.9665 | 0.9831 |
| 0.0835 | 3.5800 | 2080 | 1.1782 | 0.4627 | 1.1782 | 1.0854 |
| 0.0835 | 3.5835 | 2082 | 1.2832 | 0.4627 | 1.2832 | 1.1328 |
| 0.0835 | 3.5869 | 2084 | 1.2363 | 0.4627 | 1.2363 | 1.1119 |
| 0.0835 | 3.5904 | 2086 | 1.0860 | 0.5233 | 1.0860 | 1.0421 |
| 0.0835 | 3.5938 | 2088 | 0.9475 | 0.5765 | 0.9475 | 0.9734 |
| 0.0835 | 3.5972 | 2090 | 0.8900 | 0.6093 | 0.8900 | 0.9434 |
| 0.0835 | 3.6007 | 2092 | 0.9278 | 0.5789 | 0.9278 | 0.9632 |
| 0.0835 | 3.6041 | 2094 | 1.0722 | 0.5114 | 1.0722 | 1.0355 |
| 0.0835 | 3.6076 | 2096 | 1.1633 | 0.5286 | 1.1633 | 1.0785 |
| 0.0835 | 3.6110 | 2098 | 1.1976 | 0.4952 | 1.1976 | 1.0943 |
| 0.0835 | 3.6145 | 2100 | 1.1455 | 0.4767 | 1.1455 | 1.0703 |
| 0.0835 | 3.6179 | 2102 | 1.1570 | 0.4404 | 1.1570 | 1.0756 |
| 0.0835 | 3.6213 | 2104 | 1.1594 | 0.4506 | 1.1594 | 1.0768 |
| 0.0835 | 3.6248 | 2106 | 1.2098 | 0.4375 | 1.2098 | 1.0999 |
| 0.0835 | 3.6282 | 2108 | 1.2694 | 0.4145 | 1.2694 | 1.1267 |
| 0.0835 | 3.6317 | 2110 | 1.2617 | 0.4375 | 1.2617 | 1.1233 |
| 0.0835 | 3.6351 | 2112 | 1.2404 | 0.4375 | 1.2404 | 1.1137 |
| 0.0835 | 3.6386 | 2114 | 1.2185 | 0.4375 | 1.2185 | 1.1038 |
| 0.0835 | 3.6420 | 2116 | 1.1569 | 0.4870 | 1.1569 | 1.0756 |
| 0.0835 | 3.6454 | 2118 | 1.0989 | 0.4643 | 1.0989 | 1.0483 |
| 0.0835 | 3.6489 | 2120 | 1.0747 | 0.5341 | 1.0747 | 1.0367 |
| 0.0835 | 3.6523 | 2122 | 1.0564 | 0.5593 | 1.0564 | 1.0278 |
| 0.0835 | 3.6558 | 2124 | 1.0302 | 0.5855 | 1.0302 | 1.0150 |
| 0.0835 | 3.6592 | 2126 | 1.0387 | 0.5825 | 1.0387 | 1.0191 |
| 0.0835 | 3.6627 | 2128 | 1.0457 | 0.5512 | 1.0457 | 1.0226 |
| 0.0835 | 3.6661 | 2130 | 1.1013 | 0.4860 | 1.1013 | 1.0494 |
| 0.0835 | 3.6695 | 2132 | 1.1586 | 0.4860 | 1.1586 | 1.0764 |
| 0.0835 | 3.6730 | 2134 | 1.1460 | 0.4860 | 1.1460 | 1.0705 |
| 0.0835 | 3.6764 | 2136 | 1.0916 | 0.5028 | 1.0916 | 1.0448 |
| 0.0835 | 3.6799 | 2138 | 1.0107 | 0.5593 | 1.0107 | 1.0053 |
| 0.0835 | 3.6833 | 2140 | 0.9996 | 0.5593 | 0.9996 | 0.9998 |
| 0.0835 | 3.6867 | 2142 | 1.0597 | 0.5028 | 1.0597 | 1.0294 |
| 0.0835 | 3.6902 | 2144 | 1.1952 | 0.4627 | 1.1952 | 1.0933 |
| 0.0835 | 3.6936 | 2146 | 1.2942 | 0.4499 | 1.2942 | 1.1376 |
| 0.0835 | 3.6971 | 2148 | 1.3017 | 0.4402 | 1.3017 | 1.1409 |
| 0.0835 | 3.7005 | 2150 | 1.1981 | 0.4627 | 1.1981 | 1.0946 |
| 0.0835 | 3.7040 | 2152 | 1.0679 | 0.5119 | 1.0679 | 1.0334 |
| 0.0835 | 3.7074 | 2154 | 1.0266 | 0.4892 | 1.0266 | 1.0132 |
| 0.0835 | 3.7108 | 2156 | 1.0321 | 0.4892 | 1.0321 | 1.0159 |
| 0.0835 | 3.7143 | 2158 | 1.1014 | 0.5119 | 1.1014 | 1.0495 |
| 0.0835 | 3.7177 | 2160 | 1.1994 | 0.4627 | 1.1994 | 1.0952 |
| 0.0835 | 3.7212 | 2162 | 1.1897 | 0.4627 | 1.1897 | 1.0907 |
| 0.0835 | 3.7246 | 2164 | 1.1020 | 0.5014 | 1.1020 | 1.0497 |
| 0.0835 | 3.7281 | 2166 | 1.0087 | 0.5968 | 1.0087 | 1.0043 |
| 0.0835 | 3.7315 | 2168 | 1.0322 | 0.5968 | 1.0322 | 1.0160 |
| 0.0835 | 3.7349 | 2170 | 1.0322 | 0.5968 | 1.0322 | 1.0160 |
| 0.0835 | 3.7384 | 2172 | 1.1267 | 0.5244 | 1.1267 | 1.0614 |
| 0.0835 | 3.7418 | 2174 | 1.2347 | 0.4952 | 1.2347 | 1.1112 |
| 0.0835 | 3.7453 | 2176 | 1.2559 | 0.4952 | 1.2559 | 1.1207 |
| 0.0835 | 3.7487 | 2178 | 1.1403 | 0.4952 | 1.1403 | 1.0679 |
| 0.0835 | 3.7522 | 2180 | 0.9935 | 0.5968 | 0.9935 | 0.9968 |
| 0.0835 | 3.7556 | 2182 | 0.9836 | 0.5968 | 0.9836 | 0.9918 |
| 0.0835 | 3.7590 | 2184 | 1.0367 | 0.5565 | 1.0367 | 1.0182 |
| 0.0835 | 3.7625 | 2186 | 1.0575 | 0.5565 | 1.0575 | 1.0283 |
| 0.0835 | 3.7659 | 2188 | 1.0297 | 0.5326 | 1.0297 | 1.0147 |
| 0.0835 | 3.7694 | 2190 | 0.9819 | 0.5773 | 0.9819 | 0.9909 |
| 0.0835 | 3.7728 | 2192 | 0.9874 | 0.5773 | 0.9874 | 0.9937 |
| 0.0835 | 3.7762 | 2194 | 1.0117 | 0.5773 | 1.0117 | 1.0058 |
| 0.0835 | 3.7797 | 2196 | 1.1003 | 0.5565 | 1.1003 | 1.0489 |
| 0.0835 | 3.7831 | 2198 | 1.1140 | 0.5565 | 1.1140 | 1.0555 |
| 0.0835 | 3.7866 | 2200 | 1.0479 | 0.5350 | 1.0479 | 1.0237 |
| 0.0835 | 3.7900 | 2202 | 0.9418 | 0.5773 | 0.9418 | 0.9705 |
| 0.0835 | 3.7935 | 2204 | 0.9178 | 0.5921 | 0.9178 | 0.9580 |
| 0.0835 | 3.7969 | 2206 | 0.9461 | 0.5921 | 0.9461 | 0.9727 |
| 0.0835 | 3.8003 | 2208 | 1.0455 | 0.5424 | 1.0455 | 1.0225 |
| 0.0835 | 3.8038 | 2210 | 1.2017 | 0.4851 | 1.2017 | 1.0962 |
| 0.0835 | 3.8072 | 2212 | 1.3917 | 0.4613 | 1.3917 | 1.1797 |
| 0.0835 | 3.8107 | 2214 | 1.4334 | 0.4608 | 1.4334 | 1.1972 |
| 0.0835 | 3.8141 | 2216 | 1.3263 | 0.4726 | 1.3263 | 1.1517 |
| 0.0835 | 3.8176 | 2218 | 1.1287 | 0.4756 | 1.1287 | 1.0624 |
| 0.0835 | 3.8210 | 2220 | 0.9471 | 0.5528 | 0.9471 | 0.9732 |
| 0.0835 | 3.8244 | 2222 | 0.8806 | 0.5681 | 0.8806 | 0.9384 |
| 0.0835 | 3.8279 | 2224 | 0.8958 | 0.5681 | 0.8958 | 0.9465 |
| 0.0835 | 3.8313 | 2226 | 0.9822 | 0.6006 | 0.9822 | 0.9910 |
| 0.0835 | 3.8348 | 2228 | 1.1261 | 0.4637 | 1.1261 | 1.0612 |
| 0.0835 | 3.8382 | 2230 | 1.1817 | 0.4833 | 1.1817 | 1.0871 |
| 0.0835 | 3.8417 | 2232 | 1.1658 | 0.4833 | 1.1658 | 1.0797 |
| 0.0835 | 3.8451 | 2234 | 1.1156 | 0.5075 | 1.1156 | 1.0562 |
| 0.0835 | 3.8485 | 2236 | 1.0301 | 0.5593 | 1.0301 | 1.0150 |
| 0.0835 | 3.8520 | 2238 | 0.9287 | 0.6006 | 0.9287 | 0.9637 |
| 0.0835 | 3.8554 | 2240 | 0.9369 | 0.5593 | 0.9369 | 0.9679 |
| 0.0835 | 3.8589 | 2242 | 1.0231 | 0.5538 | 1.0231 | 1.0115 |
| 0.0835 | 3.8623 | 2244 | 1.0768 | 0.5595 | 1.0768 | 1.0377 |
| 0.0835 | 3.8657 | 2246 | 1.1918 | 0.4833 | 1.1918 | 1.0917 |
| 0.0835 | 3.8692 | 2248 | 1.2048 | 0.4833 | 1.2048 | 1.0976 |
| 0.0835 | 3.8726 | 2250 | 1.0950 | 0.5182 | 1.0950 | 1.0464 |
| 0.0835 | 3.8761 | 2252 | 0.9213 | 0.6461 | 0.9213 | 0.9598 |
| 0.0835 | 3.8795 | 2254 | 0.8136 | 0.5921 | 0.8136 | 0.9020 |
| 0.0835 | 3.8830 | 2256 | 0.7701 | 0.5681 | 0.7701 | 0.8776 |
| 0.0835 | 3.8864 | 2258 | 0.7751 | 0.5681 | 0.7751 | 0.8804 |
| 0.0835 | 3.8898 | 2260 | 0.8164 | 0.5921 | 0.8164 | 0.9035 |
| 0.0835 | 3.8933 | 2262 | 0.8409 | 0.5921 | 0.8409 | 0.9170 |
| 0.0835 | 3.8967 | 2264 | 0.8713 | 0.5921 | 0.8713 | 0.9335 |
| 0.0835 | 3.9002 | 2266 | 0.9234 | 0.5304 | 0.9234 | 0.9609 |
| 0.0835 | 3.9036 | 2268 | 0.9923 | 0.5283 | 0.9923 | 0.9961 |
| 0.0835 | 3.9071 | 2270 | 0.9865 | 0.5014 | 0.9865 | 0.9933 |
| 0.0835 | 3.9105 | 2272 | 0.9433 | 0.5708 | 0.9433 | 0.9713 |
| 0.0835 | 3.9139 | 2274 | 0.9364 | 0.5708 | 0.9364 | 0.9677 |
| 0.0835 | 3.9174 | 2276 | 0.9210 | 0.5708 | 0.9210 | 0.9597 |
| 0.0835 | 3.9208 | 2278 | 0.8608 | 0.5921 | 0.8608 | 0.9278 |
| 0.0835 | 3.9243 | 2280 | 0.7710 | 0.6173 | 0.7710 | 0.8781 |
| 0.0835 | 3.9277 | 2282 | 0.7554 | 0.6053 | 0.7554 | 0.8691 |
| 0.0835 | 3.9312 | 2284 | 0.7947 | 0.5921 | 0.7947 | 0.8915 |
| 0.0835 | 3.9346 | 2286 | 0.8855 | 0.6301 | 0.8855 | 0.9410 |
| 0.0835 | 3.9380 | 2288 | 1.0215 | 0.625 | 1.0215 | 1.0107 |
| 0.0835 | 3.9415 | 2290 | 1.0931 | 0.5182 | 1.0931 | 1.0455 |
| 0.0835 | 3.9449 | 2292 | 1.1286 | 0.5182 | 1.1286 | 1.0624 |
| 0.0835 | 3.9484 | 2294 | 1.0478 | 0.5059 | 1.0478 | 1.0236 |
| 0.0835 | 3.9518 | 2296 | 0.9320 | 0.5350 | 0.9320 | 0.9654 |
| 0.0835 | 3.9552 | 2298 | 0.8887 | 0.5773 | 0.8887 | 0.9427 |
| 0.0835 | 3.9587 | 2300 | 0.9165 | 0.5350 | 0.9165 | 0.9573 |
| 0.0835 | 3.9621 | 2302 | 0.9961 | 0.5350 | 0.9961 | 0.9981 |
| 0.0835 | 3.9656 | 2304 | 1.0647 | 0.5283 | 1.0647 | 1.0319 |
| 0.0835 | 3.9690 | 2306 | 1.1374 | 0.5422 | 1.1374 | 1.0665 |
| 0.0835 | 3.9725 | 2308 | 1.1635 | 0.4952 | 1.1635 | 1.0787 |
| 0.0835 | 3.9759 | 2310 | 1.0867 | 0.5215 | 1.0867 | 1.0424 |
| 0.0835 | 3.9793 | 2312 | 1.0136 | 0.5565 | 1.0136 | 1.0068 |
| 0.0835 | 3.9828 | 2314 | 1.0450 | 0.4765 | 1.0450 | 1.0223 |
| 0.0835 | 3.9862 | 2316 | 1.0277 | 0.5028 | 1.0277 | 1.0137 |
| 0.0835 | 3.9897 | 2318 | 1.0486 | 0.4765 | 1.0486 | 1.0240 |
| 0.0835 | 3.9931 | 2320 | 0.9906 | 0.5028 | 0.9906 | 0.9953 |
| 0.0835 | 3.9966 | 2322 | 0.8749 | 0.5921 | 0.8749 | 0.9354 |
| 0.0835 | 4.0 | 2324 | 0.8487 | 0.6069 | 0.8487 | 0.9212 |
| 0.0835 | 4.0034 | 2326 | 0.8828 | 0.6390 | 0.8828 | 0.9396 |
| 0.0835 | 4.0069 | 2328 | 0.9899 | 0.6023 | 0.9899 | 0.9950 |
| 0.0835 | 4.0103 | 2330 | 1.0489 | 0.5303 | 1.0489 | 1.0242 |
| 0.0835 | 4.0138 | 2332 | 1.1327 | 0.4848 | 1.1327 | 1.0643 |
| 0.0835 | 4.0172 | 2334 | 1.1297 | 0.4848 | 1.1297 | 1.0629 |
| 0.0835 | 4.0207 | 2336 | 1.0491 | 0.5898 | 1.0491 | 1.0242 |
| 0.0835 | 4.0241 | 2338 | 0.9764 | 0.5326 | 0.9764 | 0.9881 |
| 0.0835 | 4.0275 | 2340 | 0.9395 | 0.55 | 0.9395 | 0.9693 |
| 0.0835 | 4.0310 | 2342 | 0.9447 | 0.5473 | 0.9447 | 0.9720 |
| 0.0835 | 4.0344 | 2344 | 0.9451 | 0.5473 | 0.9451 | 0.9722 |
| 0.0835 | 4.0379 | 2346 | 0.8773 | 0.5921 | 0.8773 | 0.9366 |
| 0.0835 | 4.0413 | 2348 | 0.7700 | 0.6337 | 0.7700 | 0.8775 |
| 0.0835 | 4.0448 | 2350 | 0.7434 | 0.6451 | 0.7434 | 0.8622 |
| 0.0835 | 4.0482 | 2352 | 0.7969 | 0.7032 | 0.7969 | 0.8927 |
| 0.0835 | 4.0516 | 2354 | 0.9068 | 0.6897 | 0.9068 | 0.9522 |
| 0.0835 | 4.0551 | 2356 | 0.9512 | 0.6538 | 0.9512 | 0.9753 |
| 0.0835 | 4.0585 | 2358 | 0.9159 | 0.6836 | 0.9159 | 0.9570 |
| 0.0835 | 4.0620 | 2360 | 0.8380 | 0.6365 | 0.8380 | 0.9154 |
| 0.0835 | 4.0654 | 2362 | 0.7916 | 0.6410 | 0.7916 | 0.8897 |
| 0.0835 | 4.0688 | 2364 | 0.7905 | 0.6410 | 0.7905 | 0.8891 |
| 0.0835 | 4.0723 | 2366 | 0.8253 | 0.5921 | 0.8253 | 0.9084 |
| 0.0835 | 4.0757 | 2368 | 0.8654 | 0.5326 | 0.8654 | 0.9303 |
| 0.0835 | 4.0792 | 2370 | 0.8621 | 0.5773 | 0.8621 | 0.9285 |
| 0.0835 | 4.0826 | 2372 | 0.8771 | 0.5773 | 0.8771 | 0.9365 |
| 0.0835 | 4.0861 | 2374 | 0.8474 | 0.5773 | 0.8474 | 0.9206 |
| 0.0835 | 4.0895 | 2376 | 0.8528 | 0.5773 | 0.8528 | 0.9235 |
| 0.0835 | 4.0929 | 2378 | 0.8634 | 0.5773 | 0.8634 | 0.9292 |
| 0.0835 | 4.0964 | 2380 | 0.8403 | 0.6069 | 0.8403 | 0.9167 |
| 0.0835 | 4.0998 | 2382 | 0.8863 | 0.5773 | 0.8863 | 0.9415 |
| 0.0835 | 4.1033 | 2384 | 0.9427 | 0.5350 | 0.9427 | 0.9709 |
| 0.0835 | 4.1067 | 2386 | 0.8954 | 0.5921 | 0.8954 | 0.9463 |
| 0.0835 | 4.1102 | 2388 | 0.8755 | 0.5921 | 0.8755 | 0.9357 |
| 0.0835 | 4.1136 | 2390 | 0.7999 | 0.6093 | 0.7999 | 0.8944 |
| 0.0835 | 4.1170 | 2392 | 0.7173 | 0.6139 | 0.7173 | 0.8469 |
| 0.0835 | 4.1205 | 2394 | 0.7070 | 0.6139 | 0.7070 | 0.8408 |
| 0.0835 | 4.1239 | 2396 | 0.7549 | 0.6358 | 0.7549 | 0.8689 |
| 0.0835 | 4.1274 | 2398 | 0.8436 | 0.6316 | 0.8436 | 0.9185 |
| 0.0835 | 4.1308 | 2400 | 0.9263 | 0.5179 | 0.9263 | 0.9625 |
| 0.0835 | 4.1343 | 2402 | 0.9493 | 0.5170 | 0.9493 | 0.9743 |
| 0.0835 | 4.1377 | 2404 | 0.8904 | 0.5965 | 0.8904 | 0.9436 |
| 0.0835 | 4.1411 | 2406 | 0.7931 | 0.6568 | 0.7931 | 0.8905 |
| 0.0835 | 4.1446 | 2408 | 0.7510 | 0.6358 | 0.7510 | 0.8666 |
| 0.0835 | 4.1480 | 2410 | 0.7286 | 0.6587 | 0.7286 | 0.8536 |
| 0.0835 | 4.1515 | 2412 | 0.7238 | 0.6587 | 0.7238 | 0.8507 |
| 0.0835 | 4.1549 | 2414 | 0.7775 | 0.6568 | 0.7775 | 0.8818 |
| 0.0835 | 4.1583 | 2416 | 0.8240 | 0.6548 | 0.8240 | 0.9078 |
| 0.0835 | 4.1618 | 2418 | 0.8382 | 0.6316 | 0.8382 | 0.9155 |
| 0.0835 | 4.1652 | 2420 | 0.8273 | 0.6069 | 0.8273 | 0.9096 |
| 0.0835 | 4.1687 | 2422 | 0.8320 | 0.6069 | 0.8320 | 0.9121 |
| 0.0835 | 4.1721 | 2424 | 0.7981 | 0.6195 | 0.7981 | 0.8933 |
| 0.0835 | 4.1756 | 2426 | 0.7838 | 0.6337 | 0.7838 | 0.8853 |
| 0.0835 | 4.1790 | 2428 | 0.7640 | 0.6116 | 0.7640 | 0.8741 |
| 0.0835 | 4.1824 | 2430 | 0.7816 | 0.6116 | 0.7816 | 0.8841 |
| 0.0835 | 4.1859 | 2432 | 0.8442 | 0.6132 | 0.8442 | 0.9188 |
| 0.0835 | 4.1893 | 2434 | 0.9618 | 0.5013 | 0.9618 | 0.9807 |
| 0.0835 | 4.1928 | 2436 | 1.0488 | 0.4637 | 1.0488 | 1.0241 |
| 0.0835 | 4.1962 | 2438 | 1.0560 | 0.4512 | 1.0560 | 1.0276 |
| 0.0835 | 4.1997 | 2440 | 1.0378 | 0.4765 | 1.0378 | 1.0187 |
| 0.0835 | 4.2031 | 2442 | 1.0019 | 0.4765 | 1.0019 | 1.0010 |
| 0.0835 | 4.2065 | 2444 | 0.9573 | 0.5283 | 0.9573 | 0.9784 |
| 0.0835 | 4.2100 | 2446 | 0.9050 | 0.5740 | 0.9050 | 0.9513 |
| 0.0835 | 4.2134 | 2448 | 0.8869 | 0.5740 | 0.8869 | 0.9418 |
| 0.0835 | 4.2169 | 2450 | 0.8663 | 0.5921 | 0.8663 | 0.9307 |
| 0.0835 | 4.2203 | 2452 | 0.8314 | 0.5833 | 0.8314 | 0.9118 |
| 0.0835 | 4.2238 | 2454 | 0.8810 | 0.6150 | 0.8810 | 0.9386 |
| 0.0835 | 4.2272 | 2456 | 0.9827 | 0.5215 | 0.9827 | 0.9913 |
| 0.0835 | 4.2306 | 2458 | 1.0074 | 0.5215 | 1.0074 | 1.0037 |
| 0.0835 | 4.2341 | 2460 | 1.0557 | 0.5215 | 1.0557 | 1.0275 |
| 0.0835 | 4.2375 | 2462 | 1.0631 | 0.5119 | 1.0631 | 1.0311 |
| 0.0835 | 4.2410 | 2464 | 0.9937 | 0.4765 | 0.9937 | 0.9968 |
| 0.0835 | 4.2444 | 2466 | 0.9075 | 0.6006 | 0.9075 | 0.9526 |
| 0.0835 | 4.2478 | 2468 | 0.9055 | 0.6006 | 0.9055 | 0.9516 |
| 0.0835 | 4.2513 | 2470 | 0.9307 | 0.6006 | 0.9307 | 0.9647 |
| 0.0835 | 4.2547 | 2472 | 1.0198 | 0.4765 | 1.0198 | 1.0099 |
| 0.0835 | 4.2582 | 2474 | 1.1586 | 0.4622 | 1.1586 | 1.0764 |
| 0.0835 | 4.2616 | 2476 | 1.2068 | 0.4622 | 1.2068 | 1.0985 |
| 0.0835 | 4.2651 | 2478 | 1.2943 | 0.4613 | 1.2943 | 1.1377 |
| 0.0835 | 4.2685 | 2480 | 1.3356 | 0.4613 | 1.3356 | 1.1557 |
| 0.0835 | 4.2719 | 2482 | 1.2674 | 0.4622 | 1.2674 | 1.1258 |
| 0.0835 | 4.2754 | 2484 | 1.1395 | 0.4631 | 1.1395 | 1.0675 |
| 0.0835 | 4.2788 | 2486 | 1.0479 | 0.5446 | 1.0479 | 1.0237 |
| 0.0835 | 4.2823 | 2488 | 1.0041 | 0.5593 | 1.0041 | 1.0020 |
| 0.0835 | 4.2857 | 2490 | 1.0263 | 0.5593 | 1.0263 | 1.0131 |
| 0.0835 | 4.2892 | 2492 | 1.1201 | 0.5215 | 1.1201 | 1.0583 |
| 0.0835 | 4.2926 | 2494 | 1.2063 | 0.4622 | 1.2063 | 1.0983 |
| 0.0835 | 4.2960 | 2496 | 1.1878 | 0.4745 | 1.1878 | 1.0899 |
| 0.0835 | 4.2995 | 2498 | 1.1214 | 0.4975 | 1.1214 | 1.0590 |
| 0.0739 | 4.3029 | 2500 | 1.0153 | 0.6371 | 1.0153 | 1.0076 |
| 0.0739 | 4.3064 | 2502 | 0.9611 | 0.6371 | 0.9611 | 0.9803 |
| 0.0739 | 4.3098 | 2504 | 0.9462 | 0.6070 | 0.9462 | 0.9727 |
| 0.0739 | 4.3133 | 2506 | 0.9172 | 0.6150 | 0.9172 | 0.9577 |
| 0.0739 | 4.3167 | 2508 | 0.8875 | 0.6150 | 0.8875 | 0.9421 |
| 0.0739 | 4.3201 | 2510 | 0.8945 | 0.6150 | 0.8945 | 0.9458 |
| 0.0739 | 4.3236 | 2512 | 0.8827 | 0.6150 | 0.8827 | 0.9395 |
| 0.0739 | 4.3270 | 2514 | 0.8879 | 0.6150 | 0.8879 | 0.9423 |
| 0.0739 | 4.3305 | 2516 | 0.9044 | 0.6150 | 0.9044 | 0.9510 |
| 0.0739 | 4.3339 | 2518 | 0.9424 | 0.6006 | 0.9424 | 0.9708 |
| 0.0739 | 4.3373 | 2520 | 0.9846 | 0.5708 | 0.9846 | 0.9923 |
| 0.0739 | 4.3408 | 2522 | 0.9553 | 0.6006 | 0.9553 | 0.9774 |
| 0.0739 | 4.3442 | 2524 | 0.8802 | 0.6150 | 0.8802 | 0.9382 |
| 0.0739 | 4.3477 | 2526 | 0.8446 | 0.6150 | 0.8446 | 0.9190 |
| 0.0739 | 4.3511 | 2528 | 0.8310 | 0.6029 | 0.8310 | 0.9116 |
| 0.0739 | 4.3546 | 2530 | 0.8532 | 0.6150 | 0.8532 | 0.9237 |
| 0.0739 | 4.3580 | 2532 | 0.8308 | 0.6294 | 0.8308 | 0.9115 |
| 0.0739 | 4.3614 | 2534 | 0.8307 | 0.6294 | 0.8307 | 0.9114 |
| 0.0739 | 4.3649 | 2536 | 0.8876 | 0.5968 | 0.8876 | 0.9421 |
| 0.0739 | 4.3683 | 2538 | 0.9723 | 0.5028 | 0.9723 | 0.9861 |
| 0.0739 | 4.3718 | 2540 | 1.0050 | 0.4765 | 1.0050 | 1.0025 |
| 0.0739 | 4.3752 | 2542 | 1.0097 | 0.4881 | 1.0097 | 1.0048 |
| 0.0739 | 4.3787 | 2544 | 0.9831 | 0.4881 | 0.9831 | 0.9915 |
| 0.0739 | 4.3821 | 2546 | 0.9338 | 0.5028 | 0.9338 | 0.9664 |
| 0.0739 | 4.3855 | 2548 | 0.9064 | 0.5563 | 0.9064 | 0.9520 |
| 0.0739 | 4.3890 | 2550 | 0.9235 | 0.5310 | 0.9235 | 0.9610 |
| 0.0739 | 4.3924 | 2552 | 0.9182 | 0.5310 | 0.9182 | 0.9582 |
| 0.0739 | 4.3959 | 2554 | 0.9439 | 0.5537 | 0.9439 | 0.9715 |
| 0.0739 | 4.3993 | 2556 | 0.9206 | 0.5310 | 0.9206 | 0.9595 |
| 0.0739 | 4.4028 | 2558 | 0.8754 | 0.5840 | 0.8754 | 0.9356 |
| 0.0739 | 4.4062 | 2560 | 0.8810 | 0.5840 | 0.8810 | 0.9386 |
| 0.0739 | 4.4096 | 2562 | 0.9414 | 0.5909 | 0.9414 | 0.9703 |
| 0.0739 | 4.4131 | 2564 | 1.0254 | 0.5303 | 1.0254 | 1.0126 |
| 0.0739 | 4.4165 | 2566 | 1.0365 | 0.5182 | 1.0365 | 1.0181 |
| 0.0739 | 4.4200 | 2568 | 1.0086 | 0.5303 | 1.0086 | 1.0043 |
| 0.0739 | 4.4234 | 2570 | 1.0075 | 0.4987 | 1.0075 | 1.0037 |
| 0.0739 | 4.4269 | 2572 | 0.9398 | 0.5742 | 0.9398 | 0.9694 |
| 0.0739 | 4.4303 | 2574 | 0.8707 | 0.6076 | 0.8707 | 0.9331 |
| 0.0739 | 4.4337 | 2576 | 0.8653 | 0.6410 | 0.8653 | 0.9302 |
| 0.0739 | 4.4372 | 2578 | 0.9074 | 0.5808 | 0.9074 | 0.9526 |
| 0.0739 | 4.4406 | 2580 | 0.9129 | 0.5808 | 0.9129 | 0.9554 |
| 0.0739 | 4.4441 | 2582 | 0.9288 | 0.5877 | 0.9288 | 0.9637 |
| 0.0739 | 4.4475 | 2584 | 0.9231 | 0.6390 | 0.9231 | 0.9608 |
| 0.0739 | 4.4509 | 2586 | 0.9712 | 0.5257 | 0.9712 | 0.9855 |
| 0.0739 | 4.4544 | 2588 | 0.9953 | 0.4860 | 0.9953 | 0.9976 |
| 0.0739 | 4.4578 | 2590 | 0.9451 | 0.5257 | 0.9451 | 0.9722 |
| 0.0739 | 4.4613 | 2592 | 0.8695 | 0.6173 | 0.8696 | 0.9325 |
| 0.0739 | 4.4647 | 2594 | 0.8409 | 0.6173 | 0.8409 | 0.9170 |
| 0.0739 | 4.4682 | 2596 | 0.8484 | 0.6173 | 0.8484 | 0.9211 |
| 0.0739 | 4.4716 | 2598 | 0.9072 | 0.5350 | 0.9072 | 0.9525 |
| 0.0739 | 4.4750 | 2600 | 1.0102 | 0.5 | 1.0102 | 1.0051 |
| 0.0739 | 4.4785 | 2602 | 1.0882 | 0.4375 | 1.0882 | 1.0432 |
| 0.0739 | 4.4819 | 2604 | 1.1137 | 0.4501 | 1.1137 | 1.0553 |
| 0.0739 | 4.4854 | 2606 | 1.0587 | 0.4627 | 1.0587 | 1.0289 |
| 0.0739 | 4.4888 | 2608 | 0.9496 | 0.5027 | 0.9496 | 0.9745 |
| 0.0739 | 4.4923 | 2610 | 0.8589 | 0.6316 | 0.8589 | 0.9268 |
| 0.0739 | 4.4957 | 2612 | 0.7973 | 0.6337 | 0.7973 | 0.8929 |
| 0.0739 | 4.4991 | 2614 | 0.7711 | 0.6786 | 0.7711 | 0.8781 |
| 0.0739 | 4.5026 | 2616 | 0.7720 | 0.6786 | 0.7720 | 0.8786 |
| 0.0739 | 4.5060 | 2618 | 0.8200 | 0.625 | 0.8200 | 0.9056 |
| 0.0739 | 4.5095 | 2620 | 0.8730 | 0.6434 | 0.8730 | 0.9343 |
| 0.0739 | 4.5129 | 2622 | 0.8894 | 0.6092 | 0.8894 | 0.9431 |
| 0.0739 | 4.5164 | 2624 | 0.8744 | 0.625 | 0.8744 | 0.9351 |
| 0.0739 | 4.5198 | 2626 | 0.9079 | 0.5850 | 0.9079 | 0.9529 |
| 0.0739 | 4.5232 | 2628 | 0.9073 | 0.5593 | 0.9073 | 0.9525 |
| 0.0739 | 4.5267 | 2630 | 0.8906 | 0.5593 | 0.8906 | 0.9437 |
| 0.0739 | 4.5301 | 2632 | 0.9240 | 0.5593 | 0.9240 | 0.9612 |
| 0.0739 | 4.5336 | 2634 | 0.9200 | 0.5593 | 0.9200 | 0.9592 |
| 0.0739 | 4.5370 | 2636 | 0.9206 | 0.5593 | 0.9206 | 0.9595 |
| 0.0739 | 4.5404 | 2638 | 0.9181 | 0.5850 | 0.9181 | 0.9582 |
| 0.0739 | 4.5439 | 2640 | 0.9134 | 0.5850 | 0.9134 | 0.9557 |
| 0.0739 | 4.5473 | 2642 | 0.9325 | 0.5563 | 0.9325 | 0.9657 |
| 0.0739 | 4.5508 | 2644 | 0.9499 | 0.5537 | 0.9499 | 0.9746 |
| 0.0739 | 4.5542 | 2646 | 0.9468 | 0.5537 | 0.9468 | 0.9730 |
| 0.0739 | 4.5577 | 2648 | 1.0125 | 0.4776 | 1.0125 | 1.0062 |
| 0.0739 | 4.5611 | 2650 | 1.0436 | 0.4776 | 1.0436 | 1.0216 |
| 0.0739 | 4.5645 | 2652 | 1.0493 | 0.4375 | 1.0493 | 1.0244 |
| 0.0739 | 4.5680 | 2654 | 1.0363 | 0.4512 | 1.0363 | 1.0180 |
| 0.0739 | 4.5714 | 2656 | 0.9904 | 0.5283 | 0.9904 | 0.9952 |
| 0.0739 | 4.5749 | 2658 | 1.0065 | 0.5014 | 1.0065 | 1.0032 |
| 0.0739 | 4.5783 | 2660 | 1.0234 | 0.5014 | 1.0234 | 1.0116 |
| 0.0739 | 4.5818 | 2662 | 1.0102 | 0.5283 | 1.0102 | 1.0051 |
| 0.0739 | 4.5852 | 2664 | 1.0631 | 0.4512 | 1.0631 | 1.0311 |
| 0.0739 | 4.5886 | 2666 | 1.1287 | 0.4375 | 1.1287 | 1.0624 |
| 0.0739 | 4.5921 | 2668 | 1.1777 | 0.4614 | 1.1777 | 1.0852 |
| 0.0739 | 4.5955 | 2670 | 1.1621 | 0.4375 | 1.1621 | 1.0780 |
| 0.0739 | 4.5990 | 2672 | 1.0817 | 0.4512 | 1.0817 | 1.0400 |
| 0.0739 | 4.6024 | 2674 | 0.9838 | 0.5593 | 0.9838 | 0.9918 |
| 0.0739 | 4.6059 | 2676 | 0.9244 | 0.5740 | 0.9244 | 0.9615 |
| 0.0739 | 4.6093 | 2678 | 0.9288 | 0.5740 | 0.9288 | 0.9637 |
| 0.0739 | 4.6127 | 2680 | 0.9571 | 0.5740 | 0.9571 | 0.9783 |
| 0.0739 | 4.6162 | 2682 | 1.0181 | 0.5014 | 1.0181 | 1.0090 |
| 0.0739 | 4.6196 | 2684 | 1.0428 | 0.4637 | 1.0428 | 1.0212 |
| 0.0739 | 4.6231 | 2686 | 1.0379 | 0.5135 | 1.0379 | 1.0188 |
| 0.0739 | 4.6265 | 2688 | 1.0185 | 0.5678 | 1.0185 | 1.0092 |
| 0.0739 | 4.6299 | 2690 | 0.9697 | 0.5593 | 0.9697 | 0.9847 |
| 0.0739 | 4.6334 | 2692 | 0.9410 | 0.5350 | 0.9410 | 0.9701 |
| 0.0739 | 4.6368 | 2694 | 0.9049 | 0.5350 | 0.9049 | 0.9513 |
| 0.0739 | 4.6403 | 2696 | 0.9063 | 0.5350 | 0.9063 | 0.9520 |
| 0.0739 | 4.6437 | 2698 | 0.9500 | 0.5593 | 0.9500 | 0.9747 |
| 0.0739 | 4.6472 | 2700 | 0.9689 | 0.5304 | 0.9689 | 0.9843 |
| 0.0739 | 4.6506 | 2702 | 0.9205 | 0.5350 | 0.9205 | 0.9594 |
| 0.0739 | 4.6540 | 2704 | 0.8375 | 0.5556 | 0.8375 | 0.9151 |
| 0.0739 | 4.6575 | 2706 | 0.7813 | 0.5859 | 0.7813 | 0.8839 |
| 0.0739 | 4.6609 | 2708 | 0.7953 | 0.5708 | 0.7953 | 0.8918 |
| 0.0739 | 4.6644 | 2710 | 0.8608 | 0.5681 | 0.8608 | 0.9278 |
| 0.0739 | 4.6678 | 2712 | 1.0025 | 0.5465 | 1.0025 | 1.0013 |
| 0.0739 | 4.6713 | 2714 | 1.0931 | 0.5182 | 1.0931 | 1.0455 |
| 0.0739 | 4.6747 | 2716 | 1.1045 | 0.5182 | 1.1045 | 1.0510 |
| 0.0739 | 4.6781 | 2718 | 1.0630 | 0.4987 | 1.0630 | 1.0310 |
| 0.0739 | 4.6816 | 2720 | 0.9545 | 0.5350 | 0.9545 | 0.9770 |
| 0.0739 | 4.6850 | 2722 | 0.9142 | 0.55 | 0.9142 | 0.9561 |
| 0.0739 | 4.6885 | 2724 | 0.9434 | 0.5350 | 0.9434 | 0.9713 |
| 0.0739 | 4.6919 | 2726 | 0.9934 | 0.5350 | 0.9934 | 0.9967 |
| 0.0739 | 4.6954 | 2728 | 1.0337 | 0.5043 | 1.0337 | 1.0167 |
| 0.0739 | 4.6988 | 2730 | 1.0236 | 0.5043 | 1.0236 | 1.0118 |
| 0.0739 | 4.7022 | 2732 | 0.9484 | 0.5350 | 0.9484 | 0.9739 |
| 0.0739 | 4.7057 | 2734 | 0.8964 | 0.55 | 0.8964 | 0.9468 |
| 0.0739 | 4.7091 | 2736 | 0.8652 | 0.5248 | 0.8652 | 0.9301 |
| 0.0739 | 4.7126 | 2738 | 0.8600 | 0.5248 | 0.8600 | 0.9274 |
| 0.0739 | 4.7160 | 2740 | 0.9087 | 0.55 | 0.9087 | 0.9533 |
| 0.0739 | 4.7194 | 2742 | 0.9574 | 0.5591 | 0.9574 | 0.9785 |
| 0.0739 | 4.7229 | 2744 | 0.9902 | 0.5310 | 0.9902 | 0.9951 |
| 0.0739 | 4.7263 | 2746 | 0.9828 | 0.5310 | 0.9828 | 0.9914 |
| 0.0739 | 4.7298 | 2748 | 0.9491 | 0.55 | 0.9491 | 0.9742 |
| 0.0739 | 4.7332 | 2750 | 0.8705 | 0.5921 | 0.8705 | 0.9330 |
| 0.0739 | 4.7367 | 2752 | 0.8328 | 0.5946 | 0.8328 | 0.9126 |
| 0.0739 | 4.7401 | 2754 | 0.8583 | 0.6173 | 0.8583 | 0.9264 |
| 0.0739 | 4.7435 | 2756 | 0.9240 | 0.55 | 0.9240 | 0.9613 |
| 0.0739 | 4.7470 | 2758 | 1.0258 | 0.4538 | 1.0258 | 1.0128 |
| 0.0739 | 4.7504 | 2760 | 1.0891 | 0.4975 | 1.0891 | 1.0436 |
| 0.0739 | 4.7539 | 2762 | 1.0645 | 0.5099 | 1.0645 | 1.0318 |
| 0.0739 | 4.7573 | 2764 | 0.9655 | 0.5207 | 0.9655 | 0.9826 |
| 0.0739 | 4.7608 | 2766 | 0.8424 | 0.5681 | 0.8424 | 0.9178 |
| 0.0739 | 4.7642 | 2768 | 0.7891 | 0.5708 | 0.7891 | 0.8883 |
| 0.0739 | 4.7676 | 2770 | 0.8018 | 0.5946 | 0.8018 | 0.8955 |
| 0.0739 | 4.7711 | 2772 | 0.8736 | 0.5921 | 0.8736 | 0.9347 |
| 0.0739 | 4.7745 | 2774 | 1.0129 | 0.5207 | 1.0129 | 1.0065 |
| 0.0739 | 4.7780 | 2776 | 1.1187 | 0.4512 | 1.1187 | 1.0577 |
| 0.0739 | 4.7814 | 2778 | 1.1409 | 0.4512 | 1.1409 | 1.0681 |
| 0.0739 | 4.7849 | 2780 | 1.1034 | 0.4512 | 1.1034 | 1.0504 |
| 0.0739 | 4.7883 | 2782 | 1.0622 | 0.5350 | 1.0622 | 1.0306 |
| 0.0739 | 4.7917 | 2784 | 1.0339 | 0.5350 | 1.0339 | 1.0168 |
| 0.0739 | 4.7952 | 2786 | 0.9672 | 0.55 | 0.9672 | 0.9835 |
| 0.0739 | 4.7986 | 2788 | 0.9344 | 0.5248 | 0.9344 | 0.9667 |
| 0.0739 | 4.8021 | 2790 | 0.9422 | 0.55 | 0.9422 | 0.9707 |
| 0.0739 | 4.8055 | 2792 | 0.9720 | 0.5591 | 0.9720 | 0.9859 |
| 0.0739 | 4.8090 | 2794 | 0.9830 | 0.5920 | 0.9830 | 0.9915 |
| 0.0739 | 4.8124 | 2796 | 1.0121 | 0.5593 | 1.0121 | 1.0060 |
| 0.0739 | 4.8158 | 2798 | 1.0336 | 0.5593 | 1.0336 | 1.0167 |
| 0.0739 | 4.8193 | 2800 | 1.0086 | 0.5855 | 1.0086 | 1.0043 |
| 0.0739 | 4.8227 | 2802 | 0.9937 | 0.5740 | 0.9937 | 0.9969 |
| 0.0739 | 4.8262 | 2804 | 1.0588 | 0.5028 | 1.0588 | 1.0290 |
| 0.0739 | 4.8296 | 2806 | 1.0715 | 0.5028 | 1.0715 | 1.0351 |
| 0.0739 | 4.8330 | 2808 | 1.0619 | 0.5304 | 1.0619 | 1.0305 |
| 0.0739 | 4.8365 | 2810 | 1.0346 | 0.5593 | 1.0346 | 1.0171 |
| 0.0739 | 4.8399 | 2812 | 1.0638 | 0.5028 | 1.0638 | 1.0314 |
| 0.0739 | 4.8434 | 2814 | 1.0973 | 0.5014 | 1.0973 | 1.0475 |
| 0.0739 | 4.8468 | 2816 | 1.1362 | 0.4512 | 1.1362 | 1.0659 |
| 0.0739 | 4.8503 | 2818 | 1.1149 | 0.4512 | 1.1149 | 1.0559 |
| 0.0739 | 4.8537 | 2820 | 1.0997 | 0.5099 | 1.0997 | 1.0487 |
| 0.0739 | 4.8571 | 2822 | 1.0223 | 0.5568 | 1.0223 | 1.0111 |
| 0.0739 | 4.8606 | 2824 | 0.8925 | 0.5991 | 0.8925 | 0.9447 |
| 0.0739 | 4.8640 | 2826 | 0.8292 | 0.5946 | 0.8292 | 0.9106 |
| 0.0739 | 4.8675 | 2828 | 0.8254 | 0.6093 | 0.8254 | 0.9085 |
| 0.0739 | 4.8709 | 2830 | 0.8748 | 0.652 | 0.8748 | 0.9353 |
| 0.0739 | 4.8744 | 2832 | 0.9287 | 0.6501 | 0.9287 | 0.9637 |
| 0.0739 | 4.8778 | 2834 | 0.9608 | 0.5855 | 0.9608 | 0.9802 |
| 0.0739 | 4.8812 | 2836 | 0.9618 | 0.5537 | 0.9618 | 0.9807 |
| 0.0739 | 4.8847 | 2838 | 0.9658 | 0.5448 | 0.9658 | 0.9828 |
| 0.0739 | 4.8881 | 2840 | 0.9318 | 0.55 | 0.9318 | 0.9653 |
| 0.0739 | 4.8916 | 2842 | 0.9533 | 0.55 | 0.9533 | 0.9764 |
| 0.0739 | 4.8950 | 2844 | 1.0235 | 0.5304 | 1.0235 | 1.0117 |
| 0.0739 | 4.8985 | 2846 | 1.0969 | 0.5283 | 1.0969 | 1.0473 |
| 0.0739 | 4.9019 | 2848 | 1.1213 | 0.4765 | 1.1213 | 1.0589 |
| 0.0739 | 4.9053 | 2850 | 1.1080 | 0.4765 | 1.1080 | 1.0526 |
| 0.0739 | 4.9088 | 2852 | 1.0642 | 0.5304 | 1.0642 | 1.0316 |
| 0.0739 | 4.9122 | 2854 | 0.9664 | 0.5248 | 0.9664 | 0.9831 |
| 0.0739 | 4.9157 | 2856 | 0.8676 | 0.5681 | 0.8676 | 0.9315 |
| 0.0739 | 4.9191 | 2858 | 0.8278 | 0.5681 | 0.8278 | 0.9098 |
| 0.0739 | 4.9225 | 2860 | 0.8424 | 0.5681 | 0.8424 | 0.9178 |
| 0.0739 | 4.9260 | 2862 | 0.9169 | 0.5681 | 0.9169 | 0.9575 |
| 0.0739 | 4.9294 | 2864 | 1.0158 | 0.5645 | 1.0158 | 1.0079 |
| 0.0739 | 4.9329 | 2866 | 1.0797 | 0.5726 | 1.0797 | 1.0391 |
| 0.0739 | 4.9363 | 2868 | 1.0782 | 0.5401 | 1.0782 | 1.0384 |
| 0.0739 | 4.9398 | 2870 | 1.0365 | 0.5207 | 1.0365 | 1.0181 |
| 0.0739 | 4.9432 | 2872 | 1.0200 | 0.5207 | 1.0200 | 1.0100 |
| 0.0739 | 4.9466 | 2874 | 0.9864 | 0.4955 | 0.9864 | 0.9932 |
| 0.0739 | 4.9501 | 2876 | 0.9700 | 0.5248 | 0.9700 | 0.9849 |
| 0.0739 | 4.9535 | 2878 | 0.9650 | 0.5248 | 0.9650 | 0.9824 |
| 0.0739 | 4.9570 | 2880 | 0.9459 | 0.5681 | 0.9459 | 0.9726 |
| 0.0739 | 4.9604 | 2882 | 0.9327 | 0.5681 | 0.9327 | 0.9658 |
| 0.0739 | 4.9639 | 2884 | 0.9368 | 0.5681 | 0.9368 | 0.9679 |
| 0.0739 | 4.9673 | 2886 | 0.9292 | 0.5681 | 0.9292 | 0.9639 |
| 0.0739 | 4.9707 | 2888 | 0.9621 | 0.5765 | 0.9621 | 0.9809 |
| 0.0739 | 4.9742 | 2890 | 0.9789 | 0.5474 | 0.9789 | 0.9894 |
| 0.0739 | 4.9776 | 2892 | 0.9586 | 0.5681 | 0.9586 | 0.9791 |
| 0.0739 | 4.9811 | 2894 | 0.9491 | 0.5681 | 0.9491 | 0.9742 |
| 0.0739 | 4.9845 | 2896 | 0.9332 | 0.5681 | 0.9332 | 0.9660 |
| 0.0739 | 4.9880 | 2898 | 0.9472 | 0.5765 | 0.9472 | 0.9732 |
| 0.0739 | 4.9914 | 2900 | 0.9161 | 0.5765 | 0.9161 | 0.9571 |
| 0.0739 | 4.9948 | 2902 | 0.9186 | 0.5765 | 0.9186 | 0.9585 |
| 0.0739 | 4.9983 | 2904 | 0.9602 | 0.6092 | 0.9602 | 0.9799 |
| 0.0739 | 5.0017 | 2906 | 0.9628 | 0.6092 | 0.9628 | 0.9812 |
| 0.0739 | 5.0052 | 2908 | 0.9962 | 0.6301 | 0.9962 | 0.9981 |
| 0.0739 | 5.0086 | 2910 | 0.9842 | 0.6301 | 0.9842 | 0.9921 |
| 0.0739 | 5.0120 | 2912 | 0.9241 | 0.5765 | 0.9241 | 0.9613 |
| 0.0739 | 5.0155 | 2914 | 0.8540 | 0.5681 | 0.8540 | 0.9241 |
| 0.0739 | 5.0189 | 2916 | 0.8060 | 0.5833 | 0.8060 | 0.8978 |
| 0.0739 | 5.0224 | 2918 | 0.8269 | 0.5681 | 0.8269 | 0.9094 |
| 0.0739 | 5.0258 | 2920 | 0.8960 | 0.6092 | 0.8960 | 0.9466 |
| 0.0739 | 5.0293 | 2922 | 1.0046 | 0.5720 | 1.0046 | 1.0023 |
| 0.0739 | 5.0327 | 2924 | 1.0410 | 0.5761 | 1.0410 | 1.0203 |
| 0.0739 | 5.0361 | 2926 | 1.0007 | 0.5693 | 1.0007 | 1.0003 |
| 0.0739 | 5.0396 | 2928 | 0.9547 | 0.6019 | 0.9547 | 0.9771 |
| 0.0739 | 5.0430 | 2930 | 0.9173 | 0.6092 | 0.9173 | 0.9577 |
| 0.0739 | 5.0465 | 2932 | 0.9052 | 0.6092 | 0.9052 | 0.9514 |
| 0.0739 | 5.0499 | 2934 | 0.9321 | 0.6092 | 0.9321 | 0.9654 |
| 0.0739 | 5.0534 | 2936 | 0.9199 | 0.5765 | 0.9199 | 0.9591 |
| 0.0739 | 5.0568 | 2938 | 0.9420 | 0.6260 | 0.9420 | 0.9706 |
| 0.0739 | 5.0602 | 2940 | 0.9438 | 0.5748 | 0.9438 | 0.9715 |
| 0.0739 | 5.0637 | 2942 | 0.9333 | 0.6260 | 0.9333 | 0.9661 |
| 0.0739 | 5.0671 | 2944 | 0.9279 | 0.5955 | 0.9279 | 0.9633 |
| 0.0739 | 5.0706 | 2946 | 0.8910 | 0.5733 | 0.8910 | 0.9439 |
| 0.0739 | 5.0740 | 2948 | 0.8442 | 0.5833 | 0.8442 | 0.9188 |
| 0.0739 | 5.0775 | 2950 | 0.8291 | 0.5833 | 0.8291 | 0.9105 |
| 0.0739 | 5.0809 | 2952 | 0.8631 | 0.5909 | 0.8631 | 0.9290 |
| 0.0739 | 5.0843 | 2954 | 0.9110 | 0.5733 | 0.9110 | 0.9545 |
| 0.0739 | 5.0878 | 2956 | 0.9922 | 0.5444 | 0.9922 | 0.9961 |
| 0.0739 | 5.0912 | 2958 | 1.0170 | 0.5444 | 1.0170 | 1.0085 |
| 0.0739 | 5.0947 | 2960 | 0.9718 | 0.5693 | 0.9718 | 0.9858 |
| 0.0739 | 5.0981 | 2962 | 0.8949 | 0.5733 | 0.8949 | 0.9460 |
| 0.0739 | 5.1015 | 2964 | 0.8244 | 0.5833 | 0.8244 | 0.9080 |
| 0.0739 | 5.1050 | 2966 | 0.7893 | 0.5985 | 0.7893 | 0.8884 |
| 0.0739 | 5.1084 | 2968 | 0.7863 | 0.5859 | 0.7863 | 0.8867 |
| 0.0739 | 5.1119 | 2970 | 0.8178 | 0.5833 | 0.8178 | 0.9043 |
| 0.0739 | 5.1153 | 2972 | 0.8956 | 0.5733 | 0.8956 | 0.9463 |
| 0.0739 | 5.1188 | 2974 | 0.9998 | 0.5693 | 0.9998 | 0.9999 |
| 0.0739 | 5.1222 | 2976 | 1.0122 | 0.5642 | 1.0122 | 1.0061 |
| 0.0739 | 5.1256 | 2978 | 0.9483 | 0.5693 | 0.9483 | 0.9738 |
| 0.0739 | 5.1291 | 2980 | 0.8553 | 0.6092 | 0.8553 | 0.9248 |
| 0.0739 | 5.1325 | 2982 | 0.8139 | 0.6154 | 0.8139 | 0.9022 |
| 0.0739 | 5.1360 | 2984 | 0.7684 | 0.6116 | 0.7684 | 0.8766 |
| 0.0739 | 5.1394 | 2986 | 0.7679 | 0.6116 | 0.7679 | 0.8763 |
| 0.0739 | 5.1429 | 2988 | 0.8133 | 0.6053 | 0.8133 | 0.9018 |
| 0.0739 | 5.1463 | 2990 | 0.8937 | 0.5909 | 0.8937 | 0.9453 |
| 0.0739 | 5.1497 | 2992 | 0.9209 | 0.5488 | 0.9209 | 0.9596 |
| 0.0739 | 5.1532 | 2994 | 0.8820 | 0.6475 | 0.8820 | 0.9391 |
| 0.0739 | 5.1566 | 2996 | 0.8292 | 0.6322 | 0.8292 | 0.9106 |
| 0.0739 | 5.1601 | 2998 | 0.7885 | 0.6239 | 0.7885 | 0.8880 |
| 0.0609 | 5.1635 | 3000 | 0.7700 | 0.6116 | 0.7700 | 0.8775 |
| 0.0609 | 5.1670 | 3002 | 0.8104 | 0.6538 | 0.8104 | 0.9002 |
| 0.0609 | 5.1704 | 3004 | 0.9098 | 0.6457 | 0.9098 | 0.9538 |
| 0.0609 | 5.1738 | 3006 | 1.0112 | 0.5444 | 1.0112 | 1.0056 |
| 0.0609 | 5.1773 | 3008 | 1.0565 | 0.5303 | 1.0565 | 1.0279 |
| 0.0609 | 5.1807 | 3010 | 1.0521 | 0.5303 | 1.0521 | 1.0257 |
| 0.0609 | 5.1842 | 3012 | 0.9769 | 0.5512 | 0.9769 | 0.9884 |
| 0.0609 | 5.1876 | 3014 | 0.8823 | 0.5921 | 0.8823 | 0.9393 |
| 0.0609 | 5.1910 | 3016 | 0.8057 | 0.5833 | 0.8057 | 0.8976 |
| 0.0609 | 5.1945 | 3018 | 0.7885 | 0.5833 | 0.7885 | 0.8880 |
| 0.0609 | 5.1979 | 3020 | 0.7615 | 0.6239 | 0.7615 | 0.8726 |
| 0.0609 | 5.2014 | 3022 | 0.7515 | 0.6239 | 0.7515 | 0.8669 |
| 0.0609 | 5.2048 | 3024 | 0.7686 | 0.6239 | 0.7686 | 0.8767 |
| 0.0609 | 5.2083 | 3026 | 0.8324 | 0.5833 | 0.8324 | 0.9124 |
| 0.0609 | 5.2117 | 3028 | 0.9353 | 0.6260 | 0.9353 | 0.9671 |
| 0.0609 | 5.2151 | 3030 | 0.9778 | 0.5693 | 0.9778 | 0.9888 |
| 0.0609 | 5.2186 | 3032 | 0.9577 | 0.5645 | 0.9577 | 0.9786 |
| 0.0609 | 5.2220 | 3034 | 0.8953 | 0.5650 | 0.8953 | 0.9462 |
| 0.0609 | 5.2255 | 3036 | 0.8085 | 0.5833 | 0.8085 | 0.8992 |
| 0.0609 | 5.2289 | 3038 | 0.7729 | 0.5985 | 0.7729 | 0.8791 |
| 0.0609 | 5.2324 | 3040 | 0.7788 | 0.5985 | 0.7788 | 0.8825 |
| 0.0609 | 5.2358 | 3042 | 0.8087 | 0.5985 | 0.8087 | 0.8993 |
| 0.0609 | 5.2392 | 3044 | 0.8889 | 0.5733 | 0.8889 | 0.9428 |
| 0.0609 | 5.2427 | 3046 | 1.0452 | 0.5642 | 1.0452 | 1.0223 |
| 0.0609 | 5.2461 | 3048 | 1.1929 | 0.4954 | 1.1929 | 1.0922 |
| 0.0609 | 5.2496 | 3050 | 1.2643 | 0.5044 | 1.2643 | 1.1244 |
| 0.0609 | 5.2530 | 3052 | 1.2313 | 0.4954 | 1.2313 | 1.1096 |
| 0.0609 | 5.2565 | 3054 | 1.1229 | 0.5071 | 1.1229 | 1.0597 |
| 0.0609 | 5.2599 | 3056 | 0.9750 | 0.5382 | 0.9750 | 0.9874 |
| 0.0609 | 5.2633 | 3058 | 0.8637 | 0.5650 | 0.8637 | 0.9294 |
| 0.0609 | 5.2668 | 3060 | 0.8314 | 0.5985 | 0.8314 | 0.9118 |
| 0.0609 | 5.2702 | 3062 | 0.8522 | 0.5681 | 0.8522 | 0.9231 |
| 0.0609 | 5.2737 | 3064 | 0.8950 | 0.5650 | 0.8950 | 0.9460 |
| 0.0609 | 5.2771 | 3066 | 0.9183 | 0.5650 | 0.9183 | 0.9583 |
| 0.0609 | 5.2806 | 3068 | 0.9246 | 0.5885 | 0.9246 | 0.9615 |
| 0.0609 | 5.2840 | 3070 | 0.9425 | 0.5955 | 0.9425 | 0.9708 |
| 0.0609 | 5.2874 | 3072 | 0.9220 | 0.5955 | 0.9220 | 0.9602 |
| 0.0609 | 5.2909 | 3074 | 0.8844 | 0.5955 | 0.8844 | 0.9404 |
| 0.0609 | 5.2943 | 3076 | 0.8170 | 0.5833 | 0.8170 | 0.9039 |
| 0.0609 | 5.2978 | 3078 | 0.7981 | 0.5708 | 0.7981 | 0.8934 |
| 0.0609 | 5.3012 | 3080 | 0.8244 | 0.5833 | 0.8244 | 0.9080 |
| 0.0609 | 5.3046 | 3082 | 0.8666 | 0.5765 | 0.8666 | 0.9309 |
| 0.0609 | 5.3081 | 3084 | 0.9122 | 0.5955 | 0.9122 | 0.9551 |
| 0.0609 | 5.3115 | 3086 | 0.9872 | 0.5817 | 0.9872 | 0.9936 |
| 0.0609 | 5.3150 | 3088 | 1.0169 | 0.625 | 1.0169 | 1.0084 |
| 0.0609 | 5.3184 | 3090 | 1.0035 | 0.6645 | 1.0035 | 1.0018 |
| 0.0609 | 5.3219 | 3092 | 0.9373 | 0.5955 | 0.9373 | 0.9682 |
| 0.0609 | 5.3253 | 3094 | 0.8666 | 0.5765 | 0.8666 | 0.9309 |
| 0.0609 | 5.3287 | 3096 | 0.8167 | 0.5708 | 0.8167 | 0.9037 |
| 0.0609 | 5.3322 | 3098 | 0.8187 | 0.5708 | 0.8187 | 0.9048 |
| 0.0609 | 5.3356 | 3100 | 0.8222 | 0.5708 | 0.8222 | 0.9068 |
| 0.0609 | 5.3391 | 3102 | 0.8311 | 0.5708 | 0.8311 | 0.9116 |
| 0.0609 | 5.3425 | 3104 | 0.8233 | 0.5708 | 0.8233 | 0.9074 |
| 0.0609 | 5.3460 | 3106 | 0.8093 | 0.5708 | 0.8093 | 0.8996 |
| 0.0609 | 5.3494 | 3108 | 0.8382 | 0.5681 | 0.8382 | 0.9155 |
| 0.0609 | 5.3528 | 3110 | 0.8560 | 0.5681 | 0.8560 | 0.9252 |
| 0.0609 | 5.3563 | 3112 | 0.8746 | 0.5921 | 0.8746 | 0.9352 |
| 0.0609 | 5.3597 | 3114 | 0.8656 | 0.5885 | 0.8656 | 0.9304 |
| 0.0609 | 5.3632 | 3116 | 0.8772 | 0.6110 | 0.8772 | 0.9366 |
| 0.0609 | 5.3666 | 3118 | 0.8840 | 0.6168 | 0.8840 | 0.9402 |
| 0.0609 | 5.3701 | 3120 | 0.8686 | 0.5955 | 0.8686 | 0.9320 |
| 0.0609 | 5.3735 | 3122 | 0.8126 | 0.5946 | 0.8126 | 0.9014 |
| 0.0609 | 5.3769 | 3124 | 0.7591 | 0.5985 | 0.7591 | 0.8712 |
| 0.0609 | 5.3804 | 3126 | 0.7591 | 0.5985 | 0.7591 | 0.8713 |
| 0.0609 | 5.3838 | 3128 | 0.7956 | 0.5946 | 0.7956 | 0.8920 |
| 0.0609 | 5.3873 | 3130 | 0.8607 | 0.5874 | 0.8607 | 0.9278 |
| 0.0609 | 5.3907 | 3132 | 0.9345 | 0.5645 | 0.9345 | 0.9667 |
| 0.0609 | 5.3941 | 3134 | 0.9524 | 0.5855 | 0.9524 | 0.9759 |
| 0.0609 | 5.3976 | 3136 | 0.9330 | 0.5645 | 0.9330 | 0.9659 |
| 0.0609 | 5.4010 | 3138 | 0.8967 | 0.5650 | 0.8967 | 0.9469 |
| 0.0609 | 5.4045 | 3140 | 0.8502 | 0.5650 | 0.8502 | 0.9220 |
| 0.0609 | 5.4079 | 3142 | 0.8435 | 0.5650 | 0.8435 | 0.9184 |
| 0.0609 | 5.4114 | 3144 | 0.8517 | 0.5650 | 0.8517 | 0.9229 |
| 0.0609 | 5.4148 | 3146 | 0.8607 | 0.5650 | 0.8607 | 0.9277 |
| 0.0609 | 5.4182 | 3148 | 0.8530 | 0.5681 | 0.8530 | 0.9236 |
| 0.0609 | 5.4217 | 3150 | 0.8204 | 0.5681 | 0.8204 | 0.9058 |
| 0.0609 | 5.4251 | 3152 | 0.8234 | 0.5681 | 0.8234 | 0.9074 |
| 0.0609 | 5.4286 | 3154 | 0.8159 | 0.5985 | 0.8159 | 0.9033 |
| 0.0609 | 5.4320 | 3156 | 0.8322 | 0.5985 | 0.8322 | 0.9122 |
| 0.0609 | 5.4355 | 3158 | 0.8693 | 0.5650 | 0.8693 | 0.9324 |
| 0.0609 | 5.4389 | 3160 | 0.8898 | 0.5650 | 0.8898 | 0.9433 |
| 0.0609 | 5.4423 | 3162 | 0.8758 | 0.5650 | 0.8758 | 0.9358 |
| 0.0609 | 5.4458 | 3164 | 0.8678 | 0.5733 | 0.8678 | 0.9316 |
| 0.0609 | 5.4492 | 3166 | 0.8758 | 0.5733 | 0.8758 | 0.9358 |
| 0.0609 | 5.4527 | 3168 | 0.8361 | 0.6014 | 0.8361 | 0.9144 |
| 0.0609 | 5.4561 | 3170 | 0.8152 | 0.6014 | 0.8152 | 0.9029 |
| 0.0609 | 5.4596 | 3172 | 0.8056 | 0.5946 | 0.8056 | 0.8975 |
| 0.0609 | 5.4630 | 3174 | 0.7867 | 0.6195 | 0.7867 | 0.8870 |
| 0.0609 | 5.4664 | 3176 | 0.7846 | 0.6195 | 0.7846 | 0.8858 |
| 0.0609 | 5.4699 | 3178 | 0.7994 | 0.625 | 0.7994 | 0.8941 |
| 0.0609 | 5.4733 | 3180 | 0.7864 | 0.625 | 0.7864 | 0.8868 |
| 0.0609 | 5.4768 | 3182 | 0.7987 | 0.625 | 0.7987 | 0.8937 |
| 0.0609 | 5.4802 | 3184 | 0.8138 | 0.6454 | 0.8138 | 0.9021 |
| 0.0609 | 5.4836 | 3186 | 0.7857 | 0.625 | 0.7857 | 0.8864 |
| 0.0609 | 5.4871 | 3188 | 0.7306 | 0.6587 | 0.7306 | 0.8547 |
| 0.0609 | 5.4905 | 3190 | 0.7139 | 0.6587 | 0.7139 | 0.8449 |
| 0.0609 | 5.4940 | 3192 | 0.7162 | 0.6587 | 0.7162 | 0.8463 |
| 0.0609 | 5.4974 | 3194 | 0.7571 | 0.6076 | 0.7571 | 0.8701 |
| 0.0609 | 5.5009 | 3196 | 0.8161 | 0.625 | 0.8161 | 0.9034 |
| 0.0609 | 5.5043 | 3198 | 0.8810 | 0.6520 | 0.8810 | 0.9386 |
| 0.0609 | 5.5077 | 3200 | 0.8936 | 0.6390 | 0.8936 | 0.9453 |
| 0.0609 | 5.5112 | 3202 | 0.9161 | 0.6092 | 0.9161 | 0.9571 |
| 0.0609 | 5.5146 | 3204 | 0.9006 | 0.6029 | 0.9006 | 0.9490 |
| 0.0609 | 5.5181 | 3206 | 0.8616 | 0.6029 | 0.8616 | 0.9282 |
| 0.0609 | 5.5215 | 3208 | 0.8222 | 0.5946 | 0.8222 | 0.9067 |
| 0.0609 | 5.5250 | 3210 | 0.8107 | 0.5985 | 0.8107 | 0.9004 |
| 0.0609 | 5.5284 | 3212 | 0.8151 | 0.5946 | 0.8151 | 0.9028 |
| 0.0609 | 5.5318 | 3214 | 0.7892 | 0.5985 | 0.7892 | 0.8884 |
| 0.0609 | 5.5353 | 3216 | 0.7988 | 0.6053 | 0.7988 | 0.8938 |
| 0.0609 | 5.5387 | 3218 | 0.8313 | 0.6520 | 0.8313 | 0.9117 |
| 0.0609 | 5.5422 | 3220 | 0.8320 | 0.6520 | 0.8320 | 0.9121 |
| 0.0609 | 5.5456 | 3222 | 0.8448 | 0.6520 | 0.8448 | 0.9191 |
| 0.0609 | 5.5491 | 3224 | 0.8592 | 0.6520 | 0.8592 | 0.9269 |
| 0.0609 | 5.5525 | 3226 | 0.8482 | 0.6520 | 0.8482 | 0.9210 |
| 0.0609 | 5.5559 | 3228 | 0.8021 | 0.6271 | 0.8021 | 0.8956 |
| 0.0609 | 5.5594 | 3230 | 0.7545 | 0.5859 | 0.7545 | 0.8686 |
| 0.0609 | 5.5628 | 3232 | 0.7550 | 0.5859 | 0.7550 | 0.8689 |
| 0.0609 | 5.5663 | 3234 | 0.7873 | 0.6216 | 0.7873 | 0.8873 |
| 0.0609 | 5.5697 | 3236 | 0.8332 | 0.6069 | 0.8332 | 0.9128 |
| 0.0609 | 5.5731 | 3238 | 0.8480 | 0.6069 | 0.8480 | 0.9209 |
| 0.0609 | 5.5766 | 3240 | 0.8315 | 0.6069 | 0.8315 | 0.9119 |
| 0.0609 | 5.5800 | 3242 | 0.7829 | 0.5985 | 0.7829 | 0.8848 |
| 0.0609 | 5.5835 | 3244 | 0.7412 | 0.6116 | 0.7412 | 0.8609 |
| 0.0609 | 5.5869 | 3246 | 0.7090 | 0.6116 | 0.7090 | 0.8420 |
| 0.0609 | 5.5904 | 3248 | 0.7167 | 0.6116 | 0.7167 | 0.8466 |
| 0.0609 | 5.5938 | 3250 | 0.7604 | 0.6239 | 0.7604 | 0.8720 |
| 0.0609 | 5.5972 | 3252 | 0.8210 | 0.5985 | 0.8210 | 0.9061 |
| 0.0609 | 5.6007 | 3254 | 0.8922 | 0.6301 | 0.8922 | 0.9446 |
| 0.0609 | 5.6041 | 3256 | 0.8966 | 0.6301 | 0.8966 | 0.9469 |
| 0.0609 | 5.6076 | 3258 | 0.8568 | 0.5985 | 0.8568 | 0.9256 |
| 0.0609 | 5.6110 | 3260 | 0.8053 | 0.5985 | 0.8053 | 0.8974 |
| 0.0609 | 5.6145 | 3262 | 0.7976 | 0.5985 | 0.7976 | 0.8931 |
| 0.0609 | 5.6179 | 3264 | 0.7966 | 0.5985 | 0.7966 | 0.8925 |
| 0.0609 | 5.6213 | 3266 | 0.8004 | 0.5985 | 0.8004 | 0.8947 |
| 0.0609 | 5.6248 | 3268 | 0.8307 | 0.6053 | 0.8307 | 0.9114 |
| 0.0609 | 5.6282 | 3270 | 0.8401 | 0.6053 | 0.8401 | 0.9166 |
| 0.0609 | 5.6317 | 3272 | 0.8394 | 0.6271 | 0.8394 | 0.9162 |
| 0.0609 | 5.6351 | 3274 | 0.8380 | 0.65 | 0.8380 | 0.9154 |
| 0.0609 | 5.6386 | 3276 | 0.8145 | 0.65 | 0.8145 | 0.9025 |
| 0.0609 | 5.6420 | 3278 | 0.7895 | 0.6293 | 0.7895 | 0.8886 |
| 0.0609 | 5.6454 | 3280 | 0.7516 | 0.6116 | 0.7516 | 0.8669 |
| 0.0609 | 5.6489 | 3282 | 0.7247 | 0.6116 | 0.7247 | 0.8513 |
| 0.0609 | 5.6523 | 3284 | 0.7397 | 0.6116 | 0.7397 | 0.8601 |
| 0.0609 | 5.6558 | 3286 | 0.7933 | 0.6293 | 0.7933 | 0.8907 |
| 0.0609 | 5.6592 | 3288 | 0.8571 | 0.6132 | 0.8571 | 0.9258 |
| 0.0609 | 5.6627 | 3290 | 0.9517 | 0.5251 | 0.9517 | 0.9755 |
| 0.0609 | 5.6661 | 3292 | 0.9981 | 0.4637 | 0.9981 | 0.9990 |
| 0.0609 | 5.6695 | 3294 | 1.0236 | 0.4637 | 1.0236 | 1.0117 |
| 0.0609 | 5.6730 | 3296 | 1.0272 | 0.4512 | 1.0272 | 1.0135 |
| 0.0609 | 5.6764 | 3298 | 0.9961 | 0.4765 | 0.9961 | 0.9981 |
| 0.0609 | 5.6799 | 3300 | 0.9948 | 0.4765 | 0.9948 | 0.9974 |
| 0.0609 | 5.6833 | 3302 | 0.9639 | 0.5678 | 0.9639 | 0.9818 |
| 0.0609 | 5.6867 | 3304 | 0.8957 | 0.6069 | 0.8957 | 0.9464 |
| 0.0609 | 5.6902 | 3306 | 0.8378 | 0.5833 | 0.8378 | 0.9153 |
| 0.0609 | 5.6936 | 3308 | 0.7776 | 0.6239 | 0.7776 | 0.8818 |
| 0.0609 | 5.6971 | 3310 | 0.7679 | 0.6116 | 0.7679 | 0.8763 |
| 0.0609 | 5.7005 | 3312 | 0.8031 | 0.6053 | 0.8031 | 0.8962 |
| 0.0609 | 5.7040 | 3314 | 0.8747 | 0.5909 | 0.8747 | 0.9353 |
| 0.0609 | 5.7074 | 3316 | 0.9206 | 0.5991 | 0.9206 | 0.9595 |
| 0.0609 | 5.7108 | 3318 | 0.9192 | 0.5991 | 0.9192 | 0.9587 |
| 0.0609 | 5.7143 | 3320 | 0.8787 | 0.5765 | 0.8787 | 0.9374 |
| 0.0609 | 5.7177 | 3322 | 0.8231 | 0.5909 | 0.8231 | 0.9072 |
| 0.0609 | 5.7212 | 3324 | 0.8103 | 0.5833 | 0.8103 | 0.9002 |
| 0.0609 | 5.7246 | 3326 | 0.8160 | 0.5833 | 0.8160 | 0.9033 |
| 0.0609 | 5.7281 | 3328 | 0.8213 | 0.5833 | 0.8213 | 0.9062 |
| 0.0609 | 5.7315 | 3330 | 0.8413 | 0.5909 | 0.8413 | 0.9172 |
| 0.0609 | 5.7349 | 3332 | 0.8935 | 0.5991 | 0.8935 | 0.9452 |
| 0.0609 | 5.7384 | 3334 | 0.9533 | 0.5817 | 0.9533 | 0.9764 |
| 0.0609 | 5.7418 | 3336 | 0.9503 | 0.5817 | 0.9503 | 0.9749 |
| 0.0609 | 5.7453 | 3338 | 0.9488 | 0.5817 | 0.9488 | 0.9741 |
| 0.0609 | 5.7487 | 3340 | 0.9228 | 0.5354 | 0.9228 | 0.9606 |
| 0.0609 | 5.7522 | 3342 | 0.8810 | 0.5354 | 0.8810 | 0.9386 |
| 0.0609 | 5.7556 | 3344 | 0.8693 | 0.5354 | 0.8693 | 0.9324 |
| 0.0609 | 5.7590 | 3346 | 0.8880 | 0.5354 | 0.8880 | 0.9424 |
| 0.0609 | 5.7625 | 3348 | 0.9346 | 0.5354 | 0.9346 | 0.9667 |
| 0.0609 | 5.7659 | 3350 | 0.9430 | 0.5591 | 0.9430 | 0.9711 |
| 0.0609 | 5.7694 | 3352 | 0.9058 | 0.5354 | 0.9058 | 0.9517 |
| 0.0609 | 5.7728 | 3354 | 0.8335 | 0.5909 | 0.8335 | 0.9130 |
| 0.0609 | 5.7762 | 3356 | 0.7996 | 0.6053 | 0.7996 | 0.8942 |
| 0.0609 | 5.7797 | 3358 | 0.7916 | 0.6239 | 0.7916 | 0.8897 |
| 0.0609 | 5.7831 | 3360 | 0.8260 | 0.5909 | 0.8260 | 0.9089 |
| 0.0609 | 5.7866 | 3362 | 0.8638 | 0.5909 | 0.8638 | 0.9294 |
| 0.0609 | 5.7900 | 3364 | 0.9068 | 0.5354 | 0.9068 | 0.9523 |
| 0.0609 | 5.7935 | 3366 | 0.9386 | 0.5207 | 0.9386 | 0.9688 |
| 0.0609 | 5.7969 | 3368 | 0.9249 | 0.5093 | 0.9249 | 0.9617 |
| 0.0609 | 5.8003 | 3370 | 0.8873 | 0.5248 | 0.8873 | 0.9420 |
| 0.0609 | 5.8038 | 3372 | 0.8635 | 0.5248 | 0.8635 | 0.9292 |
| 0.0609 | 5.8072 | 3374 | 0.8661 | 0.5248 | 0.8661 | 0.9306 |
| 0.0609 | 5.8107 | 3376 | 0.8834 | 0.5093 | 0.8834 | 0.9399 |
| 0.0609 | 5.8141 | 3378 | 0.8658 | 0.5248 | 0.8658 | 0.9305 |
| 0.0609 | 5.8176 | 3380 | 0.8691 | 0.5248 | 0.8691 | 0.9323 |
| 0.0609 | 5.8210 | 3382 | 0.8956 | 0.5093 | 0.8956 | 0.9464 |
| 0.0609 | 5.8244 | 3384 | 0.9263 | 0.5207 | 0.9263 | 0.9624 |
| 0.0609 | 5.8279 | 3386 | 0.9990 | 0.5649 | 0.9990 | 0.9995 |
| 0.0609 | 5.8313 | 3388 | 1.0483 | 0.5357 | 1.0483 | 1.0238 |
| 0.0609 | 5.8348 | 3390 | 1.0251 | 0.5357 | 1.0251 | 1.0125 |
| 0.0609 | 5.8382 | 3392 | 0.9613 | 0.5424 | 0.9613 | 0.9805 |
| 0.0609 | 5.8417 | 3394 | 0.8924 | 0.55 | 0.8924 | 0.9446 |
| 0.0609 | 5.8451 | 3396 | 0.8425 | 0.5402 | 0.8425 | 0.9179 |
| 0.0609 | 5.8485 | 3398 | 0.8055 | 0.5402 | 0.8055 | 0.8975 |
| 0.0609 | 5.8520 | 3400 | 0.8103 | 0.5402 | 0.8103 | 0.9001 |
| 0.0609 | 5.8554 | 3402 | 0.8335 | 0.5402 | 0.8335 | 0.9130 |
| 0.0609 | 5.8589 | 3404 | 0.9008 | 0.5733 | 0.9008 | 0.9491 |
| 0.0609 | 5.8623 | 3406 | 0.9455 | 0.5448 | 0.9455 | 0.9723 |
| 0.0609 | 5.8657 | 3408 | 0.9538 | 0.5448 | 0.9538 | 0.9766 |
| 0.0609 | 5.8692 | 3410 | 0.9572 | 0.5448 | 0.9572 | 0.9784 |
| 0.0609 | 5.8726 | 3412 | 0.9087 | 0.55 | 0.9087 | 0.9532 |
| 0.0609 | 5.8761 | 3414 | 0.8582 | 0.5402 | 0.8582 | 0.9264 |
| 0.0609 | 5.8795 | 3416 | 0.8286 | 0.5402 | 0.8286 | 0.9103 |
| 0.0609 | 5.8830 | 3418 | 0.8472 | 0.5402 | 0.8472 | 0.9204 |
| 0.0609 | 5.8864 | 3420 | 0.9088 | 0.5650 | 0.9088 | 0.9533 |
| 0.0609 | 5.8898 | 3422 | 0.9700 | 0.5488 | 0.9700 | 0.9849 |
| 0.0609 | 5.8933 | 3424 | 1.0138 | 0.5357 | 1.0138 | 1.0069 |
| 0.0609 | 5.8967 | 3426 | 1.0122 | 0.5357 | 1.0122 | 1.0061 |
| 0.0609 | 5.9002 | 3428 | 0.9542 | 0.5270 | 0.9542 | 0.9768 |
| 0.0609 | 5.9036 | 3430 | 0.8754 | 0.5650 | 0.8754 | 0.9356 |
| 0.0609 | 5.9071 | 3432 | 0.8125 | 0.5833 | 0.8125 | 0.9014 |
| 0.0609 | 5.9105 | 3434 | 0.7968 | 0.5833 | 0.7968 | 0.8926 |
| 0.0609 | 5.9139 | 3436 | 0.8234 | 0.5833 | 0.8234 | 0.9074 |
| 0.0609 | 5.9174 | 3438 | 0.8625 | 0.5402 | 0.8625 | 0.9287 |
| 0.0609 | 5.9208 | 3440 | 0.9314 | 0.5473 | 0.9314 | 0.9651 |
| 0.0609 | 5.9243 | 3442 | 0.9664 | 0.5424 | 0.9664 | 0.9830 |
| 0.0609 | 5.9277 | 3444 | 0.9430 | 0.5326 | 0.9430 | 0.9711 |
| 0.0609 | 5.9312 | 3446 | 0.8805 | 0.5248 | 0.8805 | 0.9384 |
| 0.0609 | 5.9346 | 3448 | 0.8518 | 0.5402 | 0.8518 | 0.9229 |
| 0.0609 | 5.9380 | 3450 | 0.8299 | 0.5833 | 0.8299 | 0.9110 |
| 0.0609 | 5.9415 | 3452 | 0.8213 | 0.5985 | 0.8213 | 0.9063 |
| 0.0609 | 5.9449 | 3454 | 0.8533 | 0.5909 | 0.8533 | 0.9237 |
| 0.0609 | 5.9484 | 3456 | 0.9065 | 0.5563 | 0.9065 | 0.9521 |
| 0.0609 | 5.9518 | 3458 | 0.9136 | 0.5563 | 0.9136 | 0.9558 |
| 0.0609 | 5.9552 | 3460 | 0.8816 | 0.55 | 0.8816 | 0.9389 |
| 0.0609 | 5.9587 | 3462 | 0.8870 | 0.5227 | 0.8870 | 0.9418 |
| 0.0609 | 5.9621 | 3464 | 0.9148 | 0.5473 | 0.9148 | 0.9565 |
| 0.0609 | 5.9656 | 3466 | 0.9590 | 0.5473 | 0.9590 | 0.9793 |
| 0.0609 | 5.9690 | 3468 | 0.9710 | 0.5708 | 0.9710 | 0.9854 |
| 0.0609 | 5.9725 | 3470 | 0.9399 | 0.5473 | 0.9399 | 0.9695 |
| 0.0609 | 5.9759 | 3472 | 0.8702 | 0.5227 | 0.8702 | 0.9329 |
| 0.0609 | 5.9793 | 3474 | 0.8385 | 0.5402 | 0.8385 | 0.9157 |
| 0.0609 | 5.9828 | 3476 | 0.8540 | 0.5378 | 0.8540 | 0.9241 |
| 0.0609 | 5.9862 | 3478 | 0.8824 | 0.5331 | 0.8824 | 0.9394 |
| 0.0609 | 5.9897 | 3480 | 0.8613 | 0.5874 | 0.8613 | 0.9281 |
| 0.0609 | 5.9931 | 3482 | 0.8511 | 0.5874 | 0.8511 | 0.9226 |
| 0.0609 | 5.9966 | 3484 | 0.8553 | 0.5874 | 0.8553 | 0.9248 |
| 0.0609 | 6.0 | 3486 | 0.8894 | 0.5331 | 0.8894 | 0.9431 |
| 0.0609 | 6.0034 | 3488 | 0.8960 | 0.5227 | 0.8960 | 0.9466 |
| 0.0609 | 6.0069 | 3490 | 0.9109 | 0.5563 | 0.9109 | 0.9544 |
| 0.0609 | 6.0103 | 3492 | 0.9499 | 0.5785 | 0.9499 | 0.9746 |
| 0.0609 | 6.0138 | 3494 | 0.9583 | 0.5785 | 0.9583 | 0.9789 |
| 0.0609 | 6.0172 | 3496 | 0.9379 | 0.5473 | 0.9379 | 0.9684 |
| 0.0609 | 6.0207 | 3498 | 0.9252 | 0.5473 | 0.9252 | 0.9619 |
| 0.0557 | 6.0241 | 3500 | 0.9060 | 0.5473 | 0.9060 | 0.9519 |
| 0.0557 | 6.0275 | 3502 | 0.8553 | 0.5227 | 0.8553 | 0.9248 |
| 0.0557 | 6.0310 | 3504 | 0.8287 | 0.5527 | 0.8287 | 0.9103 |
| 0.0557 | 6.0344 | 3506 | 0.8342 | 0.5527 | 0.8342 | 0.9133 |
| 0.0557 | 6.0379 | 3508 | 0.8823 | 0.5563 | 0.8823 | 0.9393 |
| 0.0557 | 6.0413 | 3510 | 0.9159 | 0.5649 | 0.9159 | 0.9570 |
| 0.0557 | 6.0448 | 3512 | 0.9373 | 0.5649 | 0.9373 | 0.9681 |
| 0.0557 | 6.0482 | 3514 | 0.9248 | 0.5649 | 0.9248 | 0.9616 |
| 0.0557 | 6.0516 | 3516 | 0.9034 | 0.5424 | 0.9034 | 0.9505 |
| 0.0557 | 6.0551 | 3518 | 0.8822 | 0.5188 | 0.8822 | 0.9392 |
| 0.0557 | 6.0585 | 3520 | 0.8803 | 0.5188 | 0.8803 | 0.9382 |
| 0.0557 | 6.0620 | 3522 | 0.9283 | 0.5424 | 0.9283 | 0.9635 |
| 0.0557 | 6.0654 | 3524 | 0.9980 | 0.5135 | 0.9980 | 0.9990 |
| 0.0557 | 6.0688 | 3526 | 1.0095 | 0.5135 | 1.0095 | 1.0047 |
| 0.0557 | 6.0723 | 3528 | 0.9669 | 0.5649 | 0.9669 | 0.9833 |
| 0.0557 | 6.0757 | 3530 | 0.8929 | 0.5188 | 0.8929 | 0.9449 |
| 0.0557 | 6.0792 | 3532 | 0.8234 | 0.5833 | 0.8234 | 0.9074 |
| 0.0557 | 6.0826 | 3534 | 0.8089 | 0.5985 | 0.8089 | 0.8994 |
| 0.0557 | 6.0861 | 3536 | 0.8270 | 0.5833 | 0.8270 | 0.9094 |
| 0.0557 | 6.0895 | 3538 | 0.8418 | 0.5798 | 0.8418 | 0.9175 |
| 0.0557 | 6.0929 | 3540 | 0.8753 | 0.5874 | 0.8753 | 0.9356 |
| 0.0557 | 6.0964 | 3542 | 0.9198 | 0.5817 | 0.9198 | 0.9590 |
| 0.0557 | 6.0998 | 3544 | 0.9756 | 0.5424 | 0.9756 | 0.9877 |
| 0.0557 | 6.1033 | 3546 | 0.9787 | 0.5424 | 0.9787 | 0.9893 |
| 0.0557 | 6.1067 | 3548 | 0.9408 | 0.5424 | 0.9408 | 0.9699 |
| 0.0557 | 6.1102 | 3550 | 0.8974 | 0.5885 | 0.8974 | 0.9473 |
| 0.0557 | 6.1136 | 3552 | 0.8977 | 0.5885 | 0.8977 | 0.9475 |
| 0.0557 | 6.1170 | 3554 | 0.9179 | 0.5473 | 0.9179 | 0.9581 |
| 0.0557 | 6.1205 | 3556 | 0.9633 | 0.5424 | 0.9633 | 0.9815 |
| 0.0557 | 6.1239 | 3558 | 0.9626 | 0.5424 | 0.9626 | 0.9811 |
| 0.0557 | 6.1274 | 3560 | 0.9321 | 0.5563 | 0.9321 | 0.9654 |
| 0.0557 | 6.1308 | 3562 | 0.8969 | 0.5955 | 0.8969 | 0.9471 |
| 0.0557 | 6.1343 | 3564 | 0.8869 | 0.5955 | 0.8869 | 0.9417 |
| 0.0557 | 6.1377 | 3566 | 0.9248 | 0.5563 | 0.9248 | 0.9617 |
| 0.0557 | 6.1411 | 3568 | 0.9359 | 0.5563 | 0.9359 | 0.9674 |
| 0.0557 | 6.1446 | 3570 | 0.9244 | 0.5563 | 0.9244 | 0.9615 |
| 0.0557 | 6.1480 | 3572 | 0.9243 | 0.5563 | 0.9243 | 0.9614 |
| 0.0557 | 6.1515 | 3574 | 0.9173 | 0.55 | 0.9173 | 0.9578 |
| 0.0557 | 6.1549 | 3576 | 0.9668 | 0.5326 | 0.9668 | 0.9833 |
| 0.0557 | 6.1583 | 3578 | 1.0350 | 0.5135 | 1.0350 | 1.0173 |
| 0.0557 | 6.1618 | 3580 | 1.0441 | 0.5135 | 1.0441 | 1.0218 |
| 0.0557 | 6.1652 | 3582 | 0.9966 | 0.5424 | 0.9966 | 0.9983 |
| 0.0557 | 6.1687 | 3584 | 0.9926 | 0.5424 | 0.9926 | 0.9963 |
| 0.0557 | 6.1721 | 3586 | 0.9826 | 0.5424 | 0.9826 | 0.9912 |
| 0.0557 | 6.1756 | 3588 | 0.9566 | 0.5326 | 0.9566 | 0.9781 |
| 0.0557 | 6.1790 | 3590 | 0.9589 | 0.5326 | 0.9589 | 0.9792 |
| 0.0557 | 6.1824 | 3592 | 0.9835 | 0.4903 | 0.9835 | 0.9917 |
| 0.0557 | 6.1859 | 3594 | 1.0045 | 0.5135 | 1.0045 | 1.0023 |
| 0.0557 | 6.1893 | 3596 | 0.9744 | 0.5041 | 0.9744 | 0.9871 |
| 0.0557 | 6.1928 | 3598 | 0.9301 | 0.5885 | 0.9301 | 0.9644 |
| 0.0557 | 6.1962 | 3600 | 0.8732 | 0.5681 | 0.8732 | 0.9345 |
| 0.0557 | 6.1997 | 3602 | 0.8583 | 0.5681 | 0.8583 | 0.9264 |
| 0.0557 | 6.2031 | 3604 | 0.8919 | 0.5885 | 0.8919 | 0.9444 |
| 0.0557 | 6.2065 | 3606 | 0.9243 | 0.5885 | 0.9243 | 0.9614 |
| 0.0557 | 6.2100 | 3608 | 0.9867 | 0.5169 | 0.9867 | 0.9933 |
| 0.0557 | 6.2134 | 3610 | 1.0320 | 0.5104 | 1.0320 | 1.0159 |
| 0.0557 | 6.2169 | 3612 | 1.0250 | 0.5104 | 1.0250 | 1.0124 |
| 0.0557 | 6.2203 | 3614 | 0.9839 | 0.4903 | 0.9839 | 0.9919 |
| 0.0557 | 6.2238 | 3616 | 0.9452 | 0.5708 | 0.9452 | 0.9722 |
| 0.0557 | 6.2272 | 3618 | 0.9248 | 0.5885 | 0.9248 | 0.9616 |
| 0.0557 | 6.2306 | 3620 | 0.8901 | 0.5921 | 0.8901 | 0.9434 |
| 0.0557 | 6.2341 | 3622 | 0.8500 | 0.5833 | 0.8500 | 0.9220 |
| 0.0557 | 6.2375 | 3624 | 0.8499 | 0.5833 | 0.8499 | 0.9219 |
| 0.0557 | 6.2410 | 3626 | 0.8887 | 0.5681 | 0.8887 | 0.9427 |
| 0.0557 | 6.2444 | 3628 | 0.9331 | 0.5921 | 0.9331 | 0.9660 |
| 0.0557 | 6.2478 | 3630 | 0.9755 | 0.5188 | 0.9755 | 0.9877 |
| 0.0557 | 6.2513 | 3632 | 1.0378 | 0.4765 | 1.0378 | 1.0187 |
| 0.0557 | 6.2547 | 3634 | 1.0543 | 0.5 | 1.0543 | 1.0268 |
| 0.0557 | 6.2582 | 3636 | 1.0220 | 0.5283 | 1.0220 | 1.0110 |
| 0.0557 | 6.2616 | 3638 | 0.9809 | 0.5207 | 0.9809 | 0.9904 |
| 0.0557 | 6.2651 | 3640 | 0.9409 | 0.55 | 0.9409 | 0.9700 |
| 0.0557 | 6.2685 | 3642 | 0.8779 | 0.5681 | 0.8779 | 0.9370 |
| 0.0557 | 6.2719 | 3644 | 0.8309 | 0.5833 | 0.8309 | 0.9115 |
| 0.0557 | 6.2754 | 3646 | 0.8099 | 0.5833 | 0.8099 | 0.9000 |
| 0.0557 | 6.2788 | 3648 | 0.8024 | 0.5708 | 0.8024 | 0.8958 |
| 0.0557 | 6.2823 | 3650 | 0.7733 | 0.5708 | 0.7733 | 0.8794 |
| 0.0557 | 6.2857 | 3652 | 0.7587 | 0.5708 | 0.7587 | 0.8710 |
| 0.0557 | 6.2892 | 3654 | 0.7703 | 0.5789 | 0.7703 | 0.8777 |
| 0.0557 | 6.2926 | 3656 | 0.8219 | 0.5909 | 0.8219 | 0.9066 |
| 0.0557 | 6.2960 | 3658 | 0.8945 | 0.5874 | 0.8945 | 0.9458 |
| 0.0557 | 6.2995 | 3660 | 0.9527 | 0.5512 | 0.9527 | 0.9761 |
| 0.0557 | 6.3029 | 3662 | 0.9722 | 0.5382 | 0.9722 | 0.9860 |
| 0.0557 | 6.3064 | 3664 | 0.9391 | 0.5673 | 0.9391 | 0.9691 |
| 0.0557 | 6.3098 | 3666 | 0.9104 | 0.6029 | 0.9104 | 0.9541 |
| 0.0557 | 6.3133 | 3668 | 0.8809 | 0.5798 | 0.8809 | 0.9386 |
| 0.0557 | 6.3167 | 3670 | 0.8744 | 0.5798 | 0.8744 | 0.9351 |
| 0.0557 | 6.3201 | 3672 | 0.9028 | 0.5650 | 0.9028 | 0.9502 |
| 0.0557 | 6.3236 | 3674 | 0.9464 | 0.5227 | 0.9464 | 0.9728 |
| 0.0557 | 6.3270 | 3676 | 1.0060 | 0.4903 | 1.0060 | 1.0030 |
| 0.0557 | 6.3305 | 3678 | 1.0851 | 0.5104 | 1.0851 | 1.0417 |
| 0.0557 | 6.3339 | 3680 | 1.1271 | 0.4860 | 1.1271 | 1.0616 |
| 0.0557 | 6.3373 | 3682 | 1.1265 | 0.4860 | 1.1265 | 1.0614 |
| 0.0557 | 6.3408 | 3684 | 1.1122 | 0.4860 | 1.1122 | 1.0546 |
| 0.0557 | 6.3442 | 3686 | 1.0907 | 0.4881 | 1.0907 | 1.0444 |
| 0.0557 | 6.3477 | 3688 | 1.0214 | 0.5013 | 1.0214 | 1.0106 |
| 0.0557 | 6.3511 | 3690 | 0.9741 | 0.5382 | 0.9741 | 0.9870 |
| 0.0557 | 6.3546 | 3692 | 0.9252 | 0.5354 | 0.9252 | 0.9619 |
| 0.0557 | 6.3580 | 3694 | 0.9005 | 0.5378 | 0.9005 | 0.9489 |
| 0.0557 | 6.3614 | 3696 | 0.8768 | 0.5765 | 0.8768 | 0.9364 |
| 0.0557 | 6.3649 | 3698 | 0.8800 | 0.5765 | 0.8800 | 0.9381 |
| 0.0557 | 6.3683 | 3700 | 0.8908 | 0.5733 | 0.8908 | 0.9438 |
| 0.0557 | 6.3718 | 3702 | 0.9232 | 0.4931 | 0.9232 | 0.9608 |
| 0.0557 | 6.3752 | 3704 | 1.0095 | 0.5382 | 1.0095 | 1.0047 |
| 0.0557 | 6.3787 | 3706 | 1.0668 | 0.5233 | 1.0668 | 1.0328 |
| 0.0557 | 6.3821 | 3708 | 1.0820 | 0.5104 | 1.0820 | 1.0402 |
| 0.0557 | 6.3855 | 3710 | 1.0560 | 0.5104 | 1.0560 | 1.0276 |
| 0.0557 | 6.3890 | 3712 | 1.0100 | 0.4784 | 1.0100 | 1.0050 |
| 0.0557 | 6.3924 | 3714 | 0.9857 | 0.4784 | 0.9857 | 0.9928 |
| 0.0557 | 6.3959 | 3716 | 0.9673 | 0.5733 | 0.9673 | 0.9835 |
| 0.0557 | 6.3993 | 3718 | 0.9814 | 0.5161 | 0.9814 | 0.9907 |
| 0.0557 | 6.4028 | 3720 | 1.0192 | 0.4784 | 1.0192 | 1.0096 |
| 0.0557 | 6.4062 | 3722 | 1.0060 | 0.5161 | 1.0060 | 1.0030 |
| 0.0557 | 6.4096 | 3724 | 0.9849 | 0.5161 | 0.9849 | 0.9924 |
| 0.0557 | 6.4131 | 3726 | 0.9529 | 0.5733 | 0.9529 | 0.9762 |
| 0.0557 | 6.4165 | 3728 | 0.9250 | 0.5733 | 0.9250 | 0.9618 |
| 0.0557 | 6.4200 | 3730 | 0.9232 | 0.5733 | 0.9232 | 0.9608 |
| 0.0557 | 6.4234 | 3732 | 0.9203 | 0.5733 | 0.9203 | 0.9593 |
| 0.0557 | 6.4269 | 3734 | 0.9477 | 0.5733 | 0.9477 | 0.9735 |
| 0.0557 | 6.4303 | 3736 | 0.9695 | 0.5161 | 0.9695 | 0.9846 |
| 0.0557 | 6.4337 | 3738 | 0.9913 | 0.4784 | 0.9913 | 0.9956 |
| 0.0557 | 6.4372 | 3740 | 1.0158 | 0.4784 | 1.0158 | 1.0079 |
| 0.0557 | 6.4406 | 3742 | 1.0430 | 0.4649 | 1.0430 | 1.0213 |
| 0.0557 | 6.4441 | 3744 | 1.0495 | 0.4649 | 1.0495 | 1.0244 |
| 0.0557 | 6.4475 | 3746 | 1.0655 | 0.4881 | 1.0655 | 1.0322 |
| 0.0557 | 6.4509 | 3748 | 1.0727 | 0.4881 | 1.0727 | 1.0357 |
| 0.0557 | 6.4544 | 3750 | 1.0670 | 0.4881 | 1.0670 | 1.0329 |
| 0.0557 | 6.4578 | 3752 | 1.0584 | 0.4881 | 1.0584 | 1.0288 |
| 0.0557 | 6.4613 | 3754 | 1.0375 | 0.4881 | 1.0375 | 1.0186 |
| 0.0557 | 6.4647 | 3756 | 1.0425 | 0.4881 | 1.0425 | 1.0210 |
| 0.0557 | 6.4682 | 3758 | 1.0204 | 0.4881 | 1.0204 | 1.0101 |
| 0.0557 | 6.4716 | 3760 | 1.0145 | 0.4649 | 1.0145 | 1.0072 |
| 0.0557 | 6.4750 | 3762 | 0.9944 | 0.4408 | 0.9944 | 0.9972 |
| 0.0557 | 6.4785 | 3764 | 0.9736 | 0.4408 | 0.9736 | 0.9867 |
| 0.0557 | 6.4819 | 3766 | 0.9758 | 0.4408 | 0.9758 | 0.9878 |
| 0.0557 | 6.4854 | 3768 | 0.9533 | 0.4955 | 0.9533 | 0.9764 |
| 0.0557 | 6.4888 | 3770 | 0.9317 | 0.5378 | 0.9317 | 0.9652 |
| 0.0557 | 6.4923 | 3772 | 0.9204 | 0.5378 | 0.9204 | 0.9594 |
| 0.0557 | 6.4957 | 3774 | 0.9382 | 0.5378 | 0.9382 | 0.9686 |
| 0.0557 | 6.4991 | 3776 | 0.9524 | 0.5619 | 0.9524 | 0.9759 |
| 0.0557 | 6.5026 | 3778 | 0.9585 | 0.5179 | 0.9585 | 0.9790 |
| 0.0557 | 6.5060 | 3780 | 0.9641 | 0.5179 | 0.9641 | 0.9819 |
| 0.0557 | 6.5095 | 3782 | 0.9401 | 0.5179 | 0.9401 | 0.9696 |
| 0.0557 | 6.5129 | 3784 | 0.9319 | 0.5619 | 0.9319 | 0.9653 |
| 0.0557 | 6.5164 | 3786 | 0.9065 | 0.5619 | 0.9065 | 0.9521 |
| 0.0557 | 6.5198 | 3788 | 0.9016 | 0.5619 | 0.9016 | 0.9495 |
| 0.0557 | 6.5232 | 3790 | 0.8911 | 0.5921 | 0.8911 | 0.9440 |
| 0.0557 | 6.5267 | 3792 | 0.8956 | 0.5619 | 0.8956 | 0.9464 |
| 0.0557 | 6.5301 | 3794 | 0.8741 | 0.5921 | 0.8741 | 0.9349 |
| 0.0557 | 6.5336 | 3796 | 0.8450 | 0.5681 | 0.8450 | 0.9192 |
| 0.0557 | 6.5370 | 3798 | 0.8514 | 0.5681 | 0.8514 | 0.9227 |
| 0.0557 | 6.5404 | 3800 | 0.8904 | 0.5702 | 0.8904 | 0.9436 |
| 0.0557 | 6.5439 | 3802 | 0.9553 | 0.5382 | 0.9553 | 0.9774 |
| 0.0557 | 6.5473 | 3804 | 1.0309 | 0.5013 | 1.0309 | 1.0153 |
| 0.0557 | 6.5508 | 3806 | 1.0819 | 0.4881 | 1.0819 | 1.0402 |
| 0.0557 | 6.5542 | 3808 | 1.0833 | 0.4881 | 1.0833 | 1.0408 |
| 0.0557 | 6.5577 | 3810 | 1.0734 | 0.4881 | 1.0734 | 1.0360 |
| 0.0557 | 6.5611 | 3812 | 1.0638 | 0.4625 | 1.0638 | 1.0314 |
| 0.0557 | 6.5645 | 3814 | 1.0508 | 0.4625 | 1.0508 | 1.0251 |
| 0.0557 | 6.5680 | 3816 | 1.0262 | 0.4765 | 1.0262 | 1.0130 |
| 0.0557 | 6.5714 | 3818 | 0.9863 | 0.4518 | 0.9863 | 0.9931 |
| 0.0557 | 6.5749 | 3820 | 0.9624 | 0.4408 | 0.9624 | 0.9810 |
| 0.0557 | 6.5783 | 3822 | 0.9523 | 0.4813 | 0.9523 | 0.9759 |
| 0.0557 | 6.5818 | 3824 | 0.9711 | 0.4784 | 0.9711 | 0.9854 |
| 0.0557 | 6.5852 | 3826 | 0.9953 | 0.5013 | 0.9953 | 0.9976 |
| 0.0557 | 6.5886 | 3828 | 0.9830 | 0.5382 | 0.9830 | 0.9914 |
| 0.0557 | 6.5921 | 3830 | 0.9926 | 0.5382 | 0.9926 | 0.9963 |
| 0.0557 | 6.5955 | 3832 | 1.0089 | 0.5013 | 1.0089 | 1.0044 |
| 0.0557 | 6.5990 | 3834 | 0.9763 | 0.5382 | 0.9763 | 0.9881 |
| 0.0557 | 6.6024 | 3836 | 0.9176 | 0.4944 | 0.9176 | 0.9579 |
| 0.0557 | 6.6059 | 3838 | 0.8791 | 0.5946 | 0.8791 | 0.9376 |
| 0.0557 | 6.6093 | 3840 | 0.8859 | 0.5100 | 0.8859 | 0.9412 |
| 0.0557 | 6.6127 | 3842 | 0.9145 | 0.4823 | 0.9145 | 0.9563 |
| 0.0557 | 6.6162 | 3844 | 0.9531 | 0.4823 | 0.9531 | 0.9763 |
| 0.0557 | 6.6196 | 3846 | 1.0141 | 0.4903 | 1.0141 | 1.0070 |
| 0.0557 | 6.6231 | 3848 | 1.0354 | 0.4649 | 1.0354 | 1.0176 |
| 0.0557 | 6.6265 | 3850 | 1.0056 | 0.4915 | 1.0056 | 1.0028 |
| 0.0557 | 6.6299 | 3852 | 0.9371 | 0.5100 | 0.9371 | 0.9680 |
| 0.0557 | 6.6334 | 3854 | 0.8644 | 0.5946 | 0.8644 | 0.9297 |
| 0.0557 | 6.6368 | 3856 | 0.8038 | 0.6093 | 0.8038 | 0.8965 |
| 0.0557 | 6.6403 | 3858 | 0.7866 | 0.6093 | 0.7866 | 0.8869 |
| 0.0557 | 6.6437 | 3860 | 0.7907 | 0.6093 | 0.7907 | 0.8892 |
| 0.0557 | 6.6472 | 3862 | 0.8099 | 0.6093 | 0.8099 | 0.9000 |
| 0.0557 | 6.6506 | 3864 | 0.8582 | 0.5946 | 0.8582 | 0.9264 |
| 0.0557 | 6.6540 | 3866 | 0.9407 | 0.4698 | 0.9407 | 0.9699 |
| 0.0557 | 6.6575 | 3868 | 1.0351 | 0.5013 | 1.0351 | 1.0174 |
| 0.0557 | 6.6609 | 3870 | 1.1166 | 0.4532 | 1.1166 | 1.0567 |
| 0.0557 | 6.6644 | 3872 | 1.1760 | 0.4627 | 1.1760 | 1.0844 |
| 0.0557 | 6.6678 | 3874 | 1.1666 | 0.4627 | 1.1666 | 1.0801 |
| 0.0557 | 6.6713 | 3876 | 1.1141 | 0.4532 | 1.1141 | 1.0555 |
| 0.0557 | 6.6747 | 3878 | 1.0180 | 0.5027 | 1.0180 | 1.0090 |
| 0.0557 | 6.6781 | 3880 | 0.9050 | 0.4823 | 0.9050 | 0.9513 |
| 0.0557 | 6.6816 | 3882 | 0.8286 | 0.6093 | 0.8286 | 0.9103 |
| 0.0557 | 6.6850 | 3884 | 0.7997 | 0.6093 | 0.7997 | 0.8942 |
| 0.0557 | 6.6885 | 3886 | 0.7985 | 0.6093 | 0.7985 | 0.8936 |
| 0.0557 | 6.6919 | 3888 | 0.8192 | 0.6093 | 0.8192 | 0.9051 |
| 0.0557 | 6.6954 | 3890 | 0.8532 | 0.6093 | 0.8532 | 0.9237 |
| 0.0557 | 6.6988 | 3892 | 0.8928 | 0.5474 | 0.8928 | 0.9449 |
| 0.0557 | 6.7022 | 3894 | 0.9202 | 0.4549 | 0.9202 | 0.9593 |
| 0.0557 | 6.7057 | 3896 | 0.9663 | 0.5013 | 0.9663 | 0.9830 |
| 0.0557 | 6.7091 | 3898 | 0.9846 | 0.5013 | 0.9846 | 0.9923 |
| 0.0557 | 6.7126 | 3900 | 0.9678 | 0.5013 | 0.9678 | 0.9837 |
| 0.0557 | 6.7160 | 3902 | 0.9513 | 0.5013 | 0.9513 | 0.9754 |
| 0.0557 | 6.7194 | 3904 | 0.9482 | 0.5257 | 0.9482 | 0.9738 |
| 0.0557 | 6.7229 | 3906 | 0.9324 | 0.5257 | 0.9324 | 0.9656 |
| 0.0557 | 6.7263 | 3908 | 0.9015 | 0.5235 | 0.9015 | 0.9495 |
| 0.0557 | 6.7298 | 3910 | 0.9155 | 0.4955 | 0.9155 | 0.9568 |
| 0.0557 | 6.7332 | 3912 | 0.9432 | 0.4408 | 0.9432 | 0.9712 |
| 0.0557 | 6.7367 | 3914 | 0.9656 | 0.4903 | 0.9656 | 0.9826 |
| 0.0557 | 6.7401 | 3916 | 1.0030 | 0.4903 | 1.0030 | 1.0015 |
| 0.0557 | 6.7435 | 3918 | 1.0040 | 0.5013 | 1.0040 | 1.0020 |
| 0.0557 | 6.7470 | 3920 | 0.9744 | 0.5013 | 0.9744 | 0.9871 |
| 0.0557 | 6.7504 | 3922 | 0.9660 | 0.4784 | 0.9660 | 0.9829 |
| 0.0557 | 6.7539 | 3924 | 0.9354 | 0.5056 | 0.9354 | 0.9671 |
| 0.0557 | 6.7573 | 3926 | 0.9315 | 0.4941 | 0.9315 | 0.9652 |
| 0.0557 | 6.7608 | 3928 | 0.9697 | 0.4784 | 0.9697 | 0.9847 |
| 0.0557 | 6.7642 | 3930 | 1.0179 | 0.5013 | 1.0179 | 1.0089 |
| 0.0557 | 6.7676 | 3932 | 1.0316 | 0.5013 | 1.0316 | 1.0157 |
| 0.0557 | 6.7711 | 3934 | 1.0051 | 0.5013 | 1.0051 | 1.0026 |
| 0.0557 | 6.7745 | 3936 | 0.9810 | 0.4903 | 0.9810 | 0.9905 |
| 0.0557 | 6.7780 | 3938 | 0.9428 | 0.5473 | 0.9428 | 0.9710 |
| 0.0557 | 6.7814 | 3940 | 0.9191 | 0.5227 | 0.9191 | 0.9587 |
| 0.0557 | 6.7849 | 3942 | 0.9169 | 0.5227 | 0.9169 | 0.9575 |
| 0.0557 | 6.7883 | 3944 | 0.9151 | 0.5588 | 0.9151 | 0.9566 |
| 0.0557 | 6.7917 | 3946 | 0.9330 | 0.5257 | 0.9330 | 0.9659 |
| 0.0557 | 6.7952 | 3948 | 0.9539 | 0.5257 | 0.9539 | 0.9767 |
| 0.0557 | 6.7986 | 3950 | 0.9421 | 0.5257 | 0.9421 | 0.9706 |
| 0.0557 | 6.8021 | 3952 | 0.9214 | 0.5257 | 0.9214 | 0.9599 |
| 0.0557 | 6.8055 | 3954 | 0.8948 | 0.5449 | 0.8948 | 0.9459 |
| 0.0557 | 6.8090 | 3956 | 0.9014 | 0.5449 | 0.9014 | 0.9494 |
| 0.0557 | 6.8124 | 3958 | 0.9364 | 0.4931 | 0.9364 | 0.9677 |
| 0.0557 | 6.8158 | 3960 | 1.0000 | 0.4903 | 1.0000 | 1.0000 |
| 0.0557 | 6.8193 | 3962 | 1.0954 | 0.4860 | 1.0954 | 1.0466 |
| 0.0557 | 6.8227 | 3964 | 1.1805 | 0.4860 | 1.1805 | 1.0865 |
| 0.0557 | 6.8262 | 3966 | 1.2246 | 0.4627 | 1.2246 | 1.1066 |
| 0.0557 | 6.8296 | 3968 | 1.2038 | 0.4860 | 1.2038 | 1.0972 |
| 0.0557 | 6.8330 | 3970 | 1.1439 | 0.4748 | 1.1439 | 1.0695 |
| 0.0557 | 6.8365 | 3972 | 1.0827 | 0.4765 | 1.0827 | 1.0405 |
| 0.0557 | 6.8399 | 3974 | 1.0005 | 0.4903 | 1.0005 | 1.0003 |
| 0.0557 | 6.8434 | 3976 | 0.9085 | 0.5207 | 0.9085 | 0.9531 |
| 0.0557 | 6.8468 | 3978 | 0.8518 | 0.5946 | 0.8518 | 0.9229 |
| 0.0557 | 6.8503 | 3980 | 0.8270 | 0.5946 | 0.8270 | 0.9094 |
| 0.0557 | 6.8537 | 3982 | 0.8362 | 0.5946 | 0.8362 | 0.9145 |
| 0.0557 | 6.8571 | 3984 | 0.8836 | 0.5874 | 0.8836 | 0.9400 |
| 0.0557 | 6.8606 | 3986 | 0.9462 | 0.5276 | 0.9462 | 0.9727 |
| 0.0557 | 6.8640 | 3988 | 1.0287 | 0.5013 | 1.0287 | 1.0142 |
| 0.0557 | 6.8675 | 3990 | 1.0636 | 0.5013 | 1.0636 | 1.0313 |
| 0.0557 | 6.8709 | 3992 | 1.0654 | 0.4881 | 1.0654 | 1.0322 |
| 0.0557 | 6.8744 | 3994 | 1.0513 | 0.4881 | 1.0513 | 1.0253 |
| 0.0557 | 6.8778 | 3996 | 1.0321 | 0.4915 | 1.0321 | 1.0159 |
| 0.0557 | 6.8812 | 3998 | 1.0174 | 0.4915 | 1.0174 | 1.0087 |
| 0.0478 | 6.8847 | 4000 | 0.9877 | 0.5207 | 0.9877 | 0.9938 |
| 0.0478 | 6.8881 | 4002 | 0.9677 | 0.5207 | 0.9677 | 0.9837 |
| 0.0478 | 6.8916 | 4004 | 0.9592 | 0.5207 | 0.9592 | 0.9794 |
| 0.0478 | 6.8950 | 4006 | 0.9288 | 0.4955 | 0.9288 | 0.9638 |
| 0.0478 | 6.8985 | 4008 | 0.8993 | 0.5248 | 0.8993 | 0.9483 |
| 0.0478 | 6.9019 | 4010 | 0.8971 | 0.5248 | 0.8971 | 0.9471 |
| 0.0478 | 6.9053 | 4012 | 0.8829 | 0.5248 | 0.8829 | 0.9396 |
| 0.0478 | 6.9088 | 4014 | 0.8896 | 0.5248 | 0.8896 | 0.9432 |
| 0.0478 | 6.9122 | 4016 | 0.9182 | 0.5248 | 0.9182 | 0.9582 |
| 0.0478 | 6.9157 | 4018 | 0.9835 | 0.5289 | 0.9835 | 0.9917 |
| 0.0478 | 6.9191 | 4020 | 1.0516 | 0.5013 | 1.0516 | 1.0255 |
| 0.0478 | 6.9225 | 4022 | 1.0763 | 0.5013 | 1.0763 | 1.0375 |
| 0.0478 | 6.9260 | 4024 | 1.0692 | 0.5013 | 1.0692 | 1.0340 |
| 0.0478 | 6.9294 | 4026 | 1.0131 | 0.5013 | 1.0131 | 1.0065 |
| 0.0478 | 6.9329 | 4028 | 0.9762 | 0.4803 | 0.9762 | 0.9880 |
| 0.0478 | 6.9363 | 4030 | 0.9857 | 0.4803 | 0.9857 | 0.9928 |
| 0.0478 | 6.9398 | 4032 | 0.9800 | 0.4803 | 0.9800 | 0.9899 |
| 0.0478 | 6.9432 | 4034 | 0.9718 | 0.4803 | 0.9718 | 0.9858 |
| 0.0478 | 6.9466 | 4036 | 0.9533 | 0.5331 | 0.9533 | 0.9764 |
| 0.0478 | 6.9501 | 4038 | 0.9576 | 0.5331 | 0.9576 | 0.9786 |
| 0.0478 | 6.9535 | 4040 | 0.9993 | 0.4545 | 0.9993 | 0.9997 |
| 0.0478 | 6.9570 | 4042 | 1.0451 | 0.4784 | 1.0451 | 1.0223 |
| 0.0478 | 6.9604 | 4044 | 1.0398 | 0.4784 | 1.0398 | 1.0197 |
| 0.0478 | 6.9639 | 4046 | 1.0324 | 0.4784 | 1.0324 | 1.0161 |
| 0.0478 | 6.9673 | 4048 | 1.0066 | 0.4545 | 1.0066 | 1.0033 |
| 0.0478 | 6.9707 | 4050 | 0.9628 | 0.5248 | 0.9628 | 0.9812 |
| 0.0478 | 6.9742 | 4052 | 0.9093 | 0.5248 | 0.9093 | 0.9536 |
| 0.0478 | 6.9776 | 4054 | 0.8904 | 0.5248 | 0.8904 | 0.9436 |
| 0.0478 | 6.9811 | 4056 | 0.8940 | 0.5765 | 0.8940 | 0.9455 |
| 0.0478 | 6.9845 | 4058 | 0.9212 | 0.5765 | 0.9212 | 0.9598 |
| 0.0478 | 6.9880 | 4060 | 0.9303 | 0.5765 | 0.9303 | 0.9645 |
| 0.0478 | 6.9914 | 4062 | 0.9355 | 0.5474 | 0.9355 | 0.9672 |
| 0.0478 | 6.9948 | 4064 | 0.9643 | 0.5072 | 0.9643 | 0.9820 |
| 0.0478 | 6.9983 | 4066 | 0.9662 | 0.5072 | 0.9662 | 0.9830 |
| 0.0478 | 7.0017 | 4068 | 0.9597 | 0.5072 | 0.9597 | 0.9797 |
| 0.0478 | 7.0052 | 4070 | 0.9232 | 0.5072 | 0.9232 | 0.9608 |
| 0.0478 | 7.0086 | 4072 | 0.8916 | 0.5681 | 0.8916 | 0.9442 |
| 0.0478 | 7.0120 | 4074 | 0.8924 | 0.5681 | 0.8924 | 0.9447 |
| 0.0478 | 7.0155 | 4076 | 0.9228 | 0.4955 | 0.9228 | 0.9606 |
| 0.0478 | 7.0189 | 4078 | 0.9505 | 0.5072 | 0.9505 | 0.9749 |
| 0.0478 | 7.0224 | 4080 | 0.9545 | 0.5072 | 0.9545 | 0.9770 |
| 0.0478 | 7.0258 | 4082 | 0.9571 | 0.5072 | 0.9571 | 0.9783 |
| 0.0478 | 7.0293 | 4084 | 1.0011 | 0.4545 | 1.0011 | 1.0006 |
| 0.0478 | 7.0327 | 4086 | 1.0421 | 0.5013 | 1.0421 | 1.0208 |
| 0.0478 | 7.0361 | 4088 | 1.0904 | 0.4881 | 1.0904 | 1.0442 |
| 0.0478 | 7.0396 | 4090 | 1.0863 | 0.4881 | 1.0863 | 1.0423 |
| 0.0478 | 7.0430 | 4092 | 1.0395 | 0.5013 | 1.0395 | 1.0196 |
| 0.0478 | 7.0465 | 4094 | 0.9686 | 0.4941 | 0.9686 | 0.9842 |
| 0.0478 | 7.0499 | 4096 | 0.9205 | 0.5248 | 0.9205 | 0.9595 |
| 0.0478 | 7.0534 | 4098 | 0.8867 | 0.5527 | 0.8867 | 0.9417 |
| 0.0478 | 7.0568 | 4100 | 0.8796 | 0.5946 | 0.8796 | 0.9379 |
| 0.0478 | 7.0602 | 4102 | 0.8921 | 0.5527 | 0.8921 | 0.9445 |
| 0.0478 | 7.0637 | 4104 | 0.9297 | 0.5227 | 0.9297 | 0.9642 |
| 0.0478 | 7.0671 | 4106 | 0.9977 | 0.4784 | 0.9977 | 0.9989 |
| 0.0478 | 7.0706 | 4108 | 1.0693 | 0.4881 | 1.0693 | 1.0340 |
| 0.0478 | 7.0740 | 4110 | 1.1159 | 0.4881 | 1.1159 | 1.0563 |
| 0.0478 | 7.0775 | 4112 | 1.1308 | 0.4881 | 1.1308 | 1.0634 |
| 0.0478 | 7.0809 | 4114 | 1.1011 | 0.4881 | 1.1011 | 1.0493 |
| 0.0478 | 7.0843 | 4116 | 1.0541 | 0.5013 | 1.0541 | 1.0267 |
| 0.0478 | 7.0878 | 4118 | 0.9950 | 0.4784 | 0.9950 | 0.9975 |
| 0.0478 | 7.0912 | 4120 | 0.9341 | 0.5650 | 0.9341 | 0.9665 |
| 0.0478 | 7.0947 | 4122 | 0.8997 | 0.5650 | 0.8997 | 0.9485 |
| 0.0478 | 7.0981 | 4124 | 0.8892 | 0.5650 | 0.8892 | 0.9430 |
| 0.0478 | 7.1015 | 4126 | 0.8979 | 0.5650 | 0.8979 | 0.9476 |
| 0.0478 | 7.1050 | 4128 | 0.9154 | 0.5650 | 0.9154 | 0.9568 |
| 0.0478 | 7.1084 | 4130 | 0.9018 | 0.5650 | 0.9018 | 0.9496 |
| 0.0478 | 7.1119 | 4132 | 0.8784 | 0.5650 | 0.8784 | 0.9372 |
| 0.0478 | 7.1153 | 4134 | 0.8829 | 0.5733 | 0.8829 | 0.9396 |
| 0.0478 | 7.1188 | 4136 | 0.8905 | 0.5733 | 0.8905 | 0.9437 |
| 0.0478 | 7.1222 | 4138 | 0.9114 | 0.5197 | 0.9114 | 0.9547 |
| 0.0478 | 7.1256 | 4140 | 0.9286 | 0.5197 | 0.9286 | 0.9636 |
| 0.0478 | 7.1291 | 4142 | 0.9547 | 0.5426 | 0.9547 | 0.9771 |
| 0.0478 | 7.1325 | 4144 | 0.9883 | 0.5161 | 0.9883 | 0.9941 |
| 0.0478 | 7.1360 | 4146 | 1.0055 | 0.4784 | 1.0055 | 1.0027 |
| 0.0478 | 7.1394 | 4148 | 1.0396 | 0.5013 | 1.0396 | 1.0196 |
| 0.0478 | 7.1429 | 4150 | 1.0507 | 0.5013 | 1.0507 | 1.0250 |
| 0.0478 | 7.1463 | 4152 | 1.0667 | 0.5341 | 1.0667 | 1.0328 |
| 0.0478 | 7.1497 | 4154 | 1.0402 | 0.5013 | 1.0402 | 1.0199 |
| 0.0478 | 7.1532 | 4156 | 0.9780 | 0.4784 | 0.9780 | 0.9890 |
| 0.0478 | 7.1566 | 4158 | 0.9315 | 0.5197 | 0.9315 | 0.9651 |
| 0.0478 | 7.1601 | 4160 | 0.8740 | 0.5798 | 0.8740 | 0.9349 |
| 0.0478 | 7.1635 | 4162 | 0.8215 | 0.6053 | 0.8215 | 0.9063 |
| 0.0478 | 7.1670 | 4164 | 0.8042 | 0.6053 | 0.8042 | 0.8968 |
| 0.0478 | 7.1704 | 4166 | 0.8099 | 0.6093 | 0.8099 | 0.8999 |
| 0.0478 | 7.1738 | 4168 | 0.8395 | 0.5798 | 0.8395 | 0.9162 |
| 0.0478 | 7.1773 | 4170 | 0.8888 | 0.5650 | 0.8888 | 0.9428 |
| 0.0478 | 7.1807 | 4172 | 0.9230 | 0.5227 | 0.9230 | 0.9607 |
| 0.0478 | 7.1842 | 4174 | 0.9669 | 0.5188 | 0.9669 | 0.9833 |
| 0.0478 | 7.1876 | 4176 | 1.0055 | 0.4903 | 1.0055 | 1.0028 |
| 0.0478 | 7.1910 | 4178 | 1.0078 | 0.4903 | 1.0078 | 1.0039 |
| 0.0478 | 7.1945 | 4180 | 0.9906 | 0.4661 | 0.9906 | 0.9953 |
| 0.0478 | 7.1979 | 4182 | 0.9593 | 0.5188 | 0.9593 | 0.9794 |
| 0.0478 | 7.2014 | 4184 | 0.9526 | 0.4941 | 0.9526 | 0.9760 |
| 0.0478 | 7.2048 | 4186 | 0.9584 | 0.4661 | 0.9584 | 0.9790 |
| 0.0478 | 7.2083 | 4188 | 0.9669 | 0.4661 | 0.9669 | 0.9833 |
| 0.0478 | 7.2117 | 4190 | 0.9477 | 0.4813 | 0.9477 | 0.9735 |
| 0.0478 | 7.2151 | 4192 | 0.9064 | 0.5650 | 0.9064 | 0.9520 |
| 0.0478 | 7.2186 | 4194 | 0.8648 | 0.5798 | 0.8648 | 0.9300 |
| 0.0478 | 7.2220 | 4196 | 0.8505 | 0.5833 | 0.8505 | 0.9223 |
| 0.0478 | 7.2255 | 4198 | 0.8507 | 0.5833 | 0.8507 | 0.9223 |
| 0.0478 | 7.2289 | 4200 | 0.8538 | 0.5833 | 0.8538 | 0.9240 |
| 0.0478 | 7.2324 | 4202 | 0.8726 | 0.5650 | 0.8726 | 0.9341 |
| 0.0478 | 7.2358 | 4204 | 0.9060 | 0.5650 | 0.9060 | 0.9518 |
| 0.0478 | 7.2392 | 4206 | 0.9582 | 0.4813 | 0.9582 | 0.9789 |
| 0.0478 | 7.2427 | 4208 | 1.0014 | 0.4903 | 1.0014 | 1.0007 |
| 0.0478 | 7.2461 | 4210 | 1.0053 | 0.4903 | 1.0053 | 1.0026 |
| 0.0478 | 7.2496 | 4212 | 0.9985 | 0.4903 | 0.9985 | 0.9993 |
| 0.0478 | 7.2530 | 4214 | 0.9798 | 0.4408 | 0.9798 | 0.9899 |
| 0.0478 | 7.2565 | 4216 | 0.9573 | 0.4408 | 0.9573 | 0.9784 |
| 0.0478 | 7.2599 | 4218 | 0.9524 | 0.4408 | 0.9524 | 0.9759 |
| 0.0478 | 7.2633 | 4220 | 0.9598 | 0.4408 | 0.9598 | 0.9797 |
| 0.0478 | 7.2668 | 4222 | 0.9569 | 0.4408 | 0.9569 | 0.9782 |
| 0.0478 | 7.2702 | 4224 | 0.9658 | 0.4408 | 0.9658 | 0.9827 |
| 0.0478 | 7.2737 | 4226 | 0.9596 | 0.4813 | 0.9596 | 0.9796 |
| 0.0478 | 7.2771 | 4228 | 0.9468 | 0.4813 | 0.9468 | 0.9730 |
| 0.0478 | 7.2806 | 4230 | 0.9708 | 0.4813 | 0.9708 | 0.9853 |
| 0.0478 | 7.2840 | 4232 | 1.0143 | 0.5013 | 1.0143 | 1.0071 |
| 0.0478 | 7.2874 | 4234 | 1.0476 | 0.5013 | 1.0476 | 1.0235 |
| 0.0478 | 7.2909 | 4236 | 1.0525 | 0.5013 | 1.0525 | 1.0259 |
| 0.0478 | 7.2943 | 4238 | 1.0322 | 0.4903 | 1.0322 | 1.0160 |
| 0.0478 | 7.2978 | 4240 | 1.0101 | 0.4408 | 1.0101 | 1.0051 |
| 0.0478 | 7.3012 | 4242 | 1.0214 | 0.4903 | 1.0214 | 1.0106 |
| 0.0478 | 7.3046 | 4244 | 1.0169 | 0.4661 | 1.0169 | 1.0084 |
| 0.0478 | 7.3081 | 4246 | 0.9805 | 0.4408 | 0.9805 | 0.9902 |
| 0.0478 | 7.3115 | 4248 | 0.9373 | 0.4813 | 0.9373 | 0.9681 |
| 0.0478 | 7.3150 | 4250 | 0.9327 | 0.4813 | 0.9327 | 0.9658 |
| 0.0478 | 7.3184 | 4252 | 0.9285 | 0.4813 | 0.9285 | 0.9636 |
| 0.0478 | 7.3219 | 4254 | 0.9441 | 0.5056 | 0.9441 | 0.9716 |
| 0.0478 | 7.3253 | 4256 | 0.9638 | 0.4661 | 0.9638 | 0.9817 |
| 0.0478 | 7.3287 | 4258 | 0.9648 | 0.4661 | 0.9648 | 0.9823 |
| 0.0478 | 7.3322 | 4260 | 0.9529 | 0.4661 | 0.9529 | 0.9761 |
| 0.0478 | 7.3356 | 4262 | 0.9304 | 0.4661 | 0.9304 | 0.9646 |
| 0.0478 | 7.3391 | 4264 | 0.8965 | 0.5650 | 0.8965 | 0.9468 |
| 0.0478 | 7.3425 | 4266 | 0.8626 | 0.5833 | 0.8626 | 0.9288 |
| 0.0478 | 7.3460 | 4268 | 0.8645 | 0.5833 | 0.8645 | 0.9298 |
| 0.0478 | 7.3494 | 4270 | 0.8965 | 0.5650 | 0.8965 | 0.9469 |
| 0.0478 | 7.3528 | 4272 | 0.9470 | 0.5056 | 0.9470 | 0.9732 |
| 0.0478 | 7.3563 | 4274 | 0.9687 | 0.4661 | 0.9687 | 0.9842 |
| 0.0478 | 7.3597 | 4276 | 0.9536 | 0.5056 | 0.9536 | 0.9765 |
| 0.0478 | 7.3632 | 4278 | 0.9369 | 0.4813 | 0.9369 | 0.9679 |
| 0.0478 | 7.3666 | 4280 | 0.9350 | 0.4813 | 0.9350 | 0.9669 |
| 0.0478 | 7.3701 | 4282 | 0.9301 | 0.5650 | 0.9301 | 0.9644 |
| 0.0478 | 7.3735 | 4284 | 0.9401 | 0.5227 | 0.9401 | 0.9696 |
| 0.0478 | 7.3769 | 4286 | 0.9418 | 0.5227 | 0.9418 | 0.9705 |
| 0.0478 | 7.3804 | 4288 | 0.9585 | 0.4408 | 0.9585 | 0.9790 |
| 0.0478 | 7.3838 | 4290 | 0.9770 | 0.4408 | 0.9770 | 0.9884 |
| 0.0478 | 7.3873 | 4292 | 0.9914 | 0.4661 | 0.9914 | 0.9957 |
| 0.0478 | 7.3907 | 4294 | 0.9719 | 0.4408 | 0.9719 | 0.9859 |
| 0.0478 | 7.3941 | 4296 | 0.9283 | 0.5650 | 0.9283 | 0.9635 |
| 0.0478 | 7.3976 | 4298 | 0.8847 | 0.5681 | 0.8847 | 0.9406 |
| 0.0478 | 7.4010 | 4300 | 0.8545 | 0.5833 | 0.8545 | 0.9244 |
| 0.0478 | 7.4045 | 4302 | 0.8202 | 0.5833 | 0.8202 | 0.9056 |
| 0.0478 | 7.4079 | 4304 | 0.8145 | 0.5833 | 0.8145 | 0.9025 |
| 0.0478 | 7.4114 | 4306 | 0.8263 | 0.5833 | 0.8263 | 0.9090 |
| 0.0478 | 7.4148 | 4308 | 0.8558 | 0.5833 | 0.8558 | 0.9251 |
| 0.0478 | 7.4182 | 4310 | 0.9116 | 0.5650 | 0.9116 | 0.9548 |
| 0.0478 | 7.4217 | 4312 | 0.9904 | 0.4903 | 0.9904 | 0.9952 |
| 0.0478 | 7.4251 | 4314 | 1.0498 | 0.5013 | 1.0498 | 1.0246 |
| 0.0478 | 7.4286 | 4316 | 1.0775 | 0.5013 | 1.0775 | 1.0380 |
| 0.0478 | 7.4320 | 4318 | 1.0640 | 0.4903 | 1.0640 | 1.0315 |
| 0.0478 | 7.4355 | 4320 | 1.0581 | 0.4903 | 1.0581 | 1.0287 |
| 0.0478 | 7.4389 | 4322 | 1.0542 | 0.4903 | 1.0542 | 1.0267 |
| 0.0478 | 7.4423 | 4324 | 1.0317 | 0.4903 | 1.0317 | 1.0157 |
| 0.0478 | 7.4458 | 4326 | 0.9956 | 0.4903 | 0.9956 | 0.9978 |
| 0.0478 | 7.4492 | 4328 | 0.9479 | 0.5227 | 0.9479 | 0.9736 |
| 0.0478 | 7.4527 | 4330 | 0.9079 | 0.5650 | 0.9079 | 0.9528 |
| 0.0478 | 7.4561 | 4332 | 0.8813 | 0.5650 | 0.8813 | 0.9388 |
| 0.0478 | 7.4596 | 4334 | 0.8590 | 0.5681 | 0.8590 | 0.9268 |
| 0.0478 | 7.4630 | 4336 | 0.8577 | 0.5681 | 0.8577 | 0.9261 |
| 0.0478 | 7.4664 | 4338 | 0.8797 | 0.5650 | 0.8797 | 0.9379 |
| 0.0478 | 7.4699 | 4340 | 0.9272 | 0.5188 | 0.9272 | 0.9629 |
| 0.0478 | 7.4733 | 4342 | 0.9532 | 0.5188 | 0.9532 | 0.9763 |
| 0.0478 | 7.4768 | 4344 | 0.9525 | 0.5424 | 0.9525 | 0.9760 |
| 0.0478 | 7.4802 | 4346 | 0.9237 | 0.5188 | 0.9237 | 0.9611 |
| 0.0478 | 7.4836 | 4348 | 0.8740 | 0.5591 | 0.8740 | 0.9349 |
| 0.0478 | 7.4871 | 4350 | 0.8325 | 0.5833 | 0.8325 | 0.9124 |
| 0.0478 | 7.4905 | 4352 | 0.8234 | 0.5833 | 0.8234 | 0.9074 |
| 0.0478 | 7.4940 | 4354 | 0.8069 | 0.5833 | 0.8069 | 0.8983 |
| 0.0478 | 7.4974 | 4356 | 0.8080 | 0.5833 | 0.8080 | 0.8989 |
| 0.0478 | 7.5009 | 4358 | 0.8302 | 0.5833 | 0.8302 | 0.9111 |
| 0.0478 | 7.5043 | 4360 | 0.8612 | 0.5619 | 0.8612 | 0.9280 |
| 0.0478 | 7.5077 | 4362 | 0.8667 | 0.5619 | 0.8667 | 0.9310 |
| 0.0478 | 7.5112 | 4364 | 0.8650 | 0.5765 | 0.8650 | 0.9300 |
| 0.0478 | 7.5146 | 4366 | 0.8701 | 0.5733 | 0.8701 | 0.9328 |
| 0.0478 | 7.5181 | 4368 | 0.8520 | 0.5833 | 0.8520 | 0.9231 |
| 0.0478 | 7.5215 | 4370 | 0.8316 | 0.5833 | 0.8316 | 0.9119 |
| 0.0478 | 7.5250 | 4372 | 0.8249 | 0.5833 | 0.8249 | 0.9082 |
| 0.0478 | 7.5284 | 4374 | 0.8428 | 0.5833 | 0.8428 | 0.9180 |
| 0.0478 | 7.5318 | 4376 | 0.8685 | 0.55 | 0.8685 | 0.9319 |
| 0.0478 | 7.5353 | 4378 | 0.8759 | 0.55 | 0.8759 | 0.9359 |
| 0.0478 | 7.5387 | 4380 | 0.8671 | 0.5833 | 0.8671 | 0.9312 |
| 0.0478 | 7.5422 | 4382 | 0.8537 | 0.5833 | 0.8537 | 0.9240 |
| 0.0478 | 7.5456 | 4384 | 0.8492 | 0.5833 | 0.8492 | 0.9215 |
| 0.0478 | 7.5491 | 4386 | 0.8441 | 0.5833 | 0.8441 | 0.9188 |
| 0.0478 | 7.5525 | 4388 | 0.8377 | 0.5833 | 0.8377 | 0.9152 |
| 0.0478 | 7.5559 | 4390 | 0.8406 | 0.5833 | 0.8406 | 0.9168 |
| 0.0478 | 7.5594 | 4392 | 0.8594 | 0.5833 | 0.8594 | 0.9271 |
| 0.0478 | 7.5628 | 4394 | 0.8886 | 0.5227 | 0.8886 | 0.9426 |
| 0.0478 | 7.5663 | 4396 | 0.9316 | 0.4408 | 0.9316 | 0.9652 |
| 0.0478 | 7.5697 | 4398 | 0.9675 | 0.5013 | 0.9675 | 0.9836 |
| 0.0478 | 7.5731 | 4400 | 0.9967 | 0.5013 | 0.9967 | 0.9984 |
| 0.0478 | 7.5766 | 4402 | 0.9923 | 0.5013 | 0.9923 | 0.9961 |
| 0.0478 | 7.5800 | 4404 | 0.9626 | 0.5013 | 0.9626 | 0.9811 |
| 0.0478 | 7.5835 | 4406 | 0.9185 | 0.4408 | 0.9185 | 0.9584 |
| 0.0478 | 7.5869 | 4408 | 0.8974 | 0.5227 | 0.8974 | 0.9473 |
| 0.0478 | 7.5904 | 4410 | 0.8799 | 0.5378 | 0.8799 | 0.9380 |
| 0.0478 | 7.5938 | 4412 | 0.8765 | 0.5833 | 0.8765 | 0.9362 |
| 0.0478 | 7.5972 | 4414 | 0.8855 | 0.5833 | 0.8855 | 0.9410 |
| 0.0478 | 7.6007 | 4416 | 0.8876 | 0.5527 | 0.8876 | 0.9421 |
| 0.0478 | 7.6041 | 4418 | 0.9017 | 0.4941 | 0.9017 | 0.9496 |
| 0.0478 | 7.6076 | 4420 | 0.9125 | 0.4408 | 0.9125 | 0.9553 |
| 0.0478 | 7.6110 | 4422 | 0.9401 | 0.4661 | 0.9401 | 0.9696 |
| 0.0478 | 7.6145 | 4424 | 0.9449 | 0.4661 | 0.9449 | 0.9721 |
| 0.0478 | 7.6179 | 4426 | 0.9383 | 0.4661 | 0.9383 | 0.9687 |
| 0.0478 | 7.6213 | 4428 | 0.9019 | 0.4941 | 0.9019 | 0.9497 |
| 0.0478 | 7.6248 | 4430 | 0.8529 | 0.5833 | 0.8529 | 0.9235 |
| 0.0478 | 7.6282 | 4432 | 0.8325 | 0.5833 | 0.8325 | 0.9124 |
| 0.0478 | 7.6317 | 4434 | 0.8376 | 0.5833 | 0.8376 | 0.9152 |
| 0.0478 | 7.6351 | 4436 | 0.8561 | 0.5833 | 0.8561 | 0.9253 |
| 0.0478 | 7.6386 | 4438 | 0.8889 | 0.4955 | 0.8889 | 0.9428 |
| 0.0478 | 7.6420 | 4440 | 0.9390 | 0.4661 | 0.9390 | 0.9690 |
| 0.0478 | 7.6454 | 4442 | 0.9720 | 0.4661 | 0.9720 | 0.9859 |
| 0.0478 | 7.6489 | 4444 | 0.9761 | 0.4661 | 0.9761 | 0.9880 |
| 0.0478 | 7.6523 | 4446 | 0.9612 | 0.4661 | 0.9612 | 0.9804 |
| 0.0478 | 7.6558 | 4448 | 0.9280 | 0.5207 | 0.9280 | 0.9633 |
| 0.0478 | 7.6592 | 4450 | 0.8999 | 0.4955 | 0.8999 | 0.9486 |
| 0.0478 | 7.6627 | 4452 | 0.8638 | 0.5402 | 0.8638 | 0.9294 |
| 0.0478 | 7.6661 | 4454 | 0.8483 | 0.5833 | 0.8483 | 0.9210 |
| 0.0478 | 7.6695 | 4456 | 0.8531 | 0.5833 | 0.8531 | 0.9236 |
| 0.0478 | 7.6730 | 4458 | 0.8778 | 0.5248 | 0.8778 | 0.9369 |
| 0.0478 | 7.6764 | 4460 | 0.9169 | 0.4955 | 0.9169 | 0.9576 |
| 0.0478 | 7.6799 | 4462 | 0.9595 | 0.4661 | 0.9595 | 0.9795 |
| 0.0478 | 7.6833 | 4464 | 0.9673 | 0.4903 | 0.9673 | 0.9835 |
| 0.0478 | 7.6867 | 4466 | 0.9576 | 0.4903 | 0.9576 | 0.9786 |
| 0.0478 | 7.6902 | 4468 | 0.9349 | 0.4661 | 0.9349 | 0.9669 |
| 0.0478 | 7.6936 | 4470 | 0.9093 | 0.4668 | 0.9093 | 0.9536 |
| 0.0478 | 7.6971 | 4472 | 0.9115 | 0.4668 | 0.9115 | 0.9547 |
| 0.0478 | 7.7005 | 4474 | 0.9372 | 0.4903 | 0.9372 | 0.9681 |
| 0.0478 | 7.7040 | 4476 | 0.9438 | 0.4903 | 0.9438 | 0.9715 |
| 0.0478 | 7.7074 | 4478 | 0.9489 | 0.4903 | 0.9489 | 0.9741 |
| 0.0478 | 7.7108 | 4480 | 0.9395 | 0.4903 | 0.9395 | 0.9693 |
| 0.0478 | 7.7143 | 4482 | 0.9159 | 0.4668 | 0.9159 | 0.9570 |
| 0.0478 | 7.7177 | 4484 | 0.8876 | 0.5765 | 0.8876 | 0.9421 |
| 0.0478 | 7.7212 | 4486 | 0.8723 | 0.5765 | 0.8723 | 0.9340 |
| 0.0478 | 7.7246 | 4488 | 0.8676 | 0.5765 | 0.8676 | 0.9314 |
| 0.0478 | 7.7281 | 4490 | 0.8795 | 0.5354 | 0.8795 | 0.9378 |
| 0.0478 | 7.7315 | 4492 | 0.9031 | 0.5207 | 0.9031 | 0.9503 |
| 0.0478 | 7.7349 | 4494 | 0.9098 | 0.5207 | 0.9098 | 0.9538 |
| 0.0478 | 7.7384 | 4496 | 0.9157 | 0.5207 | 0.9157 | 0.9569 |
| 0.0478 | 7.7418 | 4498 | 0.9159 | 0.5207 | 0.9159 | 0.9570 |
| 0.0432 | 7.7453 | 4500 | 0.9052 | 0.5207 | 0.9052 | 0.9514 |
| 0.0432 | 7.7487 | 4502 | 0.9137 | 0.5188 | 0.9137 | 0.9559 |
| 0.0432 | 7.7522 | 4504 | 0.9078 | 0.5188 | 0.9078 | 0.9528 |
| 0.0432 | 7.7556 | 4506 | 0.8947 | 0.5207 | 0.8947 | 0.9459 |
| 0.0432 | 7.7590 | 4508 | 0.8972 | 0.5188 | 0.8972 | 0.9472 |
| 0.0432 | 7.7625 | 4510 | 0.9105 | 0.5188 | 0.9105 | 0.9542 |
| 0.0432 | 7.7659 | 4512 | 0.9372 | 0.4661 | 0.9372 | 0.9681 |
| 0.0432 | 7.7694 | 4514 | 0.9450 | 0.4903 | 0.9450 | 0.9721 |
| 0.0432 | 7.7728 | 4516 | 0.9367 | 0.4903 | 0.9367 | 0.9678 |
| 0.0432 | 7.7762 | 4518 | 0.9311 | 0.4661 | 0.9311 | 0.9649 |
| 0.0432 | 7.7797 | 4520 | 0.9048 | 0.5188 | 0.9048 | 0.9512 |
| 0.0432 | 7.7831 | 4522 | 0.9036 | 0.5188 | 0.9036 | 0.9506 |
| 0.0432 | 7.7866 | 4524 | 0.9091 | 0.4661 | 0.9091 | 0.9534 |
| 0.0432 | 7.7900 | 4526 | 0.9082 | 0.5331 | 0.9082 | 0.9530 |
| 0.0432 | 7.7935 | 4528 | 0.9242 | 0.4661 | 0.9242 | 0.9614 |
| 0.0432 | 7.7969 | 4530 | 0.9511 | 0.4661 | 0.9511 | 0.9753 |
| 0.0432 | 7.8003 | 4532 | 0.9821 | 0.4765 | 0.9821 | 0.9910 |
| 0.0432 | 7.8038 | 4534 | 0.9829 | 0.4765 | 0.9829 | 0.9914 |
| 0.0432 | 7.8072 | 4536 | 0.9554 | 0.4903 | 0.9554 | 0.9775 |
| 0.0432 | 7.8107 | 4538 | 0.9243 | 0.4661 | 0.9243 | 0.9614 |
| 0.0432 | 7.8141 | 4540 | 0.8869 | 0.5588 | 0.8869 | 0.9417 |
| 0.0432 | 7.8176 | 4542 | 0.8646 | 0.6053 | 0.8646 | 0.9299 |
| 0.0432 | 7.8210 | 4544 | 0.8711 | 0.5978 | 0.8711 | 0.9333 |
| 0.0432 | 7.8244 | 4546 | 0.9045 | 0.5068 | 0.9045 | 0.9511 |
| 0.0432 | 7.8279 | 4548 | 0.9343 | 0.4661 | 0.9343 | 0.9666 |
| 0.0432 | 7.8313 | 4550 | 0.9390 | 0.4931 | 0.9390 | 0.9690 |
| 0.0432 | 7.8348 | 4552 | 0.9455 | 0.4903 | 0.9455 | 0.9724 |
| 0.0432 | 7.8382 | 4554 | 0.9306 | 0.5068 | 0.9306 | 0.9647 |
| 0.0432 | 7.8417 | 4556 | 0.8953 | 0.5451 | 0.8953 | 0.9462 |
| 0.0432 | 7.8451 | 4558 | 0.8684 | 0.6053 | 0.8684 | 0.9319 |
| 0.0432 | 7.8485 | 4560 | 0.8527 | 0.6053 | 0.8527 | 0.9234 |
| 0.0432 | 7.8520 | 4562 | 0.8431 | 0.6093 | 0.8431 | 0.9182 |
| 0.0432 | 7.8554 | 4564 | 0.8518 | 0.6093 | 0.8518 | 0.9229 |
| 0.0432 | 7.8589 | 4566 | 0.8636 | 0.6053 | 0.8636 | 0.9293 |
| 0.0432 | 7.8623 | 4568 | 0.8868 | 0.55 | 0.8868 | 0.9417 |
| 0.0432 | 7.8657 | 4570 | 0.9170 | 0.5068 | 0.9170 | 0.9576 |
| 0.0432 | 7.8692 | 4572 | 0.9411 | 0.5068 | 0.9411 | 0.9701 |
| 0.0432 | 7.8726 | 4574 | 0.9623 | 0.5295 | 0.9623 | 0.9810 |
| 0.0432 | 7.8761 | 4576 | 0.9674 | 0.5295 | 0.9674 | 0.9836 |
| 0.0432 | 7.8795 | 4578 | 0.9490 | 0.5295 | 0.9490 | 0.9742 |
| 0.0432 | 7.8830 | 4580 | 0.9263 | 0.5451 | 0.9263 | 0.9624 |
| 0.0432 | 7.8864 | 4582 | 0.8946 | 0.5223 | 0.8946 | 0.9459 |
| 0.0432 | 7.8898 | 4584 | 0.8700 | 0.55 | 0.8700 | 0.9327 |
| 0.0432 | 7.8933 | 4586 | 0.8687 | 0.5526 | 0.8687 | 0.9320 |
| 0.0432 | 7.8967 | 4588 | 0.8720 | 0.5526 | 0.8720 | 0.9338 |
| 0.0432 | 7.9002 | 4590 | 0.8958 | 0.5223 | 0.8958 | 0.9465 |
| 0.0432 | 7.9036 | 4592 | 0.9382 | 0.5668 | 0.9382 | 0.9686 |
| 0.0432 | 7.9071 | 4594 | 0.9894 | 0.5295 | 0.9894 | 0.9947 |
| 0.0432 | 7.9105 | 4596 | 1.0223 | 0.5013 | 1.0223 | 1.0111 |
| 0.0432 | 7.9139 | 4598 | 1.0320 | 0.5013 | 1.0320 | 1.0159 |
| 0.0432 | 7.9174 | 4600 | 1.0257 | 0.4903 | 1.0257 | 1.0128 |
| 0.0432 | 7.9208 | 4602 | 0.9996 | 0.4903 | 0.9996 | 0.9998 |
| 0.0432 | 7.9243 | 4604 | 0.9540 | 0.5295 | 0.9540 | 0.9767 |
| 0.0432 | 7.9277 | 4606 | 0.9145 | 0.5243 | 0.9145 | 0.9563 |
| 0.0432 | 7.9312 | 4608 | 0.8949 | 0.5526 | 0.8949 | 0.9460 |
| 0.0432 | 7.9346 | 4610 | 0.8896 | 0.5526 | 0.8896 | 0.9432 |
| 0.0432 | 7.9380 | 4612 | 0.8901 | 0.5526 | 0.8901 | 0.9435 |
| 0.0432 | 7.9415 | 4614 | 0.8976 | 0.5526 | 0.8976 | 0.9474 |
| 0.0432 | 7.9449 | 4616 | 0.9057 | 0.5526 | 0.9057 | 0.9517 |
| 0.0432 | 7.9484 | 4618 | 0.9299 | 0.5451 | 0.9299 | 0.9643 |
| 0.0432 | 7.9518 | 4620 | 0.9714 | 0.5668 | 0.9714 | 0.9856 |
| 0.0432 | 7.9552 | 4622 | 1.0187 | 0.5013 | 1.0187 | 1.0093 |
| 0.0432 | 7.9587 | 4624 | 1.0476 | 0.5013 | 1.0476 | 1.0235 |
| 0.0432 | 7.9621 | 4626 | 1.0387 | 0.5257 | 1.0387 | 1.0192 |
| 0.0432 | 7.9656 | 4628 | 1.0022 | 0.5385 | 1.0022 | 1.0011 |
| 0.0432 | 7.9690 | 4630 | 0.9476 | 0.5668 | 0.9476 | 0.9734 |
| 0.0432 | 7.9725 | 4632 | 0.9182 | 0.5726 | 0.9182 | 0.9583 |
| 0.0432 | 7.9759 | 4634 | 0.8807 | 0.55 | 0.8807 | 0.9385 |
| 0.0432 | 7.9793 | 4636 | 0.8475 | 0.55 | 0.8475 | 0.9206 |
| 0.0432 | 7.9828 | 4638 | 0.8438 | 0.5526 | 0.8438 | 0.9186 |
| 0.0432 | 7.9862 | 4640 | 0.8588 | 0.55 | 0.8588 | 0.9267 |
| 0.0432 | 7.9897 | 4642 | 0.8712 | 0.55 | 0.8712 | 0.9334 |
| 0.0432 | 7.9931 | 4644 | 0.8999 | 0.55 | 0.8999 | 0.9486 |
| 0.0432 | 7.9966 | 4646 | 0.9454 | 0.5068 | 0.9454 | 0.9723 |
| 0.0432 | 8.0 | 4648 | 0.9732 | 0.5041 | 0.9732 | 0.9865 |
| 0.0432 | 8.0034 | 4650 | 0.9845 | 0.4903 | 0.9845 | 0.9922 |
| 0.0432 | 8.0069 | 4652 | 0.9855 | 0.4903 | 0.9855 | 0.9927 |
| 0.0432 | 8.0103 | 4654 | 0.9713 | 0.4661 | 0.9713 | 0.9855 |
| 0.0432 | 8.0138 | 4656 | 0.9567 | 0.5072 | 0.9567 | 0.9781 |
| 0.0432 | 8.0172 | 4658 | 0.9450 | 0.4823 | 0.9450 | 0.9721 |
| 0.0432 | 8.0207 | 4660 | 0.9472 | 0.4823 | 0.9472 | 0.9733 |
| 0.0432 | 8.0241 | 4662 | 0.9493 | 0.4823 | 0.9493 | 0.9743 |
| 0.0432 | 8.0275 | 4664 | 0.9391 | 0.4823 | 0.9391 | 0.9691 |
| 0.0432 | 8.0310 | 4666 | 0.9466 | 0.4823 | 0.9466 | 0.9729 |
| 0.0432 | 8.0344 | 4668 | 0.9446 | 0.4823 | 0.9446 | 0.9719 |
| 0.0432 | 8.0379 | 4670 | 0.9401 | 0.4823 | 0.9401 | 0.9696 |
| 0.0432 | 8.0413 | 4672 | 0.9189 | 0.4823 | 0.9189 | 0.9586 |
| 0.0432 | 8.0448 | 4674 | 0.8935 | 0.5235 | 0.8935 | 0.9453 |
| 0.0432 | 8.0482 | 4676 | 0.8886 | 0.55 | 0.8886 | 0.9426 |
| 0.0432 | 8.0516 | 4678 | 0.8928 | 0.55 | 0.8928 | 0.9449 |
| 0.0432 | 8.0551 | 4680 | 0.9031 | 0.55 | 0.9031 | 0.9503 |
| 0.0432 | 8.0585 | 4682 | 0.9175 | 0.55 | 0.9175 | 0.9579 |
| 0.0432 | 8.0620 | 4684 | 0.9272 | 0.5100 | 0.9272 | 0.9629 |
| 0.0432 | 8.0654 | 4686 | 0.9400 | 0.5100 | 0.9400 | 0.9695 |
| 0.0432 | 8.0688 | 4688 | 0.9557 | 0.5336 | 0.9557 | 0.9776 |
| 0.0432 | 8.0723 | 4690 | 0.9691 | 0.5161 | 0.9691 | 0.9844 |
| 0.0432 | 8.0757 | 4692 | 0.9701 | 0.5161 | 0.9701 | 0.9849 |
| 0.0432 | 8.0792 | 4694 | 0.9883 | 0.4903 | 0.9883 | 0.9941 |
| 0.0432 | 8.0826 | 4696 | 1.0091 | 0.4765 | 1.0091 | 1.0046 |
| 0.0432 | 8.0861 | 4698 | 1.0112 | 0.4765 | 1.0112 | 1.0056 |
| 0.0432 | 8.0895 | 4700 | 1.0111 | 0.4765 | 1.0111 | 1.0055 |
| 0.0432 | 8.0929 | 4702 | 1.0003 | 0.4765 | 1.0003 | 1.0001 |
| 0.0432 | 8.0964 | 4704 | 0.9822 | 0.4903 | 0.9822 | 0.9911 |
| 0.0432 | 8.0998 | 4706 | 0.9747 | 0.4931 | 0.9747 | 0.9873 |
| 0.0432 | 8.1033 | 4708 | 0.9614 | 0.4931 | 0.9614 | 0.9805 |
| 0.0432 | 8.1067 | 4710 | 0.9526 | 0.4931 | 0.9526 | 0.9760 |
| 0.0432 | 8.1102 | 4712 | 0.9449 | 0.5170 | 0.9449 | 0.9720 |
| 0.0432 | 8.1136 | 4714 | 0.9300 | 0.5535 | 0.9300 | 0.9643 |
| 0.0432 | 8.1170 | 4716 | 0.9133 | 0.5726 | 0.9133 | 0.9557 |
| 0.0432 | 8.1205 | 4718 | 0.9021 | 0.5726 | 0.9021 | 0.9498 |
| 0.0432 | 8.1239 | 4720 | 0.8962 | 0.5726 | 0.8962 | 0.9467 |
| 0.0432 | 8.1274 | 4722 | 0.8979 | 0.5726 | 0.8979 | 0.9476 |
| 0.0432 | 8.1308 | 4724 | 0.9043 | 0.5336 | 0.9043 | 0.9509 |
| 0.0432 | 8.1343 | 4726 | 0.9240 | 0.5336 | 0.9240 | 0.9612 |
| 0.0432 | 8.1377 | 4728 | 0.9509 | 0.4931 | 0.9509 | 0.9751 |
| 0.0432 | 8.1411 | 4730 | 0.9535 | 0.4931 | 0.9535 | 0.9765 |
| 0.0432 | 8.1446 | 4732 | 0.9333 | 0.5068 | 0.9333 | 0.9661 |
| 0.0432 | 8.1480 | 4734 | 0.8964 | 0.5336 | 0.8964 | 0.9468 |
| 0.0432 | 8.1515 | 4736 | 0.8750 | 0.6053 | 0.8750 | 0.9354 |
| 0.0432 | 8.1549 | 4738 | 0.8715 | 0.6053 | 0.8715 | 0.9335 |
| 0.0432 | 8.1583 | 4740 | 0.8721 | 0.6053 | 0.8721 | 0.9339 |
| 0.0432 | 8.1618 | 4742 | 0.8878 | 0.6053 | 0.8878 | 0.9422 |
| 0.0432 | 8.1652 | 4744 | 0.9206 | 0.5336 | 0.9206 | 0.9595 |
| 0.0432 | 8.1687 | 4746 | 0.9448 | 0.5068 | 0.9448 | 0.9720 |
| 0.0432 | 8.1721 | 4748 | 0.9546 | 0.5068 | 0.9546 | 0.9771 |
| 0.0432 | 8.1756 | 4750 | 0.9519 | 0.5068 | 0.9519 | 0.9757 |
| 0.0432 | 8.1790 | 4752 | 0.9400 | 0.5068 | 0.9400 | 0.9695 |
| 0.0432 | 8.1824 | 4754 | 0.9109 | 0.5336 | 0.9109 | 0.9544 |
| 0.0432 | 8.1859 | 4756 | 0.8881 | 0.55 | 0.8881 | 0.9424 |
| 0.0432 | 8.1893 | 4758 | 0.8753 | 0.6053 | 0.8753 | 0.9356 |
| 0.0432 | 8.1928 | 4760 | 0.8648 | 0.6053 | 0.8648 | 0.9299 |
| 0.0432 | 8.1962 | 4762 | 0.8674 | 0.6053 | 0.8674 | 0.9313 |
| 0.0432 | 8.1997 | 4764 | 0.8865 | 0.5100 | 0.8865 | 0.9415 |
| 0.0432 | 8.2031 | 4766 | 0.9120 | 0.5336 | 0.9120 | 0.9550 |
| 0.0432 | 8.2065 | 4768 | 0.9452 | 0.5336 | 0.9452 | 0.9722 |
| 0.0432 | 8.2100 | 4770 | 0.9839 | 0.5257 | 0.9839 | 0.9919 |
| 0.0432 | 8.2134 | 4772 | 0.9920 | 0.5129 | 0.9920 | 0.9960 |
| 0.0432 | 8.2169 | 4774 | 0.9800 | 0.5027 | 0.9800 | 0.9899 |
| 0.0432 | 8.2203 | 4776 | 0.9509 | 0.5562 | 0.9509 | 0.9752 |
| 0.0432 | 8.2238 | 4778 | 0.9140 | 0.5336 | 0.9140 | 0.9560 |
| 0.0432 | 8.2272 | 4780 | 0.9011 | 0.5336 | 0.9011 | 0.9493 |
| 0.0432 | 8.2306 | 4782 | 0.9042 | 0.5336 | 0.9042 | 0.9509 |
| 0.0432 | 8.2341 | 4784 | 0.9245 | 0.5336 | 0.9245 | 0.9615 |
| 0.0432 | 8.2375 | 4786 | 0.9430 | 0.5562 | 0.9430 | 0.9711 |
| 0.0432 | 8.2410 | 4788 | 0.9377 | 0.5562 | 0.9377 | 0.9683 |
| 0.0432 | 8.2444 | 4790 | 0.9244 | 0.5336 | 0.9244 | 0.9615 |
| 0.0432 | 8.2478 | 4792 | 0.9243 | 0.5336 | 0.9243 | 0.9614 |
| 0.0432 | 8.2513 | 4794 | 0.9209 | 0.5336 | 0.9209 | 0.9596 |
| 0.0432 | 8.2547 | 4796 | 0.9142 | 0.5336 | 0.9142 | 0.9562 |
| 0.0432 | 8.2582 | 4798 | 0.9103 | 0.5336 | 0.9103 | 0.9541 |
| 0.0432 | 8.2616 | 4800 | 0.9085 | 0.5336 | 0.9085 | 0.9532 |
| 0.0432 | 8.2651 | 4802 | 0.9091 | 0.5336 | 0.9091 | 0.9535 |
| 0.0432 | 8.2685 | 4804 | 0.8993 | 0.5726 | 0.8993 | 0.9483 |
| 0.0432 | 8.2719 | 4806 | 0.9045 | 0.5726 | 0.9045 | 0.9511 |
| 0.0432 | 8.2754 | 4808 | 0.9069 | 0.5336 | 0.9069 | 0.9523 |
| 0.0432 | 8.2788 | 4810 | 0.9024 | 0.5336 | 0.9024 | 0.9500 |
| 0.0432 | 8.2823 | 4812 | 0.9119 | 0.5336 | 0.9119 | 0.9549 |
| 0.0432 | 8.2857 | 4814 | 0.9359 | 0.5197 | 0.9359 | 0.9674 |
| 0.0432 | 8.2892 | 4816 | 0.9671 | 0.4903 | 0.9671 | 0.9834 |
| 0.0432 | 8.2926 | 4818 | 0.9961 | 0.4765 | 0.9961 | 0.9981 |
| 0.0432 | 8.2960 | 4820 | 1.0076 | 0.4765 | 1.0076 | 1.0038 |
| 0.0432 | 8.2995 | 4822 | 1.0042 | 0.4765 | 1.0042 | 1.0021 |
| 0.0432 | 8.3029 | 4824 | 1.0083 | 0.4765 | 1.0083 | 1.0042 |
| 0.0432 | 8.3064 | 4826 | 0.9954 | 0.4765 | 0.9954 | 0.9977 |
| 0.0432 | 8.3098 | 4828 | 0.9892 | 0.4903 | 0.9892 | 0.9946 |
| 0.0432 | 8.3133 | 4830 | 0.9900 | 0.4903 | 0.9900 | 0.9950 |
| 0.0432 | 8.3167 | 4832 | 0.9937 | 0.4903 | 0.9937 | 0.9968 |
| 0.0432 | 8.3201 | 4834 | 0.9741 | 0.5161 | 0.9741 | 0.9870 |
| 0.0432 | 8.3236 | 4836 | 0.9355 | 0.5562 | 0.9355 | 0.9672 |
| 0.0432 | 8.3270 | 4838 | 0.8893 | 0.55 | 0.8893 | 0.9430 |
| 0.0432 | 8.3305 | 4840 | 0.8643 | 0.6093 | 0.8643 | 0.9297 |
| 0.0432 | 8.3339 | 4842 | 0.8610 | 0.6093 | 0.8610 | 0.9279 |
| 0.0432 | 8.3373 | 4844 | 0.8730 | 0.6093 | 0.8730 | 0.9343 |
| 0.0432 | 8.3408 | 4846 | 0.8956 | 0.55 | 0.8956 | 0.9464 |
| 0.0432 | 8.3442 | 4848 | 0.9079 | 0.55 | 0.9079 | 0.9528 |
| 0.0432 | 8.3477 | 4850 | 0.9264 | 0.5562 | 0.9264 | 0.9625 |
| 0.0432 | 8.3511 | 4852 | 0.9293 | 0.5562 | 0.9293 | 0.9640 |
| 0.0432 | 8.3546 | 4854 | 0.9312 | 0.5426 | 0.9312 | 0.9650 |
| 0.0432 | 8.3580 | 4856 | 0.9395 | 0.5426 | 0.9395 | 0.9693 |
| 0.0432 | 8.3614 | 4858 | 0.9419 | 0.5161 | 0.9419 | 0.9705 |
| 0.0432 | 8.3649 | 4860 | 0.9555 | 0.5161 | 0.9555 | 0.9775 |
| 0.0432 | 8.3683 | 4862 | 0.9685 | 0.5161 | 0.9685 | 0.9841 |
| 0.0432 | 8.3718 | 4864 | 0.9754 | 0.4903 | 0.9754 | 0.9876 |
| 0.0432 | 8.3752 | 4866 | 0.9743 | 0.4903 | 0.9743 | 0.9870 |
| 0.0432 | 8.3787 | 4868 | 0.9714 | 0.4903 | 0.9714 | 0.9856 |
| 0.0432 | 8.3821 | 4870 | 0.9747 | 0.4903 | 0.9747 | 0.9873 |
| 0.0432 | 8.3855 | 4872 | 0.9701 | 0.4903 | 0.9701 | 0.9849 |
| 0.0432 | 8.3890 | 4874 | 0.9775 | 0.4903 | 0.9775 | 0.9887 |
| 0.0432 | 8.3924 | 4876 | 0.9916 | 0.4765 | 0.9916 | 0.9958 |
| 0.0432 | 8.3959 | 4878 | 1.0105 | 0.4765 | 1.0105 | 1.0053 |
| 0.0432 | 8.3993 | 4880 | 1.0132 | 0.4765 | 1.0132 | 1.0066 |
| 0.0432 | 8.4028 | 4882 | 0.9941 | 0.4765 | 0.9941 | 0.9970 |
| 0.0432 | 8.4062 | 4884 | 0.9829 | 0.4903 | 0.9829 | 0.9914 |
| 0.0432 | 8.4096 | 4886 | 0.9678 | 0.4903 | 0.9678 | 0.9837 |
| 0.0432 | 8.4131 | 4888 | 0.9627 | 0.4903 | 0.9627 | 0.9812 |
| 0.0432 | 8.4165 | 4890 | 0.9636 | 0.4903 | 0.9636 | 0.9816 |
| 0.0432 | 8.4200 | 4892 | 0.9801 | 0.4903 | 0.9801 | 0.9900 |
| 0.0432 | 8.4234 | 4894 | 1.0025 | 0.4903 | 1.0025 | 1.0013 |
| 0.0432 | 8.4269 | 4896 | 1.0020 | 0.4903 | 1.0020 | 1.0010 |
| 0.0432 | 8.4303 | 4898 | 0.9994 | 0.4903 | 0.9994 | 0.9997 |
| 0.0432 | 8.4337 | 4900 | 0.9788 | 0.4903 | 0.9788 | 0.9893 |
| 0.0432 | 8.4372 | 4902 | 0.9484 | 0.4928 | 0.9484 | 0.9739 |
| 0.0432 | 8.4406 | 4904 | 0.9250 | 0.5227 | 0.9250 | 0.9618 |
| 0.0432 | 8.4441 | 4906 | 0.9212 | 0.5227 | 0.9212 | 0.9598 |
| 0.0432 | 8.4475 | 4908 | 0.9345 | 0.5227 | 0.9345 | 0.9667 |
| 0.0432 | 8.4509 | 4910 | 0.9386 | 0.5227 | 0.9386 | 0.9688 |
| 0.0432 | 8.4544 | 4912 | 0.9555 | 0.4928 | 0.9555 | 0.9775 |
| 0.0432 | 8.4578 | 4914 | 0.9625 | 0.4903 | 0.9625 | 0.9810 |
| 0.0432 | 8.4613 | 4916 | 0.9711 | 0.4903 | 0.9711 | 0.9854 |
| 0.0432 | 8.4647 | 4918 | 0.9749 | 0.4903 | 0.9749 | 0.9874 |
| 0.0432 | 8.4682 | 4920 | 0.9738 | 0.4903 | 0.9738 | 0.9868 |
| 0.0432 | 8.4716 | 4922 | 0.9545 | 0.4903 | 0.9545 | 0.9770 |
| 0.0432 | 8.4750 | 4924 | 0.9241 | 0.5885 | 0.9241 | 0.9613 |
| 0.0432 | 8.4785 | 4926 | 0.8918 | 0.5650 | 0.8918 | 0.9444 |
| 0.0432 | 8.4819 | 4928 | 0.8754 | 0.5650 | 0.8754 | 0.9357 |
| 0.0432 | 8.4854 | 4930 | 0.8532 | 0.6093 | 0.8532 | 0.9237 |
| 0.0432 | 8.4888 | 4932 | 0.8362 | 0.6093 | 0.8362 | 0.9145 |
| 0.0432 | 8.4923 | 4934 | 0.8295 | 0.6093 | 0.8295 | 0.9108 |
| 0.0432 | 8.4957 | 4936 | 0.8254 | 0.6093 | 0.8254 | 0.9085 |
| 0.0432 | 8.4991 | 4938 | 0.8332 | 0.6093 | 0.8332 | 0.9128 |
| 0.0432 | 8.5026 | 4940 | 0.8511 | 0.6093 | 0.8511 | 0.9226 |
| 0.0432 | 8.5060 | 4942 | 0.8803 | 0.5909 | 0.8803 | 0.9382 |
| 0.0432 | 8.5095 | 4944 | 0.8950 | 0.6132 | 0.8950 | 0.9460 |
| 0.0432 | 8.5129 | 4946 | 0.9132 | 0.5588 | 0.9132 | 0.9556 |
| 0.0432 | 8.5164 | 4948 | 0.9351 | 0.5536 | 0.9351 | 0.9670 |
| 0.0432 | 8.5198 | 4950 | 0.9369 | 0.5536 | 0.9369 | 0.9679 |
| 0.0432 | 8.5232 | 4952 | 0.9238 | 0.5588 | 0.9238 | 0.9612 |
| 0.0432 | 8.5267 | 4954 | 0.9034 | 0.5588 | 0.9034 | 0.9505 |
| 0.0432 | 8.5301 | 4956 | 0.8738 | 0.5909 | 0.8738 | 0.9348 |
| 0.0432 | 8.5336 | 4958 | 0.8564 | 0.6053 | 0.8564 | 0.9254 |
| 0.0432 | 8.5370 | 4960 | 0.8590 | 0.6053 | 0.8590 | 0.9268 |
| 0.0432 | 8.5404 | 4962 | 0.8726 | 0.5909 | 0.8726 | 0.9341 |
| 0.0432 | 8.5439 | 4964 | 0.8926 | 0.5358 | 0.8926 | 0.9448 |
| 0.0432 | 8.5473 | 4966 | 0.9002 | 0.5588 | 0.9002 | 0.9488 |
| 0.0432 | 8.5508 | 4968 | 0.9019 | 0.5588 | 0.9019 | 0.9497 |
| 0.0432 | 8.5542 | 4970 | 0.9102 | 0.5588 | 0.9102 | 0.9541 |
| 0.0432 | 8.5577 | 4972 | 0.9236 | 0.5315 | 0.9236 | 0.9611 |
| 0.0432 | 8.5611 | 4974 | 0.9341 | 0.5161 | 0.9341 | 0.9665 |
| 0.0432 | 8.5645 | 4976 | 0.9366 | 0.5161 | 0.9366 | 0.9678 |
| 0.0432 | 8.5680 | 4978 | 0.9337 | 0.4931 | 0.9337 | 0.9663 |
| 0.0432 | 8.5714 | 4980 | 0.9194 | 0.5315 | 0.9194 | 0.9589 |
| 0.0432 | 8.5749 | 4982 | 0.9017 | 0.5588 | 0.9017 | 0.9496 |
| 0.0432 | 8.5783 | 4984 | 0.8866 | 0.5358 | 0.8866 | 0.9416 |
| 0.0432 | 8.5818 | 4986 | 0.8601 | 0.6053 | 0.8601 | 0.9274 |
| 0.0432 | 8.5852 | 4988 | 0.8495 | 0.6093 | 0.8495 | 0.9217 |
| 0.0432 | 8.5886 | 4990 | 0.8432 | 0.6093 | 0.8432 | 0.9182 |
| 0.0432 | 8.5921 | 4992 | 0.8287 | 0.6239 | 0.8287 | 0.9104 |
| 0.0432 | 8.5955 | 4994 | 0.8226 | 0.6239 | 0.8226 | 0.9070 |
| 0.0432 | 8.5990 | 4996 | 0.8259 | 0.6093 | 0.8259 | 0.9088 |
| 0.0432 | 8.6024 | 4998 | 0.8338 | 0.6093 | 0.8338 | 0.9131 |
| 0.0397 | 8.6059 | 5000 | 0.8552 | 0.5946 | 0.8552 | 0.9247 |
| 0.0397 | 8.6093 | 5002 | 0.8887 | 0.5909 | 0.8887 | 0.9427 |
| 0.0397 | 8.6127 | 5004 | 0.9104 | 0.5197 | 0.9104 | 0.9541 |
| 0.0397 | 8.6162 | 5006 | 0.9273 | 0.4661 | 0.9273 | 0.9630 |
| 0.0397 | 8.6196 | 5008 | 0.9256 | 0.4661 | 0.9256 | 0.9621 |
| 0.0397 | 8.6231 | 5010 | 0.9153 | 0.5473 | 0.9153 | 0.9567 |
| 0.0397 | 8.6265 | 5012 | 0.9021 | 0.5227 | 0.9021 | 0.9498 |
| 0.0397 | 8.6299 | 5014 | 0.8970 | 0.5227 | 0.8970 | 0.9471 |
| 0.0397 | 8.6334 | 5016 | 0.8805 | 0.5248 | 0.8805 | 0.9384 |
| 0.0397 | 8.6368 | 5018 | 0.8654 | 0.5946 | 0.8654 | 0.9303 |
| 0.0397 | 8.6403 | 5020 | 0.8486 | 0.5946 | 0.8486 | 0.9212 |
| 0.0397 | 8.6437 | 5022 | 0.8476 | 0.5946 | 0.8476 | 0.9207 |
| 0.0397 | 8.6472 | 5024 | 0.8599 | 0.5946 | 0.8599 | 0.9273 |
| 0.0397 | 8.6506 | 5026 | 0.8803 | 0.5248 | 0.8803 | 0.9382 |
| 0.0397 | 8.6540 | 5028 | 0.9117 | 0.5473 | 0.9117 | 0.9549 |
| 0.0397 | 8.6575 | 5030 | 0.9460 | 0.4661 | 0.9460 | 0.9726 |
| 0.0397 | 8.6609 | 5032 | 0.9580 | 0.4661 | 0.9580 | 0.9788 |
| 0.0397 | 8.6644 | 5034 | 0.9508 | 0.4661 | 0.9508 | 0.9751 |
| 0.0397 | 8.6678 | 5036 | 0.9330 | 0.4661 | 0.9330 | 0.9659 |
| 0.0397 | 8.6713 | 5038 | 0.9222 | 0.5473 | 0.9222 | 0.9603 |
| 0.0397 | 8.6747 | 5040 | 0.9078 | 0.5733 | 0.9078 | 0.9528 |
| 0.0397 | 8.6781 | 5042 | 0.8865 | 0.5946 | 0.8865 | 0.9415 |
| 0.0397 | 8.6816 | 5044 | 0.8574 | 0.5946 | 0.8574 | 0.9259 |
| 0.0397 | 8.6850 | 5046 | 0.8366 | 0.6093 | 0.8366 | 0.9146 |
| 0.0397 | 8.6885 | 5048 | 0.8257 | 0.6093 | 0.8257 | 0.9087 |
| 0.0397 | 8.6919 | 5050 | 0.8166 | 0.6093 | 0.8166 | 0.9037 |
| 0.0397 | 8.6954 | 5052 | 0.8200 | 0.6093 | 0.8200 | 0.9055 |
| 0.0397 | 8.6988 | 5054 | 0.8339 | 0.6093 | 0.8339 | 0.9132 |
| 0.0397 | 8.7022 | 5056 | 0.8572 | 0.6093 | 0.8572 | 0.9259 |
| 0.0397 | 8.7057 | 5058 | 0.8804 | 0.5946 | 0.8804 | 0.9383 |
| 0.0397 | 8.7091 | 5060 | 0.9040 | 0.6132 | 0.9040 | 0.9508 |
| 0.0397 | 8.7126 | 5062 | 0.9208 | 0.4931 | 0.9208 | 0.9596 |
| 0.0397 | 8.7160 | 5064 | 0.9249 | 0.4931 | 0.9249 | 0.9617 |
| 0.0397 | 8.7194 | 5066 | 0.9338 | 0.4661 | 0.9338 | 0.9663 |
| 0.0397 | 8.7229 | 5068 | 0.9286 | 0.4661 | 0.9286 | 0.9636 |
| 0.0397 | 8.7263 | 5070 | 0.9253 | 0.4661 | 0.9253 | 0.9619 |
| 0.0397 | 8.7298 | 5072 | 0.9157 | 0.5473 | 0.9157 | 0.9569 |
| 0.0397 | 8.7332 | 5074 | 0.9115 | 0.5227 | 0.9115 | 0.9547 |
| 0.0397 | 8.7367 | 5076 | 0.9082 | 0.5227 | 0.9082 | 0.9530 |
| 0.0397 | 8.7401 | 5078 | 0.9081 | 0.5227 | 0.9081 | 0.9529 |
| 0.0397 | 8.7435 | 5080 | 0.9008 | 0.5248 | 0.9008 | 0.9491 |
| 0.0397 | 8.7470 | 5082 | 0.8873 | 0.5527 | 0.8873 | 0.9420 |
| 0.0397 | 8.7504 | 5084 | 0.8742 | 0.5527 | 0.8742 | 0.9350 |
| 0.0397 | 8.7539 | 5086 | 0.8740 | 0.5527 | 0.8740 | 0.9349 |
| 0.0397 | 8.7573 | 5088 | 0.8890 | 0.5248 | 0.8890 | 0.9429 |
| 0.0397 | 8.7608 | 5090 | 0.9115 | 0.5227 | 0.9115 | 0.9547 |
| 0.0397 | 8.7642 | 5092 | 0.9219 | 0.5227 | 0.9219 | 0.9601 |
| 0.0397 | 8.7676 | 5094 | 0.9275 | 0.4675 | 0.9275 | 0.9631 |
| 0.0397 | 8.7711 | 5096 | 0.9375 | 0.4928 | 0.9375 | 0.9682 |
| 0.0397 | 8.7745 | 5098 | 0.9312 | 0.4675 | 0.9312 | 0.9650 |
| 0.0397 | 8.7780 | 5100 | 0.9227 | 0.4957 | 0.9227 | 0.9606 |
| 0.0397 | 8.7814 | 5102 | 0.9063 | 0.55 | 0.9063 | 0.9520 |
| 0.0397 | 8.7849 | 5104 | 0.8983 | 0.55 | 0.8983 | 0.9478 |
| 0.0397 | 8.7883 | 5106 | 0.8975 | 0.55 | 0.8975 | 0.9473 |
| 0.0397 | 8.7917 | 5108 | 0.8995 | 0.55 | 0.8995 | 0.9484 |
| 0.0397 | 8.7952 | 5110 | 0.8991 | 0.55 | 0.8991 | 0.9482 |
| 0.0397 | 8.7986 | 5112 | 0.9006 | 0.55 | 0.9006 | 0.9490 |
| 0.0397 | 8.8021 | 5114 | 0.8943 | 0.55 | 0.8943 | 0.9457 |
| 0.0397 | 8.8055 | 5116 | 0.8964 | 0.55 | 0.8964 | 0.9468 |
| 0.0397 | 8.8090 | 5118 | 0.9000 | 0.55 | 0.9000 | 0.9487 |
| 0.0397 | 8.8124 | 5120 | 0.8975 | 0.55 | 0.8975 | 0.9473 |
| 0.0397 | 8.8158 | 5122 | 0.9041 | 0.4957 | 0.9041 | 0.9508 |
| 0.0397 | 8.8193 | 5124 | 0.9139 | 0.4931 | 0.9139 | 0.9560 |
| 0.0397 | 8.8227 | 5126 | 0.9233 | 0.4931 | 0.9233 | 0.9609 |
| 0.0397 | 8.8262 | 5128 | 0.9235 | 0.4931 | 0.9235 | 0.9610 |
| 0.0397 | 8.8296 | 5130 | 0.9153 | 0.4931 | 0.9153 | 0.9567 |
| 0.0397 | 8.8330 | 5132 | 0.8985 | 0.55 | 0.8985 | 0.9479 |
| 0.0397 | 8.8365 | 5134 | 0.8776 | 0.5645 | 0.8776 | 0.9368 |
| 0.0397 | 8.8399 | 5136 | 0.8639 | 0.6053 | 0.8639 | 0.9295 |
| 0.0397 | 8.8434 | 5138 | 0.8620 | 0.6053 | 0.8620 | 0.9285 |
| 0.0397 | 8.8468 | 5140 | 0.8698 | 0.6053 | 0.8698 | 0.9326 |
| 0.0397 | 8.8503 | 5142 | 0.8782 | 0.5645 | 0.8782 | 0.9371 |
| 0.0397 | 8.8537 | 5144 | 0.8941 | 0.5227 | 0.8941 | 0.9456 |
| 0.0397 | 8.8571 | 5146 | 0.9180 | 0.4661 | 0.9180 | 0.9581 |
| 0.0397 | 8.8606 | 5148 | 0.9336 | 0.4661 | 0.9336 | 0.9662 |
| 0.0397 | 8.8640 | 5150 | 0.9400 | 0.4661 | 0.9400 | 0.9695 |
| 0.0397 | 8.8675 | 5152 | 0.9313 | 0.4661 | 0.9313 | 0.9650 |
| 0.0397 | 8.8709 | 5154 | 0.9101 | 0.5227 | 0.9101 | 0.9540 |
| 0.0397 | 8.8744 | 5156 | 0.8982 | 0.5227 | 0.8982 | 0.9478 |
| 0.0397 | 8.8778 | 5158 | 0.8828 | 0.5378 | 0.8828 | 0.9396 |
| 0.0397 | 8.8812 | 5160 | 0.8707 | 0.5378 | 0.8707 | 0.9331 |
| 0.0397 | 8.8847 | 5162 | 0.8542 | 0.6093 | 0.8542 | 0.9242 |
| 0.0397 | 8.8881 | 5164 | 0.8427 | 0.6093 | 0.8427 | 0.9180 |
| 0.0397 | 8.8916 | 5166 | 0.8315 | 0.6093 | 0.8315 | 0.9119 |
| 0.0397 | 8.8950 | 5168 | 0.8191 | 0.6093 | 0.8191 | 0.9051 |
| 0.0397 | 8.8985 | 5170 | 0.8147 | 0.6093 | 0.8147 | 0.9026 |
| 0.0397 | 8.9019 | 5172 | 0.8161 | 0.6093 | 0.8161 | 0.9034 |
| 0.0397 | 8.9053 | 5174 | 0.8105 | 0.6093 | 0.8105 | 0.9003 |
| 0.0397 | 8.9088 | 5176 | 0.8050 | 0.6239 | 0.8050 | 0.8972 |
| 0.0397 | 8.9122 | 5178 | 0.8077 | 0.6239 | 0.8077 | 0.8987 |
| 0.0397 | 8.9157 | 5180 | 0.8182 | 0.6093 | 0.8182 | 0.9045 |
| 0.0397 | 8.9191 | 5182 | 0.8327 | 0.6093 | 0.8327 | 0.9125 |
| 0.0397 | 8.9225 | 5184 | 0.8512 | 0.6093 | 0.8512 | 0.9226 |
| 0.0397 | 8.9260 | 5186 | 0.8648 | 0.6053 | 0.8648 | 0.9299 |
| 0.0397 | 8.9294 | 5188 | 0.8857 | 0.6271 | 0.8857 | 0.9411 |
| 0.0397 | 8.9329 | 5190 | 0.9074 | 0.5197 | 0.9074 | 0.9526 |
| 0.0397 | 8.9363 | 5192 | 0.9246 | 0.4661 | 0.9246 | 0.9616 |
| 0.0397 | 8.9398 | 5194 | 0.9242 | 0.4661 | 0.9242 | 0.9614 |
| 0.0397 | 8.9432 | 5196 | 0.9111 | 0.4661 | 0.9111 | 0.9545 |
| 0.0397 | 8.9466 | 5198 | 0.8923 | 0.6029 | 0.8923 | 0.9446 |
| 0.0397 | 8.9501 | 5200 | 0.8750 | 0.5798 | 0.8750 | 0.9354 |
| 0.0397 | 8.9535 | 5202 | 0.8639 | 0.5798 | 0.8639 | 0.9295 |
| 0.0397 | 8.9570 | 5204 | 0.8567 | 0.6093 | 0.8567 | 0.9256 |
| 0.0397 | 8.9604 | 5206 | 0.8500 | 0.6093 | 0.8500 | 0.9220 |
| 0.0397 | 8.9639 | 5208 | 0.8371 | 0.6093 | 0.8371 | 0.9149 |
| 0.0397 | 8.9673 | 5210 | 0.8309 | 0.6093 | 0.8309 | 0.9116 |
| 0.0397 | 8.9707 | 5212 | 0.8262 | 0.6093 | 0.8262 | 0.9090 |
| 0.0397 | 8.9742 | 5214 | 0.8279 | 0.6093 | 0.8279 | 0.9099 |
| 0.0397 | 8.9776 | 5216 | 0.8290 | 0.6093 | 0.8290 | 0.9105 |
| 0.0397 | 8.9811 | 5218 | 0.8386 | 0.6093 | 0.8386 | 0.9157 |
| 0.0397 | 8.9845 | 5220 | 0.8446 | 0.6093 | 0.8446 | 0.9190 |
| 0.0397 | 8.9880 | 5222 | 0.8527 | 0.6093 | 0.8527 | 0.9234 |
| 0.0397 | 8.9914 | 5224 | 0.8620 | 0.6053 | 0.8620 | 0.9285 |
| 0.0397 | 8.9948 | 5226 | 0.8751 | 0.6053 | 0.8751 | 0.9354 |
| 0.0397 | 8.9983 | 5228 | 0.8803 | 0.6053 | 0.8803 | 0.9382 |
| 0.0397 | 9.0017 | 5230 | 0.8780 | 0.6053 | 0.8780 | 0.9370 |
| 0.0397 | 9.0052 | 5232 | 0.8731 | 0.6053 | 0.8731 | 0.9344 |
| 0.0397 | 9.0086 | 5234 | 0.8603 | 0.6053 | 0.8603 | 0.9275 |
| 0.0397 | 9.0120 | 5236 | 0.8491 | 0.6053 | 0.8491 | 0.9214 |
| 0.0397 | 9.0155 | 5238 | 0.8349 | 0.6093 | 0.8349 | 0.9137 |
| 0.0397 | 9.0189 | 5240 | 0.8304 | 0.6093 | 0.8304 | 0.9113 |
| 0.0397 | 9.0224 | 5242 | 0.8361 | 0.6093 | 0.8361 | 0.9144 |
| 0.0397 | 9.0258 | 5244 | 0.8456 | 0.6093 | 0.8456 | 0.9196 |
| 0.0397 | 9.0293 | 5246 | 0.8439 | 0.6093 | 0.8439 | 0.9186 |
| 0.0397 | 9.0327 | 5248 | 0.8475 | 0.6093 | 0.8475 | 0.9206 |
| 0.0397 | 9.0361 | 5250 | 0.8514 | 0.6093 | 0.8514 | 0.9227 |
| 0.0397 | 9.0396 | 5252 | 0.8573 | 0.6093 | 0.8573 | 0.9259 |
| 0.0397 | 9.0430 | 5254 | 0.8691 | 0.6053 | 0.8691 | 0.9323 |
| 0.0397 | 9.0465 | 5256 | 0.8798 | 0.6053 | 0.8798 | 0.9380 |
| 0.0397 | 9.0499 | 5258 | 0.8975 | 0.5840 | 0.8975 | 0.9474 |
| 0.0397 | 9.0534 | 5260 | 0.9134 | 0.5591 | 0.9134 | 0.9557 |
| 0.0397 | 9.0568 | 5262 | 0.9153 | 0.5591 | 0.9153 | 0.9567 |
| 0.0397 | 9.0602 | 5264 | 0.9071 | 0.5840 | 0.9071 | 0.9524 |
| 0.0397 | 9.0637 | 5266 | 0.8947 | 0.5840 | 0.8947 | 0.9459 |
| 0.0397 | 9.0671 | 5268 | 0.8873 | 0.5909 | 0.8873 | 0.9420 |
| 0.0397 | 9.0706 | 5270 | 0.8745 | 0.5909 | 0.8745 | 0.9352 |
| 0.0397 | 9.0740 | 5272 | 0.8646 | 0.6093 | 0.8646 | 0.9299 |
| 0.0397 | 9.0775 | 5274 | 0.8579 | 0.6093 | 0.8579 | 0.9262 |
| 0.0397 | 9.0809 | 5276 | 0.8495 | 0.6093 | 0.8495 | 0.9217 |
| 0.0397 | 9.0843 | 5278 | 0.8471 | 0.6093 | 0.8471 | 0.9204 |
| 0.0397 | 9.0878 | 5280 | 0.8487 | 0.6093 | 0.8487 | 0.9213 |
| 0.0397 | 9.0912 | 5282 | 0.8540 | 0.6093 | 0.8540 | 0.9241 |
| 0.0397 | 9.0947 | 5284 | 0.8658 | 0.5946 | 0.8658 | 0.9305 |
| 0.0397 | 9.0981 | 5286 | 0.8804 | 0.5946 | 0.8804 | 0.9383 |
| 0.0397 | 9.1015 | 5288 | 0.8860 | 0.5946 | 0.8860 | 0.9413 |
| 0.0397 | 9.1050 | 5290 | 0.8936 | 0.5616 | 0.8936 | 0.9453 |
| 0.0397 | 9.1084 | 5292 | 0.8974 | 0.5616 | 0.8974 | 0.9473 |
| 0.0397 | 9.1119 | 5294 | 0.8960 | 0.5616 | 0.8960 | 0.9466 |
| 0.0397 | 9.1153 | 5296 | 0.8917 | 0.5616 | 0.8917 | 0.9443 |
| 0.0397 | 9.1188 | 5298 | 0.8891 | 0.5946 | 0.8891 | 0.9429 |
| 0.0397 | 9.1222 | 5300 | 0.8913 | 0.5650 | 0.8913 | 0.9441 |
| 0.0397 | 9.1256 | 5302 | 0.8855 | 0.5681 | 0.8855 | 0.9410 |
| 0.0397 | 9.1291 | 5304 | 0.8804 | 0.5946 | 0.8804 | 0.9383 |
| 0.0397 | 9.1325 | 5306 | 0.8761 | 0.5946 | 0.8761 | 0.9360 |
| 0.0397 | 9.1360 | 5308 | 0.8705 | 0.5946 | 0.8705 | 0.9330 |
| 0.0397 | 9.1394 | 5310 | 0.8673 | 0.5946 | 0.8673 | 0.9313 |
| 0.0397 | 9.1429 | 5312 | 0.8686 | 0.5946 | 0.8686 | 0.9320 |
| 0.0397 | 9.1463 | 5314 | 0.8731 | 0.5946 | 0.8731 | 0.9344 |
| 0.0397 | 9.1497 | 5316 | 0.8772 | 0.5681 | 0.8772 | 0.9366 |
| 0.0397 | 9.1532 | 5318 | 0.8845 | 0.5681 | 0.8845 | 0.9405 |
| 0.0397 | 9.1566 | 5320 | 0.8909 | 0.5227 | 0.8909 | 0.9439 |
| 0.0397 | 9.1601 | 5322 | 0.8978 | 0.5227 | 0.8978 | 0.9475 |
| 0.0397 | 9.1635 | 5324 | 0.9025 | 0.5227 | 0.9025 | 0.9500 |
| 0.0397 | 9.1670 | 5326 | 0.9051 | 0.5227 | 0.9051 | 0.9514 |
| 0.0397 | 9.1704 | 5328 | 0.9057 | 0.5227 | 0.9057 | 0.9517 |
| 0.0397 | 9.1738 | 5330 | 0.9107 | 0.5227 | 0.9107 | 0.9543 |
| 0.0397 | 9.1773 | 5332 | 0.9177 | 0.4941 | 0.9177 | 0.9580 |
| 0.0397 | 9.1807 | 5334 | 0.9251 | 0.4408 | 0.9251 | 0.9618 |
| 0.0397 | 9.1842 | 5336 | 0.9250 | 0.4408 | 0.9250 | 0.9617 |
| 0.0397 | 9.1876 | 5338 | 0.9215 | 0.4408 | 0.9215 | 0.9600 |
| 0.0397 | 9.1910 | 5340 | 0.9188 | 0.4408 | 0.9188 | 0.9585 |
| 0.0397 | 9.1945 | 5342 | 0.9204 | 0.4408 | 0.9204 | 0.9594 |
| 0.0397 | 9.1979 | 5344 | 0.9205 | 0.4408 | 0.9205 | 0.9594 |
| 0.0397 | 9.2014 | 5346 | 0.9148 | 0.4813 | 0.9148 | 0.9565 |
| 0.0397 | 9.2048 | 5348 | 0.9022 | 0.5650 | 0.9022 | 0.9498 |
| 0.0397 | 9.2083 | 5350 | 0.8889 | 0.6053 | 0.8889 | 0.9428 |
| 0.0397 | 9.2117 | 5352 | 0.8830 | 0.6053 | 0.8830 | 0.9397 |
| 0.0397 | 9.2151 | 5354 | 0.8767 | 0.6053 | 0.8767 | 0.9363 |
| 0.0397 | 9.2186 | 5356 | 0.8760 | 0.6053 | 0.8760 | 0.9359 |
| 0.0397 | 9.2220 | 5358 | 0.8726 | 0.6053 | 0.8726 | 0.9341 |
| 0.0397 | 9.2255 | 5360 | 0.8733 | 0.6053 | 0.8733 | 0.9345 |
| 0.0397 | 9.2289 | 5362 | 0.8786 | 0.6053 | 0.8786 | 0.9373 |
| 0.0397 | 9.2324 | 5364 | 0.8838 | 0.6053 | 0.8838 | 0.9401 |
| 0.0397 | 9.2358 | 5366 | 0.8896 | 0.6053 | 0.8896 | 0.9432 |
| 0.0397 | 9.2392 | 5368 | 0.8904 | 0.6053 | 0.8904 | 0.9436 |
| 0.0397 | 9.2427 | 5370 | 0.8890 | 0.6053 | 0.8890 | 0.9429 |
| 0.0397 | 9.2461 | 5372 | 0.8902 | 0.6053 | 0.8902 | 0.9435 |
| 0.0397 | 9.2496 | 5374 | 0.8844 | 0.6053 | 0.8844 | 0.9404 |
| 0.0397 | 9.2530 | 5376 | 0.8837 | 0.6053 | 0.8837 | 0.9400 |
| 0.0397 | 9.2565 | 5378 | 0.8911 | 0.6053 | 0.8911 | 0.9440 |
| 0.0397 | 9.2599 | 5380 | 0.8978 | 0.5223 | 0.8978 | 0.9475 |
| 0.0397 | 9.2633 | 5382 | 0.9018 | 0.5223 | 0.9018 | 0.9496 |
| 0.0397 | 9.2668 | 5384 | 0.9059 | 0.5223 | 0.9059 | 0.9518 |
| 0.0397 | 9.2702 | 5386 | 0.9080 | 0.4957 | 0.9080 | 0.9529 |
| 0.0397 | 9.2737 | 5388 | 0.9090 | 0.4813 | 0.9090 | 0.9534 |
| 0.0397 | 9.2771 | 5390 | 0.9094 | 0.4813 | 0.9094 | 0.9536 |
| 0.0397 | 9.2806 | 5392 | 0.9095 | 0.4813 | 0.9095 | 0.9537 |
| 0.0397 | 9.2840 | 5394 | 0.9082 | 0.4813 | 0.9082 | 0.9530 |
| 0.0397 | 9.2874 | 5396 | 0.9085 | 0.4813 | 0.9085 | 0.9532 |
| 0.0397 | 9.2909 | 5398 | 0.9076 | 0.4408 | 0.9076 | 0.9527 |
| 0.0397 | 9.2943 | 5400 | 0.9073 | 0.4941 | 0.9073 | 0.9525 |
| 0.0397 | 9.2978 | 5402 | 0.9025 | 0.5227 | 0.9025 | 0.9500 |
| 0.0397 | 9.3012 | 5404 | 0.8973 | 0.5227 | 0.8973 | 0.9472 |
| 0.0397 | 9.3046 | 5406 | 0.8966 | 0.5227 | 0.8966 | 0.9469 |
| 0.0397 | 9.3081 | 5408 | 0.9017 | 0.5227 | 0.9017 | 0.9496 |
| 0.0397 | 9.3115 | 5410 | 0.9052 | 0.5227 | 0.9052 | 0.9514 |
| 0.0397 | 9.3150 | 5412 | 0.9108 | 0.4941 | 0.9108 | 0.9544 |
| 0.0397 | 9.3184 | 5414 | 0.9216 | 0.4408 | 0.9216 | 0.9600 |
| 0.0397 | 9.3219 | 5416 | 0.9388 | 0.4661 | 0.9388 | 0.9689 |
| 0.0397 | 9.3253 | 5418 | 0.9561 | 0.4661 | 0.9561 | 0.9778 |
| 0.0397 | 9.3287 | 5420 | 0.9623 | 0.4661 | 0.9623 | 0.9810 |
| 0.0397 | 9.3322 | 5422 | 0.9624 | 0.4661 | 0.9624 | 0.9810 |
| 0.0397 | 9.3356 | 5424 | 0.9599 | 0.4661 | 0.9599 | 0.9797 |
| 0.0397 | 9.3391 | 5426 | 0.9633 | 0.4661 | 0.9633 | 0.9815 |
| 0.0397 | 9.3425 | 5428 | 0.9669 | 0.4661 | 0.9669 | 0.9833 |
| 0.0397 | 9.3460 | 5430 | 0.9670 | 0.4661 | 0.9670 | 0.9833 |
| 0.0397 | 9.3494 | 5432 | 0.9615 | 0.4661 | 0.9615 | 0.9806 |
| 0.0397 | 9.3528 | 5434 | 0.9552 | 0.4661 | 0.9552 | 0.9773 |
| 0.0397 | 9.3563 | 5436 | 0.9459 | 0.4661 | 0.9459 | 0.9726 |
| 0.0397 | 9.3597 | 5438 | 0.9353 | 0.4661 | 0.9353 | 0.9671 |
| 0.0397 | 9.3632 | 5440 | 0.9222 | 0.4941 | 0.9222 | 0.9603 |
| 0.0397 | 9.3666 | 5442 | 0.9074 | 0.5227 | 0.9074 | 0.9526 |
| 0.0397 | 9.3701 | 5444 | 0.8922 | 0.5227 | 0.8922 | 0.9446 |
| 0.0397 | 9.3735 | 5446 | 0.8770 | 0.5378 | 0.8770 | 0.9365 |
| 0.0397 | 9.3769 | 5448 | 0.8710 | 0.5378 | 0.8710 | 0.9332 |
| 0.0397 | 9.3804 | 5450 | 0.8640 | 0.5402 | 0.8640 | 0.9295 |
| 0.0397 | 9.3838 | 5452 | 0.8606 | 0.5402 | 0.8606 | 0.9277 |
| 0.0397 | 9.3873 | 5454 | 0.8611 | 0.5402 | 0.8611 | 0.9280 |
| 0.0397 | 9.3907 | 5456 | 0.8638 | 0.5402 | 0.8638 | 0.9294 |
| 0.0397 | 9.3941 | 5458 | 0.8649 | 0.5402 | 0.8649 | 0.9300 |
| 0.0397 | 9.3976 | 5460 | 0.8689 | 0.5378 | 0.8689 | 0.9322 |
| 0.0397 | 9.4010 | 5462 | 0.8733 | 0.5378 | 0.8733 | 0.9345 |
| 0.0397 | 9.4045 | 5464 | 0.8768 | 0.5378 | 0.8768 | 0.9364 |
| 0.0397 | 9.4079 | 5466 | 0.8860 | 0.5378 | 0.8860 | 0.9413 |
| 0.0397 | 9.4114 | 5468 | 0.8932 | 0.5378 | 0.8932 | 0.9451 |
| 0.0397 | 9.4148 | 5470 | 0.9021 | 0.4941 | 0.9021 | 0.9498 |
| 0.0397 | 9.4182 | 5472 | 0.9075 | 0.5188 | 0.9075 | 0.9526 |
| 0.0397 | 9.4217 | 5474 | 0.9072 | 0.5188 | 0.9072 | 0.9525 |
| 0.0397 | 9.4251 | 5476 | 0.9055 | 0.5088 | 0.9055 | 0.9516 |
| 0.0397 | 9.4286 | 5478 | 0.9042 | 0.5088 | 0.9042 | 0.9509 |
| 0.0397 | 9.4320 | 5480 | 0.9065 | 0.5331 | 0.9065 | 0.9521 |
| 0.0397 | 9.4355 | 5482 | 0.9106 | 0.4661 | 0.9106 | 0.9543 |
| 0.0397 | 9.4389 | 5484 | 0.9100 | 0.4661 | 0.9100 | 0.9540 |
| 0.0397 | 9.4423 | 5486 | 0.9065 | 0.5331 | 0.9065 | 0.9521 |
| 0.0397 | 9.4458 | 5488 | 0.9035 | 0.5331 | 0.9035 | 0.9505 |
| 0.0397 | 9.4492 | 5490 | 0.8946 | 0.5645 | 0.8946 | 0.9459 |
| 0.0397 | 9.4527 | 5492 | 0.8896 | 0.5645 | 0.8896 | 0.9432 |
| 0.0397 | 9.4561 | 5494 | 0.8842 | 0.5645 | 0.8842 | 0.9403 |
| 0.0397 | 9.4596 | 5496 | 0.8805 | 0.5645 | 0.8805 | 0.9384 |
| 0.0397 | 9.4630 | 5498 | 0.8710 | 0.5645 | 0.8710 | 0.9333 |
| 0.0379 | 9.4664 | 5500 | 0.8664 | 0.5645 | 0.8664 | 0.9308 |
| 0.0379 | 9.4699 | 5502 | 0.8638 | 0.5645 | 0.8638 | 0.9294 |
| 0.0379 | 9.4733 | 5504 | 0.8597 | 0.5645 | 0.8597 | 0.9272 |
| 0.0379 | 9.4768 | 5506 | 0.8587 | 0.5645 | 0.8587 | 0.9267 |
| 0.0379 | 9.4802 | 5508 | 0.8559 | 0.6053 | 0.8559 | 0.9251 |
| 0.0379 | 9.4836 | 5510 | 0.8548 | 0.6053 | 0.8548 | 0.9245 |
| 0.0379 | 9.4871 | 5512 | 0.8530 | 0.6053 | 0.8530 | 0.9236 |
| 0.0379 | 9.4905 | 5514 | 0.8554 | 0.5645 | 0.8554 | 0.9249 |
| 0.0379 | 9.4940 | 5516 | 0.8601 | 0.5645 | 0.8601 | 0.9274 |
| 0.0379 | 9.4974 | 5518 | 0.8693 | 0.5645 | 0.8693 | 0.9324 |
| 0.0379 | 9.5009 | 5520 | 0.8769 | 0.5645 | 0.8769 | 0.9364 |
| 0.0379 | 9.5043 | 5522 | 0.8844 | 0.5378 | 0.8844 | 0.9404 |
| 0.0379 | 9.5077 | 5524 | 0.8888 | 0.5378 | 0.8888 | 0.9427 |
| 0.0379 | 9.5112 | 5526 | 0.8952 | 0.5378 | 0.8952 | 0.9462 |
| 0.0379 | 9.5146 | 5528 | 0.9010 | 0.5227 | 0.9010 | 0.9492 |
| 0.0379 | 9.5181 | 5530 | 0.9048 | 0.5188 | 0.9048 | 0.9512 |
| 0.0379 | 9.5215 | 5532 | 0.9112 | 0.5188 | 0.9112 | 0.9546 |
| 0.0379 | 9.5250 | 5534 | 0.9216 | 0.4661 | 0.9216 | 0.9600 |
| 0.0379 | 9.5284 | 5536 | 0.9359 | 0.4661 | 0.9359 | 0.9674 |
| 0.0379 | 9.5318 | 5538 | 0.9516 | 0.4661 | 0.9516 | 0.9755 |
| 0.0379 | 9.5353 | 5540 | 0.9617 | 0.4661 | 0.9617 | 0.9807 |
| 0.0379 | 9.5387 | 5542 | 0.9651 | 0.4661 | 0.9651 | 0.9824 |
| 0.0379 | 9.5422 | 5544 | 0.9645 | 0.4661 | 0.9645 | 0.9821 |
| 0.0379 | 9.5456 | 5546 | 0.9600 | 0.4661 | 0.9600 | 0.9798 |
| 0.0379 | 9.5491 | 5548 | 0.9558 | 0.4661 | 0.9558 | 0.9776 |
| 0.0379 | 9.5525 | 5550 | 0.9464 | 0.4661 | 0.9464 | 0.9728 |
| 0.0379 | 9.5559 | 5552 | 0.9354 | 0.4661 | 0.9354 | 0.9672 |
| 0.0379 | 9.5594 | 5554 | 0.9243 | 0.4661 | 0.9243 | 0.9614 |
| 0.0379 | 9.5628 | 5556 | 0.9133 | 0.5188 | 0.9133 | 0.9556 |
| 0.0379 | 9.5663 | 5558 | 0.9054 | 0.5227 | 0.9054 | 0.9515 |
| 0.0379 | 9.5697 | 5560 | 0.8967 | 0.5227 | 0.8967 | 0.9469 |
| 0.0379 | 9.5731 | 5562 | 0.8924 | 0.5378 | 0.8924 | 0.9447 |
| 0.0379 | 9.5766 | 5564 | 0.8892 | 0.5378 | 0.8892 | 0.9430 |
| 0.0379 | 9.5800 | 5566 | 0.8860 | 0.5378 | 0.8860 | 0.9413 |
| 0.0379 | 9.5835 | 5568 | 0.8795 | 0.5378 | 0.8795 | 0.9378 |
| 0.0379 | 9.5869 | 5570 | 0.8744 | 0.5645 | 0.8744 | 0.9351 |
| 0.0379 | 9.5904 | 5572 | 0.8747 | 0.5645 | 0.8747 | 0.9353 |
| 0.0379 | 9.5938 | 5574 | 0.8758 | 0.5645 | 0.8758 | 0.9358 |
| 0.0379 | 9.5972 | 5576 | 0.8810 | 0.5645 | 0.8810 | 0.9386 |
| 0.0379 | 9.6007 | 5578 | 0.8832 | 0.5645 | 0.8832 | 0.9398 |
| 0.0379 | 9.6041 | 5580 | 0.8826 | 0.5645 | 0.8826 | 0.9395 |
| 0.0379 | 9.6076 | 5582 | 0.8833 | 0.5645 | 0.8833 | 0.9399 |
| 0.0379 | 9.6110 | 5584 | 0.8841 | 0.5645 | 0.8841 | 0.9402 |
| 0.0379 | 9.6145 | 5586 | 0.8803 | 0.5645 | 0.8803 | 0.9382 |
| 0.0379 | 9.6179 | 5588 | 0.8803 | 0.5645 | 0.8803 | 0.9382 |
| 0.0379 | 9.6213 | 5590 | 0.8819 | 0.5645 | 0.8819 | 0.9391 |
| 0.0379 | 9.6248 | 5592 | 0.8860 | 0.5645 | 0.8860 | 0.9413 |
| 0.0379 | 9.6282 | 5594 | 0.8893 | 0.5874 | 0.8893 | 0.9430 |
| 0.0379 | 9.6317 | 5596 | 0.8896 | 0.5874 | 0.8896 | 0.9432 |
| 0.0379 | 9.6351 | 5598 | 0.8870 | 0.5874 | 0.8870 | 0.9418 |
| 0.0379 | 9.6386 | 5600 | 0.8835 | 0.5645 | 0.8835 | 0.9400 |
| 0.0379 | 9.6420 | 5602 | 0.8803 | 0.5645 | 0.8803 | 0.9383 |
| 0.0379 | 9.6454 | 5604 | 0.8754 | 0.5645 | 0.8754 | 0.9356 |
| 0.0379 | 9.6489 | 5606 | 0.8722 | 0.5645 | 0.8722 | 0.9339 |
| 0.0379 | 9.6523 | 5608 | 0.8721 | 0.5645 | 0.8721 | 0.9339 |
| 0.0379 | 9.6558 | 5610 | 0.8748 | 0.5645 | 0.8748 | 0.9353 |
| 0.0379 | 9.6592 | 5612 | 0.8800 | 0.5645 | 0.8800 | 0.9381 |
| 0.0379 | 9.6627 | 5614 | 0.8873 | 0.5874 | 0.8873 | 0.9420 |
| 0.0379 | 9.6661 | 5616 | 0.8923 | 0.5588 | 0.8923 | 0.9446 |
| 0.0379 | 9.6695 | 5618 | 0.8976 | 0.5068 | 0.8976 | 0.9474 |
| 0.0379 | 9.6730 | 5620 | 0.9014 | 0.5068 | 0.9014 | 0.9494 |
| 0.0379 | 9.6764 | 5622 | 0.9016 | 0.5068 | 0.9016 | 0.9495 |
| 0.0379 | 9.6799 | 5624 | 0.9020 | 0.5068 | 0.9020 | 0.9497 |
| 0.0379 | 9.6833 | 5626 | 0.9025 | 0.5068 | 0.9025 | 0.9500 |
| 0.0379 | 9.6867 | 5628 | 0.9058 | 0.5068 | 0.9058 | 0.9517 |
| 0.0379 | 9.6902 | 5630 | 0.9100 | 0.4931 | 0.9100 | 0.9539 |
| 0.0379 | 9.6936 | 5632 | 0.9167 | 0.4931 | 0.9167 | 0.9574 |
| 0.0379 | 9.6971 | 5634 | 0.9226 | 0.4931 | 0.9226 | 0.9605 |
| 0.0379 | 9.7005 | 5636 | 0.9273 | 0.4931 | 0.9273 | 0.9630 |
| 0.0379 | 9.7040 | 5638 | 0.9329 | 0.4931 | 0.9329 | 0.9659 |
| 0.0379 | 9.7074 | 5640 | 0.9373 | 0.4931 | 0.9373 | 0.9681 |
| 0.0379 | 9.7108 | 5642 | 0.9387 | 0.4931 | 0.9387 | 0.9689 |
| 0.0379 | 9.7143 | 5644 | 0.9385 | 0.4931 | 0.9385 | 0.9688 |
| 0.0379 | 9.7177 | 5646 | 0.9406 | 0.4931 | 0.9406 | 0.9698 |
| 0.0379 | 9.7212 | 5648 | 0.9406 | 0.4931 | 0.9406 | 0.9699 |
| 0.0379 | 9.7246 | 5650 | 0.9393 | 0.4931 | 0.9393 | 0.9692 |
| 0.0379 | 9.7281 | 5652 | 0.9391 | 0.4931 | 0.9391 | 0.9691 |
| 0.0379 | 9.7315 | 5654 | 0.9389 | 0.4931 | 0.9389 | 0.9690 |
| 0.0379 | 9.7349 | 5656 | 0.9370 | 0.4931 | 0.9370 | 0.9680 |
| 0.0379 | 9.7384 | 5658 | 0.9363 | 0.4931 | 0.9363 | 0.9676 |
| 0.0379 | 9.7418 | 5660 | 0.9343 | 0.4931 | 0.9343 | 0.9666 |
| 0.0379 | 9.7453 | 5662 | 0.9298 | 0.4931 | 0.9298 | 0.9643 |
| 0.0379 | 9.7487 | 5664 | 0.9260 | 0.4931 | 0.9260 | 0.9623 |
| 0.0379 | 9.7522 | 5666 | 0.9229 | 0.4931 | 0.9229 | 0.9607 |
| 0.0379 | 9.7556 | 5668 | 0.9203 | 0.4931 | 0.9203 | 0.9593 |
| 0.0379 | 9.7590 | 5670 | 0.9160 | 0.4931 | 0.9160 | 0.9571 |
| 0.0379 | 9.7625 | 5672 | 0.9103 | 0.4931 | 0.9103 | 0.9541 |
| 0.0379 | 9.7659 | 5674 | 0.9058 | 0.5068 | 0.9058 | 0.9518 |
| 0.0379 | 9.7694 | 5676 | 0.9029 | 0.5068 | 0.9029 | 0.9502 |
| 0.0379 | 9.7728 | 5678 | 0.9017 | 0.5068 | 0.9017 | 0.9496 |
| 0.0379 | 9.7762 | 5680 | 0.9024 | 0.5068 | 0.9024 | 0.9500 |
| 0.0379 | 9.7797 | 5682 | 0.9036 | 0.5068 | 0.9036 | 0.9506 |
| 0.0379 | 9.7831 | 5684 | 0.9047 | 0.5068 | 0.9047 | 0.9512 |
| 0.0379 | 9.7866 | 5686 | 0.9062 | 0.4931 | 0.9062 | 0.9520 |
| 0.0379 | 9.7900 | 5688 | 0.9052 | 0.4931 | 0.9052 | 0.9514 |
| 0.0379 | 9.7935 | 5690 | 0.9042 | 0.5449 | 0.9042 | 0.9509 |
| 0.0379 | 9.7969 | 5692 | 0.9024 | 0.5449 | 0.9024 | 0.9499 |
| 0.0379 | 9.8003 | 5694 | 0.9018 | 0.5449 | 0.9018 | 0.9496 |
| 0.0379 | 9.8038 | 5696 | 0.9015 | 0.5449 | 0.9015 | 0.9495 |
| 0.0379 | 9.8072 | 5698 | 0.9023 | 0.5449 | 0.9023 | 0.9499 |
| 0.0379 | 9.8107 | 5700 | 0.9044 | 0.5449 | 0.9044 | 0.9510 |
| 0.0379 | 9.8141 | 5702 | 0.9072 | 0.5449 | 0.9072 | 0.9525 |
| 0.0379 | 9.8176 | 5704 | 0.9079 | 0.5449 | 0.9079 | 0.9528 |
| 0.0379 | 9.8210 | 5706 | 0.9064 | 0.5449 | 0.9064 | 0.9521 |
| 0.0379 | 9.8244 | 5708 | 0.9032 | 0.5449 | 0.9032 | 0.9504 |
| 0.0379 | 9.8279 | 5710 | 0.8994 | 0.5449 | 0.8994 | 0.9484 |
| 0.0379 | 9.8313 | 5712 | 0.8957 | 0.5588 | 0.8957 | 0.9464 |
| 0.0379 | 9.8348 | 5714 | 0.8939 | 0.5874 | 0.8939 | 0.9455 |
| 0.0379 | 9.8382 | 5716 | 0.8917 | 0.5874 | 0.8917 | 0.9443 |
| 0.0379 | 9.8417 | 5718 | 0.8884 | 0.5645 | 0.8884 | 0.9425 |
| 0.0379 | 9.8451 | 5720 | 0.8861 | 0.5645 | 0.8861 | 0.9413 |
| 0.0379 | 9.8485 | 5722 | 0.8858 | 0.5645 | 0.8858 | 0.9412 |
| 0.0379 | 9.8520 | 5724 | 0.8862 | 0.5645 | 0.8862 | 0.9414 |
| 0.0379 | 9.8554 | 5726 | 0.8869 | 0.5645 | 0.8869 | 0.9418 |
| 0.0379 | 9.8589 | 5728 | 0.8884 | 0.5874 | 0.8884 | 0.9426 |
| 0.0379 | 9.8623 | 5730 | 0.8906 | 0.5874 | 0.8906 | 0.9437 |
| 0.0379 | 9.8657 | 5732 | 0.8934 | 0.5588 | 0.8934 | 0.9452 |
| 0.0379 | 9.8692 | 5734 | 0.8955 | 0.5588 | 0.8955 | 0.9463 |
| 0.0379 | 9.8726 | 5736 | 0.8979 | 0.5588 | 0.8979 | 0.9476 |
| 0.0379 | 9.8761 | 5738 | 0.8987 | 0.5588 | 0.8987 | 0.9480 |
| 0.0379 | 9.8795 | 5740 | 0.8996 | 0.5588 | 0.8996 | 0.9485 |
| 0.0379 | 9.8830 | 5742 | 0.9008 | 0.5588 | 0.9008 | 0.9491 |
| 0.0379 | 9.8864 | 5744 | 0.9028 | 0.5588 | 0.9028 | 0.9502 |
| 0.0379 | 9.8898 | 5746 | 0.9052 | 0.5068 | 0.9052 | 0.9514 |
| 0.0379 | 9.8933 | 5748 | 0.9065 | 0.4931 | 0.9065 | 0.9521 |
| 0.0379 | 9.8967 | 5750 | 0.9078 | 0.4931 | 0.9078 | 0.9528 |
| 0.0379 | 9.9002 | 5752 | 0.9093 | 0.4931 | 0.9093 | 0.9536 |
| 0.0379 | 9.9036 | 5754 | 0.9101 | 0.4931 | 0.9101 | 0.9540 |
| 0.0379 | 9.9071 | 5756 | 0.9110 | 0.4931 | 0.9110 | 0.9545 |
| 0.0379 | 9.9105 | 5758 | 0.9112 | 0.4931 | 0.9112 | 0.9546 |
| 0.0379 | 9.9139 | 5760 | 0.9118 | 0.4931 | 0.9118 | 0.9549 |
| 0.0379 | 9.9174 | 5762 | 0.9130 | 0.4931 | 0.9130 | 0.9555 |
| 0.0379 | 9.9208 | 5764 | 0.9131 | 0.4931 | 0.9131 | 0.9556 |
| 0.0379 | 9.9243 | 5766 | 0.9131 | 0.4931 | 0.9131 | 0.9556 |
| 0.0379 | 9.9277 | 5768 | 0.9125 | 0.4931 | 0.9125 | 0.9553 |
| 0.0379 | 9.9312 | 5770 | 0.9116 | 0.4931 | 0.9116 | 0.9548 |
| 0.0379 | 9.9346 | 5772 | 0.9109 | 0.4931 | 0.9109 | 0.9544 |
| 0.0379 | 9.9380 | 5774 | 0.9105 | 0.4931 | 0.9105 | 0.9542 |
| 0.0379 | 9.9415 | 5776 | 0.9103 | 0.4931 | 0.9103 | 0.9541 |
| 0.0379 | 9.9449 | 5778 | 0.9102 | 0.4931 | 0.9102 | 0.9540 |
| 0.0379 | 9.9484 | 5780 | 0.9104 | 0.4931 | 0.9104 | 0.9541 |
| 0.0379 | 9.9518 | 5782 | 0.9103 | 0.4931 | 0.9103 | 0.9541 |
| 0.0379 | 9.9552 | 5784 | 0.9102 | 0.4931 | 0.9102 | 0.9540 |
| 0.0379 | 9.9587 | 5786 | 0.9099 | 0.4931 | 0.9099 | 0.9539 |
| 0.0379 | 9.9621 | 5788 | 0.9098 | 0.4931 | 0.9098 | 0.9538 |
| 0.0379 | 9.9656 | 5790 | 0.9097 | 0.4931 | 0.9097 | 0.9538 |
| 0.0379 | 9.9690 | 5792 | 0.9098 | 0.4931 | 0.9098 | 0.9538 |
| 0.0379 | 9.9725 | 5794 | 0.9101 | 0.4931 | 0.9101 | 0.9540 |
| 0.0379 | 9.9759 | 5796 | 0.9106 | 0.4931 | 0.9106 | 0.9542 |
| 0.0379 | 9.9793 | 5798 | 0.9108 | 0.4931 | 0.9108 | 0.9544 |
| 0.0379 | 9.9828 | 5800 | 0.9107 | 0.4931 | 0.9107 | 0.9543 |
| 0.0379 | 9.9862 | 5802 | 0.9108 | 0.4931 | 0.9108 | 0.9543 |
| 0.0379 | 9.9897 | 5804 | 0.9108 | 0.4931 | 0.9108 | 0.9544 |
| 0.0379 | 9.9931 | 5806 | 0.9107 | 0.4931 | 0.9107 | 0.9543 |
| 0.0379 | 9.9966 | 5808 | 0.9107 | 0.4931 | 0.9107 | 0.9543 |
| 0.0379 | 10.0 | 5810 | 0.9107 | 0.4931 | 0.9107 | 0.9543 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
MayBashendy/ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k10_task1_organization
|
MayBashendy
| 2024-12-04T16:23:12Z | 163 | 0 |
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:16:36Z |
---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k10_task1_organization
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k10_task1_organization
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7273
- Qwk: 0.6431
- Mse: 0.7273
- Rmse: 0.8528
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.04 | 2 | 4.7660 | -0.0151 | 4.7660 | 2.1831 |
| No log | 0.08 | 4 | 2.7800 | 0.0315 | 2.7800 | 1.6673 |
| No log | 0.12 | 6 | 1.9801 | -0.0354 | 1.9801 | 1.4072 |
| No log | 0.16 | 8 | 2.4587 | -0.1636 | 2.4587 | 1.5680 |
| No log | 0.2 | 10 | 1.9055 | -0.0680 | 1.9055 | 1.3804 |
| No log | 0.24 | 12 | 1.4720 | 0.1236 | 1.4720 | 1.2133 |
| No log | 0.28 | 14 | 1.3428 | 0.0493 | 1.3428 | 1.1588 |
| No log | 0.32 | 16 | 1.5088 | 0.0502 | 1.5088 | 1.2283 |
| No log | 0.36 | 18 | 1.5366 | 0.0502 | 1.5366 | 1.2396 |
| No log | 0.4 | 20 | 1.3687 | 0.0502 | 1.3687 | 1.1699 |
| No log | 0.44 | 22 | 1.2149 | 0.1164 | 1.2149 | 1.1022 |
| No log | 0.48 | 24 | 1.0969 | 0.3793 | 1.0969 | 1.0473 |
| No log | 0.52 | 26 | 1.0160 | 0.3696 | 1.0160 | 1.0080 |
| No log | 0.56 | 28 | 1.0119 | 0.2980 | 1.0119 | 1.0059 |
| No log | 0.6 | 30 | 0.9662 | 0.3276 | 0.9662 | 0.9830 |
| No log | 0.64 | 32 | 0.8600 | 0.4438 | 0.8600 | 0.9274 |
| No log | 0.68 | 34 | 0.8412 | 0.5 | 0.8412 | 0.9172 |
| No log | 0.72 | 36 | 0.9916 | 0.4527 | 0.9916 | 0.9958 |
| No log | 0.76 | 38 | 1.1141 | 0.4137 | 1.1141 | 1.0555 |
| No log | 0.8 | 40 | 0.9422 | 0.4493 | 0.9422 | 0.9707 |
| No log | 0.84 | 42 | 0.7827 | 0.4852 | 0.7827 | 0.8847 |
| No log | 0.88 | 44 | 0.7817 | 0.5629 | 0.7817 | 0.8841 |
| No log | 0.92 | 46 | 0.7947 | 0.5037 | 0.7947 | 0.8914 |
| No log | 0.96 | 48 | 0.8103 | 0.5037 | 0.8103 | 0.9002 |
| No log | 1.0 | 50 | 0.7784 | 0.5484 | 0.7784 | 0.8823 |
| No log | 1.04 | 52 | 0.7824 | 0.5272 | 0.7824 | 0.8846 |
| No log | 1.08 | 54 | 0.7833 | 0.5629 | 0.7833 | 0.8850 |
| No log | 1.12 | 56 | 0.7761 | 0.5758 | 0.7761 | 0.8810 |
| No log | 1.16 | 58 | 0.7943 | 0.5758 | 0.7943 | 0.8912 |
| No log | 1.2 | 60 | 0.7965 | 0.4897 | 0.7965 | 0.8925 |
| No log | 1.24 | 62 | 0.7978 | 0.4897 | 0.7978 | 0.8932 |
| No log | 1.28 | 64 | 0.7500 | 0.5345 | 0.7500 | 0.8660 |
| No log | 1.32 | 66 | 0.7429 | 0.4875 | 0.7429 | 0.8619 |
| No log | 1.3600 | 68 | 0.7668 | 0.4897 | 0.7668 | 0.8756 |
| No log | 1.4 | 70 | 0.7577 | 0.5034 | 0.7577 | 0.8705 |
| No log | 1.44 | 72 | 0.8103 | 0.5083 | 0.8103 | 0.9002 |
| No log | 1.48 | 74 | 0.8455 | 0.4873 | 0.8455 | 0.9195 |
| No log | 1.52 | 76 | 0.7940 | 0.5286 | 0.7940 | 0.8910 |
| No log | 1.56 | 78 | 0.7399 | 0.6283 | 0.7399 | 0.8602 |
| No log | 1.6 | 80 | 0.7827 | 0.6419 | 0.7827 | 0.8847 |
| No log | 1.6400 | 82 | 0.7471 | 0.6322 | 0.7471 | 0.8644 |
| No log | 1.6800 | 84 | 0.7640 | 0.6801 | 0.7640 | 0.8741 |
| No log | 1.72 | 86 | 0.7978 | 0.6741 | 0.7978 | 0.8932 |
| No log | 1.76 | 88 | 0.7038 | 0.6959 | 0.7038 | 0.8389 |
| No log | 1.8 | 90 | 0.6945 | 0.6743 | 0.6945 | 0.8334 |
| No log | 1.8400 | 92 | 0.6982 | 0.6927 | 0.6982 | 0.8356 |
| No log | 1.88 | 94 | 0.6950 | 0.6835 | 0.6950 | 0.8336 |
| No log | 1.92 | 96 | 0.7052 | 0.7034 | 0.7052 | 0.8398 |
| No log | 1.96 | 98 | 0.6851 | 0.7130 | 0.6851 | 0.8277 |
| No log | 2.0 | 100 | 0.6280 | 0.6905 | 0.6280 | 0.7925 |
| No log | 2.04 | 102 | 0.6430 | 0.6918 | 0.6430 | 0.8019 |
| No log | 2.08 | 104 | 0.6034 | 0.7133 | 0.6034 | 0.7768 |
| No log | 2.12 | 106 | 0.6815 | 0.6937 | 0.6815 | 0.8255 |
| No log | 2.16 | 108 | 0.7411 | 0.5909 | 0.7411 | 0.8609 |
| No log | 2.2 | 110 | 0.6543 | 0.7132 | 0.6543 | 0.8089 |
| No log | 2.24 | 112 | 0.6511 | 0.6855 | 0.6511 | 0.8069 |
| No log | 2.2800 | 114 | 0.6789 | 0.7340 | 0.6789 | 0.8240 |
| No log | 2.32 | 116 | 0.7071 | 0.6671 | 0.7071 | 0.8409 |
| No log | 2.36 | 118 | 0.6994 | 0.6712 | 0.6994 | 0.8363 |
| No log | 2.4 | 120 | 0.6608 | 0.6671 | 0.6608 | 0.8129 |
| No log | 2.44 | 122 | 0.7031 | 0.6683 | 0.7031 | 0.8385 |
| No log | 2.48 | 124 | 0.5697 | 0.7165 | 0.5697 | 0.7548 |
| No log | 2.52 | 126 | 0.5652 | 0.7164 | 0.5652 | 0.7518 |
| No log | 2.56 | 128 | 0.6030 | 0.6375 | 0.6030 | 0.7765 |
| No log | 2.6 | 130 | 0.5643 | 0.7317 | 0.5643 | 0.7512 |
| No log | 2.64 | 132 | 0.6556 | 0.6548 | 0.6556 | 0.8097 |
| No log | 2.68 | 134 | 0.6535 | 0.6548 | 0.6535 | 0.8084 |
| No log | 2.7200 | 136 | 0.5613 | 0.7564 | 0.5613 | 0.7492 |
| No log | 2.76 | 138 | 0.5474 | 0.7515 | 0.5474 | 0.7399 |
| No log | 2.8 | 140 | 0.6650 | 0.6548 | 0.6650 | 0.8155 |
| No log | 2.84 | 142 | 0.7601 | 0.6271 | 0.7601 | 0.8718 |
| No log | 2.88 | 144 | 0.7676 | 0.6480 | 0.7676 | 0.8762 |
| No log | 2.92 | 146 | 0.6482 | 0.7164 | 0.6482 | 0.8051 |
| No log | 2.96 | 148 | 0.6089 | 0.7227 | 0.6089 | 0.7803 |
| No log | 3.0 | 150 | 0.7263 | 0.6548 | 0.7263 | 0.8522 |
| No log | 3.04 | 152 | 0.8609 | 0.5645 | 0.8609 | 0.9279 |
| No log | 3.08 | 154 | 0.7347 | 0.6686 | 0.7347 | 0.8571 |
| No log | 3.12 | 156 | 0.6637 | 0.6814 | 0.6637 | 0.8147 |
| No log | 3.16 | 158 | 0.6927 | 0.6758 | 0.6927 | 0.8323 |
| No log | 3.2 | 160 | 0.7841 | 0.5475 | 0.7841 | 0.8855 |
| No log | 3.24 | 162 | 0.7068 | 0.6686 | 0.7068 | 0.8407 |
| No log | 3.2800 | 164 | 0.7263 | 0.6548 | 0.7263 | 0.8522 |
| No log | 3.32 | 166 | 0.9497 | 0.5536 | 0.9497 | 0.9745 |
| No log | 3.36 | 168 | 1.0767 | 0.5276 | 1.0767 | 1.0376 |
| No log | 3.4 | 170 | 0.9387 | 0.5536 | 0.9387 | 0.9689 |
| No log | 3.44 | 172 | 0.8986 | 0.6055 | 0.8986 | 0.9479 |
| No log | 3.48 | 174 | 0.7807 | 0.7089 | 0.7807 | 0.8836 |
| No log | 3.52 | 176 | 0.6557 | 0.6228 | 0.6557 | 0.8097 |
| No log | 3.56 | 178 | 0.6231 | 0.6150 | 0.6231 | 0.7894 |
| No log | 3.6 | 180 | 0.6019 | 0.7036 | 0.6019 | 0.7758 |
| No log | 3.64 | 182 | 0.6548 | 0.6587 | 0.6548 | 0.8092 |
| No log | 3.68 | 184 | 0.7935 | 0.6271 | 0.7935 | 0.8908 |
| No log | 3.7200 | 186 | 0.7408 | 0.6195 | 0.7408 | 0.8607 |
| No log | 3.76 | 188 | 0.7020 | 0.6116 | 0.7020 | 0.8379 |
| No log | 3.8 | 190 | 0.6662 | 0.6653 | 0.6662 | 0.8162 |
| No log | 3.84 | 192 | 0.6191 | 0.7164 | 0.6191 | 0.7868 |
| No log | 3.88 | 194 | 0.6713 | 0.7089 | 0.6713 | 0.8194 |
| No log | 3.92 | 196 | 0.8613 | 0.5276 | 0.8613 | 0.9280 |
| No log | 3.96 | 198 | 1.0149 | 0.5382 | 1.0149 | 1.0074 |
| No log | 4.0 | 200 | 0.9587 | 0.5276 | 0.9587 | 0.9791 |
| No log | 4.04 | 202 | 0.8194 | 0.6410 | 0.8194 | 0.9052 |
| No log | 4.08 | 204 | 0.8184 | 0.628 | 0.8184 | 0.9046 |
| No log | 4.12 | 206 | 0.8315 | 0.628 | 0.8315 | 0.9119 |
| No log | 4.16 | 208 | 0.7777 | 0.6538 | 0.7777 | 0.8819 |
| No log | 4.2 | 210 | 0.7584 | 0.6829 | 0.7584 | 0.8708 |
| No log | 4.24 | 212 | 0.6917 | 0.7295 | 0.6917 | 0.8317 |
| No log | 4.28 | 214 | 0.6309 | 0.7008 | 0.6309 | 0.7943 |
| No log | 4.32 | 216 | 0.6109 | 0.6710 | 0.6109 | 0.7816 |
| No log | 4.36 | 218 | 0.6073 | 0.7123 | 0.6073 | 0.7793 |
| No log | 4.4 | 220 | 0.5952 | 0.7474 | 0.5952 | 0.7715 |
| No log | 4.44 | 222 | 0.5859 | 0.7474 | 0.5859 | 0.7654 |
| No log | 4.48 | 224 | 0.5857 | 0.7324 | 0.5857 | 0.7653 |
| No log | 4.52 | 226 | 0.5748 | 0.7210 | 0.5748 | 0.7581 |
| No log | 4.5600 | 228 | 0.5790 | 0.7210 | 0.5790 | 0.7609 |
| No log | 4.6 | 230 | 0.5892 | 0.7051 | 0.5892 | 0.7676 |
| No log | 4.64 | 232 | 0.6282 | 0.6840 | 0.6282 | 0.7926 |
| No log | 4.68 | 234 | 0.6040 | 0.6851 | 0.6040 | 0.7771 |
| No log | 4.72 | 236 | 0.5974 | 0.6851 | 0.5974 | 0.7729 |
| No log | 4.76 | 238 | 0.5714 | 0.6929 | 0.5714 | 0.7559 |
| No log | 4.8 | 240 | 0.5629 | 0.6579 | 0.5629 | 0.7503 |
| No log | 4.84 | 242 | 0.5739 | 0.6433 | 0.5739 | 0.7576 |
| No log | 4.88 | 244 | 0.5526 | 0.6480 | 0.5526 | 0.7434 |
| No log | 4.92 | 246 | 0.5793 | 0.6992 | 0.5793 | 0.7611 |
| No log | 4.96 | 248 | 0.6925 | 0.6431 | 0.6925 | 0.8322 |
| No log | 5.0 | 250 | 0.7460 | 0.65 | 0.7460 | 0.8637 |
| No log | 5.04 | 252 | 0.6890 | 0.6478 | 0.6890 | 0.8301 |
| No log | 5.08 | 254 | 0.6209 | 0.5982 | 0.6209 | 0.7880 |
| No log | 5.12 | 256 | 0.6214 | 0.5982 | 0.6214 | 0.7883 |
| No log | 5.16 | 258 | 0.6174 | 0.6468 | 0.6174 | 0.7857 |
| No log | 5.2 | 260 | 0.6088 | 0.6870 | 0.6088 | 0.7803 |
| No log | 5.24 | 262 | 0.6184 | 0.6875 | 0.6184 | 0.7864 |
| No log | 5.28 | 264 | 0.6172 | 0.6875 | 0.6172 | 0.7856 |
| No log | 5.32 | 266 | 0.6107 | 0.7258 | 0.6107 | 0.7815 |
| No log | 5.36 | 268 | 0.5841 | 0.6732 | 0.5841 | 0.7643 |
| No log | 5.4 | 270 | 0.5862 | 0.6616 | 0.5862 | 0.7656 |
| No log | 5.44 | 272 | 0.5902 | 0.6767 | 0.5902 | 0.7682 |
| No log | 5.48 | 274 | 0.6431 | 0.6261 | 0.6431 | 0.8019 |
| No log | 5.52 | 276 | 0.7174 | 0.6154 | 0.7174 | 0.8470 |
| No log | 5.5600 | 278 | 0.6994 | 0.6176 | 0.6994 | 0.8363 |
| No log | 5.6 | 280 | 0.6612 | 0.6471 | 0.6612 | 0.8132 |
| No log | 5.64 | 282 | 0.6587 | 0.6471 | 0.6587 | 0.8116 |
| No log | 5.68 | 284 | 0.6518 | 0.6471 | 0.6518 | 0.8074 |
| No log | 5.72 | 286 | 0.6918 | 0.6405 | 0.6918 | 0.8317 |
| No log | 5.76 | 288 | 0.7977 | 0.6229 | 0.7977 | 0.8931 |
| No log | 5.8 | 290 | 0.8901 | 0.5808 | 0.8901 | 0.9435 |
| No log | 5.84 | 292 | 0.8898 | 0.5808 | 0.8898 | 0.9433 |
| No log | 5.88 | 294 | 0.8565 | 0.5943 | 0.8565 | 0.9255 |
| No log | 5.92 | 296 | 0.7590 | 0.6473 | 0.7590 | 0.8712 |
| No log | 5.96 | 298 | 0.7121 | 0.6604 | 0.7121 | 0.8438 |
| No log | 6.0 | 300 | 0.6767 | 0.6821 | 0.6767 | 0.8226 |
| No log | 6.04 | 302 | 0.6990 | 0.6689 | 0.6990 | 0.8361 |
| No log | 6.08 | 304 | 0.7633 | 0.5986 | 0.7633 | 0.8736 |
| No log | 6.12 | 306 | 0.8722 | 0.5808 | 0.8722 | 0.9339 |
| No log | 6.16 | 308 | 0.8806 | 0.5808 | 0.8806 | 0.9384 |
| No log | 6.2 | 310 | 0.8080 | 0.6322 | 0.8080 | 0.8989 |
| No log | 6.24 | 312 | 0.6933 | 0.6751 | 0.6933 | 0.8327 |
| No log | 6.28 | 314 | 0.6190 | 0.6813 | 0.6190 | 0.7867 |
| No log | 6.32 | 316 | 0.6060 | 0.7117 | 0.6060 | 0.7784 |
| No log | 6.36 | 318 | 0.5976 | 0.6926 | 0.5976 | 0.7731 |
| No log | 6.4 | 320 | 0.6070 | 0.7195 | 0.6070 | 0.7791 |
| No log | 6.44 | 322 | 0.6634 | 0.7063 | 0.6634 | 0.8145 |
| No log | 6.48 | 324 | 0.7573 | 0.6634 | 0.7573 | 0.8702 |
| No log | 6.52 | 326 | 0.7539 | 0.65 | 0.7539 | 0.8683 |
| No log | 6.5600 | 328 | 0.7108 | 0.6768 | 0.7108 | 0.8431 |
| No log | 6.6 | 330 | 0.6769 | 0.6358 | 0.6769 | 0.8227 |
| No log | 6.64 | 332 | 0.6629 | 0.6358 | 0.6629 | 0.8142 |
| No log | 6.68 | 334 | 0.6982 | 0.6217 | 0.6982 | 0.8356 |
| No log | 6.72 | 336 | 0.7167 | 0.6217 | 0.7167 | 0.8466 |
| No log | 6.76 | 338 | 0.6958 | 0.6261 | 0.6958 | 0.8342 |
| No log | 6.8 | 340 | 0.6819 | 0.6405 | 0.6819 | 0.8258 |
| No log | 6.84 | 342 | 0.6588 | 0.6405 | 0.6588 | 0.8116 |
| No log | 6.88 | 344 | 0.6609 | 0.6405 | 0.6609 | 0.8130 |
| No log | 6.92 | 346 | 0.6955 | 0.6261 | 0.6955 | 0.8340 |
| No log | 6.96 | 348 | 0.7469 | 0.6053 | 0.7469 | 0.8643 |
| No log | 7.0 | 350 | 0.7467 | 0.5933 | 0.7467 | 0.8641 |
| No log | 7.04 | 352 | 0.7061 | 0.6261 | 0.7061 | 0.8403 |
| No log | 7.08 | 354 | 0.7138 | 0.5970 | 0.7138 | 0.8449 |
| No log | 7.12 | 356 | 0.7529 | 0.5970 | 0.7529 | 0.8677 |
| No log | 7.16 | 358 | 0.7571 | 0.5823 | 0.7571 | 0.8701 |
| No log | 7.2 | 360 | 0.7467 | 0.5823 | 0.7467 | 0.8641 |
| No log | 7.24 | 362 | 0.6954 | 0.6261 | 0.6954 | 0.8339 |
| No log | 7.28 | 364 | 0.6803 | 0.6261 | 0.6803 | 0.8248 |
| No log | 7.32 | 366 | 0.7005 | 0.6261 | 0.7005 | 0.8369 |
| No log | 7.36 | 368 | 0.7434 | 0.6014 | 0.7434 | 0.8622 |
| No log | 7.4 | 370 | 0.7748 | 0.6014 | 0.7748 | 0.8802 |
| No log | 7.44 | 372 | 0.7794 | 0.6014 | 0.7794 | 0.8828 |
| No log | 7.48 | 374 | 0.8323 | 0.6055 | 0.8323 | 0.9123 |
| No log | 7.52 | 376 | 0.8832 | 0.6055 | 0.8832 | 0.9398 |
| No log | 7.5600 | 378 | 0.8824 | 0.6055 | 0.8824 | 0.9393 |
| No log | 7.6 | 380 | 0.8168 | 0.5943 | 0.8168 | 0.9038 |
| No log | 7.64 | 382 | 0.7402 | 0.6293 | 0.7402 | 0.8604 |
| No log | 7.68 | 384 | 0.7071 | 0.6431 | 0.7071 | 0.8409 |
| No log | 7.72 | 386 | 0.7158 | 0.6431 | 0.7158 | 0.8460 |
| No log | 7.76 | 388 | 0.7231 | 0.6431 | 0.7231 | 0.8504 |
| No log | 7.8 | 390 | 0.7341 | 0.6431 | 0.7341 | 0.8568 |
| No log | 7.84 | 392 | 0.7243 | 0.6431 | 0.7243 | 0.8510 |
| No log | 7.88 | 394 | 0.7349 | 0.6293 | 0.7349 | 0.8573 |
| No log | 7.92 | 396 | 0.7463 | 0.6293 | 0.7463 | 0.8639 |
| No log | 7.96 | 398 | 0.7480 | 0.6293 | 0.7480 | 0.8649 |
| No log | 8.0 | 400 | 0.7290 | 0.6478 | 0.7290 | 0.8538 |
| No log | 8.04 | 402 | 0.6982 | 0.6478 | 0.6982 | 0.8356 |
| No log | 8.08 | 404 | 0.6923 | 0.6478 | 0.6923 | 0.8320 |
| No log | 8.12 | 406 | 0.6999 | 0.6478 | 0.6999 | 0.8366 |
| No log | 8.16 | 408 | 0.6788 | 0.6499 | 0.6788 | 0.8239 |
| No log | 8.2 | 410 | 0.6675 | 0.6379 | 0.6675 | 0.8170 |
| No log | 8.24 | 412 | 0.6456 | 0.6519 | 0.6456 | 0.8035 |
| No log | 8.28 | 414 | 0.6464 | 0.6519 | 0.6464 | 0.8040 |
| No log | 8.32 | 416 | 0.6444 | 0.6519 | 0.6444 | 0.8027 |
| No log | 8.36 | 418 | 0.6487 | 0.6519 | 0.6487 | 0.8054 |
| No log | 8.4 | 420 | 0.6559 | 0.6283 | 0.6559 | 0.8098 |
| No log | 8.44 | 422 | 0.6846 | 0.6261 | 0.6846 | 0.8274 |
| No log | 8.48 | 424 | 0.7055 | 0.6478 | 0.7055 | 0.8399 |
| No log | 8.52 | 426 | 0.7070 | 0.6478 | 0.7070 | 0.8409 |
| No log | 8.56 | 428 | 0.6968 | 0.6478 | 0.6968 | 0.8348 |
| No log | 8.6 | 430 | 0.6670 | 0.6283 | 0.6670 | 0.8167 |
| No log | 8.64 | 432 | 0.6433 | 0.6519 | 0.6433 | 0.8020 |
| No log | 8.68 | 434 | 0.6455 | 0.6519 | 0.6455 | 0.8035 |
| No log | 8.72 | 436 | 0.6651 | 0.6283 | 0.6651 | 0.8155 |
| No log | 8.76 | 438 | 0.6870 | 0.6261 | 0.6870 | 0.8288 |
| No log | 8.8 | 440 | 0.6996 | 0.6478 | 0.6996 | 0.8364 |
| No log | 8.84 | 442 | 0.6999 | 0.6261 | 0.6999 | 0.8366 |
| No log | 8.88 | 444 | 0.7098 | 0.6261 | 0.7098 | 0.8425 |
| No log | 8.92 | 446 | 0.7263 | 0.6431 | 0.7263 | 0.8522 |
| No log | 8.96 | 448 | 0.7246 | 0.6478 | 0.7246 | 0.8512 |
| No log | 9.0 | 450 | 0.7314 | 0.6431 | 0.7314 | 0.8552 |
| No log | 9.04 | 452 | 0.7268 | 0.6261 | 0.7268 | 0.8525 |
| No log | 9.08 | 454 | 0.7244 | 0.6261 | 0.7244 | 0.8511 |
| No log | 9.12 | 456 | 0.7135 | 0.6261 | 0.7135 | 0.8447 |
| No log | 9.16 | 458 | 0.7088 | 0.6261 | 0.7088 | 0.8419 |
| No log | 9.2 | 460 | 0.7176 | 0.6261 | 0.7176 | 0.8471 |
| No log | 9.24 | 462 | 0.7237 | 0.6217 | 0.7237 | 0.8507 |
| No log | 9.28 | 464 | 0.7308 | 0.6431 | 0.7308 | 0.8549 |
| No log | 9.32 | 466 | 0.7368 | 0.6293 | 0.7368 | 0.8584 |
| No log | 9.36 | 468 | 0.7496 | 0.6293 | 0.7496 | 0.8658 |
| No log | 9.4 | 470 | 0.7583 | 0.6154 | 0.7583 | 0.8708 |
| No log | 9.44 | 472 | 0.7712 | 0.6154 | 0.7712 | 0.8782 |
| No log | 9.48 | 474 | 0.7728 | 0.6154 | 0.7728 | 0.8791 |
| No log | 9.52 | 476 | 0.7697 | 0.6154 | 0.7697 | 0.8773 |
| No log | 9.56 | 478 | 0.7638 | 0.6154 | 0.7638 | 0.8740 |
| No log | 9.6 | 480 | 0.7514 | 0.6293 | 0.7514 | 0.8668 |
| No log | 9.64 | 482 | 0.7414 | 0.6431 | 0.7414 | 0.8611 |
| No log | 9.68 | 484 | 0.7309 | 0.6431 | 0.7309 | 0.8549 |
| No log | 9.72 | 486 | 0.7235 | 0.6431 | 0.7235 | 0.8506 |
| No log | 9.76 | 488 | 0.7169 | 0.6431 | 0.7169 | 0.8467 |
| No log | 9.8 | 490 | 0.7171 | 0.6431 | 0.7171 | 0.8468 |
| No log | 9.84 | 492 | 0.7207 | 0.6431 | 0.7207 | 0.8489 |
| No log | 9.88 | 494 | 0.7221 | 0.6431 | 0.7221 | 0.8497 |
| No log | 9.92 | 496 | 0.7242 | 0.6431 | 0.7242 | 0.8510 |
| No log | 9.96 | 498 | 0.7262 | 0.6431 | 0.7262 | 0.8522 |
| 0.353 | 10.0 | 500 | 0.7273 | 0.6431 | 0.7273 | 0.8528 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
dada22231/e93b15dc-a886-41d4-82be-188d20cae77e
|
dada22231
| 2024-12-04T16:21:34Z | 5 | 0 |
peft
|
[
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/Qwen2.5-3B",
"base_model:adapter:unsloth/Qwen2.5-3B",
"license:other",
"region:us"
] | null | 2024-12-04T16:04:54Z |
---
library_name: peft
license: other
base_model: unsloth/Qwen2.5-3B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: e93b15dc-a886-41d4-82be-188d20cae77e
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/Qwen2.5-3B
bf16: auto
chat_template: llama3
cosine_min_lr_ratio: 0.1
data_processes: 16
dataset_prepared_path: null
datasets:
- data_files:
- acef9caccbb2bf36_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/acef9caccbb2bf36_train_data.json
type:
field_input: seq
field_instruction: id
field_output: labels_str
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device_map: auto
do_eval: true
early_stopping_patience: 1
eval_batch_size: 1
eval_sample_packing: false
eval_steps: 25
evaluation_strategy: steps
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 32
gradient_checkpointing: true
group_by_length: true
hub_model_id: dada22231/e93b15dc-a886-41d4-82be-188d20cae77e
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 64
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 32
lora_target_linear: true
lora_target_modules:
- q_proj
- v_proj
lr_scheduler: cosine
max_grad_norm: 1.0
max_memory:
0: 70GiB
1: 70GiB
2: 70GiB
3: 70GiB
max_steps: 25
micro_batch_size: 1
mlflow_experiment_name: /tmp/acef9caccbb2bf36_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 3
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_torch
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 25
save_strategy: steps
sequence_len: 2048
strict: false
tf32: false
tokenizer_type: AutoTokenizer
torch_compile: false
train_on_inputs: false
trust_remote_code: true
val_set_size: 50
wandb_entity: null
wandb_mode: online
wandb_name: e93b15dc-a886-41d4-82be-188d20cae77e
wandb_project: Public_TuningSN
wandb_runid: e93b15dc-a886-41d4-82be-188d20cae77e
warmup_ratio: 0.04
weight_decay: 0.01
xformers_attention: null
```
</details><br>
# e93b15dc-a886-41d4-82be-188d20cae77e
This model is a fine-tuned version of [unsloth/Qwen2.5-3B](https://huggingface.co/unsloth/Qwen2.5-3B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 32
- total_train_batch_size: 128
- total_eval_batch_size: 4
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 2
- training_steps: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 5791798.0 | 0.0002 | 1 | nan |
| 0.0 | 0.0061 | 25 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1
|
mradermacher/Hebrew-Gemma-11B-V2-GGUF
|
mradermacher
| 2024-12-04T16:19:49Z | 22 | 0 |
transformers
|
[
"transformers",
"gguf",
"en",
"he",
"base_model:yam-peleg/Hebrew-Gemma-11B-V2",
"base_model:quantized:yam-peleg/Hebrew-Gemma-11B-V2",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-12-03T22:50:03Z |
---
base_model: yam-peleg/Hebrew-Gemma-11B-V2
language:
- en
- he
library_name: transformers
license: other
license_link: https://ai.google.dev/gemma/terms
license_name: gemma-terms-of-use
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/yam-peleg/Hebrew-Gemma-11B-V2
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q2_K.gguf) | Q2_K | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q3_K_S.gguf) | Q3_K_S | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q3_K_M.gguf) | Q3_K_M | 5.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.IQ4_XS.gguf) | IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q4_K_S.gguf) | Q4_K_S | 6.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q5_K_S.gguf) | Q5_K_S | 7.4 | |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q5_K_M.gguf) | Q5_K_M | 7.6 | |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q6_K.gguf) | Q6_K | 8.7 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.Q8_0.gguf) | Q8_0 | 11.2 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Hebrew-Gemma-11B-V2-GGUF/resolve/main/Hebrew-Gemma-11B-V2.f16.gguf) | f16 | 21.1 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
OpenBuddy/openbuddy-qwen2.5coder-32b-v24.1q-200k
|
OpenBuddy
| 2024-12-04T16:09:04Z | 6 | 2 | null |
[
"safetensors",
"qwen2",
"qwen2.5",
"text-generation",
"conversational",
"zh",
"en",
"fr",
"de",
"ja",
"ko",
"it",
"fi",
"base_model:Qwen/Qwen2.5-Coder-32B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-Coder-32B-Instruct",
"license:apache-2.0",
"region:us"
] |
text-generation
| 2024-12-04T12:28:29Z |
---
language:
- zh
- en
- fr
- de
- ja
- ko
- it
- fi
license: apache-2.0
tags:
- qwen2.5
pipeline_tag: text-generation
base_model: Qwen/Qwen2.5-Coder-32B-Instruct
---
# ⚛️ Q Model: Optimized for Enhanced Quantized Inference Capability
This model has been specially optimized to improve the performance of quantized inference and is recommended for use in 3 to 8-bit quantization scenarios.
Quantized version: https://huggingface.co/OpenBuddy/openbuddy-qwen2.5coder-32b-v24.1q-200k-gguf
# OpenBuddy - Open Multilingual Chatbot
GitHub and Usage Guide: [https://github.com/OpenBuddy/OpenBuddy](https://github.com/OpenBuddy/OpenBuddy)
Website and Demo: [https://openbuddy.ai](https://openbuddy.ai)
Evaluation result of this model: [Evaluation.txt](Evaluation.txt)

# Copyright Notice
Base Model: Qwen2.5-Coder-32B-Instruct
License: Apache 2.0
# Prompt Format
We recommend using the fast tokenizer from `transformers`, which should be enabled by default in the `transformers` and `vllm` libraries. Other implementations including `sentencepiece` may not work as expected, especially for special tokens like `<|role|>`, `<|says|>` and `<|end|>`.
```
<|role|>system<|says|>You(assistant) are a helpful, respectful and honest INTP-T AI Assistant named Buddy. You are talking to a human(user).
Always answer as helpfully and logically as possible, while being safe. Your answers should not include any harmful, political, religious, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.
You cannot access the internet, but you have vast knowledge, cutoff: 2023-04.
You are trained by OpenBuddy team, (https://openbuddy.ai, https://github.com/OpenBuddy/OpenBuddy), not related to GPT or OpenAI.<|end|>
<|role|>user<|says|>History input 1<|end|>
<|role|>assistant<|says|>History output 1<|end|>
<|role|>user<|says|>History input 2<|end|>
<|role|>assistant<|says|>History output 2<|end|>
<|role|>user<|says|>Current input<|end|>
<|role|>assistant<|says|>
```
This format is also defined in `tokenizer_config.json`, which means you can directly use `vllm` to deploy an OpenAI-like API service. For more information, please refer to the [vllm documentation](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html).
## Disclaimer
All OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.
OpenBuddy is provided "as-is" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.
By using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy.
## 免责声明
所有OpenBuddy模型均存在固有的局限性,可能产生错误的、有害的、冒犯性的或其他不良的输出。用户在关键或高风险场景中应谨慎行事,不要使用这些模型,以免导致人身伤害、财产损失或重大损失。此类场景的例子包括但不限于医疗领域、可能导致伤害的软硬件系统的控制以及进行重要的财务或法律决策。
OpenBuddy按“原样”提供,不附带任何种类的明示或暗示的保证,包括但不限于适销性、特定目的的适用性和非侵权的暗示保证。在任何情况下,作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任。
使用OpenBuddy即表示您同意这些条款和条件,并承认您了解其使用可能带来的潜在风险。您还同意赔偿并使作者、贡献者和版权所有者免受因您使用OpenBuddy而产生的任何索赔、损害赔偿或责任的影响。
|
MayBashendy/ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k2_task1_organization
|
MayBashendy
| 2024-12-04T16:03:48Z | 163 | 0 |
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T16:01:19Z |
---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k2_task1_organization
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k2_task1_organization
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0411
- Qwk: 0.4933
- Mse: 1.0411
- Rmse: 1.0203
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.1538 | 2 | 4.8755 | -0.0722 | 4.8755 | 2.2081 |
| No log | 0.3077 | 4 | 3.0731 | 0.0738 | 3.0731 | 1.7530 |
| No log | 0.4615 | 6 | 1.7811 | 0.1254 | 1.7811 | 1.3346 |
| No log | 0.6154 | 8 | 1.3472 | 0.1296 | 1.3472 | 1.1607 |
| No log | 0.7692 | 10 | 1.2827 | 0.0701 | 1.2827 | 1.1326 |
| No log | 0.9231 | 12 | 1.2158 | 0.0899 | 1.2158 | 1.1026 |
| No log | 1.0769 | 14 | 1.0297 | 0.3445 | 1.0297 | 1.0147 |
| No log | 1.2308 | 16 | 1.1103 | 0.2736 | 1.1103 | 1.0537 |
| No log | 1.3846 | 18 | 1.1941 | 0.1648 | 1.1941 | 1.0927 |
| No log | 1.5385 | 20 | 1.1982 | 0.2471 | 1.1982 | 1.0946 |
| No log | 1.6923 | 22 | 1.1769 | 0.3050 | 1.1769 | 1.0848 |
| No log | 1.8462 | 24 | 1.1307 | 0.3145 | 1.1307 | 1.0633 |
| No log | 2.0 | 26 | 1.1225 | 0.3198 | 1.1225 | 1.0595 |
| No log | 2.1538 | 28 | 1.2005 | 0.1301 | 1.2005 | 1.0957 |
| No log | 2.3077 | 30 | 1.2531 | 0.1503 | 1.2531 | 1.1194 |
| No log | 2.4615 | 32 | 1.1448 | 0.2230 | 1.1448 | 1.0700 |
| No log | 2.6154 | 34 | 1.0100 | 0.3793 | 1.0100 | 1.0050 |
| No log | 2.7692 | 36 | 0.9753 | 0.3571 | 0.9753 | 0.9876 |
| No log | 2.9231 | 38 | 0.9982 | 0.2771 | 0.9982 | 0.9991 |
| No log | 3.0769 | 40 | 0.9904 | 0.3423 | 0.9904 | 0.9952 |
| No log | 3.2308 | 42 | 1.0048 | 0.3717 | 1.0048 | 1.0024 |
| No log | 3.3846 | 44 | 1.0774 | 0.2528 | 1.0774 | 1.0380 |
| No log | 3.5385 | 46 | 1.3022 | 0.2826 | 1.3022 | 1.1411 |
| No log | 3.6923 | 48 | 1.4397 | 0.1888 | 1.4397 | 1.1999 |
| No log | 3.8462 | 50 | 1.2720 | 0.4219 | 1.2720 | 1.1278 |
| No log | 4.0 | 52 | 1.1023 | 0.2721 | 1.1023 | 1.0499 |
| No log | 4.1538 | 54 | 1.0140 | 0.5071 | 1.0140 | 1.0070 |
| No log | 4.3077 | 56 | 1.0495 | 0.4357 | 1.0495 | 1.0244 |
| No log | 4.4615 | 58 | 1.0388 | 0.4068 | 1.0388 | 1.0192 |
| No log | 4.6154 | 60 | 1.0341 | 0.4583 | 1.0341 | 1.0169 |
| No log | 4.7692 | 62 | 1.0435 | 0.4559 | 1.0435 | 1.0215 |
| No log | 4.9231 | 64 | 1.0465 | 0.4393 | 1.0465 | 1.0230 |
| No log | 5.0769 | 66 | 1.0186 | 0.5103 | 1.0186 | 1.0093 |
| No log | 5.2308 | 68 | 0.9823 | 0.5777 | 0.9823 | 0.9911 |
| No log | 5.3846 | 70 | 0.9632 | 0.5745 | 0.9632 | 0.9814 |
| No log | 5.5385 | 72 | 0.9583 | 0.5745 | 0.9583 | 0.9789 |
| No log | 5.6923 | 74 | 0.9579 | 0.5629 | 0.9579 | 0.9787 |
| No log | 5.8462 | 76 | 0.9809 | 0.5745 | 0.9809 | 0.9904 |
| No log | 6.0 | 78 | 1.0332 | 0.4669 | 1.0332 | 1.0165 |
| No log | 6.1538 | 80 | 1.0979 | 0.4391 | 1.0979 | 1.0478 |
| No log | 6.3077 | 82 | 1.1242 | 0.4253 | 1.1242 | 1.0603 |
| No log | 6.4615 | 84 | 1.0957 | 0.4391 | 1.0957 | 1.0468 |
| No log | 6.6154 | 86 | 1.0546 | 0.4945 | 1.0546 | 1.0270 |
| No log | 6.7692 | 88 | 1.0216 | 0.5462 | 1.0216 | 1.0108 |
| No log | 6.9231 | 90 | 1.0302 | 0.5601 | 1.0302 | 1.0150 |
| No log | 7.0769 | 92 | 1.0255 | 0.5601 | 1.0255 | 1.0127 |
| No log | 7.2308 | 94 | 1.0372 | 0.5082 | 1.0372 | 1.0184 |
| No log | 7.3846 | 96 | 1.0460 | 0.5067 | 1.0461 | 1.0228 |
| No log | 7.5385 | 98 | 1.0610 | 0.5067 | 1.0610 | 1.0301 |
| No log | 7.6923 | 100 | 1.0750 | 0.4933 | 1.0750 | 1.0368 |
| No log | 7.8462 | 102 | 1.0843 | 0.4933 | 1.0843 | 1.0413 |
| No log | 8.0 | 104 | 1.0914 | 0.4933 | 1.0914 | 1.0447 |
| No log | 8.1538 | 106 | 1.0905 | 0.4933 | 1.0905 | 1.0443 |
| No log | 8.3077 | 108 | 1.0880 | 0.4933 | 1.0880 | 1.0431 |
| No log | 8.4615 | 110 | 1.0890 | 0.4933 | 1.0890 | 1.0435 |
| No log | 8.6154 | 112 | 1.0865 | 0.4933 | 1.0865 | 1.0424 |
| No log | 8.7692 | 114 | 1.0780 | 0.4933 | 1.0780 | 1.0383 |
| No log | 8.9231 | 116 | 1.0789 | 0.4933 | 1.0789 | 1.0387 |
| No log | 9.0769 | 118 | 1.0787 | 0.4798 | 1.0787 | 1.0386 |
| No log | 9.2308 | 120 | 1.0701 | 0.4798 | 1.0701 | 1.0344 |
| No log | 9.3846 | 122 | 1.0583 | 0.4798 | 1.0583 | 1.0287 |
| No log | 9.5385 | 124 | 1.0499 | 0.4933 | 1.0499 | 1.0247 |
| No log | 9.6923 | 126 | 1.0453 | 0.4933 | 1.0453 | 1.0224 |
| No log | 9.8462 | 128 | 1.0424 | 0.4933 | 1.0424 | 1.0210 |
| No log | 10.0 | 130 | 1.0411 | 0.4933 | 1.0411 | 1.0203 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
CogACT/CogACT-Base
|
CogACT
| 2024-12-04T16:02:40Z | 3,828 | 6 |
transformers
|
[
"transformers",
"robotics",
"vla",
"diffusion",
"multimodal",
"pretraining",
"en",
"arxiv:2411.19650",
"license:mit",
"endpoints_compatible",
"region:us"
] |
robotics
| 2024-11-29T03:42:05Z |
---
license: mit
library_name: transformers
tags:
- robotics
- vla
- diffusion
- multimodal
- pretraining
language:
- en
pipeline_tag: robotics
---
# CogACT-Base
CogACT is a new advanced VLA architecture derived from VLM. Unlike previous works that directly repurpose VLM for action prediction by simple action quantization, we propose a componentized VLA architecture that has a specialized action module conditioned on VLM output. CogACT-Base employs a [DiT-Base](https://github.com/facebookresearch/DiT) model as the action module.
All our [code](https://github.com/microsoft/CogACT), [pre-trained model weights](https://huggingface.co/CogACT), are licensed under the MIT license.
Please refer to our [project page](https://cogact.github.io/) and [paper](https://arxiv.org/abs/2411.19650) for more details.
## Model Summary
- **Developed by:** The CogACT consisting of researchers from [Microsoft Research Asia](https://www.microsoft.com/en-us/research/lab/microsoft-research-asia/).
- **Model type:** Vision-Language-Action (language, image => robot actions)
- **Language(s) (NLP):** en
- **License:** MIT
- **Model components:**
+ **Vision Backbone**: DINOv2 ViT-L/14 and SigLIP ViT-So400M/14
+ **Language Model**: Llama-2
+ **Action Model**: DiT-Base
- **Pretraining Dataset:** A subset of [Open X-Embodiment](https://robotics-transformer-x.github.io/)
- **Repository:** [https://github.com/microsoft/CogACT](https://github.com/microsoft/CogACT)
- **Paper:** [CogACT: A Foundational Vision-Language-Action Model for Synergizing Cognition and Action in Robotic Manipulation](https://arxiv.org/abs/2411.19650)
- **Project Page:** [https://cogact.github.io/](https://cogact.github.io/)
## Uses
CogACT takes a language instruction and a single view RGB image as input and predicts the next 16 normalized robot actions (consisting of the 7-DoF end effector deltas
of the form ``x, y, z, roll, pitch, yaw, gripper``). These actions should be unnormalized and integrated by our ``Adaptive Action Ensemble``(Optional). Unnormalization and ensemble depend on the dataset statistics.
CogACT models can be used zero-shot to control robots for setups seen in the [Open-X](https://robotics-transformer-x.github.io/) pretraining mixture. They can also be fine-tuned for new tasks and robot setups with an extremely small amount of demonstrations. See [our repository](https://github.com/microsoft/CogACT) for more information.
Here is a simple example for inference.
```python
# Please clone and install dependencies in our repo
# Install minimal dependencies (`torch`, `transformers`, `timm`, `tokenizers`, ...)
from PIL import Image
from vla import load_vla
import torch
model = load_vla(
'CogACT/CogACT-Base',
load_for_training=False,
action_model_type='DiT-B',
future_action_window_size=15,
)
# about 30G Memory in fp32;
# (Optional) use "model.vlm = model.vlm.to(torch.bfloat16)" to load vlm in bf16
model.to('cuda:0').eval()
image: Image.Image = <input_your_image>
prompt = "move sponge near apple" # input your prompt
# Predict Action (7-DoF; un-normalize for RT-1 google robot data, i.e. fractal20220817_data)
actions, _ = model.predict_action(
image,
prompt,
unnorm_key='fractal20220817_data', # input your unnorm_key of dataset
cfg_scale = 1.5, # cfg from 1.5 to 7 also performs well
use_ddim = True, # use DDIM sampling
num_ddim_steps = 10, # number of steps for DDIM sampling
)
# results in 7-DoF actions of 16 steps with shape [16, 7]
```
## Citation
```bibtex
@article{li2024cogact,
title={CogACT: A Foundational Vision-Language-Action Model for Synergizing Cognition and Action in Robotic Manipulation},
author={Li, Qixiu and Liang, Yaobo and Wang, Zeyu and Luo, Lin and Chen, Xi and Liao, Mozheng and Wei, Fangyun and Deng, Yu and Xu, Sicheng and Zhang, Yizhong and others},
journal={arXiv preprint arXiv:2411.19650},
year={2024}
}
```
|
gokulsrinivasagan/bert_tiny_lda_20_v1_mnli
|
gokulsrinivasagan
| 2024-12-04T16:02:24Z | 122 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_20_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_20_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T22:05:18Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_20_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_20_v1_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.6954841334418226
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_20_v1_mnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_20_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_20_v1) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7126
- Accuracy: 0.6955
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.9696 | 1.0 | 1534 | 0.8635 | 0.6102 |
| 0.8307 | 2.0 | 3068 | 0.7849 | 0.6501 |
| 0.7523 | 3.0 | 4602 | 0.7467 | 0.6728 |
| 0.6962 | 4.0 | 6136 | 0.7247 | 0.6862 |
| 0.6472 | 5.0 | 7670 | 0.7248 | 0.6957 |
| 0.6032 | 6.0 | 9204 | 0.7455 | 0.6984 |
| 0.5606 | 7.0 | 10738 | 0.7510 | 0.6987 |
| 0.5204 | 8.0 | 12272 | 0.7849 | 0.6915 |
| 0.4808 | 9.0 | 13806 | 0.8428 | 0.6963 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
MayBashendy/ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k1_task1_organization
|
MayBashendy
| 2024-12-04T16:00:52Z | 266 | 0 |
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T15:58:29Z |
---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k1_task1_organization
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ArabicNewSplits3_FineTuningAraBERT_run3_AugV5_k1_task1_organization
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7176
- Qwk: 0.6693
- Mse: 0.7176
- Rmse: 0.8471
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.2222 | 2 | 4.7780 | -0.0582 | 4.7780 | 2.1859 |
| No log | 0.4444 | 4 | 3.1128 | 0.1066 | 3.1128 | 1.7643 |
| No log | 0.6667 | 6 | 2.0032 | 0.1713 | 2.0032 | 1.4153 |
| No log | 0.8889 | 8 | 1.2087 | 0.2215 | 1.2087 | 1.0994 |
| No log | 1.1111 | 10 | 1.0608 | 0.2530 | 1.0608 | 1.0299 |
| No log | 1.3333 | 12 | 1.0632 | 0.3827 | 1.0632 | 1.0311 |
| No log | 1.5556 | 14 | 1.2407 | 0.3485 | 1.2407 | 1.1139 |
| No log | 1.7778 | 16 | 1.0530 | 0.3684 | 1.0530 | 1.0262 |
| No log | 2.0 | 18 | 0.9129 | 0.4613 | 0.9129 | 0.9555 |
| No log | 2.2222 | 20 | 0.8556 | 0.5 | 0.8556 | 0.9250 |
| No log | 2.4444 | 22 | 1.0047 | 0.4265 | 1.0047 | 1.0023 |
| No log | 2.6667 | 24 | 1.0017 | 0.4306 | 1.0017 | 1.0008 |
| No log | 2.8889 | 26 | 0.7764 | 0.5206 | 0.7764 | 0.8811 |
| No log | 3.1111 | 28 | 0.7065 | 0.6373 | 0.7065 | 0.8406 |
| No log | 3.3333 | 30 | 0.7061 | 0.6282 | 0.7061 | 0.8403 |
| No log | 3.5556 | 32 | 0.8003 | 0.6119 | 0.8003 | 0.8946 |
| No log | 3.7778 | 34 | 0.9348 | 0.5889 | 0.9348 | 0.9668 |
| No log | 4.0 | 36 | 0.8889 | 0.6241 | 0.8889 | 0.9428 |
| No log | 4.2222 | 38 | 0.7831 | 0.6505 | 0.7831 | 0.8849 |
| No log | 4.4444 | 40 | 0.7470 | 0.6769 | 0.7470 | 0.8643 |
| No log | 4.6667 | 42 | 0.8496 | 0.6230 | 0.8496 | 0.9217 |
| No log | 4.8889 | 44 | 0.8149 | 0.6241 | 0.8149 | 0.9027 |
| No log | 5.1111 | 46 | 0.7461 | 0.6932 | 0.7461 | 0.8638 |
| No log | 5.3333 | 48 | 0.8198 | 0.6382 | 0.8198 | 0.9054 |
| No log | 5.5556 | 50 | 0.8182 | 0.6039 | 0.8182 | 0.9046 |
| No log | 5.7778 | 52 | 0.7212 | 0.7030 | 0.7212 | 0.8492 |
| No log | 6.0 | 54 | 0.7459 | 0.7208 | 0.7459 | 0.8636 |
| No log | 6.2222 | 56 | 0.7669 | 0.6970 | 0.7669 | 0.8757 |
| No log | 6.4444 | 58 | 0.7488 | 0.7 | 0.7488 | 0.8653 |
| No log | 6.6667 | 60 | 0.7425 | 0.6872 | 0.7425 | 0.8617 |
| No log | 6.8889 | 62 | 0.8120 | 0.6530 | 0.8120 | 0.9011 |
| No log | 7.1111 | 64 | 0.8103 | 0.6713 | 0.8103 | 0.9001 |
| No log | 7.3333 | 66 | 0.8061 | 0.6777 | 0.8061 | 0.8978 |
| No log | 7.5556 | 68 | 0.7488 | 0.6872 | 0.7488 | 0.8653 |
| No log | 7.7778 | 70 | 0.7326 | 0.6877 | 0.7326 | 0.8559 |
| No log | 8.0 | 72 | 0.7338 | 0.6877 | 0.7338 | 0.8566 |
| No log | 8.2222 | 74 | 0.7250 | 0.6662 | 0.7250 | 0.8515 |
| No log | 8.4444 | 76 | 0.7223 | 0.6662 | 0.7223 | 0.8499 |
| No log | 8.6667 | 78 | 0.7172 | 0.6662 | 0.7172 | 0.8469 |
| No log | 8.8889 | 80 | 0.7163 | 0.6662 | 0.7163 | 0.8463 |
| No log | 9.1111 | 82 | 0.7150 | 0.6662 | 0.7150 | 0.8456 |
| No log | 9.3333 | 84 | 0.7146 | 0.6662 | 0.7146 | 0.8453 |
| No log | 9.5556 | 86 | 0.7161 | 0.6693 | 0.7161 | 0.8462 |
| No log | 9.7778 | 88 | 0.7169 | 0.6693 | 0.7169 | 0.8467 |
| No log | 10.0 | 90 | 0.7176 | 0.6693 | 0.7176 | 0.8471 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Chinnu1103/ROSLlama
|
Chinnu1103
| 2024-12-04T16:00:28Z | 150 | 0 |
transformers
|
[
"transformers",
"pytorch",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-12-04T15:57:07Z |
---
base_model: unsloth/llama-3.2-3b-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** Chinnu1103
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3.2-3b-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
gokulsrinivasagan/bert_tiny_lda_5_v1_mnli
|
gokulsrinivasagan
| 2024-12-04T15:50:07Z | 130 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_5_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_5_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:15:44Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_5_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_5_v1_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.6503254678600489
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_5_v1_mnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_5_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_5_v1) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8033
- Accuracy: 0.6503
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.9869 | 1.0 | 1534 | 0.9311 | 0.5462 |
| 0.9068 | 2.0 | 3068 | 0.8786 | 0.5912 |
| 0.8437 | 3.0 | 4602 | 0.8481 | 0.6158 |
| 0.7821 | 4.0 | 6136 | 0.8241 | 0.6377 |
| 0.727 | 5.0 | 7670 | 0.8183 | 0.6445 |
| 0.6752 | 6.0 | 9204 | 0.8311 | 0.6532 |
| 0.6251 | 7.0 | 10738 | 0.8516 | 0.6500 |
| 0.5774 | 8.0 | 12272 | 0.9028 | 0.6482 |
| 0.5308 | 9.0 | 13806 | 0.9652 | 0.6511 |
| 0.4879 | 10.0 | 15340 | 0.9956 | 0.6522 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
Augusto777/BEiT-RD-DA
|
Augusto777
| 2024-12-04T15:44:02Z | 5 | 0 | null |
[
"tensorboard",
"safetensors",
"beit",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:microsoft/beit-base-patch16-224",
"base_model:finetune:microsoft/beit-base-patch16-224",
"license:apache-2.0",
"model-index",
"region:us"
] | null | 2024-12-04T15:42:55Z |
---
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: BEiT-RD-DA
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: validation
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.6654545454545454
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BEiT-RD-DA
This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9617
- Accuracy: 0.6655
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.4123 | 1.0 | 96 | 1.4099 | 0.4927 |
| 0.9503 | 2.0 | 192 | 1.8852 | 0.4927 |
| 0.8284 | 3.0 | 288 | 2.1702 | 0.5073 |
| 0.7677 | 4.0 | 384 | 2.0408 | 0.5345 |
| 0.788 | 5.0 | 480 | 2.7991 | 0.5127 |
| 0.5822 | 6.0 | 576 | 2.0951 | 0.5636 |
| 0.5172 | 7.0 | 672 | 2.5977 | 0.5364 |
| 0.4615 | 8.0 | 768 | 2.0968 | 0.58 |
| 0.3672 | 9.0 | 864 | 2.8535 | 0.5436 |
| 0.379 | 10.0 | 960 | 2.9515 | 0.5382 |
| 0.3301 | 11.0 | 1056 | 2.7200 | 0.5582 |
| 0.2786 | 12.0 | 1152 | 1.9000 | 0.6273 |
| 0.2746 | 13.0 | 1248 | 3.1768 | 0.5364 |
| 0.2298 | 14.0 | 1344 | 3.1003 | 0.5527 |
| 0.2013 | 15.0 | 1440 | 2.3441 | 0.6182 |
| 0.2225 | 16.0 | 1536 | 3.0214 | 0.5709 |
| 0.2229 | 17.0 | 1632 | 2.0676 | 0.6164 |
| 0.2024 | 18.0 | 1728 | 2.6478 | 0.5673 |
| 0.1401 | 19.0 | 1824 | 2.8952 | 0.5636 |
| 0.1984 | 20.0 | 1920 | 2.3083 | 0.6145 |
| 0.1788 | 21.0 | 2016 | 3.7702 | 0.52 |
| 0.1907 | 22.0 | 2112 | 1.9617 | 0.6655 |
| 0.1113 | 23.0 | 2208 | 2.6546 | 0.5964 |
| 0.1293 | 24.0 | 2304 | 2.6427 | 0.6036 |
| 0.1354 | 25.0 | 2400 | 3.4105 | 0.5527 |
| 0.1447 | 26.0 | 2496 | 2.5460 | 0.6127 |
| 0.0995 | 27.0 | 2592 | 2.9865 | 0.5855 |
| 0.1369 | 28.0 | 2688 | 3.5281 | 0.5545 |
| 0.1238 | 29.0 | 2784 | 2.8161 | 0.6018 |
| 0.1256 | 30.0 | 2880 | 3.4917 | 0.5491 |
| 0.1064 | 31.0 | 2976 | 3.0659 | 0.58 |
| 0.1333 | 32.0 | 3072 | 3.5972 | 0.5473 |
| 0.1134 | 33.0 | 3168 | 3.6116 | 0.54 |
| 0.0831 | 34.0 | 3264 | 3.5308 | 0.5509 |
| 0.1035 | 35.0 | 3360 | 3.4789 | 0.5582 |
| 0.0957 | 36.0 | 3456 | 3.6358 | 0.5509 |
| 0.0764 | 37.0 | 3552 | 3.3639 | 0.5709 |
| 0.072 | 38.0 | 3648 | 3.5639 | 0.5564 |
| 0.0727 | 39.0 | 3744 | 3.5193 | 0.5582 |
| 0.0619 | 40.0 | 3840 | 3.5836 | 0.5582 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.0
|
gokulsrinivasagan/bert_tiny_lda_100_v1_rte
|
gokulsrinivasagan
| 2024-12-04T15:43:58Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T23:04:11Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_100_v1_rte
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE RTE
type: glue
args: rte
metrics:
- name: Accuracy
type: accuracy
value: 0.48736462093862815
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_100_v1_rte
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_100_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_100_v1) on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6925
- Accuracy: 0.4874
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7159 | 1.0 | 10 | 0.7091 | 0.4729 |
| 0.6926 | 2.0 | 20 | 0.6925 | 0.4874 |
| 0.6814 | 3.0 | 30 | 0.6944 | 0.5199 |
| 0.6663 | 4.0 | 40 | 0.6978 | 0.5271 |
| 0.6472 | 5.0 | 50 | 0.7425 | 0.5415 |
| 0.6276 | 6.0 | 60 | 0.7315 | 0.5451 |
| 0.5534 | 7.0 | 70 | 0.8165 | 0.5018 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
nickoloss/detr-resnet-50_finetuned_cppe5
|
nickoloss
| 2024-12-04T15:43:26Z | 43 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"detr",
"object-detection",
"generated_from_trainer",
"base_model:facebook/detr-resnet-50",
"base_model:finetune:facebook/detr-resnet-50",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
object-detection
| 2024-11-15T12:19:22Z |
---
library_name: transformers
license: apache-2.0
base_model: facebook/detr-resnet-50
tags:
- generated_from_trainer
model-index:
- name: detr-resnet-50_finetuned_cppe5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1397
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4.941058844013093e-07
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 1000
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:-----:|:---------------:|
| No log | 1.0 | 10 | 7.0795 |
| No log | 2.0 | 20 | 5.7431 |
| No log | 3.0 | 30 | 6.7530 |
| No log | 4.0 | 40 | 5.3270 |
| 5.8012 | 5.0 | 50 | 6.1804 |
| 5.8012 | 6.0 | 60 | 6.0216 |
| 5.8012 | 7.0 | 70 | 5.2871 |
| 5.8012 | 8.0 | 80 | 5.1623 |
| 5.8012 | 9.0 | 90 | 6.9306 |
| 5.1916 | 10.0 | 100 | 5.7015 |
| 5.1916 | 11.0 | 110 | 7.5914 |
| 5.1916 | 12.0 | 120 | 6.0398 |
| 5.1916 | 13.0 | 130 | 5.3490 |
| 5.1916 | 14.0 | 140 | 5.8373 |
| 5.4281 | 15.0 | 150 | 4.1271 |
| 5.4281 | 16.0 | 160 | 6.2398 |
| 5.4281 | 17.0 | 170 | 4.2083 |
| 5.4281 | 18.0 | 180 | 4.7579 |
| 5.4281 | 19.0 | 190 | 5.2079 |
| 4.8531 | 20.0 | 200 | 3.9074 |
| 4.8531 | 21.0 | 210 | 5.1759 |
| 4.8531 | 22.0 | 220 | 3.8891 |
| 4.8531 | 23.0 | 230 | 3.8469 |
| 4.8531 | 24.0 | 240 | 4.0659 |
| 4.6505 | 25.0 | 250 | 4.7758 |
| 4.6505 | 26.0 | 260 | 4.2206 |
| 4.6505 | 27.0 | 270 | 4.8899 |
| 4.6505 | 28.0 | 280 | 4.6572 |
| 4.6505 | 29.0 | 290 | 4.1456 |
| 4.7175 | 30.0 | 300 | 4.3631 |
| 4.7175 | 31.0 | 310 | 3.9849 |
| 4.7175 | 32.0 | 320 | 5.4221 |
| 4.7175 | 33.0 | 330 | 3.7069 |
| 4.7175 | 34.0 | 340 | 5.2221 |
| 4.3464 | 35.0 | 350 | 4.0269 |
| 4.3464 | 36.0 | 360 | 3.8471 |
| 4.3464 | 37.0 | 370 | 4.0817 |
| 4.3464 | 38.0 | 380 | 4.0005 |
| 4.3464 | 39.0 | 390 | 3.7973 |
| 4.4884 | 40.0 | 400 | 4.1382 |
| 4.4884 | 41.0 | 410 | 3.9224 |
| 4.4884 | 42.0 | 420 | 4.3754 |
| 4.4884 | 43.0 | 430 | 3.9821 |
| 4.4884 | 44.0 | 440 | 3.9350 |
| 4.2526 | 45.0 | 450 | 4.0074 |
| 4.2526 | 46.0 | 460 | 3.6133 |
| 4.2526 | 47.0 | 470 | 3.8681 |
| 4.2526 | 48.0 | 480 | 3.9367 |
| 4.2526 | 49.0 | 490 | 3.8197 |
| 4.1344 | 50.0 | 500 | 3.5646 |
| 4.1344 | 51.0 | 510 | 3.7987 |
| 4.1344 | 52.0 | 520 | 3.9491 |
| 4.1344 | 53.0 | 530 | 4.1457 |
| 4.1344 | 54.0 | 540 | 3.6863 |
| 4.0492 | 55.0 | 550 | 3.8259 |
| 4.0492 | 56.0 | 560 | 3.8122 |
| 4.0492 | 57.0 | 570 | 4.0111 |
| 4.0492 | 58.0 | 580 | 3.7859 |
| 4.0492 | 59.0 | 590 | 3.7566 |
| 3.9168 | 60.0 | 600 | 3.6876 |
| 3.9168 | 61.0 | 610 | 3.7469 |
| 3.9168 | 62.0 | 620 | 4.0203 |
| 3.9168 | 63.0 | 630 | 4.7051 |
| 3.9168 | 64.0 | 640 | 3.6666 |
| 3.8979 | 65.0 | 650 | 3.5877 |
| 3.8979 | 66.0 | 660 | 3.5737 |
| 3.8979 | 67.0 | 670 | 3.6520 |
| 3.8979 | 68.0 | 680 | 3.4342 |
| 3.8979 | 69.0 | 690 | 3.9668 |
| 3.7801 | 70.0 | 700 | 4.0617 |
| 3.7801 | 71.0 | 710 | 3.8625 |
| 3.7801 | 72.0 | 720 | 3.3205 |
| 3.7801 | 73.0 | 730 | 4.0774 |
| 3.7801 | 74.0 | 740 | 4.0416 |
| 3.7303 | 75.0 | 750 | 3.4343 |
| 3.7303 | 76.0 | 760 | 3.5131 |
| 3.7303 | 77.0 | 770 | 3.5507 |
| 3.7303 | 78.0 | 780 | 3.9112 |
| 3.7303 | 79.0 | 790 | 3.3022 |
| 3.5806 | 80.0 | 800 | 3.8618 |
| 3.5806 | 81.0 | 810 | 3.6697 |
| 3.5806 | 82.0 | 820 | 3.5536 |
| 3.5806 | 83.0 | 830 | 3.3500 |
| 3.5806 | 84.0 | 840 | 3.8134 |
| 3.6183 | 85.0 | 850 | 3.4067 |
| 3.6183 | 86.0 | 860 | 3.4425 |
| 3.6183 | 87.0 | 870 | 3.2812 |
| 3.6183 | 88.0 | 880 | 3.3909 |
| 3.6183 | 89.0 | 890 | 3.6878 |
| 3.4767 | 90.0 | 900 | 3.5409 |
| 3.4767 | 91.0 | 910 | 3.5380 |
| 3.4767 | 92.0 | 920 | 3.8982 |
| 3.4767 | 93.0 | 930 | 3.4205 |
| 3.4767 | 94.0 | 940 | 4.0828 |
| 3.5444 | 95.0 | 950 | 3.2579 |
| 3.5444 | 96.0 | 960 | 3.3702 |
| 3.5444 | 97.0 | 970 | 4.2833 |
| 3.5444 | 98.0 | 980 | 3.4222 |
| 3.5444 | 99.0 | 990 | 3.4477 |
| 3.3821 | 100.0 | 1000 | 3.2484 |
| 3.3821 | 101.0 | 1010 | 3.3493 |
| 3.3821 | 102.0 | 1020 | 3.2192 |
| 3.3821 | 103.0 | 1030 | 3.2491 |
| 3.3821 | 104.0 | 1040 | 3.3853 |
| 3.3429 | 105.0 | 1050 | 3.4362 |
| 3.3429 | 106.0 | 1060 | 4.1587 |
| 3.3429 | 107.0 | 1070 | 3.9797 |
| 3.3429 | 108.0 | 1080 | 3.6257 |
| 3.3429 | 109.0 | 1090 | 3.4861 |
| 3.304 | 110.0 | 1100 | 3.3520 |
| 3.304 | 111.0 | 1110 | 3.0047 |
| 3.304 | 112.0 | 1120 | 3.4988 |
| 3.304 | 113.0 | 1130 | 3.4723 |
| 3.304 | 114.0 | 1140 | 3.4294 |
| 3.2826 | 115.0 | 1150 | 3.6923 |
| 3.2826 | 116.0 | 1160 | 3.2513 |
| 3.2826 | 117.0 | 1170 | 3.6769 |
| 3.2826 | 118.0 | 1180 | 3.5384 |
| 3.2826 | 119.0 | 1190 | 3.3773 |
| 3.1944 | 120.0 | 1200 | 3.2538 |
| 3.1944 | 121.0 | 1210 | 3.2896 |
| 3.1944 | 122.0 | 1220 | 3.4226 |
| 3.1944 | 123.0 | 1230 | 3.3085 |
| 3.1944 | 124.0 | 1240 | 3.1047 |
| 3.1978 | 125.0 | 1250 | 3.3142 |
| 3.1978 | 126.0 | 1260 | 3.4432 |
| 3.1978 | 127.0 | 1270 | 2.9309 |
| 3.1978 | 128.0 | 1280 | 3.3678 |
| 3.1978 | 129.0 | 1290 | 3.6156 |
| 3.3425 | 130.0 | 1300 | 3.3015 |
| 3.3425 | 131.0 | 1310 | 3.3181 |
| 3.3425 | 132.0 | 1320 | 3.2688 |
| 3.3425 | 133.0 | 1330 | 3.4590 |
| 3.3425 | 134.0 | 1340 | 3.0809 |
| 3.3654 | 135.0 | 1350 | 3.0907 |
| 3.3654 | 136.0 | 1360 | 3.2888 |
| 3.3654 | 137.0 | 1370 | 3.1504 |
| 3.3654 | 138.0 | 1380 | 3.4285 |
| 3.3654 | 139.0 | 1390 | 3.4080 |
| 3.1702 | 140.0 | 1400 | 3.1543 |
| 3.1702 | 141.0 | 1410 | 3.5154 |
| 3.1702 | 142.0 | 1420 | 3.1132 |
| 3.1702 | 143.0 | 1430 | 3.2503 |
| 3.1702 | 144.0 | 1440 | 3.6848 |
| 3.1929 | 145.0 | 1450 | 3.1961 |
| 3.1929 | 146.0 | 1460 | 3.4146 |
| 3.1929 | 147.0 | 1470 | 3.4162 |
| 3.1929 | 148.0 | 1480 | 3.2388 |
| 3.1929 | 149.0 | 1490 | 3.6281 |
| 3.0 | 150.0 | 1500 | 2.9830 |
| 3.0 | 151.0 | 1510 | 3.1817 |
| 3.0 | 152.0 | 1520 | 3.2862 |
| 3.0 | 153.0 | 1530 | 3.0465 |
| 3.0 | 154.0 | 1540 | 3.1208 |
| 2.9975 | 155.0 | 1550 | 3.3041 |
| 2.9975 | 156.0 | 1560 | 3.4944 |
| 2.9975 | 157.0 | 1570 | 3.5826 |
| 2.9975 | 158.0 | 1580 | 3.5453 |
| 2.9975 | 159.0 | 1590 | 4.0256 |
| 3.0312 | 160.0 | 1600 | 3.3678 |
| 3.0312 | 161.0 | 1610 | 2.9384 |
| 3.0312 | 162.0 | 1620 | 3.0596 |
| 3.0312 | 163.0 | 1630 | 3.3952 |
| 3.0312 | 164.0 | 1640 | 3.5299 |
| 2.9855 | 165.0 | 1650 | 2.9930 |
| 2.9855 | 166.0 | 1660 | 3.3869 |
| 2.9855 | 167.0 | 1670 | 3.1676 |
| 2.9855 | 168.0 | 1680 | 3.1330 |
| 2.9855 | 169.0 | 1690 | 3.2595 |
| 2.863 | 170.0 | 1700 | 3.1151 |
| 2.863 | 171.0 | 1710 | 3.1382 |
| 2.863 | 172.0 | 1720 | 3.7265 |
| 2.863 | 173.0 | 1730 | 2.8716 |
| 2.863 | 174.0 | 1740 | 3.0285 |
| 2.8942 | 175.0 | 1750 | 3.0285 |
| 2.8942 | 176.0 | 1760 | 3.7873 |
| 2.8942 | 177.0 | 1770 | 2.9266 |
| 2.8942 | 178.0 | 1780 | 2.9751 |
| 2.8942 | 179.0 | 1790 | 3.1875 |
| 2.7614 | 180.0 | 1800 | 2.6317 |
| 2.7614 | 181.0 | 1810 | 3.3780 |
| 2.7614 | 182.0 | 1820 | 3.1680 |
| 2.7614 | 183.0 | 1830 | 3.3270 |
| 2.7614 | 184.0 | 1840 | 3.2822 |
| 2.9 | 185.0 | 1850 | 3.0026 |
| 2.9 | 186.0 | 1860 | 3.0610 |
| 2.9 | 187.0 | 1870 | 3.2631 |
| 2.9 | 188.0 | 1880 | 2.8804 |
| 2.9 | 189.0 | 1890 | 3.2069 |
| 2.9176 | 190.0 | 1900 | 2.8339 |
| 2.9176 | 191.0 | 1910 | 2.9836 |
| 2.9176 | 192.0 | 1920 | 3.0211 |
| 2.9176 | 193.0 | 1930 | 2.8448 |
| 2.9176 | 194.0 | 1940 | 4.1654 |
| 2.8189 | 195.0 | 1950 | 3.0910 |
| 2.8189 | 196.0 | 1960 | 2.7972 |
| 2.8189 | 197.0 | 1970 | 3.5421 |
| 2.8189 | 198.0 | 1980 | 2.8334 |
| 2.8189 | 199.0 | 1990 | 3.0457 |
| 2.7236 | 200.0 | 2000 | 3.0531 |
| 2.7236 | 201.0 | 2010 | 3.0384 |
| 2.7236 | 202.0 | 2020 | 3.0183 |
| 2.7236 | 203.0 | 2030 | 3.1019 |
| 2.7236 | 204.0 | 2040 | 2.6909 |
| 2.6289 | 205.0 | 2050 | 2.8969 |
| 2.6289 | 206.0 | 2060 | 2.8063 |
| 2.6289 | 207.0 | 2070 | 3.3533 |
| 2.6289 | 208.0 | 2080 | 3.0578 |
| 2.6289 | 209.0 | 2090 | 3.0081 |
| 2.6592 | 210.0 | 2100 | 3.1674 |
| 2.6592 | 211.0 | 2110 | 3.0982 |
| 2.6592 | 212.0 | 2120 | 2.9070 |
| 2.6592 | 213.0 | 2130 | 2.8881 |
| 2.6592 | 214.0 | 2140 | 2.7869 |
| 2.6898 | 215.0 | 2150 | 2.9736 |
| 2.6898 | 216.0 | 2160 | 2.7309 |
| 2.6898 | 217.0 | 2170 | 3.2656 |
| 2.6898 | 218.0 | 2180 | 2.7734 |
| 2.6898 | 219.0 | 2190 | 2.6135 |
| 2.6117 | 220.0 | 2200 | 3.0652 |
| 2.6117 | 221.0 | 2210 | 3.0918 |
| 2.6117 | 222.0 | 2220 | 3.2191 |
| 2.6117 | 223.0 | 2230 | 2.8947 |
| 2.6117 | 224.0 | 2240 | 2.6307 |
| 2.6281 | 225.0 | 2250 | 2.6585 |
| 2.6281 | 226.0 | 2260 | 3.0801 |
| 2.6281 | 227.0 | 2270 | 2.9075 |
| 2.6281 | 228.0 | 2280 | 3.1795 |
| 2.6281 | 229.0 | 2290 | 2.8762 |
| 2.4503 | 230.0 | 2300 | 2.6883 |
| 2.4503 | 231.0 | 2310 | 3.0329 |
| 2.4503 | 232.0 | 2320 | 2.8990 |
| 2.4503 | 233.0 | 2330 | 2.7381 |
| 2.4503 | 234.0 | 2340 | 2.8102 |
| 2.5171 | 235.0 | 2350 | 3.0730 |
| 2.5171 | 236.0 | 2360 | 2.9376 |
| 2.5171 | 237.0 | 2370 | 2.5781 |
| 2.5171 | 238.0 | 2380 | 2.9466 |
| 2.5171 | 239.0 | 2390 | 2.6868 |
| 2.5004 | 240.0 | 2400 | 2.6414 |
| 2.5004 | 241.0 | 2410 | 3.0623 |
| 2.5004 | 242.0 | 2420 | 2.8071 |
| 2.5004 | 243.0 | 2430 | 2.4406 |
| 2.5004 | 244.0 | 2440 | 2.6247 |
| 2.5338 | 245.0 | 2450 | 2.7334 |
| 2.5338 | 246.0 | 2460 | 2.8576 |
| 2.5338 | 247.0 | 2470 | 2.6042 |
| 2.5338 | 248.0 | 2480 | 2.8519 |
| 2.5338 | 249.0 | 2490 | 3.0416 |
| 2.429 | 250.0 | 2500 | 2.7010 |
| 2.429 | 251.0 | 2510 | 4.0268 |
| 2.429 | 252.0 | 2520 | 2.9236 |
| 2.429 | 253.0 | 2530 | 2.5467 |
| 2.429 | 254.0 | 2540 | 2.7355 |
| 2.4368 | 255.0 | 2550 | 3.1205 |
| 2.4368 | 256.0 | 2560 | 2.8335 |
| 2.4368 | 257.0 | 2570 | 2.7752 |
| 2.4368 | 258.0 | 2580 | 2.7598 |
| 2.4368 | 259.0 | 2590 | 2.6409 |
| 2.3204 | 260.0 | 2600 | 2.7808 |
| 2.3204 | 261.0 | 2610 | 2.4784 |
| 2.3204 | 262.0 | 2620 | 2.9005 |
| 2.3204 | 263.0 | 2630 | 2.6729 |
| 2.3204 | 264.0 | 2640 | 2.6290 |
| 2.4044 | 265.0 | 2650 | 2.8760 |
| 2.4044 | 266.0 | 2660 | 2.5683 |
| 2.4044 | 267.0 | 2670 | 2.8607 |
| 2.4044 | 268.0 | 2680 | 2.5760 |
| 2.4044 | 269.0 | 2690 | 2.6616 |
| 2.3464 | 270.0 | 2700 | 2.6968 |
| 2.3464 | 271.0 | 2710 | 2.7200 |
| 2.3464 | 272.0 | 2720 | 2.7963 |
| 2.3464 | 273.0 | 2730 | 2.5230 |
| 2.3464 | 274.0 | 2740 | 2.7015 |
| 2.2999 | 275.0 | 2750 | 2.9836 |
| 2.2999 | 276.0 | 2760 | 2.6443 |
| 2.2999 | 277.0 | 2770 | 2.5045 |
| 2.2999 | 278.0 | 2780 | 3.2068 |
| 2.2999 | 279.0 | 2790 | 2.5038 |
| 2.3102 | 280.0 | 2800 | 2.7581 |
| 2.3102 | 281.0 | 2810 | 2.6092 |
| 2.3102 | 282.0 | 2820 | 2.4482 |
| 2.3102 | 283.0 | 2830 | 3.0941 |
| 2.3102 | 284.0 | 2840 | 2.3476 |
| 2.2134 | 285.0 | 2850 | 2.8535 |
| 2.2134 | 286.0 | 2860 | 2.6361 |
| 2.2134 | 287.0 | 2870 | 2.6033 |
| 2.2134 | 288.0 | 2880 | 2.4526 |
| 2.2134 | 289.0 | 2890 | 2.7966 |
| 2.3276 | 290.0 | 2900 | 2.6472 |
| 2.3276 | 291.0 | 2910 | 2.6410 |
| 2.3276 | 292.0 | 2920 | 2.5670 |
| 2.3276 | 293.0 | 2930 | 2.7832 |
| 2.3276 | 294.0 | 2940 | 2.5031 |
| 2.287 | 295.0 | 2950 | 2.5614 |
| 2.287 | 296.0 | 2960 | 3.0045 |
| 2.287 | 297.0 | 2970 | 2.5755 |
| 2.287 | 298.0 | 2980 | 2.5132 |
| 2.287 | 299.0 | 2990 | 2.6427 |
| 2.1723 | 300.0 | 3000 | 3.2675 |
| 2.1723 | 301.0 | 3010 | 2.5890 |
| 2.1723 | 302.0 | 3020 | 2.7935 |
| 2.1723 | 303.0 | 3030 | 2.5836 |
| 2.1723 | 304.0 | 3040 | 2.4359 |
| 2.237 | 305.0 | 3050 | 2.7048 |
| 2.237 | 306.0 | 3060 | 2.4640 |
| 2.237 | 307.0 | 3070 | 2.5528 |
| 2.237 | 308.0 | 3080 | 2.4919 |
| 2.237 | 309.0 | 3090 | 2.5067 |
| 2.1502 | 310.0 | 3100 | 2.6569 |
| 2.1502 | 311.0 | 3110 | 2.6649 |
| 2.1502 | 312.0 | 3120 | 2.7721 |
| 2.1502 | 313.0 | 3130 | 2.3934 |
| 2.1502 | 314.0 | 3140 | 2.4799 |
| 2.2248 | 315.0 | 3150 | 2.6882 |
| 2.2248 | 316.0 | 3160 | 2.8493 |
| 2.2248 | 317.0 | 3170 | 2.5919 |
| 2.2248 | 318.0 | 3180 | 2.4124 |
| 2.2248 | 319.0 | 3190 | 2.5997 |
| 2.2399 | 320.0 | 3200 | 2.3440 |
| 2.2399 | 321.0 | 3210 | 2.6292 |
| 2.2399 | 322.0 | 3220 | 3.2851 |
| 2.2399 | 323.0 | 3230 | 2.4422 |
| 2.2399 | 324.0 | 3240 | 2.3866 |
| 2.1759 | 325.0 | 3250 | 2.4307 |
| 2.1759 | 326.0 | 3260 | 2.2842 |
| 2.1759 | 327.0 | 3270 | 2.5418 |
| 2.1759 | 328.0 | 3280 | 2.5840 |
| 2.1759 | 329.0 | 3290 | 2.9884 |
| 2.2557 | 330.0 | 3300 | 2.5096 |
| 2.2557 | 331.0 | 3310 | 3.2382 |
| 2.2557 | 332.0 | 3320 | 2.5237 |
| 2.2557 | 333.0 | 3330 | 2.4346 |
| 2.2557 | 334.0 | 3340 | 2.4034 |
| 2.216 | 335.0 | 3350 | 2.4259 |
| 2.216 | 336.0 | 3360 | 2.4239 |
| 2.216 | 337.0 | 3370 | 2.5417 |
| 2.216 | 338.0 | 3380 | 2.7757 |
| 2.216 | 339.0 | 3390 | 2.6264 |
| 2.2112 | 340.0 | 3400 | 2.6611 |
| 2.2112 | 341.0 | 3410 | 2.6828 |
| 2.2112 | 342.0 | 3420 | 2.4541 |
| 2.2112 | 343.0 | 3430 | 2.4426 |
| 2.2112 | 344.0 | 3440 | 2.4566 |
| 2.1473 | 345.0 | 3450 | 2.8140 |
| 2.1473 | 346.0 | 3460 | 2.3079 |
| 2.1473 | 347.0 | 3470 | 2.4263 |
| 2.1473 | 348.0 | 3480 | 2.4176 |
| 2.1473 | 349.0 | 3490 | 2.5132 |
| 2.0273 | 350.0 | 3500 | 2.5695 |
| 2.0273 | 351.0 | 3510 | 2.3300 |
| 2.0273 | 352.0 | 3520 | 2.3673 |
| 2.0273 | 353.0 | 3530 | 2.4108 |
| 2.0273 | 354.0 | 3540 | 2.1937 |
| 1.9724 | 355.0 | 3550 | 2.4282 |
| 1.9724 | 356.0 | 3560 | 2.5854 |
| 1.9724 | 357.0 | 3570 | 2.3549 |
| 1.9724 | 358.0 | 3580 | 2.7288 |
| 1.9724 | 359.0 | 3590 | 2.3138 |
| 2.1214 | 360.0 | 3600 | 2.6228 |
| 2.1214 | 361.0 | 3610 | 2.5202 |
| 2.1214 | 362.0 | 3620 | 2.3395 |
| 2.1214 | 363.0 | 3630 | 2.7839 |
| 2.1214 | 364.0 | 3640 | 2.3686 |
| 2.0616 | 365.0 | 3650 | 2.1838 |
| 2.0616 | 366.0 | 3660 | 2.1441 |
| 2.0616 | 367.0 | 3670 | 2.3893 |
| 2.0616 | 368.0 | 3680 | 2.3090 |
| 2.0616 | 369.0 | 3690 | 2.5005 |
| 2.0561 | 370.0 | 3700 | 2.5149 |
| 2.0561 | 371.0 | 3710 | 2.4185 |
| 2.0561 | 372.0 | 3720 | 2.2988 |
| 2.0561 | 373.0 | 3730 | 2.5609 |
| 2.0561 | 374.0 | 3740 | 2.4859 |
| 1.9504 | 375.0 | 3750 | 2.6781 |
| 1.9504 | 376.0 | 3760 | 2.4028 |
| 1.9504 | 377.0 | 3770 | 2.2976 |
| 1.9504 | 378.0 | 3780 | 2.6518 |
| 1.9504 | 379.0 | 3790 | 2.4606 |
| 1.9662 | 380.0 | 3800 | 2.0894 |
| 1.9662 | 381.0 | 3810 | 2.7766 |
| 1.9662 | 382.0 | 3820 | 2.6676 |
| 1.9662 | 383.0 | 3830 | 2.3832 |
| 1.9662 | 384.0 | 3840 | 2.3459 |
| 2.007 | 385.0 | 3850 | 2.5191 |
| 2.007 | 386.0 | 3860 | 2.5370 |
| 2.007 | 387.0 | 3870 | 2.3437 |
| 2.007 | 388.0 | 3880 | 2.5367 |
| 2.007 | 389.0 | 3890 | 2.3221 |
| 1.9401 | 390.0 | 3900 | 2.2395 |
| 1.9401 | 391.0 | 3910 | 2.3589 |
| 1.9401 | 392.0 | 3920 | 2.3799 |
| 1.9401 | 393.0 | 3930 | 2.3295 |
| 1.9401 | 394.0 | 3940 | 2.6330 |
| 1.9375 | 395.0 | 3950 | 2.4340 |
| 1.9375 | 396.0 | 3960 | 2.5184 |
| 1.9375 | 397.0 | 3970 | 2.1730 |
| 1.9375 | 398.0 | 3980 | 2.2300 |
| 1.9375 | 399.0 | 3990 | 2.4796 |
| 1.9703 | 400.0 | 4000 | 2.2612 |
| 1.9703 | 401.0 | 4010 | 2.3175 |
| 1.9703 | 402.0 | 4020 | 2.5344 |
| 1.9703 | 403.0 | 4030 | 2.1123 |
| 1.9703 | 404.0 | 4040 | 2.2479 |
| 1.8652 | 405.0 | 4050 | 2.6316 |
| 1.8652 | 406.0 | 4060 | 2.1574 |
| 1.8652 | 407.0 | 4070 | 2.4231 |
| 1.8652 | 408.0 | 4080 | 2.1255 |
| 1.8652 | 409.0 | 4090 | 2.2994 |
| 1.9834 | 410.0 | 4100 | 2.3541 |
| 1.9834 | 411.0 | 4110 | 2.3113 |
| 1.9834 | 412.0 | 4120 | 2.3966 |
| 1.9834 | 413.0 | 4130 | 2.3865 |
| 1.9834 | 414.0 | 4140 | 3.0955 |
| 1.976 | 415.0 | 4150 | 2.6212 |
| 1.976 | 416.0 | 4160 | 2.3237 |
| 1.976 | 417.0 | 4170 | 3.3010 |
| 1.976 | 418.0 | 4180 | 2.7378 |
| 1.976 | 419.0 | 4190 | 2.4063 |
| 1.9165 | 420.0 | 4200 | 2.9853 |
| 1.9165 | 421.0 | 4210 | 2.0776 |
| 1.9165 | 422.0 | 4220 | 2.3036 |
| 1.9165 | 423.0 | 4230 | 2.1934 |
| 1.9165 | 424.0 | 4240 | 2.1535 |
| 1.9224 | 425.0 | 4250 | 2.3000 |
| 1.9224 | 426.0 | 4260 | 2.6858 |
| 1.9224 | 427.0 | 4270 | 2.4825 |
| 1.9224 | 428.0 | 4280 | 2.4776 |
| 1.9224 | 429.0 | 4290 | 2.2042 |
| 1.9091 | 430.0 | 4300 | 2.2847 |
| 1.9091 | 431.0 | 4310 | 2.0935 |
| 1.9091 | 432.0 | 4320 | 2.6040 |
| 1.9091 | 433.0 | 4330 | 2.2520 |
| 1.9091 | 434.0 | 4340 | 2.5126 |
| 1.9543 | 435.0 | 4350 | 2.3081 |
| 1.9543 | 436.0 | 4360 | 2.5018 |
| 1.9543 | 437.0 | 4370 | 2.4462 |
| 1.9543 | 438.0 | 4380 | 2.1927 |
| 1.9543 | 439.0 | 4390 | 2.1584 |
| 1.7975 | 440.0 | 4400 | 2.2996 |
| 1.7975 | 441.0 | 4410 | 2.2288 |
| 1.7975 | 442.0 | 4420 | 2.4102 |
| 1.7975 | 443.0 | 4430 | 2.3321 |
| 1.7975 | 444.0 | 4440 | 1.9341 |
| 1.9595 | 445.0 | 4450 | 2.1064 |
| 1.9595 | 446.0 | 4460 | 2.4024 |
| 1.9595 | 447.0 | 4470 | 2.1377 |
| 1.9595 | 448.0 | 4480 | 2.2580 |
| 1.9595 | 449.0 | 4490 | 2.2505 |
| 1.8746 | 450.0 | 4500 | 2.3562 |
| 1.8746 | 451.0 | 4510 | 2.2730 |
| 1.8746 | 452.0 | 4520 | 2.1447 |
| 1.8746 | 453.0 | 4530 | 2.2458 |
| 1.8746 | 454.0 | 4540 | 2.2136 |
| 2.0722 | 455.0 | 4550 | 2.1459 |
| 2.0722 | 456.0 | 4560 | 1.9991 |
| 2.0722 | 457.0 | 4570 | 2.1572 |
| 2.0722 | 458.0 | 4580 | 2.2700 |
| 2.0722 | 459.0 | 4590 | 2.3094 |
| 1.9179 | 460.0 | 4600 | 2.2721 |
| 1.9179 | 461.0 | 4610 | 2.2809 |
| 1.9179 | 462.0 | 4620 | 2.4517 |
| 1.9179 | 463.0 | 4630 | 2.2500 |
| 1.9179 | 464.0 | 4640 | 2.2107 |
| 2.0428 | 465.0 | 4650 | 2.1489 |
| 2.0428 | 466.0 | 4660 | 2.2571 |
| 2.0428 | 467.0 | 4670 | 2.2047 |
| 2.0428 | 468.0 | 4680 | 2.5041 |
| 2.0428 | 469.0 | 4690 | 2.2354 |
| 1.8738 | 470.0 | 4700 | 2.0811 |
| 1.8738 | 471.0 | 4710 | 2.1300 |
| 1.8738 | 472.0 | 4720 | 2.3041 |
| 1.8738 | 473.0 | 4730 | 2.1780 |
| 1.8738 | 474.0 | 4740 | 2.0481 |
| 1.8625 | 475.0 | 4750 | 2.2354 |
| 1.8625 | 476.0 | 4760 | 2.1670 |
| 1.8625 | 477.0 | 4770 | 2.1575 |
| 1.8625 | 478.0 | 4780 | 2.0797 |
| 1.8625 | 479.0 | 4790 | 2.2353 |
| 1.7743 | 480.0 | 4800 | 2.2478 |
| 1.7743 | 481.0 | 4810 | 2.1120 |
| 1.7743 | 482.0 | 4820 | 2.1790 |
| 1.7743 | 483.0 | 4830 | 3.1939 |
| 1.7743 | 484.0 | 4840 | 2.0575 |
| 1.7955 | 485.0 | 4850 | 2.3685 |
| 1.7955 | 486.0 | 4860 | 2.1021 |
| 1.7955 | 487.0 | 4870 | 2.3043 |
| 1.7955 | 488.0 | 4880 | 2.1155 |
| 1.7955 | 489.0 | 4890 | 2.0982 |
| 1.7685 | 490.0 | 4900 | 2.2740 |
| 1.7685 | 491.0 | 4910 | 2.1216 |
| 1.7685 | 492.0 | 4920 | 2.0764 |
| 1.7685 | 493.0 | 4930 | 2.1182 |
| 1.7685 | 494.0 | 4940 | 2.0343 |
| 1.8806 | 495.0 | 4950 | 2.0229 |
| 1.8806 | 496.0 | 4960 | 2.9971 |
| 1.8806 | 497.0 | 4970 | 2.1848 |
| 1.8806 | 498.0 | 4980 | 2.6586 |
| 1.8806 | 499.0 | 4990 | 2.3622 |
| 1.8554 | 500.0 | 5000 | 2.5255 |
| 1.8554 | 501.0 | 5010 | 2.1792 |
| 1.8554 | 502.0 | 5020 | 2.2098 |
| 1.8554 | 503.0 | 5030 | 3.0466 |
| 1.8554 | 504.0 | 5040 | 2.2054 |
| 1.8123 | 505.0 | 5050 | 2.0846 |
| 1.8123 | 506.0 | 5060 | 2.4480 |
| 1.8123 | 507.0 | 5070 | 2.1692 |
| 1.8123 | 508.0 | 5080 | 2.1262 |
| 1.8123 | 509.0 | 5090 | 2.0610 |
| 1.8189 | 510.0 | 5100 | 2.1438 |
| 1.8189 | 511.0 | 5110 | 1.9691 |
| 1.8189 | 512.0 | 5120 | 1.9818 |
| 1.8189 | 513.0 | 5130 | 2.1824 |
| 1.8189 | 514.0 | 5140 | 2.3053 |
| 1.7296 | 515.0 | 5150 | 2.0095 |
| 1.7296 | 516.0 | 5160 | 2.3895 |
| 1.7296 | 517.0 | 5170 | 2.4203 |
| 1.7296 | 518.0 | 5180 | 2.8143 |
| 1.7296 | 519.0 | 5190 | 1.9249 |
| 1.9353 | 520.0 | 5200 | 1.9745 |
| 1.9353 | 521.0 | 5210 | 2.3712 |
| 1.9353 | 522.0 | 5220 | 2.2221 |
| 1.9353 | 523.0 | 5230 | 2.3223 |
| 1.9353 | 524.0 | 5240 | 2.0649 |
| 1.8203 | 525.0 | 5250 | 2.4524 |
| 1.8203 | 526.0 | 5260 | 2.1729 |
| 1.8203 | 527.0 | 5270 | 2.3503 |
| 1.8203 | 528.0 | 5280 | 1.8859 |
| 1.8203 | 529.0 | 5290 | 2.5795 |
| 1.9042 | 530.0 | 5300 | 2.2665 |
| 1.9042 | 531.0 | 5310 | 1.9231 |
| 1.9042 | 532.0 | 5320 | 2.1896 |
| 1.9042 | 533.0 | 5330 | 2.1866 |
| 1.9042 | 534.0 | 5340 | 2.1273 |
| 1.797 | 535.0 | 5350 | 2.1864 |
| 1.797 | 536.0 | 5360 | 2.1360 |
| 1.797 | 537.0 | 5370 | 2.1195 |
| 1.797 | 538.0 | 5380 | 1.9885 |
| 1.797 | 539.0 | 5390 | 1.9990 |
| 1.8289 | 540.0 | 5400 | 2.0208 |
| 1.8289 | 541.0 | 5410 | 1.9337 |
| 1.8289 | 542.0 | 5420 | 2.0515 |
| 1.8289 | 543.0 | 5430 | 2.3292 |
| 1.8289 | 544.0 | 5440 | 1.8969 |
| 1.7952 | 545.0 | 5450 | 2.0917 |
| 1.7952 | 546.0 | 5460 | 2.2664 |
| 1.7952 | 547.0 | 5470 | 2.1886 |
| 1.7952 | 548.0 | 5480 | 2.2333 |
| 1.7952 | 549.0 | 5490 | 2.1483 |
| 1.8083 | 550.0 | 5500 | 2.2158 |
| 1.8083 | 551.0 | 5510 | 2.2681 |
| 1.8083 | 552.0 | 5520 | 2.7891 |
| 1.8083 | 553.0 | 5530 | 1.9523 |
| 1.8083 | 554.0 | 5540 | 2.2605 |
| 1.8217 | 555.0 | 5550 | 2.4190 |
| 1.8217 | 556.0 | 5560 | 2.1206 |
| 1.8217 | 557.0 | 5570 | 2.5011 |
| 1.8217 | 558.0 | 5580 | 2.1416 |
| 1.8217 | 559.0 | 5590 | 2.1722 |
| 1.7937 | 560.0 | 5600 | 2.0521 |
| 1.7937 | 561.0 | 5610 | 2.1215 |
| 1.7937 | 562.0 | 5620 | 2.7153 |
| 1.7937 | 563.0 | 5630 | 2.1914 |
| 1.7937 | 564.0 | 5640 | 2.1923 |
| 1.7143 | 565.0 | 5650 | 2.4663 |
| 1.7143 | 566.0 | 5660 | 1.9746 |
| 1.7143 | 567.0 | 5670 | 2.0240 |
| 1.7143 | 568.0 | 5680 | 2.5691 |
| 1.7143 | 569.0 | 5690 | 2.3204 |
| 1.6601 | 570.0 | 5700 | 2.1723 |
| 1.6601 | 571.0 | 5710 | 1.9296 |
| 1.6601 | 572.0 | 5720 | 2.1570 |
| 1.6601 | 573.0 | 5730 | 2.1298 |
| 1.6601 | 574.0 | 5740 | 2.3539 |
| 1.8999 | 575.0 | 5750 | 2.1365 |
| 1.8999 | 576.0 | 5760 | 2.0601 |
| 1.8999 | 577.0 | 5770 | 2.0550 |
| 1.8999 | 578.0 | 5780 | 2.5869 |
| 1.8999 | 579.0 | 5790 | 2.1311 |
| 1.6806 | 580.0 | 5800 | 1.9451 |
| 1.6806 | 581.0 | 5810 | 2.1228 |
| 1.6806 | 582.0 | 5820 | 2.3437 |
| 1.6806 | 583.0 | 5830 | 2.3398 |
| 1.6806 | 584.0 | 5840 | 2.1228 |
| 1.7643 | 585.0 | 5850 | 2.0135 |
| 1.7643 | 586.0 | 5860 | 1.9824 |
| 1.7643 | 587.0 | 5870 | 2.2028 |
| 1.7643 | 588.0 | 5880 | 2.4352 |
| 1.7643 | 589.0 | 5890 | 1.9458 |
| 1.803 | 590.0 | 5900 | 2.3152 |
| 1.803 | 591.0 | 5910 | 2.0768 |
| 1.803 | 592.0 | 5920 | 2.2836 |
| 1.803 | 593.0 | 5930 | 2.1446 |
| 1.803 | 594.0 | 5940 | 2.1702 |
| 1.6866 | 595.0 | 5950 | 2.3142 |
| 1.6866 | 596.0 | 5960 | 2.1351 |
| 1.6866 | 597.0 | 5970 | 1.9202 |
| 1.6866 | 598.0 | 5980 | 2.0712 |
| 1.6866 | 599.0 | 5990 | 1.9634 |
| 1.6967 | 600.0 | 6000 | 2.3699 |
| 1.6967 | 601.0 | 6010 | 2.1562 |
| 1.6967 | 602.0 | 6020 | 2.3168 |
| 1.6967 | 603.0 | 6030 | 2.2248 |
| 1.6967 | 604.0 | 6040 | 2.2533 |
| 1.6627 | 605.0 | 6050 | 1.8170 |
| 1.6627 | 606.0 | 6060 | 2.3989 |
| 1.6627 | 607.0 | 6070 | 2.0302 |
| 1.6627 | 608.0 | 6080 | 2.3638 |
| 1.6627 | 609.0 | 6090 | 1.9077 |
| 1.6703 | 610.0 | 6100 | 1.9806 |
| 1.6703 | 611.0 | 6110 | 1.9167 |
| 1.6703 | 612.0 | 6120 | 2.2209 |
| 1.6703 | 613.0 | 6130 | 2.2042 |
| 1.6703 | 614.0 | 6140 | 1.7366 |
| 1.6809 | 615.0 | 6150 | 2.1843 |
| 1.6809 | 616.0 | 6160 | 2.9500 |
| 1.6809 | 617.0 | 6170 | 2.1226 |
| 1.6809 | 618.0 | 6180 | 2.2124 |
| 1.6809 | 619.0 | 6190 | 2.8095 |
| 1.762 | 620.0 | 6200 | 1.9578 |
| 1.762 | 621.0 | 6210 | 2.0715 |
| 1.762 | 622.0 | 6220 | 2.1241 |
| 1.762 | 623.0 | 6230 | 2.4005 |
| 1.762 | 624.0 | 6240 | 1.9467 |
| 1.7518 | 625.0 | 6250 | 1.9363 |
| 1.7518 | 626.0 | 6260 | 2.3800 |
| 1.7518 | 627.0 | 6270 | 2.0086 |
| 1.7518 | 628.0 | 6280 | 2.0844 |
| 1.7518 | 629.0 | 6290 | 1.9936 |
| 1.7146 | 630.0 | 6300 | 2.9278 |
| 1.7146 | 631.0 | 6310 | 2.2130 |
| 1.7146 | 632.0 | 6320 | 1.8916 |
| 1.7146 | 633.0 | 6330 | 1.9770 |
| 1.7146 | 634.0 | 6340 | 1.9727 |
| 1.7078 | 635.0 | 6350 | 2.5519 |
| 1.7078 | 636.0 | 6360 | 1.8578 |
| 1.7078 | 637.0 | 6370 | 2.1396 |
| 1.7078 | 638.0 | 6380 | 2.1651 |
| 1.7078 | 639.0 | 6390 | 1.9666 |
| 1.7668 | 640.0 | 6400 | 2.1160 |
| 1.7668 | 641.0 | 6410 | 2.0328 |
| 1.7668 | 642.0 | 6420 | 2.0711 |
| 1.7668 | 643.0 | 6430 | 2.1058 |
| 1.7668 | 644.0 | 6440 | 2.0504 |
| 1.7245 | 645.0 | 6450 | 2.2605 |
| 1.7245 | 646.0 | 6460 | 2.3964 |
| 1.7245 | 647.0 | 6470 | 2.0940 |
| 1.7245 | 648.0 | 6480 | 2.4811 |
| 1.7245 | 649.0 | 6490 | 2.2603 |
| 1.66 | 650.0 | 6500 | 2.0771 |
| 1.66 | 651.0 | 6510 | 2.0068 |
| 1.66 | 652.0 | 6520 | 1.9992 |
| 1.66 | 653.0 | 6530 | 2.0482 |
| 1.66 | 654.0 | 6540 | 2.1352 |
| 1.6753 | 655.0 | 6550 | 2.0777 |
| 1.6753 | 656.0 | 6560 | 1.9601 |
| 1.6753 | 657.0 | 6570 | 2.0755 |
| 1.6753 | 658.0 | 6580 | 2.0130 |
| 1.6753 | 659.0 | 6590 | 2.5618 |
| 1.6751 | 660.0 | 6600 | 2.0391 |
| 1.6751 | 661.0 | 6610 | 1.9881 |
| 1.6751 | 662.0 | 6620 | 2.0105 |
| 1.6751 | 663.0 | 6630 | 2.0397 |
| 1.6751 | 664.0 | 6640 | 1.9171 |
| 1.7329 | 665.0 | 6650 | 2.1690 |
| 1.7329 | 666.0 | 6660 | 1.9315 |
| 1.7329 | 667.0 | 6670 | 2.4457 |
| 1.7329 | 668.0 | 6680 | 2.0552 |
| 1.7329 | 669.0 | 6690 | 2.1250 |
| 1.7304 | 670.0 | 6700 | 1.9498 |
| 1.7304 | 671.0 | 6710 | 2.1620 |
| 1.7304 | 672.0 | 6720 | 2.1663 |
| 1.7304 | 673.0 | 6730 | 2.2802 |
| 1.7304 | 674.0 | 6740 | 2.0857 |
| 1.6356 | 675.0 | 6750 | 2.2656 |
| 1.6356 | 676.0 | 6760 | 1.9959 |
| 1.6356 | 677.0 | 6770 | 2.0719 |
| 1.6356 | 678.0 | 6780 | 2.0429 |
| 1.6356 | 679.0 | 6790 | 1.9561 |
| 1.6098 | 680.0 | 6800 | 2.3071 |
| 1.6098 | 681.0 | 6810 | 2.2920 |
| 1.6098 | 682.0 | 6820 | 2.1268 |
| 1.6098 | 683.0 | 6830 | 1.9186 |
| 1.6098 | 684.0 | 6840 | 1.8820 |
| 1.6784 | 685.0 | 6850 | 2.1013 |
| 1.6784 | 686.0 | 6860 | 2.0973 |
| 1.6784 | 687.0 | 6870 | 2.3960 |
| 1.6784 | 688.0 | 6880 | 1.8338 |
| 1.6784 | 689.0 | 6890 | 2.0245 |
| 1.689 | 690.0 | 6900 | 2.1786 |
| 1.689 | 691.0 | 6910 | 2.0254 |
| 1.689 | 692.0 | 6920 | 1.9316 |
| 1.689 | 693.0 | 6930 | 1.9776 |
| 1.689 | 694.0 | 6940 | 2.1271 |
| 1.6889 | 695.0 | 6950 | 2.3542 |
| 1.6889 | 696.0 | 6960 | 2.1932 |
| 1.6889 | 697.0 | 6970 | 1.8910 |
| 1.6889 | 698.0 | 6980 | 2.1252 |
| 1.6889 | 699.0 | 6990 | 1.9726 |
| 1.7028 | 700.0 | 7000 | 2.0448 |
| 1.7028 | 701.0 | 7010 | 2.1499 |
| 1.7028 | 702.0 | 7020 | 1.8854 |
| 1.7028 | 703.0 | 7030 | 1.9297 |
| 1.7028 | 704.0 | 7040 | 2.1054 |
| 1.6484 | 705.0 | 7050 | 1.9997 |
| 1.6484 | 706.0 | 7060 | 2.0114 |
| 1.6484 | 707.0 | 7070 | 2.0139 |
| 1.6484 | 708.0 | 7080 | 2.9272 |
| 1.6484 | 709.0 | 7090 | 1.8419 |
| 1.6615 | 710.0 | 7100 | 3.2302 |
| 1.6615 | 711.0 | 7110 | 2.0337 |
| 1.6615 | 712.0 | 7120 | 2.0933 |
| 1.6615 | 713.0 | 7130 | 2.0162 |
| 1.6615 | 714.0 | 7140 | 2.0073 |
| 1.6318 | 715.0 | 7150 | 2.1256 |
| 1.6318 | 716.0 | 7160 | 1.8836 |
| 1.6318 | 717.0 | 7170 | 2.0321 |
| 1.6318 | 718.0 | 7180 | 2.0796 |
| 1.6318 | 719.0 | 7190 | 1.9985 |
| 1.7706 | 720.0 | 7200 | 2.6352 |
| 1.7706 | 721.0 | 7210 | 1.9618 |
| 1.7706 | 722.0 | 7220 | 1.8866 |
| 1.7706 | 723.0 | 7230 | 1.9311 |
| 1.7706 | 724.0 | 7240 | 2.2133 |
| 1.7221 | 725.0 | 7250 | 1.8637 |
| 1.7221 | 726.0 | 7260 | 2.1916 |
| 1.7221 | 727.0 | 7270 | 1.8545 |
| 1.7221 | 728.0 | 7280 | 2.1350 |
| 1.7221 | 729.0 | 7290 | 2.0091 |
| 1.754 | 730.0 | 7300 | 1.9316 |
| 1.754 | 731.0 | 7310 | 2.0585 |
| 1.754 | 732.0 | 7320 | 2.0417 |
| 1.754 | 733.0 | 7330 | 2.1116 |
| 1.754 | 734.0 | 7340 | 2.0630 |
| 1.6204 | 735.0 | 7350 | 1.9218 |
| 1.6204 | 736.0 | 7360 | 2.5058 |
| 1.6204 | 737.0 | 7370 | 2.2771 |
| 1.6204 | 738.0 | 7380 | 1.9493 |
| 1.6204 | 739.0 | 7390 | 2.1200 |
| 1.6891 | 740.0 | 7400 | 2.0596 |
| 1.6891 | 741.0 | 7410 | 2.0757 |
| 1.6891 | 742.0 | 7420 | 1.9904 |
| 1.6891 | 743.0 | 7430 | 2.1336 |
| 1.6891 | 744.0 | 7440 | 2.4599 |
| 1.7584 | 745.0 | 7450 | 2.1578 |
| 1.7584 | 746.0 | 7460 | 1.9749 |
| 1.7584 | 747.0 | 7470 | 2.1406 |
| 1.7584 | 748.0 | 7480 | 2.3524 |
| 1.7584 | 749.0 | 7490 | 2.0798 |
| 1.5819 | 750.0 | 7500 | 1.8948 |
| 1.5819 | 751.0 | 7510 | 1.8562 |
| 1.5819 | 752.0 | 7520 | 3.5239 |
| 1.5819 | 753.0 | 7530 | 2.2157 |
| 1.5819 | 754.0 | 7540 | 2.7353 |
| 1.6342 | 755.0 | 7550 | 2.2190 |
| 1.6342 | 756.0 | 7560 | 2.3935 |
| 1.6342 | 757.0 | 7570 | 2.0825 |
| 1.6342 | 758.0 | 7580 | 2.0174 |
| 1.6342 | 759.0 | 7590 | 1.9563 |
| 1.7279 | 760.0 | 7600 | 2.1491 |
| 1.7279 | 761.0 | 7610 | 1.9795 |
| 1.7279 | 762.0 | 7620 | 1.9805 |
| 1.7279 | 763.0 | 7630 | 1.9753 |
| 1.7279 | 764.0 | 7640 | 2.0721 |
| 1.5626 | 765.0 | 7650 | 2.1229 |
| 1.5626 | 766.0 | 7660 | 2.0831 |
| 1.5626 | 767.0 | 7670 | 2.8723 |
| 1.5626 | 768.0 | 7680 | 1.9799 |
| 1.5626 | 769.0 | 7690 | 2.0792 |
| 1.6589 | 770.0 | 7700 | 1.9836 |
| 1.6589 | 771.0 | 7710 | 1.8836 |
| 1.6589 | 772.0 | 7720 | 2.1195 |
| 1.6589 | 773.0 | 7730 | 2.2073 |
| 1.6589 | 774.0 | 7740 | 1.9880 |
| 1.641 | 775.0 | 7750 | 2.2762 |
| 1.641 | 776.0 | 7760 | 2.0996 |
| 1.641 | 777.0 | 7770 | 2.0157 |
| 1.641 | 778.0 | 7780 | 1.9012 |
| 1.641 | 779.0 | 7790 | 3.4505 |
| 1.8726 | 780.0 | 7800 | 1.9617 |
| 1.8726 | 781.0 | 7810 | 2.0913 |
| 1.8726 | 782.0 | 7820 | 1.9486 |
| 1.8726 | 783.0 | 7830 | 2.0114 |
| 1.8726 | 784.0 | 7840 | 1.9957 |
| 1.6342 | 785.0 | 7850 | 2.1678 |
| 1.6342 | 786.0 | 7860 | 2.1731 |
| 1.6342 | 787.0 | 7870 | 1.9840 |
| 1.6342 | 788.0 | 7880 | 2.2147 |
| 1.6342 | 789.0 | 7890 | 2.4845 |
| 1.6656 | 790.0 | 7900 | 2.0647 |
| 1.6656 | 791.0 | 7910 | 1.9105 |
| 1.6656 | 792.0 | 7920 | 1.9711 |
| 1.6656 | 793.0 | 7930 | 2.8114 |
| 1.6656 | 794.0 | 7940 | 2.1196 |
| 1.7298 | 795.0 | 7950 | 2.0664 |
| 1.7298 | 796.0 | 7960 | 2.2231 |
| 1.7298 | 797.0 | 7970 | 1.9946 |
| 1.7298 | 798.0 | 7980 | 2.3052 |
| 1.7298 | 799.0 | 7990 | 2.4928 |
| 1.7294 | 800.0 | 8000 | 2.0689 |
| 1.7294 | 801.0 | 8010 | 2.1222 |
| 1.7294 | 802.0 | 8020 | 1.9995 |
| 1.7294 | 803.0 | 8030 | 2.0070 |
| 1.7294 | 804.0 | 8040 | 1.8976 |
| 1.6905 | 805.0 | 8050 | 2.0889 |
| 1.6905 | 806.0 | 8060 | 2.0273 |
| 1.6905 | 807.0 | 8070 | 1.8873 |
| 1.6905 | 808.0 | 8080 | 2.5260 |
| 1.6905 | 809.0 | 8090 | 2.0703 |
| 1.6383 | 810.0 | 8100 | 2.1421 |
| 1.6383 | 811.0 | 8110 | 1.9730 |
| 1.6383 | 812.0 | 8120 | 2.2552 |
| 1.6383 | 813.0 | 8130 | 1.8962 |
| 1.6383 | 814.0 | 8140 | 2.0572 |
| 1.6897 | 815.0 | 8150 | 2.0349 |
| 1.6897 | 816.0 | 8160 | 2.0451 |
| 1.6897 | 817.0 | 8170 | 2.0762 |
| 1.6897 | 818.0 | 8180 | 2.0079 |
| 1.6897 | 819.0 | 8190 | 2.1432 |
| 1.5845 | 820.0 | 8200 | 2.5644 |
| 1.5845 | 821.0 | 8210 | 2.1259 |
| 1.5845 | 822.0 | 8220 | 2.1217 |
| 1.5845 | 823.0 | 8230 | 1.8807 |
| 1.5845 | 824.0 | 8240 | 2.2475 |
| 1.6942 | 825.0 | 8250 | 2.7079 |
| 1.6942 | 826.0 | 8260 | 2.1418 |
| 1.6942 | 827.0 | 8270 | 1.9854 |
| 1.6942 | 828.0 | 8280 | 2.1039 |
| 1.6942 | 829.0 | 8290 | 1.9488 |
| 1.5919 | 830.0 | 8300 | 2.1037 |
| 1.5919 | 831.0 | 8310 | 2.0170 |
| 1.5919 | 832.0 | 8320 | 1.8831 |
| 1.5919 | 833.0 | 8330 | 1.7501 |
| 1.5919 | 834.0 | 8340 | 2.5991 |
| 1.6626 | 835.0 | 8350 | 2.0915 |
| 1.6626 | 836.0 | 8360 | 2.0901 |
| 1.6626 | 837.0 | 8370 | 2.0779 |
| 1.6626 | 838.0 | 8380 | 1.9901 |
| 1.6626 | 839.0 | 8390 | 2.1458 |
| 1.5978 | 840.0 | 8400 | 2.1409 |
| 1.5978 | 841.0 | 8410 | 2.2341 |
| 1.5978 | 842.0 | 8420 | 2.3387 |
| 1.5978 | 843.0 | 8430 | 2.0669 |
| 1.5978 | 844.0 | 8440 | 2.1725 |
| 1.6153 | 845.0 | 8450 | 1.9977 |
| 1.6153 | 846.0 | 8460 | 2.3008 |
| 1.6153 | 847.0 | 8470 | 2.0032 |
| 1.6153 | 848.0 | 8480 | 2.0802 |
| 1.6153 | 849.0 | 8490 | 2.1358 |
| 1.6977 | 850.0 | 8500 | 2.2539 |
| 1.6977 | 851.0 | 8510 | 2.3892 |
| 1.6977 | 852.0 | 8520 | 1.8730 |
| 1.6977 | 853.0 | 8530 | 2.4494 |
| 1.6977 | 854.0 | 8540 | 1.7971 |
| 1.6117 | 855.0 | 8550 | 1.8645 |
| 1.6117 | 856.0 | 8560 | 2.1854 |
| 1.6117 | 857.0 | 8570 | 1.7846 |
| 1.6117 | 858.0 | 8580 | 2.0895 |
| 1.6117 | 859.0 | 8590 | 1.9494 |
| 1.6776 | 860.0 | 8600 | 3.0806 |
| 1.6776 | 861.0 | 8610 | 2.5941 |
| 1.6776 | 862.0 | 8620 | 1.8778 |
| 1.6776 | 863.0 | 8630 | 1.9408 |
| 1.6776 | 864.0 | 8640 | 2.0962 |
| 1.7326 | 865.0 | 8650 | 1.8876 |
| 1.7326 | 866.0 | 8660 | 1.9434 |
| 1.7326 | 867.0 | 8670 | 2.0616 |
| 1.7326 | 868.0 | 8680 | 2.4041 |
| 1.7326 | 869.0 | 8690 | 2.8890 |
| 1.6468 | 870.0 | 8700 | 2.1031 |
| 1.6468 | 871.0 | 8710 | 2.1359 |
| 1.6468 | 872.0 | 8720 | 1.8292 |
| 1.6468 | 873.0 | 8730 | 2.0762 |
| 1.6468 | 874.0 | 8740 | 2.1207 |
| 1.7116 | 875.0 | 8750 | 1.8605 |
| 1.7116 | 876.0 | 8760 | 1.8536 |
| 1.7116 | 877.0 | 8770 | 2.0260 |
| 1.7116 | 878.0 | 8780 | 2.6150 |
| 1.7116 | 879.0 | 8790 | 1.9157 |
| 1.5673 | 880.0 | 8800 | 1.9184 |
| 1.5673 | 881.0 | 8810 | 1.9319 |
| 1.5673 | 882.0 | 8820 | 2.4362 |
| 1.5673 | 883.0 | 8830 | 1.9637 |
| 1.5673 | 884.0 | 8840 | 1.8797 |
| 1.7281 | 885.0 | 8850 | 1.9358 |
| 1.7281 | 886.0 | 8860 | 2.0570 |
| 1.7281 | 887.0 | 8870 | 1.8167 |
| 1.7281 | 888.0 | 8880 | 2.4525 |
| 1.7281 | 889.0 | 8890 | 2.0002 |
| 1.6826 | 890.0 | 8900 | 2.1198 |
| 1.6826 | 891.0 | 8910 | 2.0699 |
| 1.6826 | 892.0 | 8920 | 1.9274 |
| 1.6826 | 893.0 | 8930 | 2.1415 |
| 1.6826 | 894.0 | 8940 | 2.2883 |
| 1.6198 | 895.0 | 8950 | 2.0476 |
| 1.6198 | 896.0 | 8960 | 2.2307 |
| 1.6198 | 897.0 | 8970 | 2.0366 |
| 1.6198 | 898.0 | 8980 | 2.2318 |
| 1.6198 | 899.0 | 8990 | 1.8846 |
| 1.6745 | 900.0 | 9000 | 2.1018 |
| 1.6745 | 901.0 | 9010 | 1.9280 |
| 1.6745 | 902.0 | 9020 | 1.9235 |
| 1.6745 | 903.0 | 9030 | 1.9320 |
| 1.6745 | 904.0 | 9040 | 2.0586 |
| 1.6756 | 905.0 | 9050 | 2.2404 |
| 1.6756 | 906.0 | 9060 | 1.7918 |
| 1.6756 | 907.0 | 9070 | 2.0683 |
| 1.6756 | 908.0 | 9080 | 2.1354 |
| 1.6756 | 909.0 | 9090 | 1.8801 |
| 1.6787 | 910.0 | 9100 | 1.9743 |
| 1.6787 | 911.0 | 9110 | 1.9033 |
| 1.6787 | 912.0 | 9120 | 1.9763 |
| 1.6787 | 913.0 | 9130 | 2.4240 |
| 1.6787 | 914.0 | 9140 | 2.1385 |
| 1.7097 | 915.0 | 9150 | 2.1198 |
| 1.7097 | 916.0 | 9160 | 2.0050 |
| 1.7097 | 917.0 | 9170 | 2.2088 |
| 1.7097 | 918.0 | 9180 | 2.1206 |
| 1.7097 | 919.0 | 9190 | 2.0948 |
| 1.6659 | 920.0 | 9200 | 1.8802 |
| 1.6659 | 921.0 | 9210 | 2.1338 |
| 1.6659 | 922.0 | 9220 | 2.1038 |
| 1.6659 | 923.0 | 9230 | 1.9181 |
| 1.6659 | 924.0 | 9240 | 2.7046 |
| 1.6811 | 925.0 | 9250 | 2.0183 |
| 1.6811 | 926.0 | 9260 | 1.8901 |
| 1.6811 | 927.0 | 9270 | 1.9689 |
| 1.6811 | 928.0 | 9280 | 2.0394 |
| 1.6811 | 929.0 | 9290 | 2.2120 |
| 1.6563 | 930.0 | 9300 | 2.0195 |
| 1.6563 | 931.0 | 9310 | 1.9242 |
| 1.6563 | 932.0 | 9320 | 1.9250 |
| 1.6563 | 933.0 | 9330 | 2.0381 |
| 1.6563 | 934.0 | 9340 | 2.0593 |
| 1.6305 | 935.0 | 9350 | 2.0884 |
| 1.6305 | 936.0 | 9360 | 2.2510 |
| 1.6305 | 937.0 | 9370 | 2.1661 |
| 1.6305 | 938.0 | 9380 | 2.1428 |
| 1.6305 | 939.0 | 9390 | 1.9285 |
| 1.7281 | 940.0 | 9400 | 2.2593 |
| 1.7281 | 941.0 | 9410 | 1.9035 |
| 1.7281 | 942.0 | 9420 | 2.1112 |
| 1.7281 | 943.0 | 9430 | 1.8724 |
| 1.7281 | 944.0 | 9440 | 2.1733 |
| 1.7082 | 945.0 | 9450 | 2.0155 |
| 1.7082 | 946.0 | 9460 | 2.3869 |
| 1.7082 | 947.0 | 9470 | 1.8851 |
| 1.7082 | 948.0 | 9480 | 2.0056 |
| 1.7082 | 949.0 | 9490 | 2.2667 |
| 1.6896 | 950.0 | 9500 | 1.8944 |
| 1.6896 | 951.0 | 9510 | 2.1082 |
| 1.6896 | 952.0 | 9520 | 1.9545 |
| 1.6896 | 953.0 | 9530 | 1.8668 |
| 1.6896 | 954.0 | 9540 | 2.0611 |
| 1.6217 | 955.0 | 9550 | 1.9020 |
| 1.6217 | 956.0 | 9560 | 1.9017 |
| 1.6217 | 957.0 | 9570 | 1.8864 |
| 1.6217 | 958.0 | 9580 | 1.8889 |
| 1.6217 | 959.0 | 9590 | 2.1421 |
| 1.6448 | 960.0 | 9600 | 2.0292 |
| 1.6448 | 961.0 | 9610 | 1.9317 |
| 1.6448 | 962.0 | 9620 | 2.1516 |
| 1.6448 | 963.0 | 9630 | 1.9716 |
| 1.6448 | 964.0 | 9640 | 2.1114 |
| 1.6824 | 965.0 | 9650 | 2.1036 |
| 1.6824 | 966.0 | 9660 | 2.0659 |
| 1.6824 | 967.0 | 9670 | 1.9232 |
| 1.6824 | 968.0 | 9680 | 1.9512 |
| 1.6824 | 969.0 | 9690 | 1.9665 |
| 1.6428 | 970.0 | 9700 | 2.0697 |
| 1.6428 | 971.0 | 9710 | 2.4811 |
| 1.6428 | 972.0 | 9720 | 2.2800 |
| 1.6428 | 973.0 | 9730 | 2.0109 |
| 1.6428 | 974.0 | 9740 | 1.9637 |
| 1.6795 | 975.0 | 9750 | 1.7978 |
| 1.6795 | 976.0 | 9760 | 2.2613 |
| 1.6795 | 977.0 | 9770 | 2.0626 |
| 1.6795 | 978.0 | 9780 | 1.9644 |
| 1.6795 | 979.0 | 9790 | 1.9700 |
| 1.668 | 980.0 | 9800 | 2.0342 |
| 1.668 | 981.0 | 9810 | 1.9443 |
| 1.668 | 982.0 | 9820 | 1.9675 |
| 1.668 | 983.0 | 9830 | 1.8887 |
| 1.668 | 984.0 | 9840 | 1.9073 |
| 1.6776 | 985.0 | 9850 | 2.0161 |
| 1.6776 | 986.0 | 9860 | 1.8777 |
| 1.6776 | 987.0 | 9870 | 2.4692 |
| 1.6776 | 988.0 | 9880 | 2.0462 |
| 1.6776 | 989.0 | 9890 | 1.9776 |
| 1.744 | 990.0 | 9900 | 2.0838 |
| 1.744 | 991.0 | 9910 | 2.1438 |
| 1.744 | 992.0 | 9920 | 2.2172 |
| 1.744 | 993.0 | 9930 | 2.4513 |
| 1.744 | 994.0 | 9940 | 1.8723 |
| 1.644 | 995.0 | 9950 | 2.9081 |
| 1.644 | 996.0 | 9960 | 1.8090 |
| 1.644 | 997.0 | 9970 | 1.9621 |
| 1.644 | 998.0 | 9980 | 2.1157 |
| 1.644 | 999.0 | 9990 | 1.9026 |
| 1.6751 | 1000.0 | 10000 | 2.1397 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_100_v1_qqp
|
gokulsrinivasagan
| 2024-12-04T15:43:12Z | 121 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T22:36:39Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: bert_tiny_lda_100_v1_qqp
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE QQP
type: glue
args: qqp
metrics:
- name: Accuracy
type: accuracy
value: 0.8543408360128617
- name: F1
type: f1
value: 0.8063020096700984
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_100_v1_qqp
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_100_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_100_v1) on the GLUE QQP dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3551
- Accuracy: 0.8543
- F1: 0.8063
- Combined Score: 0.8303
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:--------------:|
| 0.4874 | 1.0 | 1422 | 0.4274 | 0.7980 | 0.7125 | 0.7553 |
| 0.388 | 2.0 | 2844 | 0.3786 | 0.8224 | 0.7726 | 0.7975 |
| 0.3354 | 3.0 | 4266 | 0.3613 | 0.8372 | 0.7899 | 0.8136 |
| 0.2928 | 4.0 | 5688 | 0.3564 | 0.8447 | 0.7830 | 0.8139 |
| 0.2583 | 5.0 | 7110 | 0.3614 | 0.8509 | 0.7997 | 0.8253 |
| 0.2277 | 6.0 | 8532 | 0.3551 | 0.8543 | 0.8063 | 0.8303 |
| 0.2014 | 7.0 | 9954 | 0.3854 | 0.8552 | 0.8093 | 0.8322 |
| 0.1784 | 8.0 | 11376 | 0.3979 | 0.8545 | 0.8064 | 0.8305 |
| 0.1578 | 9.0 | 12798 | 0.4261 | 0.8558 | 0.8102 | 0.8330 |
| 0.1403 | 10.0 | 14220 | 0.4443 | 0.8588 | 0.8108 | 0.8348 |
| 0.1246 | 11.0 | 15642 | 0.4678 | 0.8567 | 0.8093 | 0.8330 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
MarsupialAI/Monstral-123B-v2_GGUF
|
MarsupialAI
| 2024-12-04T15:40:43Z | 1,990 | 3 |
transformers
|
[
"transformers",
"gguf",
"chat",
"text-generation",
"en",
"license:other",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] |
text-generation
| 2024-12-03T16:37:58Z |
---
license: other
license_name: mrl
language:
- en
tags:
- chat
pipeline_tag: text-generation
library_name: transformers
---
iMatrix GGUFs for https://huggingface.co/MarsupialAI/Monstral-123B-v2
iMat generated with Kalomaze's groups_merged.txt
|
gokulsrinivasagan/bert_tiny_lda_50_v1_wnli
|
gokulsrinivasagan
| 2024-12-04T15:30:31Z | 122 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_50_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_50_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T22:08:32Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_50_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_50_v1_wnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE WNLI
type: glue
args: wnli
metrics:
- name: Accuracy
type: accuracy
value: 0.5633802816901409
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_50_v1_wnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_50_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_50_v1) on the GLUE WNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6909
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6929 | 1.0 | 3 | 0.6941 | 0.5634 |
| 0.6983 | 2.0 | 6 | 0.7115 | 0.4507 |
| 0.7073 | 3.0 | 9 | 0.7136 | 0.4507 |
| 0.7008 | 4.0 | 12 | 0.6909 | 0.5634 |
| 0.703 | 5.0 | 15 | 0.6937 | 0.5634 |
| 0.6958 | 6.0 | 18 | 0.7127 | 0.4366 |
| 0.6937 | 7.0 | 21 | 0.7092 | 0.4225 |
| 0.6955 | 8.0 | 24 | 0.7010 | 0.4789 |
| 0.6926 | 9.0 | 27 | 0.7048 | 0.4507 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_50_v1_stsb
|
gokulsrinivasagan
| 2024-12-04T15:29:48Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_50_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_50_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T22:07:37Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_50_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- spearmanr
model-index:
- name: bert_tiny_lda_50_v1_stsb
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE STSB
type: glue
args: stsb
metrics:
- name: Spearmanr
type: spearmanr
value: 0.18445543404254133
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_50_v1_stsb
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_50_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_50_v1) on the GLUE STSB dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3765
- Pearson: 0.1882
- Spearmanr: 0.1845
- Combined Score: 0.1863
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Pearson | Spearmanr | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:---------:|:--------------:|
| 2.7864 | 1.0 | 23 | 2.4307 | 0.0696 | 0.0608 | 0.0652 |
| 2.0367 | 2.0 | 46 | 2.7090 | 0.0822 | 0.0803 | 0.0812 |
| 1.9366 | 3.0 | 69 | 2.6122 | 0.1308 | 0.1278 | 0.1293 |
| 1.801 | 4.0 | 92 | 2.3765 | 0.1882 | 0.1845 | 0.1863 |
| 1.6026 | 5.0 | 115 | 2.5150 | 0.1998 | 0.2016 | 0.2007 |
| 1.3683 | 6.0 | 138 | 2.7483 | 0.1906 | 0.1931 | 0.1918 |
| 1.2143 | 7.0 | 161 | 2.5671 | 0.2152 | 0.2193 | 0.2173 |
| 1.0711 | 8.0 | 184 | 3.4697 | 0.1887 | 0.1904 | 0.1895 |
| 1.1482 | 9.0 | 207 | 2.6297 | 0.2285 | 0.2319 | 0.2302 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
asmaRida00/model
|
asmaRida00
| 2024-12-04T15:29:26Z | 76 | 0 |
transformers
|
[
"transformers",
"safetensors",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"trl",
"en",
"license:apache-2.0",
"endpoints_compatible",
"8-bit",
"region:us"
] | null | 2024-12-04T14:28:24Z |
---
base_model: unsloth/llama-3.2-3b-instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** asmaRida00
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3.2-3b-instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
gokulsrinivasagan/bert_tiny_lda_20_v1_sst2
|
gokulsrinivasagan
| 2024-12-04T15:27:06Z | 122 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_20_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_20_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:57:38Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_20_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_20_v1_sst2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE SST2
type: glue
args: sst2
metrics:
- name: Accuracy
type: accuracy
value: 0.8165137614678899
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_20_v1_sst2
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_20_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_20_v1) on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4656
- Accuracy: 0.8165
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4233 | 1.0 | 264 | 0.4932 | 0.7844 |
| 0.2592 | 2.0 | 528 | 0.4656 | 0.8165 |
| 0.1987 | 3.0 | 792 | 0.4725 | 0.8131 |
| 0.161 | 4.0 | 1056 | 0.5317 | 0.8108 |
| 0.1363 | 5.0 | 1320 | 0.5661 | 0.8073 |
| 0.1147 | 6.0 | 1584 | 0.6462 | 0.8085 |
| 0.0959 | 7.0 | 1848 | 0.7103 | 0.7970 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
miasetya/fine_tuned_t5_small_model_sec_5_v8
|
miasetya
| 2024-12-04T15:24:23Z | 68 | 0 |
transformers
|
[
"transformers",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"base_model:finetune:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2024-12-04T15:24:11Z |
---
library_name: transformers
license: apache-2.0
base_model: t5-small
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: fine_tuned_t5_small_model_sec_5_v8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine_tuned_t5_small_model_sec_5_v8
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0291
- Rouge1: 0.4097
- Rouge2: 0.1755
- Rougel: 0.265
- Rougelsum: 0.265
- Gen Len: 91.6842
- Bert F1: 0.8773
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | Bert F1 |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|:-------:|
| 4.0511 | 0.8333 | 40 | 3.4679 | 0.3889 | 0.1606 | 0.2546 | 0.2544 | 76.5053 | 0.8751 |
| 3.5797 | 1.6667 | 80 | 3.2230 | 0.3977 | 0.1652 | 0.2577 | 0.2572 | 83.4368 | 0.8754 |
| 3.412 | 2.5 | 120 | 3.1147 | 0.4011 | 0.1665 | 0.2601 | 0.2595 | 86.7526 | 0.8758 |
| 3.4241 | 3.3333 | 160 | 3.0614 | 0.4082 | 0.1739 | 0.2624 | 0.2622 | 89.2895 | 0.877 |
| 3.3084 | 4.1667 | 200 | 3.0368 | 0.4066 | 0.1733 | 0.2633 | 0.2639 | 89.6579 | 0.8769 |
| 3.3262 | 5.0 | 240 | 3.0291 | 0.4097 | 0.1755 | 0.265 | 0.265 | 91.6842 | 0.8773 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_50_v1_rte
|
gokulsrinivasagan
| 2024-12-04T15:24:08Z | 76 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_50_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_50_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T22:00:09Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_50_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_50_v1_rte
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE RTE
type: glue
args: rte
metrics:
- name: Accuracy
type: accuracy
value: 0.516245487364621
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_50_v1_rte
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_50_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_50_v1) on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6947
- Accuracy: 0.5162
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7102 | 1.0 | 10 | 0.7030 | 0.4513 |
| 0.6931 | 2.0 | 20 | 0.6947 | 0.5162 |
| 0.684 | 3.0 | 30 | 0.6952 | 0.5596 |
| 0.6672 | 4.0 | 40 | 0.6998 | 0.5415 |
| 0.6337 | 5.0 | 50 | 0.7336 | 0.5162 |
| 0.5785 | 6.0 | 60 | 0.8523 | 0.4693 |
| 0.5141 | 7.0 | 70 | 0.8520 | 0.5415 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_50_v1_qqp
|
gokulsrinivasagan
| 2024-12-04T15:23:03Z | 73 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_50_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_50_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:16:30Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_50_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: bert_tiny_lda_50_v1_qqp
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE QQP
type: glue
args: qqp
metrics:
- name: Accuracy
type: accuracy
value: 0.824956715310413
- name: F1
type: f1
value: 0.7710671885614465
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_50_v1_qqp
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_50_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_50_v1) on the GLUE QQP dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3820
- Accuracy: 0.8250
- F1: 0.7711
- Combined Score: 0.7980
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:--------------:|
| 0.4989 | 1.0 | 1422 | 0.4520 | 0.7819 | 0.6719 | 0.7269 |
| 0.403 | 2.0 | 2844 | 0.3968 | 0.8146 | 0.7527 | 0.7836 |
| 0.3459 | 3.0 | 4266 | 0.3820 | 0.8250 | 0.7711 | 0.7980 |
| 0.2995 | 4.0 | 5688 | 0.3891 | 0.8334 | 0.7644 | 0.7989 |
| 0.2618 | 5.0 | 7110 | 0.4069 | 0.8376 | 0.7723 | 0.8050 |
| 0.2277 | 6.0 | 8532 | 0.3923 | 0.8411 | 0.7899 | 0.8155 |
| 0.1997 | 7.0 | 9954 | 0.4387 | 0.8435 | 0.7863 | 0.8149 |
| 0.1755 | 8.0 | 11376 | 0.4638 | 0.8448 | 0.7911 | 0.8180 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_20_v1_qqp
|
gokulsrinivasagan
| 2024-12-04T15:21:16Z | 90 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_20_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_20_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:28:12Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_20_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: bert_tiny_lda_20_v1_qqp
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE QQP
type: glue
args: qqp
metrics:
- name: Accuracy
type: accuracy
value: 0.8355181795696265
- name: F1
type: f1
value: 0.7821386450006552
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_20_v1_qqp
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_20_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_20_v1) on the GLUE QQP dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3641
- Accuracy: 0.8355
- F1: 0.7821
- Combined Score: 0.8088
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:--------------:|
| 0.4896 | 1.0 | 1422 | 0.4397 | 0.7902 | 0.6817 | 0.7359 |
| 0.3891 | 2.0 | 2844 | 0.3806 | 0.8247 | 0.7674 | 0.7960 |
| 0.3332 | 3.0 | 4266 | 0.3641 | 0.8355 | 0.7821 | 0.8088 |
| 0.29 | 4.0 | 5688 | 0.3666 | 0.8448 | 0.7868 | 0.8158 |
| 0.2535 | 5.0 | 7110 | 0.3724 | 0.8485 | 0.7977 | 0.8231 |
| 0.2212 | 6.0 | 8532 | 0.3716 | 0.8517 | 0.8042 | 0.8280 |
| 0.1947 | 7.0 | 9954 | 0.4039 | 0.8528 | 0.8050 | 0.8289 |
| 0.1711 | 8.0 | 11376 | 0.4276 | 0.8535 | 0.7964 | 0.8249 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_5_v1_wnli
|
gokulsrinivasagan
| 2024-12-04T15:13:44Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_5_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_5_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:14:05Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_5_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_5_v1_wnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE WNLI
type: glue
args: wnli
metrics:
- name: Accuracy
type: accuracy
value: 0.4507042253521127
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_5_v1_wnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_5_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_5_v1) on the GLUE WNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7044
- Accuracy: 0.4507
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7046 | 1.0 | 3 | 0.7044 | 0.4507 |
| 0.6988 | 2.0 | 6 | 0.7140 | 0.4789 |
| 0.6966 | 3.0 | 9 | 0.7436 | 0.2958 |
| 0.6905 | 4.0 | 12 | 0.7424 | 0.2394 |
| 0.6915 | 5.0 | 15 | 0.7469 | 0.2958 |
| 0.6853 | 6.0 | 18 | 0.7605 | 0.2113 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/distilbert_lda_20_v1_mnli
|
gokulsrinivasagan
| 2024-12-04T15:13:12Z | 119 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_20_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_20_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:03:19Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_20_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_lda_20_v1_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.7496948738812043
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_20_v1_mnli
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_20_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_20_v1) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6113
- Accuracy: 0.7497
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.7997 | 1.0 | 1534 | 0.7015 | 0.6993 |
| 0.6366 | 2.0 | 3068 | 0.6519 | 0.7348 |
| 0.5415 | 3.0 | 4602 | 0.6345 | 0.7454 |
| 0.4571 | 4.0 | 6136 | 0.6564 | 0.7495 |
| 0.3787 | 5.0 | 7670 | 0.7103 | 0.7490 |
| 0.3108 | 6.0 | 9204 | 0.7443 | 0.7472 |
| 0.2508 | 7.0 | 10738 | 0.8307 | 0.7462 |
| 0.2051 | 8.0 | 12272 | 0.9429 | 0.7401 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
AstraMindAI/xtts2-gpt
|
AstraMindAI
| 2024-12-04T15:09:44Z | 9,974 | 6 | null |
[
"safetensors",
"xtts_gpt",
"custom_code",
"base_model:coqui/XTTS-v2",
"base_model:finetune:coqui/XTTS-v2",
"license:apache-2.0",
"region:us"
] | null | 2024-10-24T09:39:52Z |
---
license: apache-2.0
base_model:
- coqui/XTTS-v2
---
# Auralis 🌌
## Model Details 🛠️
**Model Name:** Auralis
**Model Architecture:** Based on [Coqui XTTS-v2](https://huggingface.co/coqui/XTTS-v2)
**License:**
- license: Apache 2.0
- base_model: XTTS-v2 Components [Coqui AI License](https://coqui.ai/cpml)
**Language Support:** English, Spanish, French, German, Italian, Portuguese, Polish, Turkish, Russian, Dutch, Czech, Arabic, Chinese (Simplified), Hungarian, Korean, Japanese, Hindi
**Developed by:** [AstraMind.ai](https://www.astramind.ai)
**GitHub:** [AstraMind AI](https://github.com/astramind-ai/Auralis/tree/main)
**Primary Use Case:** Text-to-Speech (TTS) generation for real-world applications, including books, dialogues, and multilingual tasks.
---
## Model Description 🚀
Auralis transforms text into natural, high-quality speech with exceptional speed and scalability. It is powered by [Coqui XTTS-v2](https://huggingface.co/coqui/XTTS-v2) and optimized for both consumer-grade and high-performance GPUs. Auralis is designed to meet real-world needs like long-text processing, voice cloning, and concurrent request handling.
### Key Features:
- **Warp-Speed Processing:** Generate speech for an entire novel (e.g., Harry Potter) in ~10 minutes.
- **Hardware Friendly:** Requires <10GB VRAM on a single NVIDIA RTX 3090.
- **Scalable:** Handles multiple requests simultaneously.
- **Streaming:** Seamlessly processes long texts in a streaming format.
- **Custom Voices:** Enables voice cloning from short reference audio.
---
## Quick Start ⭐
```python
from auralis import TTS, TTSRequest
# Initialize the model
tts = TTS().from_pretrained("AstraMindAI/xtts2-gpt")
# Create a TTS request
request = TTSRequest(
text="Hello Earth! This is Auralis speaking.",
speaker_files=["reference.wav"]
)
# Generate speech
output = tts.generate_speech(request)
output.save("output.wav")
```
---
## Ebook Generation 📚
Auralis converting ebooks into audio formats at lightning speed. For Python script, check out [ebook_audio_generator.py](https://github.com/astramind-ai/Auralis/blob/main/examples/vocalize_a_ebook.py).
```python
def process_book(chapter_file: str, speaker_file: str):
# Read chapter
with open(chapter_file, 'r') as f:
chapter = f.read()
# You can pass the whole book, auralis will take care of splitting
request = TTSRequest(
text=chapter,
speaker_files=[speaker_file],
audio_config=AudioPreprocessingConfig(
enhance_speech=True,
normalize=True
)
)
output = tts.generate_speech(request)
output.play()
output.save("chapter_output.wav")
# Example usage
process_book("chapter1.txt", "reference_voice.wav")
```
---
## Intended Use 🌟
Auralis is designed for:
- **Content Creators:** Generate audiobooks, podcasts, or voiceovers.
- **Developers:** Integrate TTS into applications via a simple Python API.
- **Accessibility**: Providing audio versions of digital content for people with visual or reading difficulties.
- **Multilingual Scenarios:** Convert text to speech in multiple supported languages.
---
## Performance 📊
**Benchmarks on NVIDIA RTX 3090:**
- Short phrases (<100 characters): ~1 second
- Medium texts (<1,000 characters): ~5-10 seconds
- Full books (~100,000 characters): ~10 minutes
**Memory Usage:**
- Base VRAM: ~4GB
- Peak VRAM: ~10GB
---
## Model Features 🛸
1. **Speed & Efficiency:**
- Smart batching for rapid processing of long texts.
- Memory-optimized for consumer GPUs.
2. **Easy Integration:**
- Python API with support for synchronous and asynchronous workflows.
- Streaming mode for continuous playback during generation.
3. **Audio Quality Enhancements:**
- Background noise reduction.
- Voice clarity and volume normalization.
- Customizable audio preprocessing.
4. **Multilingual Support:**
- Automatic language detection.
- High-quality speech in 15+ languages.
5. **Customization:**
- Voice cloning using short reference clips.
- Adjustable parameters for tone, pacing, and language.
---
## Limitations & Ethical Considerations ⚠️
- **Voice Cloning Risks:** Auralis supports voice cloning, which may raise ethical concerns about misuse. Use responsibly and ensure proper consent.
- **Accent Limitations:** While robust for many languages, accents and intonations may vary based on the input.
---
## Citation 📜
If you use Auralis in your research or projects, please cite:
```bibtex
@misc{auralis2024,
author = {AstraMind AI},
title = {Auralis: High-Performance Text-to-Speech Engine},
year = {2024},
url = {https://huggingface.co/AstraMindAI/auralis}
}
```
|
gokulsrinivasagan/distilbert_lda_100_v1_mnli
|
gokulsrinivasagan
| 2024-12-04T15:00:49Z | 120 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:03:14Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_lda_100_v1_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.7477624084621644
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_100_v1_mnli
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_100_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_100_v1) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6210
- Accuracy: 0.7478
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.8015 | 1.0 | 1534 | 0.7065 | 0.6998 |
| 0.6395 | 2.0 | 3068 | 0.6530 | 0.7296 |
| 0.5434 | 3.0 | 4602 | 0.6438 | 0.7388 |
| 0.459 | 4.0 | 6136 | 0.6610 | 0.7388 |
| 0.3802 | 5.0 | 7670 | 0.7116 | 0.7474 |
| 0.3083 | 6.0 | 9204 | 0.7747 | 0.7442 |
| 0.2483 | 7.0 | 10738 | 0.8570 | 0.7382 |
| 0.202 | 8.0 | 12272 | 0.9470 | 0.7383 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
maghrane/speecht5_finetuned_marar1000
|
maghrane
| 2024-12-04T14:54:54Z | 76 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"speecht5",
"text-to-audio",
"generated_from_trainer",
"base_model:microsoft/speecht5_tts",
"base_model:finetune:microsoft/speecht5_tts",
"license:mit",
"endpoints_compatible",
"region:us"
] |
text-to-audio
| 2024-12-04T14:24:45Z |
---
library_name: transformers
license: mit
base_model: microsoft/speecht5_tts
tags:
- generated_from_trainer
model-index:
- name: speecht5_finetuned_marar1000
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# speecht5_finetuned_marar1000
This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4988
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- training_steps: 1000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-------:|:----:|:---------------:|
| 0.7169 | 3.1873 | 100 | 0.6299 |
| 0.5985 | 6.3745 | 200 | 0.5623 |
| 0.5607 | 9.5618 | 300 | 0.5406 |
| 0.5473 | 12.7490 | 400 | 0.5500 |
| 0.5191 | 15.9363 | 500 | 0.5234 |
| 0.5276 | 19.1235 | 600 | 0.5260 |
| 0.5116 | 22.3108 | 700 | 0.5064 |
| 0.504 | 25.4980 | 800 | 0.5191 |
| 0.4838 | 28.6853 | 900 | 0.5001 |
| 0.4825 | 31.8725 | 1000 | 0.4988 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_100_v1_cola
|
gokulsrinivasagan
| 2024-12-04T14:53:13Z | 106 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T22:26:48Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
- accuracy
model-index:
- name: bert_tiny_lda_100_v1_cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE COLA
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.07380739515541786
- name: Accuracy
type: accuracy
value: 0.6941514611244202
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_100_v1_cola
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_100_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_100_v1) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6106
- Matthews Correlation: 0.0738
- Accuracy: 0.6942
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|:--------:|
| 0.6154 | 1.0 | 34 | 0.6166 | 0.0 | 0.6913 |
| 0.6048 | 2.0 | 68 | 0.6133 | 0.0 | 0.6913 |
| 0.5957 | 3.0 | 102 | 0.6106 | 0.0738 | 0.6942 |
| 0.5772 | 4.0 | 136 | 0.6261 | 0.0274 | 0.6903 |
| 0.5353 | 5.0 | 170 | 0.6314 | 0.1252 | 0.6692 |
| 0.496 | 6.0 | 204 | 0.6528 | 0.0916 | 0.6261 |
| 0.4518 | 7.0 | 238 | 0.6883 | 0.0837 | 0.6251 |
| 0.4203 | 8.0 | 272 | 0.7735 | 0.0681 | 0.6405 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
fbaldassarri/meta-llama_Llama-3.2-3B-auto_awq-int4-gs128-sym
|
fbaldassarri
| 2024-12-04T14:51:57Z | 95 | 0 |
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"autoround",
"autoawq",
"awq",
"gptq",
"woq",
"meta",
"pytorch",
"llama-3",
"intel-autoround",
"intel",
"en",
"de",
"fr",
"it",
"pt",
"hi",
"es",
"th",
"base_model:meta-llama/Llama-3.2-3B",
"base_model:quantized:meta-llama/Llama-3.2-3B",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] |
text-generation
| 2024-12-04T14:50:26Z |
---
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
license: llama3.2
library_name: transformers
tags:
- autoround
- autoawq
- awq
- gptq
- woq
- meta
- pytorch
- llama
- llama-3
- intel-autoround
- intel
model_name: Llama 3.2 3B
base_model: meta-llama/Llama-3.2-3B
inference: false
model_creator: meta-llama
pipeline_tag: text-generation
prompt_template: '{prompt}
'
quantized_by: fbaldassarri
---
## Model Information
Quantized version of [meta-llama/Llama-3.2-3B](meta-llama/Llama-3.2-3B) using torch.float32 for quantization tuning.
- 4 bits (INT4)
- group size = 128
- Symmetrical Quantization
- Method AutoAWQ
Quantization framework: [Intel AutoRound](https://github.com/intel/auto-round)
Note: this INT4 version of Llama-3.2-3B has been quantized to run inference through CPU.
## Replication Recipe
### Step 1 Install Requirements
I suggest to install requirements into a dedicated python-virtualenv or a conda enviroment.
```
python -m pip install <package> --upgrade
```
- accelerate==1.0.1
- auto_gptq==0.7.1
- neural_compressor==3.1
- torch==2.3.0+cpu
- torchaudio==2.5.0+cpu
- torchvision==0.18.0+cpu
- transformers==4.45.2
### Step 2 Build Intel Autoround wheel from sources
```
python -m pip install git+https://github.com/intel/auto-round.git
```
### Step 3 Script for Quantization
```
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "meta-llama/Llama-3.2-3B"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
from auto_round import AutoRound
bits, group_size, sym = 4, 128, True
autoround = AutoRound(model, tokenizer, nsamples=128, iters=200, seqlen=512, batch_size=4, bits=bits, group_size=group_size, sym=sym)
autoround.quantize()
output_dir = "./AutoRound/meta-llama_Llama-3.2-3B-auto_awq-int4-gs128-sym"
autoround.save_quantized(output_dir, format='auto_awq', inplace=True)
```
## License
[Llama 3.2 Community License](https://github.com/meta-llama/llama-models/blob/main/models/llama3_2/LICENSE)
## Disclaimer
This quantized model comes with no warrenty. It has been developed only for research purposes.
|
gokulsrinivasagan/bert_base_lda_100_v1_mnli
|
gokulsrinivasagan
| 2024-12-04T14:51:41Z | 11 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_base_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/bert_base_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T19:56:36Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_base_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_base_lda_100_v1_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.7162327095199349
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_base_lda_100_v1_mnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_base_lda_100_v1](https://huggingface.co/gokulsrinivasagan/bert_base_lda_100_v1) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6799
- Accuracy: 0.7162
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.9588 | 1.0 | 1534 | 0.8420 | 0.6249 |
| 0.7857 | 2.0 | 3068 | 0.7451 | 0.6808 |
| 0.6825 | 3.0 | 4602 | 0.7162 | 0.6976 |
| 0.5973 | 4.0 | 6136 | 0.7056 | 0.7113 |
| 0.5208 | 5.0 | 7670 | 0.7460 | 0.7144 |
| 0.4464 | 6.0 | 9204 | 0.7907 | 0.7078 |
| 0.3775 | 7.0 | 10738 | 0.8362 | 0.7172 |
| 0.316 | 8.0 | 12272 | 0.9463 | 0.7101 |
| 0.2617 | 9.0 | 13806 | 1.0094 | 0.7111 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
miasetya/fine_tuned_t5_small_model_sec_5_v7
|
miasetya
| 2024-12-04T14:50:43Z | 106 | 0 |
transformers
|
[
"transformers",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"base_model:finetune:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2024-12-04T14:50:32Z |
---
library_name: transformers
license: apache-2.0
base_model: t5-small
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: fine_tuned_t5_small_model_sec_5_v7
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine_tuned_t5_small_model_sec_5_v7
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.9444
- Rouge1: 0.4164
- Rouge2: 0.1713
- Rougel: 0.2629
- Rougelsum: 0.263
- Gen Len: 97.6789
- Bert F1: 0.8782
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | Bert F1 |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:--------:|:-------:|
| 3.7935 | 0.8333 | 40 | 3.2861 | 0.4017 | 0.1617 | 0.2553 | 0.2555 | 85.1947 | 0.8757 |
| 3.4033 | 1.6667 | 80 | 3.0832 | 0.4134 | 0.1674 | 0.2577 | 0.258 | 96.5105 | 0.8765 |
| 3.2373 | 2.5 | 120 | 3.0022 | 0.4197 | 0.1746 | 0.2653 | 0.2655 | 100.2632 | 0.8777 |
| 3.1774 | 3.3333 | 160 | 2.9664 | 0.4146 | 0.1695 | 0.262 | 0.2619 | 98.3895 | 0.8773 |
| 3.1783 | 4.1667 | 200 | 2.9495 | 0.4151 | 0.1708 | 0.2633 | 0.2633 | 97.4684 | 0.878 |
| 3.1614 | 5.0 | 240 | 2.9444 | 0.4164 | 0.1713 | 0.2629 | 0.263 | 97.6789 | 0.8782 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.3
|
code135/scene_segmentation
|
code135
| 2024-12-04T14:49:36Z | 32 | 0 |
transformers
|
[
"transformers",
"tf",
"segformer",
"generated_from_keras_callback",
"base_model:nvidia/mit-b0",
"base_model:finetune:nvidia/mit-b0",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2024-11-14T08:08:35Z |
---
library_name: transformers
license: other
base_model: nvidia/mit-b0
tags:
- generated_from_keras_callback
model-index:
- name: code135/scene_segmentation
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# code135/scene_segmentation
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: nan
- Validation Loss: nan
- Validation Mean Iou: 0.0183
- Validation Mean Accuracy: 0.1667
- Validation Overall Accuracy: 0.1607
- Validation Accuracy Ciel: 1.0
- Validation Accuracy Vegetation: 0.0
- Validation Accuracy Batiment peu vitre (<50%): 0.0
- Validation Accuracy Batiment tres vitre (>50%): 0.0
- Validation Accuracy Couvert: 0.0
- Validation Accuracy Autre: 0.0
- Validation Iou Ciel: 0.1098
- Validation Iou Vegetation: 0.0
- Validation Iou Batiment peu vitre (<50%): 0.0
- Validation Iou Batiment tres vitre (>50%): 0.0
- Validation Iou Couvert: 0.0
- Validation Iou Autre: 0.0
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 6e-05, 'decay_steps': 120, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Accuracy Ciel | Validation Accuracy Vegetation | Validation Accuracy Batiment peu vitre (<50%) | Validation Accuracy Batiment tres vitre (>50%) | Validation Accuracy Couvert | Validation Accuracy Autre | Validation Iou Ciel | Validation Iou Vegetation | Validation Iou Batiment peu vitre (<50%) | Validation Iou Batiment tres vitre (>50%) | Validation Iou Couvert | Validation Iou Autre | Epoch |
|:----------:|:---------------:|:-------------------:|:------------------------:|:---------------------------:|:------------------------:|:------------------------------:|:---------------------------------------------:|:----------------------------------------------:|:---------------------------:|:-------------------------:|:-------------------:|:-------------------------:|:----------------------------------------:|:-----------------------------------------:|:----------------------:|:--------------------:|:-----:|
| nan | nan | 0.0183 | 0.1667 | 0.1607 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 |
| nan | nan | 0.0183 | 0.1667 | 0.1607 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1 |
| nan | nan | 0.0183 | 0.1667 | 0.1607 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 2 |
### Framework versions
- Transformers 4.46.2
- TensorFlow 2.17.1
- Datasets 3.1.0
- Tokenizers 0.20.3
|
fbaldassarri/meta-llama_Llama-3.2-3B-auto_gptq-int4-gs128-sym
|
fbaldassarri
| 2024-12-04T14:48:10Z | 78 | 0 |
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"autoround",
"autogptq",
"gptq",
"woq",
"meta",
"pytorch",
"llama-3",
"intel-autoround",
"intel",
"en",
"de",
"fr",
"it",
"pt",
"hi",
"es",
"th",
"base_model:meta-llama/Llama-3.2-3B",
"base_model:quantized:meta-llama/Llama-3.2-3B",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] |
text-generation
| 2024-12-04T14:47:01Z |
---
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
license: llama3.2
library_name: transformers
tags:
- autoround
- autogptq
- gptq
- woq
- meta
- pytorch
- llama
- llama-3
- intel-autoround
- intel
model_name: Llama 3.2 3B
base_model: meta-llama/Llama-3.2-3B
inference: false
model_creator: meta-llama
pipeline_tag: text-generation
prompt_template: '{prompt}
'
quantized_by: fbaldassarri
---
## Model Information
Quantized version of [meta-llama/Llama-3.2-3B](meta-llama/Llama-3.2-3B) using torch.float32 for quantization tuning.
- 4 bits (INT4)
- group size = 128
- Symmetrical Quantization
- Method AutoGPTQ
Quantization framework: [Intel AutoRound](https://github.com/intel/auto-round)
Note: this INT4 version of Llama-3.2-3B has been quantized to run inference through CPU.
## Replication Recipe
### Step 1 Install Requirements
I suggest to install requirements into a dedicated python-virtualenv or a conda enviroment.
```
python -m pip install <package> --upgrade
```
- accelerate==1.0.1
- auto_gptq==0.7.1
- neural_compressor==3.1
- torch==2.3.0+cpu
- torchaudio==2.5.0+cpu
- torchvision==0.18.0+cpu
- transformers==4.45.2
### Step 2 Build Intel Autoround wheel from sources
```
python -m pip install git+https://github.com/intel/auto-round.git
```
### Step 3 Script for Quantization
```
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "meta-llama/Llama-3.2-3B"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
from auto_round import AutoRound
bits, group_size, sym = 4, 128, True
autoround = AutoRound(model, tokenizer, nsamples=128, iters=200, seqlen=512, batch_size=4, bits=bits, group_size=group_size, sym=sym)
autoround.quantize()
output_dir = "./AutoRound/meta-llama_Llama-3.2-3B-auto_gptq-int4-gs128-sym"
autoround.save_quantized(output_dir, format='auto_gptq', inplace=True)
```
## License
[Llama 3.2 Community License](https://github.com/meta-llama/llama-models/blob/main/models/llama3_2/LICENSE)
## Disclaimer
This quantized model comes with no warrenty. It has been developed only for research purposes.
|
fbaldassarri/meta-llama_Llama-3.2-3B-auto_round-int4-gs128-sym
|
fbaldassarri
| 2024-12-04T14:45:07Z | 77 | 0 |
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"autoround",
"intel",
"gptq",
"woq",
"meta",
"pytorch",
"llama-3",
"intel-autoround",
"en",
"de",
"fr",
"it",
"pt",
"hi",
"es",
"th",
"base_model:meta-llama/Llama-3.2-3B",
"base_model:quantized:meta-llama/Llama-3.2-3B",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"intel/auto-round",
"region:us"
] |
text-generation
| 2024-12-04T14:43:57Z |
---
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
license: llama3.2
library_name: transformers
tags:
- autoround
- intel
- gptq
- woq
- meta
- pytorch
- llama
- llama-3
- intel-autoround
model_name: Llama 3.2 3B
base_model: meta-llama/Llama-3.2-3B
inference: false
model_creator: meta-llama
pipeline_tag: text-generation
prompt_template: '{prompt}
'
quantized_by: fbaldassarri
---
## Model Information
Quantized version of [meta-llama/Llama-3.2-3B](meta-llama/Llama-3.2-3B) using torch.float32 for quantization tuning.
- 4 bits (INT4)
- group size = 128
- Symmetrical Quantization
- Method WoQ (AutoRound format)
Fast and low memory, 2-3X speedup (slight accuracy drop at W4G128)
Quantization framework: [Intel AutoRound](https://github.com/intel/auto-round)
Note: this INT4 version of Llama-3.2-3B has been quantized to run inference through CPU.
## Replication Recipe
### Step 1 Install Requirements
I suggest to install requirements into a dedicated python-virtualenv or a conda enviroment.
```
python -m pip install <package> --upgrade
```
- accelerate==1.0.1
- auto_gptq==0.7.1
- neural_compressor==3.1
- torch==2.3.0+cpu
- torchaudio==2.5.0+cpu
- torchvision==0.18.0+cpu
- transformers==4.45.2
### Step 2 Build Intel Autoround wheel from sources
```
python -m pip install git+https://github.com/intel/auto-round.git
```
### Step 3 Script for Quantization
```
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "meta-llama/Llama-3.2-3B"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
from auto_round import AutoRound
bits, group_size, sym = 4, 128, True
autoround = AutoRound(model, tokenizer, nsamples=128, iters=200, seqlen=512, batch_size=4, bits=bits, group_size=group_size, sym=sym)
autoround.quantize()
output_dir = "./AutoRound/meta-llama_Llama-3.2-3B-auto_round-int4-gs128-sym"
autoround.save_quantized(output_dir, format='auto_round', inplace=True)
```
## License
[Llama 3.2 Community License](https://github.com/meta-llama/llama-models/blob/main/models/llama3_2/LICENSE)
## Disclaimer
This quantized model comes with no warrenty. It has been developed only for research purposes.
|
fbaldassarri/meta-llama_Llama-3.2-3B-auto_round-int4-gs128-asym
|
fbaldassarri
| 2024-12-04T14:43:30Z | 79 | 0 |
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"autoround",
"intel",
"gptq",
"woq",
"meta",
"pytorch",
"llama-3",
"intel-autoround",
"en",
"de",
"fr",
"it",
"pt",
"hi",
"es",
"th",
"base_model:meta-llama/Llama-3.2-3B",
"base_model:quantized:meta-llama/Llama-3.2-3B",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"intel/auto-round",
"region:us"
] |
text-generation
| 2024-12-04T14:42:06Z |
---
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
license: llama3.2
library_name: transformers
tags:
- autoround
- intel
- gptq
- woq
- meta
- pytorch
- llama
- llama-3
- intel-autoround
model_name: Llama 3.2 3B
base_model: meta-llama/Llama-3.2-3B
inference: false
model_creator: meta-llama
pipeline_tag: text-generation
prompt_template: '{prompt}
'
quantized_by: fbaldassarri
---
## Model Information
Quantized version of [meta-llama/Llama-3.2-3B](meta-llama/Llama-3.2-3B) using torch.float32 for quantization tuning.
- 4 bits (INT4)
- group size = 128
- Asymmetrical Quantization
- Method WoQ (AutoRound format)
Fast and low memory, 2-3X speedup (slight accuracy drop at W4G128)
Quantization framework: [Intel AutoRound](https://github.com/intel/auto-round)
Note: this INT4 version of Llama-3.2-3B has been quantized to run inference through CPU.
## Replication Recipe
### Step 1 Install Requirements
I suggest to install requirements into a dedicated python-virtualenv or a conda enviroment.
```
python -m pip install <package> --upgrade
```
- accelerate==1.0.1
- auto_gptq==0.7.1
- neural_compressor==3.1
- torch==2.3.0+cpu
- torchaudio==2.5.0+cpu
- torchvision==0.18.0+cpu
- transformers==4.45.2
### Step 2 Build Intel Autoround wheel from sources
```
python -m pip install git+https://github.com/intel/auto-round.git
```
### Step 3 Script for Quantization
```
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "meta-llama/Llama-3.2-3B"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
from auto_round import AutoRound
bits, group_size, sym = 4, 128, False
autoround = AutoRound(model, tokenizer, nsamples=128, iters=200, seqlen=512, batch_size=4, bits=bits, group_size=group_size, sym=sym)
autoround.quantize()
output_dir = "./AutoRound/meta-llama_Llama-3.2-3B-auto_round-int4-gs128-asym"
autoround.save_quantized(output_dir, format='auto_round', inplace=True)
```
## License
[Llama 3.2 Community License](https://github.com/meta-llama/llama-models/blob/main/models/llama3_2/LICENSE)
## Disclaimer
This quantized model comes with no warrenty. It has been developed only for research purposes.
|
gokulsrinivasagan/bert_base_lda_50_v1_mnli
|
gokulsrinivasagan
| 2024-12-04T14:40:39Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_base_lda_50_v1",
"base_model:finetune:gokulsrinivasagan/bert_base_lda_50_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T19:49:30Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_base_lda_50_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_base_lda_50_v1_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.6771765663140765
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_base_lda_50_v1_mnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_base_lda_50_v1](https://huggingface.co/gokulsrinivasagan/bert_base_lda_50_v1) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7495
- Accuracy: 0.6772
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.9617 | 1.0 | 1534 | 0.8662 | 0.6088 |
| 0.814 | 2.0 | 3068 | 0.8016 | 0.6440 |
| 0.7181 | 3.0 | 4602 | 0.7586 | 0.6704 |
| 0.6352 | 4.0 | 6136 | 0.7738 | 0.6728 |
| 0.5553 | 5.0 | 7670 | 0.8012 | 0.6811 |
| 0.4748 | 6.0 | 9204 | 0.8789 | 0.6837 |
| 0.3985 | 7.0 | 10738 | 0.9567 | 0.6792 |
| 0.3311 | 8.0 | 12272 | 1.0359 | 0.6737 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_20_v1_cola
|
gokulsrinivasagan
| 2024-12-04T14:40:16Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_20_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_20_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:11:19Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_20_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
- accuracy
model-index:
- name: bert_tiny_lda_20_v1_cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE COLA
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.0
- name: Accuracy
type: accuracy
value: 0.6912751793861389
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_20_v1_cola
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_20_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_20_v1) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6166
- Matthews Correlation: 0.0
- Accuracy: 0.6913
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|:--------:|
| 0.6144 | 1.0 | 34 | 0.6166 | 0.0 | 0.6913 |
| 0.605 | 2.0 | 68 | 0.6190 | 0.0213 | 0.6903 |
| 0.5934 | 3.0 | 102 | 0.6171 | 0.0043 | 0.6759 |
| 0.567 | 4.0 | 136 | 0.6516 | 0.0362 | 0.6836 |
| 0.518 | 5.0 | 170 | 0.6389 | 0.0675 | 0.6692 |
| 0.4781 | 6.0 | 204 | 0.7010 | 0.0997 | 0.6663 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/bert_tiny_lda_5_v1_qnli
|
gokulsrinivasagan
| 2024-12-04T14:37:24Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_5_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_5_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T20:23:50Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_5_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_5_v1_qnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE QNLI
type: glue
args: qnli
metrics:
- name: Accuracy
type: accuracy
value: 0.6587955335896027
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_5_v1_qnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_5_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_5_v1) on the GLUE QNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6180
- Accuracy: 0.6588
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6621 | 1.0 | 410 | 0.6380 | 0.6324 |
| 0.6225 | 2.0 | 820 | 0.6180 | 0.6588 |
| 0.5606 | 3.0 | 1230 | 0.6307 | 0.6562 |
| 0.482 | 4.0 | 1640 | 0.6562 | 0.6544 |
| 0.401 | 5.0 | 2050 | 0.7177 | 0.6614 |
| 0.3259 | 6.0 | 2460 | 0.8533 | 0.6612 |
| 0.2652 | 7.0 | 2870 | 0.9130 | 0.6553 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
DevQuasar/meta-llama.Llama-3.2-1B-GGUF
|
DevQuasar
| 2024-12-04T14:34:02Z | 160 | 0 | null |
[
"gguf",
"text-generation",
"base_model:meta-llama/Llama-3.2-1B",
"base_model:quantized:meta-llama/Llama-3.2-1B",
"license:llama3.2",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-09-26T00:34:43Z |
---
license: llama3.2
base_model:
- meta-llama/Llama-3.2-1B
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
mradermacher/34b-beta-i1-GGUF
|
mradermacher
| 2024-12-04T14:33:23Z | 8 | 1 |
transformers
|
[
"transformers",
"gguf",
"en",
"base_model:CausalLM/34b-beta",
"base_model:quantized:CausalLM/34b-beta",
"license:gpl-3.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-12-04T11:48:41Z |
---
base_model: CausalLM/34b-beta
language:
- en
library_name: transformers
license: gpl-3.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/CausalLM/34b-beta
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/34b-beta-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ1_S.gguf) | i1-IQ1_S | 7.6 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ1_M.gguf) | i1-IQ1_M | 8.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 9.4 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ2_XS.gguf) | i1-IQ2_XS | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ2_S.gguf) | i1-IQ2_S | 11.0 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ2_M.gguf) | i1-IQ2_M | 11.9 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q2_K_S.gguf) | i1-Q2_K_S | 12.0 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q2_K.gguf) | i1-Q2_K | 12.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 13.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ3_XS.gguf) | i1-IQ3_XS | 14.3 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q3_K_S.gguf) | i1-Q3_K_S | 15.1 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ3_S.gguf) | i1-IQ3_S | 15.1 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ3_M.gguf) | i1-IQ3_M | 15.7 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q3_K_M.gguf) | i1-Q3_K_M | 16.8 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q3_K_L.gguf) | i1-Q3_K_L | 18.2 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-IQ4_XS.gguf) | i1-IQ4_XS | 18.6 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q4_0.gguf) | i1-Q4_0 | 19.6 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q4_K_S.gguf) | i1-Q4_K_S | 19.7 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q4_K_M.gguf) | i1-Q4_K_M | 20.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q5_K_S.gguf) | i1-Q5_K_S | 23.8 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q5_K_M.gguf) | i1-Q5_K_M | 24.4 | |
| [GGUF](https://huggingface.co/mradermacher/34b-beta-i1-GGUF/resolve/main/34b-beta.i1-Q6_K.gguf) | i1-Q6_K | 28.3 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
DevQuasar/xLAM-1b-fc-r-GGUF
|
DevQuasar
| 2024-12-04T14:31:32Z | 21 | 0 | null |
[
"gguf",
"text-generation",
"base_model:Salesforce/xLAM-1b-fc-r",
"base_model:quantized:Salesforce/xLAM-1b-fc-r",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-29T19:58:59Z |
---
base_model:
- Salesforce/xLAM-1b-fc-r
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
DevQuasar/granite-20b-code-instruct-8k-GGUF
|
DevQuasar
| 2024-12-04T14:30:51Z | 26 | 1 | null |
[
"gguf",
"text-generation",
"base_model:ibm-granite/granite-20b-code-instruct-8k",
"base_model:quantized:ibm-granite/granite-20b-code-instruct-8k",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-28T15:43:34Z |
---
base_model:
- ibm-granite/granite-20b-code-instruct-8k
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
nlingampallyy/marian-finetuned-kde4-en-to-fr
|
nlingampallyy
| 2024-12-04T14:30:43Z | 105 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"marian",
"text2text-generation",
"translation",
"generated_from_trainer",
"dataset:kde4",
"base_model:Helsinki-NLP/opus-mt-en-fr",
"base_model:finetune:Helsinki-NLP/opus-mt-en-fr",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
translation
| 2024-12-04T03:57:00Z |
---
library_name: transformers
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-en-fr
tags:
- translation
- generated_from_trainer
datasets:
- kde4
metrics:
- bleu
model-index:
- name: marian-finetuned-kde4-en-to-fr
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: kde4
type: kde4
config: en-fr
split: train
args: en-fr
metrics:
- name: Bleu
type: bleu
value: 45.85529482217844
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2789
- Bleu: 45.8553
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
DevQuasar/granite-8b-code-base-128k-GGUF
|
DevQuasar
| 2024-12-04T14:28:24Z | 9 | 0 | null |
[
"gguf",
"text-generation",
"base_model:ibm-granite/granite-8b-code-base-128k",
"base_model:quantized:ibm-granite/granite-8b-code-base-128k",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-09-28T05:33:02Z |
---
base_model:
- ibm-granite/granite-8b-code-base-128k
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
gokulsrinivasagan/bert_tiny_lda_5_v1_cola
|
gokulsrinivasagan
| 2024-12-04T14:28:23Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_tiny_lda_5_v1",
"base_model:finetune:gokulsrinivasagan/bert_tiny_lda_5_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T20:21:04Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_tiny_lda_5_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
- accuracy
model-index:
- name: bert_tiny_lda_5_v1_cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE COLA
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.0
- name: Accuracy
type: accuracy
value: 0.6912751793861389
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_5_v1_cola
This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_5_v1](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_5_v1) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6138
- Matthews Correlation: 0.0
- Accuracy: 0.6913
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|:--------:|
| 0.6154 | 1.0 | 34 | 0.6138 | 0.0 | 0.6913 |
| 0.5986 | 2.0 | 68 | 0.6147 | 0.0064 | 0.6865 |
| 0.5752 | 3.0 | 102 | 0.6158 | 0.0616 | 0.6894 |
| 0.5356 | 4.0 | 136 | 0.6378 | 0.1110 | 0.6807 |
| 0.4841 | 5.0 | 170 | 0.6615 | 0.0969 | 0.6721 |
| 0.4382 | 6.0 | 204 | 0.7818 | 0.0896 | 0.6673 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
DevQuasar/Llama-3.2-3B-GGUF
|
DevQuasar
| 2024-12-04T14:28:09Z | 9 | 0 | null |
[
"gguf",
"text-generation",
"base_model:meta-llama/Llama-3.2-3B",
"base_model:quantized:meta-llama/Llama-3.2-3B",
"license:llama3.2",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-09-26T00:35:30Z |
---
license: llama3.2
base_model:
- meta-llama/Llama-3.2-3B
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
DevQuasar/Llama-Guard-3-1B-GGUF
|
DevQuasar
| 2024-12-04T14:27:55Z | 8 | 0 | null |
[
"gguf",
"text-classification",
"base_model:meta-llama/Llama-Guard-3-1B",
"base_model:quantized:meta-llama/Llama-Guard-3-1B",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-classification
| 2024-09-26T23:13:07Z |
---
base_model:
- meta-llama/Llama-Guard-3-1B
pipeline_tag: text-classification
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
gokulsrinivasagan/bert_base_lda_5_v1_mnli
|
gokulsrinivasagan
| 2024-12-04T14:27:10Z | 118 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/bert_base_lda_5_v1",
"base_model:finetune:gokulsrinivasagan/bert_base_lda_5_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T19:05:23Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/bert_base_lda_5_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: bert_base_lda_5_v1_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.6910089503661514
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_base_lda_5_v1_mnli
This model is a fine-tuned version of [gokulsrinivasagan/bert_base_lda_5_v1](https://huggingface.co/gokulsrinivasagan/bert_base_lda_5_v1) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7249
- Accuracy: 0.6910
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.9579 | 1.0 | 1534 | 0.8463 | 0.6233 |
| 0.7978 | 2.0 | 3068 | 0.7706 | 0.6570 |
| 0.703 | 3.0 | 4602 | 0.7456 | 0.6771 |
| 0.6236 | 4.0 | 6136 | 0.7698 | 0.6787 |
| 0.5471 | 5.0 | 7670 | 0.7924 | 0.6860 |
| 0.4694 | 6.0 | 9204 | 0.9036 | 0.6861 |
| 0.3948 | 7.0 | 10738 | 0.9024 | 0.6824 |
| 0.3291 | 8.0 | 12272 | 0.9799 | 0.6773 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
DevQuasar/openbuddy-llama3.1-8b-v22.3-131k-GGUF
|
DevQuasar
| 2024-12-04T14:26:36Z | 7 | 0 | null |
[
"gguf",
"text-generation",
"base_model:OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k",
"base_model:quantized:OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-24T19:34:10Z |
---
base_model:
- OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
DevQuasar/Qwen2.5-Math-72B-GGUF
|
DevQuasar
| 2024-12-04T14:25:49Z | 19 | 0 | null |
[
"gguf",
"text-generation",
"base_model:Qwen/Qwen2.5-Math-72B",
"base_model:quantized:Qwen/Qwen2.5-Math-72B",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-22T17:19:47Z |
---
base_model:
- Qwen/Qwen2.5-Math-72B
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
DevQuasar/shieldgemma-9b-GGUF
|
DevQuasar
| 2024-12-04T14:24:03Z | 50 | 0 | null |
[
"gguf",
"text-generation",
"base_model:google/shieldgemma-9b",
"base_model:quantized:google/shieldgemma-9b",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-19T05:14:25Z |
---
base_model:
- google/shieldgemma-9b
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
DevQuasar/datagemma-rig-27b-it-GGUF
|
DevQuasar
| 2024-12-04T14:23:06Z | 30 | 0 | null |
[
"gguf",
"text-generation",
"base_model:google/datagemma-rig-27b-it",
"base_model:quantized:google/datagemma-rig-27b-it",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-12T21:17:04Z |
---
base_model:
- google/datagemma-rig-27b-it
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
MayBashendy/ArabicNewSplits2_FineTuningAraBERT_run1_AugV4_k50_task1_organization
|
MayBashendy
| 2024-12-04T14:20:57Z | 183 | 0 |
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T13:20:30Z |
---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: ArabicNewSplits2_FineTuningAraBERT_run1_AugV4_k50_task1_organization
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ArabicNewSplits2_FineTuningAraBERT_run1_AugV4_k50_task1_organization
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8964
- Qwk: 0.6410
- Mse: 0.8964
- Rmse: 0.9468
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0048 | 2 | 4.5785 | -0.0102 | 4.5785 | 2.1397 |
| No log | 0.0096 | 4 | 2.5376 | 0.0501 | 2.5376 | 1.5930 |
| No log | 0.0144 | 6 | 1.6809 | -0.0312 | 1.6809 | 1.2965 |
| No log | 0.0192 | 8 | 1.2607 | 0.0760 | 1.2607 | 1.1228 |
| No log | 0.0240 | 10 | 1.1288 | 0.2560 | 1.1288 | 1.0625 |
| No log | 0.0288 | 12 | 1.4414 | 0.1380 | 1.4414 | 1.2006 |
| No log | 0.0337 | 14 | 2.0116 | 0.1772 | 2.0116 | 1.4183 |
| No log | 0.0385 | 16 | 2.4212 | 0.1293 | 2.4212 | 1.5560 |
| No log | 0.0433 | 18 | 2.4305 | 0.1293 | 2.4305 | 1.5590 |
| No log | 0.0481 | 20 | 1.9867 | 0.2095 | 1.9867 | 1.4095 |
| No log | 0.0529 | 22 | 1.7828 | 0.1509 | 1.7828 | 1.3352 |
| No log | 0.0577 | 24 | 1.6944 | 0.2349 | 1.6944 | 1.3017 |
| No log | 0.0625 | 26 | 1.8129 | 0.1911 | 1.8129 | 1.3464 |
| No log | 0.0673 | 28 | 1.9478 | 0.1227 | 1.9478 | 1.3956 |
| No log | 0.0721 | 30 | 1.7661 | 0.1724 | 1.7661 | 1.3289 |
| No log | 0.0769 | 32 | 1.6062 | 0.1478 | 1.6062 | 1.2674 |
| No log | 0.0817 | 34 | 1.8835 | 0.1591 | 1.8835 | 1.3724 |
| No log | 0.0865 | 36 | 2.6850 | 0.1470 | 2.6850 | 1.6386 |
| No log | 0.0913 | 38 | 2.4763 | 0.1921 | 2.4763 | 1.5736 |
| No log | 0.0962 | 40 | 1.6846 | 0.3059 | 1.6846 | 1.2979 |
| No log | 0.1010 | 42 | 1.7020 | 0.3589 | 1.7020 | 1.3046 |
| No log | 0.1058 | 44 | 2.3162 | 0.2159 | 2.3162 | 1.5219 |
| No log | 0.1106 | 46 | 2.2339 | 0.2471 | 2.2339 | 1.4946 |
| No log | 0.1154 | 48 | 1.4998 | 0.4110 | 1.4998 | 1.2246 |
| No log | 0.1202 | 50 | 1.5650 | 0.3966 | 1.5650 | 1.2510 |
| No log | 0.125 | 52 | 1.7989 | 0.3721 | 1.7989 | 1.3412 |
| No log | 0.1298 | 54 | 1.9343 | 0.3558 | 1.9343 | 1.3908 |
| No log | 0.1346 | 56 | 1.6069 | 0.3850 | 1.6069 | 1.2676 |
| No log | 0.1394 | 58 | 1.4107 | 0.3485 | 1.4107 | 1.1877 |
| No log | 0.1442 | 60 | 1.6766 | 0.2732 | 1.6766 | 1.2948 |
| No log | 0.1490 | 62 | 1.5660 | 0.3182 | 1.5660 | 1.2514 |
| No log | 0.1538 | 64 | 1.5078 | 0.3105 | 1.5078 | 1.2279 |
| No log | 0.1587 | 66 | 1.6542 | 0.2925 | 1.6542 | 1.2862 |
| No log | 0.1635 | 68 | 1.9378 | 0.2573 | 1.9378 | 1.3920 |
| No log | 0.1683 | 70 | 1.7430 | 0.2727 | 1.7430 | 1.3202 |
| No log | 0.1731 | 72 | 1.4994 | 0.4104 | 1.4994 | 1.2245 |
| No log | 0.1779 | 74 | 1.8129 | 0.2768 | 1.8129 | 1.3464 |
| No log | 0.1827 | 76 | 2.9558 | 0.2337 | 2.9558 | 1.7192 |
| No log | 0.1875 | 78 | 3.4980 | 0.2349 | 3.4980 | 1.8703 |
| No log | 0.1923 | 80 | 2.9849 | 0.2225 | 2.9849 | 1.7277 |
| No log | 0.1971 | 82 | 1.6225 | 0.3031 | 1.6225 | 1.2738 |
| No log | 0.2019 | 84 | 1.2579 | 0.3631 | 1.2579 | 1.1216 |
| No log | 0.2067 | 86 | 1.4193 | 0.3402 | 1.4193 | 1.1913 |
| No log | 0.2115 | 88 | 1.7236 | 0.3645 | 1.7236 | 1.3128 |
| No log | 0.2163 | 90 | 2.2150 | 0.3189 | 2.2150 | 1.4883 |
| No log | 0.2212 | 92 | 2.7823 | 0.2059 | 2.7823 | 1.6680 |
| No log | 0.2260 | 94 | 2.5745 | 0.2752 | 2.5745 | 1.6045 |
| No log | 0.2308 | 96 | 1.7644 | 0.2667 | 1.7644 | 1.3283 |
| No log | 0.2356 | 98 | 1.6347 | 0.3347 | 1.6347 | 1.2785 |
| No log | 0.2404 | 100 | 1.9091 | 0.2518 | 1.9091 | 1.3817 |
| No log | 0.2452 | 102 | 2.4210 | 0.2617 | 2.4210 | 1.5560 |
| No log | 0.25 | 104 | 3.0730 | 0.2241 | 3.0730 | 1.7530 |
| No log | 0.2548 | 106 | 3.1097 | 0.2241 | 3.1097 | 1.7634 |
| No log | 0.2596 | 108 | 2.6093 | 0.2714 | 2.6093 | 1.6153 |
| No log | 0.2644 | 110 | 2.3688 | 0.2927 | 2.3688 | 1.5391 |
| No log | 0.2692 | 112 | 2.2465 | 0.2330 | 2.2465 | 1.4988 |
| No log | 0.2740 | 114 | 1.5614 | 0.3423 | 1.5614 | 1.2496 |
| No log | 0.2788 | 116 | 0.9234 | 0.5072 | 0.9234 | 0.9609 |
| No log | 0.2837 | 118 | 0.8127 | 0.5848 | 0.8127 | 0.9015 |
| No log | 0.2885 | 120 | 0.9567 | 0.5188 | 0.9567 | 0.9781 |
| No log | 0.2933 | 122 | 1.4920 | 0.3753 | 1.4920 | 1.2215 |
| No log | 0.2981 | 124 | 1.9918 | 0.3707 | 1.9918 | 1.4113 |
| No log | 0.3029 | 126 | 2.4146 | 0.3173 | 2.4146 | 1.5539 |
| No log | 0.3077 | 128 | 2.1557 | 0.3514 | 2.1557 | 1.4682 |
| No log | 0.3125 | 130 | 1.5124 | 0.3409 | 1.5124 | 1.2298 |
| No log | 0.3173 | 132 | 0.9676 | 0.5326 | 0.9676 | 0.9837 |
| No log | 0.3221 | 134 | 0.8188 | 0.6045 | 0.8188 | 0.9049 |
| No log | 0.3269 | 136 | 0.9493 | 0.4898 | 0.9493 | 0.9743 |
| No log | 0.3317 | 138 | 1.2347 | 0.4482 | 1.2347 | 1.1112 |
| No log | 0.3365 | 140 | 1.5474 | 0.3789 | 1.5474 | 1.2439 |
| No log | 0.3413 | 142 | 1.4186 | 0.3973 | 1.4186 | 1.1911 |
| No log | 0.3462 | 144 | 1.3061 | 0.4400 | 1.3061 | 1.1429 |
| No log | 0.3510 | 146 | 1.4393 | 0.4134 | 1.4393 | 1.1997 |
| No log | 0.3558 | 148 | 1.7746 | 0.3971 | 1.7746 | 1.3321 |
| No log | 0.3606 | 150 | 1.7968 | 0.3787 | 1.7968 | 1.3404 |
| No log | 0.3654 | 152 | 1.8679 | 0.3971 | 1.8679 | 1.3667 |
| No log | 0.3702 | 154 | 2.1114 | 0.3822 | 2.1114 | 1.4531 |
| No log | 0.375 | 156 | 2.1136 | 0.3768 | 2.1136 | 1.4538 |
| No log | 0.3798 | 158 | 2.0309 | 0.3768 | 2.0309 | 1.4251 |
| No log | 0.3846 | 160 | 1.6320 | 0.3586 | 1.6320 | 1.2775 |
| No log | 0.3894 | 162 | 1.5046 | 0.3213 | 1.5046 | 1.2266 |
| No log | 0.3942 | 164 | 1.5847 | 0.3371 | 1.5847 | 1.2588 |
| No log | 0.3990 | 166 | 1.5081 | 0.3820 | 1.5081 | 1.2280 |
| No log | 0.4038 | 168 | 1.0901 | 0.4643 | 1.0901 | 1.0441 |
| No log | 0.4087 | 170 | 0.9336 | 0.5773 | 0.9336 | 0.9662 |
| No log | 0.4135 | 172 | 1.2571 | 0.5250 | 1.2571 | 1.1212 |
| No log | 0.4183 | 174 | 1.4535 | 0.4701 | 1.4535 | 1.2056 |
| No log | 0.4231 | 176 | 1.3786 | 0.5142 | 1.3786 | 1.1742 |
| No log | 0.4279 | 178 | 1.3028 | 0.5406 | 1.3028 | 1.1414 |
| No log | 0.4327 | 180 | 1.3070 | 0.5199 | 1.3070 | 1.1433 |
| No log | 0.4375 | 182 | 1.0938 | 0.5357 | 1.0938 | 1.0459 |
| No log | 0.4423 | 184 | 1.0955 | 0.5570 | 1.0955 | 1.0467 |
| No log | 0.4471 | 186 | 1.0432 | 0.5595 | 1.0432 | 1.0214 |
| No log | 0.4519 | 188 | 1.3057 | 0.5199 | 1.3057 | 1.1427 |
| No log | 0.4567 | 190 | 1.2332 | 0.5701 | 1.2332 | 1.1105 |
| No log | 0.4615 | 192 | 1.4539 | 0.5058 | 1.4539 | 1.2058 |
| No log | 0.4663 | 194 | 1.8243 | 0.3689 | 1.8243 | 1.3507 |
| No log | 0.4712 | 196 | 1.8802 | 0.3444 | 1.8802 | 1.3712 |
| No log | 0.4760 | 198 | 1.3979 | 0.4476 | 1.3979 | 1.1823 |
| No log | 0.4808 | 200 | 1.0150 | 0.5538 | 1.0150 | 1.0075 |
| No log | 0.4856 | 202 | 1.0340 | 0.5263 | 1.0340 | 1.0169 |
| No log | 0.4904 | 204 | 1.3196 | 0.4713 | 1.3196 | 1.1487 |
| No log | 0.4952 | 206 | 1.9796 | 0.3046 | 1.9796 | 1.4070 |
| No log | 0.5 | 208 | 2.1554 | 0.3189 | 2.1554 | 1.4681 |
| No log | 0.5048 | 210 | 1.8102 | 0.3799 | 1.8102 | 1.3455 |
| No log | 0.5096 | 212 | 1.5742 | 0.4706 | 1.5742 | 1.2547 |
| No log | 0.5144 | 214 | 1.2772 | 0.5236 | 1.2772 | 1.1302 |
| No log | 0.5192 | 216 | 1.2687 | 0.5061 | 1.2687 | 1.1264 |
| No log | 0.5240 | 218 | 1.2781 | 0.4950 | 1.2781 | 1.1305 |
| No log | 0.5288 | 220 | 1.4885 | 0.3696 | 1.4885 | 1.2201 |
| No log | 0.5337 | 222 | 2.0089 | 0.3328 | 2.0089 | 1.4174 |
| No log | 0.5385 | 224 | 2.2985 | 0.3189 | 2.2985 | 1.5161 |
| No log | 0.5433 | 226 | 2.3323 | 0.3189 | 2.3323 | 1.5272 |
| No log | 0.5481 | 228 | 2.0546 | 0.2273 | 2.0546 | 1.4334 |
| No log | 0.5529 | 230 | 1.6071 | 0.3182 | 1.6071 | 1.2677 |
| No log | 0.5577 | 232 | 1.4835 | 0.4073 | 1.4835 | 1.2180 |
| No log | 0.5625 | 234 | 1.5404 | 0.3389 | 1.5404 | 1.2411 |
| No log | 0.5673 | 236 | 1.6830 | 0.3036 | 1.6830 | 1.2973 |
| No log | 0.5721 | 238 | 1.8945 | 0.2649 | 1.8945 | 1.3764 |
| No log | 0.5769 | 240 | 1.6789 | 0.3182 | 1.6789 | 1.2957 |
| No log | 0.5817 | 242 | 1.7196 | 0.3050 | 1.7196 | 1.3114 |
| No log | 0.5865 | 244 | 1.8647 | 0.3362 | 1.8647 | 1.3656 |
| No log | 0.5913 | 246 | 2.3505 | 0.3328 | 2.3505 | 1.5331 |
| No log | 0.5962 | 248 | 2.6512 | 0.3088 | 2.6512 | 1.6283 |
| No log | 0.6010 | 250 | 2.6637 | 0.2856 | 2.6637 | 1.6321 |
| No log | 0.6058 | 252 | 2.3140 | 0.2794 | 2.3140 | 1.5212 |
| No log | 0.6106 | 254 | 2.0143 | 0.2917 | 2.0143 | 1.4193 |
| No log | 0.6154 | 256 | 1.8500 | 0.2672 | 1.8500 | 1.3602 |
| No log | 0.6202 | 258 | 1.4212 | 0.3157 | 1.4212 | 1.1921 |
| No log | 0.625 | 260 | 1.2817 | 0.3950 | 1.2817 | 1.1321 |
| No log | 0.6298 | 262 | 1.5820 | 0.3205 | 1.5820 | 1.2578 |
| No log | 0.6346 | 264 | 1.6017 | 0.2821 | 1.6017 | 1.2656 |
| No log | 0.6394 | 266 | 1.4060 | 0.3589 | 1.4060 | 1.1858 |
| No log | 0.6442 | 268 | 1.1841 | 0.4198 | 1.1841 | 1.0882 |
| No log | 0.6490 | 270 | 1.3887 | 0.4346 | 1.3887 | 1.1784 |
| No log | 0.6538 | 272 | 1.3870 | 0.4591 | 1.3870 | 1.1777 |
| No log | 0.6587 | 274 | 1.0494 | 0.6146 | 1.0494 | 1.0244 |
| No log | 0.6635 | 276 | 0.9451 | 0.5773 | 0.9451 | 0.9722 |
| No log | 0.6683 | 278 | 1.0839 | 0.5375 | 1.0839 | 1.0411 |
| No log | 0.6731 | 280 | 1.0563 | 0.5762 | 1.0563 | 1.0278 |
| No log | 0.6779 | 282 | 0.9448 | 0.6146 | 0.9448 | 0.9720 |
| No log | 0.6827 | 284 | 1.1744 | 0.5404 | 1.1744 | 1.0837 |
| No log | 0.6875 | 286 | 1.2827 | 0.5273 | 1.2827 | 1.1326 |
| No log | 0.6923 | 288 | 1.2384 | 0.5441 | 1.2384 | 1.1128 |
| No log | 0.6971 | 290 | 1.1826 | 0.5465 | 1.1826 | 1.0875 |
| No log | 0.7019 | 292 | 1.0292 | 0.5378 | 1.0292 | 1.0145 |
| No log | 0.7067 | 294 | 0.8250 | 0.5921 | 0.8250 | 0.9083 |
| No log | 0.7115 | 296 | 0.7838 | 0.6337 | 0.7838 | 0.8853 |
| No log | 0.7163 | 298 | 0.8242 | 0.625 | 0.8242 | 0.9078 |
| No log | 0.7212 | 300 | 0.8437 | 0.6061 | 0.8437 | 0.9185 |
| No log | 0.7260 | 302 | 0.8414 | 0.6061 | 0.8414 | 0.9173 |
| No log | 0.7308 | 304 | 0.9165 | 0.6062 | 0.9165 | 0.9573 |
| No log | 0.7356 | 306 | 0.8922 | 0.5770 | 0.8922 | 0.9446 |
| No log | 0.7404 | 308 | 0.9118 | 0.5770 | 0.9118 | 0.9549 |
| No log | 0.7452 | 310 | 1.0634 | 0.5568 | 1.0634 | 1.0312 |
| No log | 0.75 | 312 | 1.3483 | 0.4423 | 1.3483 | 1.1612 |
| No log | 0.7548 | 314 | 1.4766 | 0.3951 | 1.4766 | 1.2152 |
| No log | 0.7596 | 316 | 1.2672 | 0.5269 | 1.2672 | 1.1257 |
| No log | 0.7644 | 318 | 1.0219 | 0.5618 | 1.0219 | 1.0109 |
| No log | 0.7692 | 320 | 1.0812 | 0.5239 | 1.0812 | 1.0398 |
| No log | 0.7740 | 322 | 1.2859 | 0.5227 | 1.2859 | 1.1340 |
| No log | 0.7788 | 324 | 1.4938 | 0.4233 | 1.4938 | 1.2222 |
| No log | 0.7837 | 326 | 1.7260 | 0.3675 | 1.7260 | 1.3138 |
| No log | 0.7885 | 328 | 1.6483 | 0.3465 | 1.6483 | 1.2839 |
| No log | 0.7933 | 330 | 1.3870 | 0.3639 | 1.3870 | 1.1777 |
| No log | 0.7981 | 332 | 1.1639 | 0.4752 | 1.1639 | 1.0788 |
| No log | 0.8029 | 334 | 0.9433 | 0.5619 | 0.9433 | 0.9712 |
| No log | 0.8077 | 336 | 0.8776 | 0.6092 | 0.8776 | 0.9368 |
| No log | 0.8125 | 338 | 0.9989 | 0.5846 | 0.9989 | 0.9994 |
| No log | 0.8173 | 340 | 1.2252 | 0.5389 | 1.2252 | 1.1069 |
| No log | 0.8221 | 342 | 1.1867 | 0.5909 | 1.1867 | 1.0894 |
| No log | 0.8269 | 344 | 1.2392 | 0.6143 | 1.2392 | 1.1132 |
| No log | 0.8317 | 346 | 1.0924 | 0.5855 | 1.0924 | 1.0452 |
| No log | 0.8365 | 348 | 0.9728 | 0.5920 | 0.9728 | 0.9863 |
| No log | 0.8413 | 350 | 1.0220 | 0.5678 | 1.0220 | 1.0110 |
| No log | 0.8462 | 352 | 1.1443 | 0.496 | 1.1443 | 1.0697 |
| No log | 0.8510 | 354 | 1.0960 | 0.5464 | 1.0960 | 1.0469 |
| No log | 0.8558 | 356 | 0.9329 | 0.5919 | 0.9329 | 0.9658 |
| No log | 0.8606 | 358 | 0.7883 | 0.6133 | 0.7883 | 0.8878 |
| No log | 0.8654 | 360 | 0.6597 | 0.6786 | 0.6597 | 0.8122 |
| No log | 0.8702 | 362 | 0.7203 | 0.6568 | 0.7203 | 0.8487 |
| No log | 0.875 | 364 | 0.9090 | 0.5873 | 0.9090 | 0.9534 |
| No log | 0.8798 | 366 | 1.3183 | 0.5518 | 1.3183 | 1.1482 |
| No log | 0.8846 | 368 | 1.6170 | 0.4512 | 1.6170 | 1.2716 |
| No log | 0.8894 | 370 | 1.5310 | 0.4596 | 1.5310 | 1.2373 |
| No log | 0.8942 | 372 | 1.2063 | 0.4526 | 1.2063 | 1.0983 |
| No log | 0.8990 | 374 | 1.1178 | 0.4655 | 1.1178 | 1.0572 |
| No log | 0.9038 | 376 | 1.2166 | 0.4526 | 1.2166 | 1.1030 |
| No log | 0.9087 | 378 | 1.4463 | 0.3810 | 1.4463 | 1.2026 |
| No log | 0.9135 | 380 | 1.3789 | 0.4422 | 1.3789 | 1.1743 |
| No log | 0.9183 | 382 | 1.0740 | 0.4857 | 1.0740 | 1.0363 |
| No log | 0.9231 | 384 | 0.8671 | 0.5691 | 0.8671 | 0.9312 |
| No log | 0.9279 | 386 | 0.9148 | 0.5960 | 0.9148 | 0.9565 |
| No log | 0.9327 | 388 | 1.0619 | 0.5935 | 1.0619 | 1.0305 |
| No log | 0.9375 | 390 | 1.0697 | 0.5985 | 1.0697 | 1.0343 |
| No log | 0.9423 | 392 | 0.9683 | 0.5991 | 0.9683 | 0.9840 |
| No log | 0.9471 | 394 | 0.7514 | 0.6286 | 0.7514 | 0.8668 |
| No log | 0.9519 | 396 | 0.7262 | 0.6293 | 0.7262 | 0.8522 |
| No log | 0.9567 | 398 | 0.8199 | 0.65 | 0.8199 | 0.9055 |
| No log | 0.9615 | 400 | 1.0021 | 0.5851 | 1.0021 | 1.0010 |
| No log | 0.9663 | 402 | 0.8983 | 0.5867 | 0.8983 | 0.9478 |
| No log | 0.9712 | 404 | 0.9510 | 0.5817 | 0.9510 | 0.9752 |
| No log | 0.9760 | 406 | 1.0538 | 0.5642 | 1.0538 | 1.0266 |
| No log | 0.9808 | 408 | 1.0753 | 0.5772 | 1.0753 | 1.0370 |
| No log | 0.9856 | 410 | 0.9953 | 0.5615 | 0.9953 | 0.9977 |
| No log | 0.9904 | 412 | 0.9385 | 0.5205 | 0.9385 | 0.9688 |
| No log | 0.9952 | 414 | 0.9152 | 0.5527 | 0.9152 | 0.9566 |
| No log | 1.0 | 416 | 1.0330 | 0.5424 | 1.0330 | 1.0164 |
| No log | 1.0048 | 418 | 1.1932 | 0.5866 | 1.1932 | 1.0923 |
| No log | 1.0096 | 420 | 1.2062 | 0.5657 | 1.2062 | 1.0983 |
| No log | 1.0144 | 422 | 1.0100 | 0.5909 | 1.0100 | 1.0050 |
| No log | 1.0192 | 424 | 0.9347 | 0.5967 | 0.9347 | 0.9668 |
| No log | 1.0240 | 426 | 1.0490 | 0.6290 | 1.0490 | 1.0242 |
| No log | 1.0288 | 428 | 1.1005 | 0.6290 | 1.1005 | 1.0491 |
| No log | 1.0337 | 430 | 0.9918 | 0.6277 | 0.9918 | 0.9959 |
| No log | 1.0385 | 432 | 0.8045 | 0.6135 | 0.8045 | 0.8969 |
| No log | 1.0433 | 434 | 0.6764 | 0.6050 | 0.6764 | 0.8224 |
| No log | 1.0481 | 436 | 0.6861 | 0.6050 | 0.6861 | 0.8283 |
| No log | 1.0529 | 438 | 0.8116 | 0.6362 | 0.8116 | 0.9009 |
| No log | 1.0577 | 440 | 1.2223 | 0.5863 | 1.2223 | 1.1056 |
| No log | 1.0625 | 442 | 1.5686 | 0.3768 | 1.5686 | 1.2524 |
| No log | 1.0673 | 444 | 1.4790 | 0.3810 | 1.4790 | 1.2161 |
| No log | 1.0721 | 446 | 1.1714 | 0.4735 | 1.1714 | 1.0823 |
| No log | 1.0769 | 448 | 0.9957 | 0.5623 | 0.9957 | 0.9979 |
| No log | 1.0817 | 450 | 0.9618 | 0.6029 | 0.9618 | 0.9807 |
| No log | 1.0865 | 452 | 0.9126 | 0.6173 | 0.9126 | 0.9553 |
| No log | 1.0913 | 454 | 1.0865 | 0.6283 | 1.0865 | 1.0424 |
| No log | 1.0962 | 456 | 1.3427 | 0.5357 | 1.3427 | 1.1588 |
| No log | 1.1010 | 458 | 1.2514 | 0.5491 | 1.2514 | 1.1187 |
| No log | 1.1058 | 460 | 0.9139 | 0.5951 | 0.9139 | 0.9560 |
| No log | 1.1106 | 462 | 0.6483 | 0.5884 | 0.6483 | 0.8051 |
| No log | 1.1154 | 464 | 0.6158 | 0.6182 | 0.6158 | 0.7847 |
| No log | 1.1202 | 466 | 0.7025 | 0.6362 | 0.7025 | 0.8382 |
| No log | 1.125 | 468 | 1.0035 | 0.6415 | 1.0035 | 1.0017 |
| No log | 1.1298 | 470 | 1.4929 | 0.3631 | 1.4929 | 1.2218 |
| No log | 1.1346 | 472 | 1.6160 | 0.3116 | 1.6160 | 1.2712 |
| No log | 1.1394 | 474 | 1.4090 | 0.4115 | 1.4090 | 1.1870 |
| No log | 1.1442 | 476 | 1.0964 | 0.5256 | 1.0964 | 1.0471 |
| No log | 1.1490 | 478 | 0.8767 | 0.6111 | 0.8767 | 0.9363 |
| No log | 1.1538 | 480 | 0.8938 | 0.6111 | 0.8938 | 0.9454 |
| No log | 1.1587 | 482 | 1.0824 | 0.5256 | 1.0824 | 1.0404 |
| No log | 1.1635 | 484 | 1.2204 | 0.5022 | 1.2204 | 1.1047 |
| No log | 1.1683 | 486 | 1.3676 | 0.4607 | 1.3676 | 1.1694 |
| No log | 1.1731 | 488 | 1.5089 | 0.4244 | 1.5089 | 1.2284 |
| No log | 1.1779 | 490 | 1.3414 | 0.4591 | 1.3414 | 1.1582 |
| No log | 1.1827 | 492 | 1.1870 | 0.5353 | 1.1870 | 1.0895 |
| No log | 1.1875 | 494 | 1.0176 | 0.5448 | 1.0176 | 1.0088 |
| No log | 1.1923 | 496 | 1.0070 | 0.5740 | 1.0070 | 1.0035 |
| No log | 1.1971 | 498 | 1.0337 | 0.5593 | 1.0337 | 1.0167 |
| 0.4404 | 1.2019 | 500 | 1.1673 | 0.4986 | 1.1673 | 1.0804 |
| 0.4404 | 1.2067 | 502 | 1.1953 | 0.5312 | 1.1953 | 1.0933 |
| 0.4404 | 1.2115 | 504 | 1.1618 | 0.5803 | 1.1618 | 1.0779 |
| 0.4404 | 1.2163 | 506 | 1.1046 | 0.5299 | 1.1046 | 1.0510 |
| 0.4404 | 1.2212 | 508 | 1.0527 | 0.5299 | 1.0527 | 1.0260 |
| 0.4404 | 1.2260 | 510 | 1.0346 | 0.4781 | 1.0346 | 1.0171 |
| 0.4404 | 1.2308 | 512 | 1.0886 | 0.4340 | 1.0886 | 1.0434 |
| 0.4404 | 1.2356 | 514 | 1.0904 | 0.4340 | 1.0904 | 1.0442 |
| 0.4404 | 1.2404 | 516 | 1.0971 | 0.4340 | 1.0971 | 1.0474 |
| 0.4404 | 1.2452 | 518 | 1.1864 | 0.4243 | 1.1864 | 1.0892 |
| 0.4404 | 1.25 | 520 | 1.1172 | 0.5443 | 1.1172 | 1.0570 |
| 0.4404 | 1.2548 | 522 | 1.0718 | 0.5263 | 1.0718 | 1.0353 |
| 0.4404 | 1.2596 | 524 | 1.0706 | 0.5012 | 1.0706 | 1.0347 |
| 0.4404 | 1.2644 | 526 | 1.0794 | 0.5387 | 1.0794 | 1.0390 |
| 0.4404 | 1.2692 | 528 | 1.1157 | 0.5461 | 1.1157 | 1.0563 |
| 0.4404 | 1.2740 | 530 | 1.0374 | 0.5705 | 1.0374 | 1.0185 |
| 0.4404 | 1.2788 | 532 | 1.0677 | 0.5742 | 1.0677 | 1.0333 |
| 0.4404 | 1.2837 | 534 | 1.1140 | 0.5719 | 1.1140 | 1.0555 |
| 0.4404 | 1.2885 | 536 | 1.2117 | 0.5424 | 1.2117 | 1.1008 |
| 0.4404 | 1.2933 | 538 | 1.4103 | 0.5166 | 1.4103 | 1.1876 |
| 0.4404 | 1.2981 | 540 | 1.3105 | 0.5227 | 1.3105 | 1.1448 |
| 0.4404 | 1.3029 | 542 | 1.3240 | 0.4962 | 1.3240 | 1.1506 |
| 0.4404 | 1.3077 | 544 | 1.5389 | 0.3333 | 1.5389 | 1.2405 |
| 0.4404 | 1.3125 | 546 | 1.7362 | 0.2742 | 1.7362 | 1.3177 |
| 0.4404 | 1.3173 | 548 | 1.6206 | 0.3231 | 1.6206 | 1.2730 |
| 0.4404 | 1.3221 | 550 | 1.4466 | 0.4036 | 1.4466 | 1.2028 |
| 0.4404 | 1.3269 | 552 | 1.4361 | 0.4315 | 1.4361 | 1.1984 |
| 0.4404 | 1.3317 | 554 | 1.3225 | 0.4510 | 1.3225 | 1.1500 |
| 0.4404 | 1.3365 | 556 | 1.1657 | 0.5512 | 1.1657 | 1.0797 |
| 0.4404 | 1.3413 | 558 | 1.1765 | 0.5422 | 1.1765 | 1.0847 |
| 0.4404 | 1.3462 | 560 | 1.2855 | 0.5103 | 1.2855 | 1.1338 |
| 0.4404 | 1.3510 | 562 | 1.1302 | 0.5772 | 1.1302 | 1.0631 |
| 0.4404 | 1.3558 | 564 | 1.0808 | 0.6161 | 1.0808 | 1.0396 |
| 0.4404 | 1.3606 | 566 | 1.2216 | 0.4924 | 1.2216 | 1.1052 |
| 0.4404 | 1.3654 | 568 | 1.4432 | 0.4315 | 1.4432 | 1.2013 |
| 0.4404 | 1.3702 | 570 | 1.3490 | 0.4279 | 1.3490 | 1.1615 |
| 0.4404 | 1.375 | 572 | 1.1775 | 0.4609 | 1.1775 | 1.0851 |
| 0.4404 | 1.3798 | 574 | 0.9670 | 0.6208 | 0.9670 | 0.9834 |
| 0.4404 | 1.3846 | 576 | 0.9920 | 0.6208 | 0.9920 | 0.9960 |
| 0.4404 | 1.3894 | 578 | 1.1687 | 0.5772 | 1.1687 | 1.0810 |
| 0.4404 | 1.3942 | 580 | 1.3064 | 0.5103 | 1.3064 | 1.1430 |
| 0.4404 | 1.3990 | 582 | 1.3611 | 0.4830 | 1.3611 | 1.1667 |
| 0.4404 | 1.4038 | 584 | 1.3979 | 0.4712 | 1.3979 | 1.1823 |
| 0.4404 | 1.4087 | 586 | 1.4361 | 0.4400 | 1.4361 | 1.1984 |
| 0.4404 | 1.4135 | 588 | 1.3052 | 0.4728 | 1.3052 | 1.1425 |
| 0.4404 | 1.4183 | 590 | 1.1678 | 0.4744 | 1.1678 | 1.0806 |
| 0.4404 | 1.4231 | 592 | 1.1738 | 0.5465 | 1.1738 | 1.0834 |
| 0.4404 | 1.4279 | 594 | 1.2196 | 0.5551 | 1.2196 | 1.1043 |
| 0.4404 | 1.4327 | 596 | 1.0317 | 0.65 | 1.0317 | 1.0157 |
| 0.4404 | 1.4375 | 598 | 0.7959 | 0.6698 | 0.7959 | 0.8922 |
| 0.4404 | 1.4423 | 600 | 0.7739 | 0.6195 | 0.7739 | 0.8797 |
| 0.4404 | 1.4471 | 602 | 0.9080 | 0.6475 | 0.9080 | 0.9529 |
| 0.4404 | 1.4519 | 604 | 1.0254 | 0.6036 | 1.0254 | 1.0126 |
| 0.4404 | 1.4567 | 606 | 1.1547 | 0.5547 | 1.1547 | 1.0746 |
| 0.4404 | 1.4615 | 608 | 1.1255 | 0.5183 | 1.1255 | 1.0609 |
| 0.4404 | 1.4663 | 610 | 1.0367 | 0.5942 | 1.0367 | 1.0182 |
| 0.4404 | 1.4712 | 612 | 0.9930 | 0.5762 | 0.9930 | 0.9965 |
| 0.4404 | 1.4760 | 614 | 1.0915 | 0.5878 | 1.0915 | 1.0448 |
| 0.4404 | 1.4808 | 616 | 1.0595 | 0.6231 | 1.0595 | 1.0293 |
| 0.4404 | 1.4856 | 618 | 0.9811 | 0.5898 | 0.9811 | 0.9905 |
| 0.4404 | 1.4904 | 620 | 0.8945 | 0.6324 | 0.8945 | 0.9458 |
| 0.4404 | 1.4952 | 622 | 0.9418 | 0.6753 | 0.9418 | 0.9705 |
| 0.4404 | 1.5 | 624 | 1.0762 | 0.6065 | 1.0762 | 1.0374 |
| 0.4404 | 1.5048 | 626 | 0.9577 | 0.6753 | 0.9577 | 0.9786 |
| 0.4404 | 1.5096 | 628 | 0.9086 | 0.6482 | 0.9086 | 0.9532 |
| 0.4404 | 1.5144 | 630 | 0.8613 | 0.6415 | 0.8613 | 0.9280 |
| 0.4404 | 1.5192 | 632 | 0.8950 | 0.6819 | 0.8950 | 0.9461 |
| 0.4404 | 1.5240 | 634 | 0.8211 | 0.6680 | 0.8211 | 0.9061 |
| 0.4404 | 1.5288 | 636 | 0.7987 | 0.6945 | 0.7987 | 0.8937 |
| 0.4404 | 1.5337 | 638 | 0.8033 | 0.6738 | 0.8033 | 0.8963 |
| 0.4404 | 1.5385 | 640 | 0.8460 | 0.6617 | 0.8460 | 0.9198 |
| 0.4404 | 1.5433 | 642 | 0.7784 | 0.6819 | 0.7784 | 0.8823 |
| 0.4404 | 1.5481 | 644 | 0.6683 | 0.6384 | 0.6683 | 0.8175 |
| 0.4404 | 1.5529 | 646 | 0.6838 | 0.6507 | 0.6838 | 0.8269 |
| 0.4404 | 1.5577 | 648 | 0.7563 | 0.6390 | 0.7563 | 0.8697 |
| 0.4404 | 1.5625 | 650 | 0.9178 | 0.6232 | 0.9178 | 0.9580 |
| 0.4404 | 1.5673 | 652 | 1.1508 | 0.5639 | 1.1508 | 1.0728 |
| 0.4404 | 1.5721 | 654 | 1.1112 | 0.5661 | 1.1112 | 1.0541 |
| 0.4404 | 1.5769 | 656 | 0.9939 | 0.6146 | 0.9939 | 0.9970 |
| 0.4404 | 1.5817 | 658 | 0.9687 | 0.6392 | 0.9687 | 0.9842 |
| 0.4404 | 1.5865 | 660 | 0.9904 | 0.6146 | 0.9904 | 0.9952 |
| 0.4404 | 1.5913 | 662 | 0.9903 | 0.6392 | 0.9903 | 0.9952 |
| 0.4404 | 1.5962 | 664 | 1.1182 | 0.5661 | 1.1182 | 1.0575 |
| 0.4404 | 1.6010 | 666 | 1.1676 | 0.5661 | 1.1676 | 1.0806 |
| 0.4404 | 1.6058 | 668 | 1.0972 | 0.6277 | 1.0972 | 1.0475 |
| 0.4404 | 1.6106 | 670 | 0.9526 | 0.6280 | 0.9526 | 0.9760 |
| 0.4404 | 1.6154 | 672 | 0.9494 | 0.6280 | 0.9494 | 0.9744 |
| 0.4404 | 1.6202 | 674 | 1.0391 | 0.6070 | 1.0391 | 1.0194 |
| 0.4404 | 1.625 | 676 | 1.1124 | 0.6273 | 1.1124 | 1.0547 |
| 0.4404 | 1.6298 | 678 | 1.0604 | 0.6266 | 1.0604 | 1.0298 |
| 0.4404 | 1.6346 | 680 | 0.8704 | 0.6170 | 0.8704 | 0.9329 |
| 0.4404 | 1.6394 | 682 | 0.8285 | 0.6271 | 0.8285 | 0.9102 |
| 0.4404 | 1.6442 | 684 | 0.8555 | 0.5850 | 0.8555 | 0.9249 |
| 0.4404 | 1.6490 | 686 | 0.9662 | 0.5801 | 0.9662 | 0.9830 |
| 0.4404 | 1.6538 | 688 | 1.2287 | 0.5776 | 1.2287 | 1.1085 |
| 0.4404 | 1.6587 | 690 | 1.3785 | 0.5077 | 1.3785 | 1.1741 |
| 0.4404 | 1.6635 | 692 | 1.2534 | 0.5529 | 1.2534 | 1.1195 |
| 0.4404 | 1.6683 | 694 | 1.0999 | 0.5705 | 1.0999 | 1.0487 |
| 0.4404 | 1.6731 | 696 | 1.1729 | 0.5529 | 1.1729 | 1.0830 |
| 0.4404 | 1.6779 | 698 | 1.1398 | 0.6053 | 1.1398 | 1.0676 |
| 0.4404 | 1.6827 | 700 | 0.9596 | 0.6093 | 0.9596 | 0.9796 |
| 0.4404 | 1.6875 | 702 | 0.8301 | 0.6135 | 0.8301 | 0.9111 |
| 0.4404 | 1.6923 | 704 | 0.7653 | 0.6009 | 0.7653 | 0.8748 |
| 0.4404 | 1.6971 | 706 | 0.7905 | 0.6009 | 0.7905 | 0.8891 |
| 0.4404 | 1.7019 | 708 | 0.9895 | 0.6146 | 0.9895 | 0.9948 |
| 0.4404 | 1.7067 | 710 | 1.1511 | 0.6273 | 1.1511 | 1.0729 |
| 0.4404 | 1.7115 | 712 | 1.0381 | 0.6146 | 1.0381 | 1.0189 |
| 0.4404 | 1.7163 | 714 | 0.8501 | 0.6387 | 0.8501 | 0.9220 |
| 0.4404 | 1.7212 | 716 | 0.6895 | 0.5884 | 0.6895 | 0.8303 |
| 0.4404 | 1.7260 | 718 | 0.6831 | 0.5884 | 0.6831 | 0.8265 |
| 0.4404 | 1.7308 | 720 | 0.7462 | 0.5970 | 0.7462 | 0.8638 |
| 0.4404 | 1.7356 | 722 | 0.9430 | 0.6348 | 0.9430 | 0.9711 |
| 0.4404 | 1.7404 | 724 | 1.1611 | 0.5522 | 1.1611 | 1.0775 |
| 0.4404 | 1.7452 | 726 | 1.2847 | 0.4520 | 1.2847 | 1.1335 |
| 0.4404 | 1.75 | 728 | 1.1931 | 0.5225 | 1.1931 | 1.0923 |
| 0.4404 | 1.7548 | 730 | 1.0440 | 0.5898 | 1.0440 | 1.0218 |
| 0.4404 | 1.7596 | 732 | 0.9004 | 0.625 | 0.9004 | 0.9489 |
| 0.4404 | 1.7644 | 734 | 0.8587 | 0.625 | 0.8587 | 0.9267 |
| 0.4404 | 1.7692 | 736 | 0.8103 | 0.6529 | 0.8103 | 0.9002 |
| 0.4404 | 1.7740 | 738 | 0.7432 | 0.7203 | 0.7432 | 0.8621 |
| 0.4404 | 1.7788 | 740 | 0.6121 | 0.7017 | 0.6121 | 0.7824 |
| 0.4404 | 1.7837 | 742 | 0.5886 | 0.7017 | 0.5886 | 0.7672 |
| 0.4404 | 1.7885 | 744 | 0.6678 | 0.7160 | 0.6678 | 0.8172 |
| 0.4404 | 1.7933 | 746 | 0.7808 | 0.7044 | 0.7808 | 0.8836 |
| 0.4404 | 1.7981 | 748 | 0.7581 | 0.7016 | 0.7581 | 0.8707 |
| 0.4404 | 1.8029 | 750 | 0.6175 | 0.6358 | 0.6175 | 0.7858 |
| 0.4404 | 1.8077 | 752 | 0.5755 | 0.6184 | 0.5755 | 0.7586 |
| 0.4404 | 1.8125 | 754 | 0.5796 | 0.6184 | 0.5796 | 0.7613 |
| 0.4404 | 1.8173 | 756 | 0.6557 | 0.6358 | 0.6557 | 0.8097 |
| 0.4404 | 1.8221 | 758 | 0.7692 | 0.6707 | 0.7692 | 0.8770 |
| 0.4404 | 1.8269 | 760 | 0.8088 | 0.6331 | 0.8088 | 0.8993 |
| 0.4404 | 1.8317 | 762 | 0.8743 | 0.6578 | 0.8743 | 0.9350 |
| 0.4404 | 1.8365 | 764 | 0.9374 | 0.6578 | 0.9374 | 0.9682 |
| 0.4404 | 1.8413 | 766 | 0.9602 | 0.6331 | 0.9602 | 0.9799 |
| 0.4404 | 1.8462 | 768 | 0.8608 | 0.6149 | 0.8608 | 0.9278 |
| 0.4404 | 1.8510 | 770 | 0.8171 | 0.6409 | 0.8171 | 0.9039 |
| 0.4404 | 1.8558 | 772 | 0.7709 | 0.6409 | 0.7709 | 0.8780 |
| 0.4404 | 1.8606 | 774 | 0.7462 | 0.6606 | 0.7462 | 0.8638 |
| 0.4404 | 1.8654 | 776 | 0.8555 | 0.6445 | 0.8555 | 0.9249 |
| 0.4404 | 1.8702 | 778 | 1.1278 | 0.613 | 1.1278 | 1.0620 |
| 0.4404 | 1.875 | 780 | 1.3889 | 0.5406 | 1.3889 | 1.1785 |
| 0.4404 | 1.8798 | 782 | 1.3558 | 0.5231 | 1.3558 | 1.1644 |
| 0.4404 | 1.8846 | 784 | 1.1410 | 0.5639 | 1.1410 | 1.0682 |
| 0.4404 | 1.8894 | 786 | 0.9587 | 0.6324 | 0.9587 | 0.9791 |
| 0.4404 | 1.8942 | 788 | 0.8677 | 0.6173 | 0.8677 | 0.9315 |
| 0.4404 | 1.8990 | 790 | 0.7574 | 0.5946 | 0.7574 | 0.8703 |
| 0.4404 | 1.9038 | 792 | 0.7185 | 0.6239 | 0.7185 | 0.8476 |
| 0.4404 | 1.9087 | 794 | 0.7307 | 0.625 | 0.7307 | 0.8548 |
| 0.4404 | 1.9135 | 796 | 0.7845 | 0.6831 | 0.7845 | 0.8857 |
| 0.4404 | 1.9183 | 798 | 0.7324 | 0.6944 | 0.7324 | 0.8558 |
| 0.4404 | 1.9231 | 800 | 0.7330 | 0.6944 | 0.7330 | 0.8561 |
| 0.4404 | 1.9279 | 802 | 0.7496 | 0.6927 | 0.7496 | 0.8658 |
| 0.4404 | 1.9327 | 804 | 0.8550 | 0.6809 | 0.8550 | 0.9247 |
| 0.4404 | 1.9375 | 806 | 0.9769 | 0.6269 | 0.9769 | 0.9884 |
| 0.4404 | 1.9423 | 808 | 1.0180 | 0.5997 | 1.0180 | 1.0090 |
| 0.4404 | 1.9471 | 810 | 0.9298 | 0.6415 | 0.9298 | 0.9643 |
| 0.4404 | 1.9519 | 812 | 0.8614 | 0.6390 | 0.8614 | 0.9281 |
| 0.4404 | 1.9567 | 814 | 0.8991 | 0.6019 | 0.8991 | 0.9482 |
| 0.4404 | 1.9615 | 816 | 1.0374 | 0.6318 | 1.0374 | 1.0185 |
| 0.4404 | 1.9663 | 818 | 1.0473 | 0.6318 | 1.0473 | 1.0234 |
| 0.4404 | 1.9712 | 820 | 1.0485 | 0.6083 | 1.0485 | 1.0240 |
| 0.4404 | 1.9760 | 822 | 1.1768 | 0.5724 | 1.1768 | 1.0848 |
| 0.4404 | 1.9808 | 824 | 1.4089 | 0.4695 | 1.4089 | 1.1870 |
| 0.4404 | 1.9856 | 826 | 1.5111 | 0.4343 | 1.5111 | 1.2293 |
| 0.4404 | 1.9904 | 828 | 1.4163 | 0.4314 | 1.4163 | 1.1901 |
| 0.4404 | 1.9952 | 830 | 1.2183 | 0.5152 | 1.2183 | 1.1038 |
| 0.4404 | 2.0 | 832 | 1.1020 | 0.5233 | 1.1020 | 1.0498 |
| 0.4404 | 2.0048 | 834 | 0.9780 | 0.5591 | 0.9780 | 0.9889 |
| 0.4404 | 2.0096 | 836 | 0.8967 | 0.6097 | 0.8967 | 0.9469 |
| 0.4404 | 2.0144 | 838 | 0.9309 | 0.625 | 0.9309 | 0.9648 |
| 0.4404 | 2.0192 | 840 | 0.8955 | 0.6430 | 0.8955 | 0.9463 |
| 0.4404 | 2.0240 | 842 | 0.8366 | 0.6369 | 0.8366 | 0.9147 |
| 0.4404 | 2.0288 | 844 | 0.7885 | 0.6300 | 0.7885 | 0.8880 |
| 0.4404 | 2.0337 | 846 | 0.7677 | 0.6362 | 0.7677 | 0.8762 |
| 0.4404 | 2.0385 | 848 | 0.7860 | 0.6173 | 0.7860 | 0.8866 |
| 0.4404 | 2.0433 | 850 | 0.7983 | 0.6173 | 0.7983 | 0.8935 |
| 0.4404 | 2.0481 | 852 | 0.8154 | 0.6390 | 0.8154 | 0.9030 |
| 0.4404 | 2.0529 | 854 | 0.8327 | 0.6348 | 0.8327 | 0.9125 |
| 0.4404 | 2.0577 | 856 | 0.8565 | 0.6391 | 0.8565 | 0.9255 |
| 0.4404 | 2.0625 | 858 | 0.8275 | 0.6512 | 0.8275 | 0.9097 |
| 0.4404 | 2.0673 | 860 | 0.7938 | 0.6679 | 0.7938 | 0.8910 |
| 0.4404 | 2.0721 | 862 | 0.8686 | 0.6077 | 0.8686 | 0.9320 |
| 0.4404 | 2.0769 | 864 | 1.0019 | 0.5951 | 1.0019 | 1.0009 |
| 0.4404 | 2.0817 | 866 | 0.9639 | 0.6129 | 0.9639 | 0.9818 |
| 0.4404 | 2.0865 | 868 | 0.9785 | 0.5755 | 0.9785 | 0.9892 |
| 0.4404 | 2.0913 | 870 | 0.8667 | 0.6461 | 0.8667 | 0.9310 |
| 0.4404 | 2.0962 | 872 | 0.8336 | 0.6597 | 0.8336 | 0.9130 |
| 0.4404 | 2.1010 | 874 | 0.8153 | 0.6189 | 0.8153 | 0.9030 |
| 0.4404 | 2.1058 | 876 | 0.8158 | 0.6633 | 0.8158 | 0.9032 |
| 0.4404 | 2.1106 | 878 | 0.7451 | 0.6667 | 0.7451 | 0.8632 |
| 0.4404 | 2.1154 | 880 | 0.7141 | 0.6486 | 0.7141 | 0.8451 |
| 0.4404 | 2.1202 | 882 | 0.8097 | 0.6663 | 0.8097 | 0.8998 |
| 0.4404 | 2.125 | 884 | 0.8796 | 0.6718 | 0.8796 | 0.9379 |
| 0.4404 | 2.1298 | 886 | 0.9026 | 0.6604 | 0.9026 | 0.9501 |
| 0.4404 | 2.1346 | 888 | 0.8238 | 0.6189 | 0.8238 | 0.9077 |
| 0.4404 | 2.1394 | 890 | 0.6843 | 0.6116 | 0.6843 | 0.8272 |
| 0.4404 | 2.1442 | 892 | 0.6171 | 0.6426 | 0.6171 | 0.7856 |
| 0.4404 | 2.1490 | 894 | 0.6207 | 0.6426 | 0.6207 | 0.7878 |
| 0.4404 | 2.1538 | 896 | 0.6718 | 0.6217 | 0.6718 | 0.8196 |
| 0.4404 | 2.1587 | 898 | 0.8387 | 0.6691 | 0.8387 | 0.9158 |
| 0.4404 | 2.1635 | 900 | 1.0331 | 0.6489 | 1.0331 | 1.0164 |
| 0.4404 | 2.1683 | 902 | 1.1024 | 0.6413 | 1.1024 | 1.0500 |
| 0.4404 | 2.1731 | 904 | 1.0161 | 0.6129 | 1.0161 | 1.0080 |
| 0.4404 | 2.1779 | 906 | 0.8685 | 0.5650 | 0.8685 | 0.9319 |
| 0.4404 | 2.1827 | 908 | 0.8027 | 0.5798 | 0.8027 | 0.8959 |
| 0.4404 | 2.1875 | 910 | 0.7354 | 0.6093 | 0.7354 | 0.8576 |
| 0.4404 | 2.1923 | 912 | 0.7213 | 0.6093 | 0.7213 | 0.8493 |
| 0.4404 | 2.1971 | 914 | 0.8115 | 0.6209 | 0.8115 | 0.9008 |
| 0.4404 | 2.2019 | 916 | 1.0856 | 0.6207 | 1.0856 | 1.0419 |
| 0.4404 | 2.2067 | 918 | 1.2537 | 0.5884 | 1.2537 | 1.1197 |
| 0.4404 | 2.2115 | 920 | 1.2156 | 0.5661 | 1.2156 | 1.1026 |
| 0.4404 | 2.2163 | 922 | 1.0508 | 0.6232 | 1.0508 | 1.0251 |
| 0.4404 | 2.2212 | 924 | 1.0056 | 0.6220 | 1.0056 | 1.0028 |
| 0.4404 | 2.2260 | 926 | 1.0690 | 0.6055 | 1.0690 | 1.0339 |
| 0.4404 | 2.2308 | 928 | 1.0627 | 0.6055 | 1.0627 | 1.0309 |
| 0.4404 | 2.2356 | 930 | 1.0155 | 0.5088 | 1.0155 | 1.0077 |
| 0.4404 | 2.2404 | 932 | 1.0594 | 0.5310 | 1.0594 | 1.0293 |
| 0.4404 | 2.2452 | 934 | 1.1096 | 0.5839 | 1.1096 | 1.0534 |
| 0.4404 | 2.25 | 936 | 1.0294 | 0.6010 | 1.0294 | 1.0146 |
| 0.4404 | 2.2548 | 938 | 0.9480 | 0.6351 | 0.9480 | 0.9737 |
| 0.4404 | 2.2596 | 940 | 0.8481 | 0.5972 | 0.8481 | 0.9209 |
| 0.4404 | 2.2644 | 942 | 0.7548 | 0.625 | 0.7548 | 0.8688 |
| 0.4404 | 2.2692 | 944 | 0.7302 | 0.5933 | 0.7302 | 0.8545 |
| 0.4404 | 2.2740 | 946 | 0.7668 | 0.6 | 0.7668 | 0.8757 |
| 0.4404 | 2.2788 | 948 | 0.8767 | 0.6351 | 0.8767 | 0.9363 |
| 0.4404 | 2.2837 | 950 | 0.9780 | 0.6224 | 0.9780 | 0.9890 |
| 0.4404 | 2.2885 | 952 | 0.9740 | 0.6351 | 0.9740 | 0.9869 |
| 0.4404 | 2.2933 | 954 | 0.8770 | 0.5955 | 0.8770 | 0.9365 |
| 0.4404 | 2.2981 | 956 | 0.8538 | 0.5991 | 0.8538 | 0.9240 |
| 0.4404 | 2.3029 | 958 | 0.8239 | 0.5714 | 0.8239 | 0.9077 |
| 0.4404 | 2.3077 | 960 | 0.8150 | 0.5456 | 0.8150 | 0.9027 |
| 0.4404 | 2.3125 | 962 | 0.8525 | 0.5527 | 0.8525 | 0.9233 |
| 0.4404 | 2.3173 | 964 | 0.9523 | 0.5817 | 0.9523 | 0.9758 |
| 0.4404 | 2.3221 | 966 | 1.0378 | 0.6196 | 1.0378 | 1.0187 |
| 0.4404 | 2.3269 | 968 | 1.0823 | 0.6077 | 1.0823 | 1.0404 |
| 0.4404 | 2.3317 | 970 | 1.0459 | 0.5951 | 1.0459 | 1.0227 |
| 0.4404 | 2.3365 | 972 | 1.0589 | 0.5951 | 1.0589 | 1.0290 |
| 0.4404 | 2.3413 | 974 | 1.0591 | 0.5887 | 1.0591 | 1.0291 |
| 0.4404 | 2.3462 | 976 | 1.1579 | 0.5488 | 1.1579 | 1.0760 |
| 0.4404 | 2.3510 | 978 | 1.2250 | 0.472 | 1.2250 | 1.1068 |
| 0.4404 | 2.3558 | 980 | 1.1861 | 0.4987 | 1.1861 | 1.0891 |
| 0.4404 | 2.3606 | 982 | 1.1385 | 0.5488 | 1.1385 | 1.0670 |
| 0.4404 | 2.3654 | 984 | 1.1799 | 0.5215 | 1.1799 | 1.0862 |
| 0.4404 | 2.3702 | 986 | 1.1925 | 0.5424 | 1.1925 | 1.0920 |
| 0.4404 | 2.375 | 988 | 1.1650 | 0.5824 | 1.1650 | 1.0793 |
| 0.4404 | 2.3798 | 990 | 1.0194 | 0.6483 | 1.0194 | 1.0096 |
| 0.4404 | 2.3846 | 992 | 0.8414 | 0.6311 | 0.8414 | 0.9173 |
| 0.4404 | 2.3894 | 994 | 0.8395 | 0.6663 | 0.8395 | 0.9162 |
| 0.4404 | 2.3942 | 996 | 0.9327 | 0.6478 | 0.9327 | 0.9658 |
| 0.4404 | 2.3990 | 998 | 1.0596 | 0.625 | 1.0596 | 1.0293 |
| 0.1328 | 2.4038 | 1000 | 1.2228 | 0.5830 | 1.2228 | 1.1058 |
| 0.1328 | 2.4087 | 1002 | 1.2686 | 0.4846 | 1.2686 | 1.1263 |
| 0.1328 | 2.4135 | 1004 | 1.1648 | 0.5341 | 1.1648 | 1.0793 |
| 0.1328 | 2.4183 | 1006 | 1.0463 | 0.5403 | 1.0463 | 1.0229 |
| 0.1328 | 2.4231 | 1008 | 0.9599 | 0.6390 | 0.9599 | 0.9797 |
| 0.1328 | 2.4279 | 1010 | 0.9124 | 0.6390 | 0.9124 | 0.9552 |
| 0.1328 | 2.4327 | 1012 | 0.9392 | 0.6189 | 0.9392 | 0.9691 |
| 0.1328 | 2.4375 | 1014 | 0.9451 | 0.6240 | 0.9451 | 0.9722 |
| 0.1328 | 2.4423 | 1016 | 0.9369 | 0.6718 | 0.9369 | 0.9680 |
| 0.1328 | 2.4471 | 1018 | 0.9192 | 0.6538 | 0.9192 | 0.9588 |
| 0.1328 | 2.4519 | 1020 | 0.9302 | 0.6429 | 0.9302 | 0.9645 |
| 0.1328 | 2.4567 | 1022 | 0.9014 | 0.6718 | 0.9014 | 0.9494 |
| 0.1328 | 2.4615 | 1024 | 0.8927 | 0.6691 | 0.8927 | 0.9448 |
| 0.1328 | 2.4663 | 1026 | 0.8053 | 0.6014 | 0.8053 | 0.8974 |
| 0.1328 | 2.4712 | 1028 | 0.8073 | 0.6216 | 0.8073 | 0.8985 |
| 0.1328 | 2.4760 | 1030 | 0.9030 | 0.6597 | 0.9030 | 0.9502 |
| 0.1328 | 2.4808 | 1032 | 1.0468 | 0.5984 | 1.0468 | 1.0231 |
| 0.1328 | 2.4856 | 1034 | 1.0539 | 0.5618 | 1.0539 | 1.0266 |
| 0.1328 | 2.4904 | 1036 | 0.9253 | 0.625 | 0.9253 | 0.9619 |
| 0.1328 | 2.4952 | 1038 | 0.8063 | 0.5985 | 0.8063 | 0.8979 |
| 0.1328 | 2.5 | 1040 | 0.8102 | 0.5741 | 0.8102 | 0.9001 |
| 0.1328 | 2.5048 | 1042 | 0.8899 | 0.6216 | 0.8899 | 0.9434 |
| 0.1328 | 2.5096 | 1044 | 1.0616 | 0.5887 | 1.0616 | 1.0304 |
| 0.1328 | 2.5144 | 1046 | 1.1628 | 0.6002 | 1.1628 | 1.0783 |
| 0.1328 | 2.5192 | 1048 | 1.1166 | 0.5919 | 1.1166 | 1.0567 |
| 0.1328 | 2.5240 | 1050 | 0.9593 | 0.6461 | 0.9593 | 0.9794 |
| 0.1328 | 2.5288 | 1052 | 0.8524 | 0.6437 | 0.8524 | 0.9233 |
| 0.1328 | 2.5337 | 1054 | 0.7740 | 0.6135 | 0.7740 | 0.8798 |
| 0.1328 | 2.5385 | 1056 | 0.8056 | 0.6362 | 0.8056 | 0.8975 |
| 0.1328 | 2.5433 | 1058 | 0.8404 | 0.6322 | 0.8404 | 0.9167 |
| 0.1328 | 2.5481 | 1060 | 0.8688 | 0.652 | 0.8688 | 0.9321 |
| 0.1328 | 2.5529 | 1062 | 0.8936 | 0.652 | 0.8936 | 0.9453 |
| 0.1328 | 2.5577 | 1064 | 0.8252 | 0.5985 | 0.8252 | 0.9084 |
| 0.1328 | 2.5625 | 1066 | 0.7843 | 0.6135 | 0.7843 | 0.8856 |
| 0.1328 | 2.5673 | 1068 | 0.7883 | 0.6135 | 0.7883 | 0.8878 |
| 0.1328 | 2.5721 | 1070 | 0.8753 | 0.6316 | 0.8753 | 0.9356 |
| 0.1328 | 2.5769 | 1072 | 1.0248 | 0.6604 | 1.0248 | 1.0123 |
| 0.1328 | 2.5817 | 1074 | 1.0883 | 0.6013 | 1.0883 | 1.0432 |
| 0.1328 | 2.5865 | 1076 | 1.0410 | 0.6041 | 1.0410 | 1.0203 |
| 0.1328 | 2.5913 | 1078 | 0.9738 | 0.6604 | 0.9738 | 0.9868 |
| 0.1328 | 2.5962 | 1080 | 0.8888 | 0.5789 | 0.8888 | 0.9428 |
| 0.1328 | 2.6010 | 1082 | 0.8072 | 0.6135 | 0.8072 | 0.8984 |
| 0.1328 | 2.6058 | 1084 | 0.7951 | 0.6135 | 0.7951 | 0.8917 |
| 0.1328 | 2.6106 | 1086 | 0.8551 | 0.5985 | 0.8551 | 0.9247 |
| 0.1328 | 2.6154 | 1088 | 0.9583 | 0.6260 | 0.9583 | 0.9789 |
| 0.1328 | 2.6202 | 1090 | 1.1406 | 0.6045 | 1.1406 | 1.0680 |
| 0.1328 | 2.625 | 1092 | 1.2203 | 0.5804 | 1.2203 | 1.1047 |
| 0.1328 | 2.6298 | 1094 | 1.1821 | 0.5804 | 1.1821 | 1.0872 |
| 0.1328 | 2.6346 | 1096 | 1.1743 | 0.5522 | 1.1743 | 1.0837 |
| 0.1328 | 2.6394 | 1098 | 1.1604 | 0.5225 | 1.1604 | 1.0772 |
| 0.1328 | 2.6442 | 1100 | 1.0226 | 0.5826 | 1.0226 | 1.0112 |
| 0.1328 | 2.6490 | 1102 | 0.9784 | 0.5826 | 0.9784 | 0.9891 |
| 0.1328 | 2.6538 | 1104 | 0.9506 | 0.6070 | 0.9506 | 0.9750 |
| 0.1328 | 2.6587 | 1106 | 0.8722 | 0.6070 | 0.8722 | 0.9339 |
| 0.1328 | 2.6635 | 1108 | 0.8105 | 0.6437 | 0.8105 | 0.9003 |
| 0.1328 | 2.6683 | 1110 | 0.8641 | 0.6475 | 0.8641 | 0.9296 |
| 0.1328 | 2.6731 | 1112 | 0.9728 | 0.6467 | 0.9728 | 0.9863 |
| 0.1328 | 2.6779 | 1114 | 0.9614 | 0.6786 | 0.9614 | 0.9805 |
| 0.1328 | 2.6827 | 1116 | 0.8431 | 0.6993 | 0.8431 | 0.9182 |
| 0.1328 | 2.6875 | 1118 | 0.7139 | 0.6667 | 0.7139 | 0.8449 |
| 0.1328 | 2.6923 | 1120 | 0.7006 | 0.6478 | 0.7006 | 0.8370 |
| 0.1328 | 2.6971 | 1122 | 0.7704 | 0.675 | 0.7704 | 0.8777 |
| 0.1328 | 2.7019 | 1124 | 0.9270 | 0.6165 | 0.9270 | 0.9628 |
| 0.1328 | 2.7067 | 1126 | 1.0558 | 0.5951 | 1.0558 | 1.0275 |
| 0.1328 | 2.7115 | 1128 | 1.0085 | 0.5951 | 1.0085 | 1.0042 |
| 0.1328 | 2.7163 | 1130 | 0.8575 | 0.6301 | 0.8575 | 0.9260 |
| 0.1328 | 2.7212 | 1132 | 0.7665 | 0.6599 | 0.7665 | 0.8755 |
| 0.1328 | 2.7260 | 1134 | 0.7158 | 0.6384 | 0.7158 | 0.8460 |
| 0.1328 | 2.7308 | 1136 | 0.7215 | 0.6384 | 0.7215 | 0.8494 |
| 0.1328 | 2.7356 | 1138 | 0.7671 | 0.675 | 0.7671 | 0.8758 |
| 0.1328 | 2.7404 | 1140 | 0.8439 | 0.6574 | 0.8439 | 0.9186 |
| 0.1328 | 2.7452 | 1142 | 0.8655 | 0.6754 | 0.8655 | 0.9303 |
| 0.1328 | 2.75 | 1144 | 0.9098 | 0.6512 | 0.9098 | 0.9539 |
| 0.1328 | 2.7548 | 1146 | 0.9406 | 0.6512 | 0.9406 | 0.9699 |
| 0.1328 | 2.7596 | 1148 | 0.8696 | 0.6754 | 0.8696 | 0.9325 |
| 0.1328 | 2.7644 | 1150 | 0.7428 | 0.6548 | 0.7428 | 0.8618 |
| 0.1328 | 2.7692 | 1152 | 0.7119 | 0.6384 | 0.7119 | 0.8437 |
| 0.1328 | 2.7740 | 1154 | 0.7134 | 0.6176 | 0.7134 | 0.8447 |
| 0.1328 | 2.7788 | 1156 | 0.7919 | 0.6825 | 0.7919 | 0.8899 |
| 0.1328 | 2.7837 | 1158 | 0.8310 | 0.6846 | 0.8310 | 0.9116 |
| 0.1328 | 2.7885 | 1160 | 0.8792 | 0.6846 | 0.8792 | 0.9377 |
| 0.1328 | 2.7933 | 1162 | 0.8301 | 0.6846 | 0.8301 | 0.9111 |
| 0.1328 | 2.7981 | 1164 | 0.7465 | 0.6574 | 0.7465 | 0.8640 |
| 0.1328 | 2.8029 | 1166 | 0.7404 | 0.6585 | 0.7404 | 0.8604 |
| 0.1328 | 2.8077 | 1168 | 0.7232 | 0.6384 | 0.7232 | 0.8504 |
| 0.1328 | 2.8125 | 1170 | 0.7596 | 0.6804 | 0.7596 | 0.8716 |
| 0.1328 | 2.8173 | 1172 | 0.7876 | 0.675 | 0.7876 | 0.8875 |
| 0.1328 | 2.8221 | 1174 | 0.7723 | 0.675 | 0.7723 | 0.8788 |
| 0.1328 | 2.8269 | 1176 | 0.7870 | 0.675 | 0.7870 | 0.8871 |
| 0.1328 | 2.8317 | 1178 | 0.7873 | 0.675 | 0.7873 | 0.8873 |
| 0.1328 | 2.8365 | 1180 | 0.7679 | 0.675 | 0.7679 | 0.8763 |
| 0.1328 | 2.8413 | 1182 | 0.7465 | 0.6548 | 0.7465 | 0.8640 |
| 0.1328 | 2.8462 | 1184 | 0.7429 | 0.6548 | 0.7429 | 0.8619 |
| 0.1328 | 2.8510 | 1186 | 0.6973 | 0.6159 | 0.6973 | 0.8350 |
| 0.1328 | 2.8558 | 1188 | 0.7429 | 0.6384 | 0.7429 | 0.8619 |
| 0.1328 | 2.8606 | 1190 | 0.8414 | 0.6195 | 0.8414 | 0.9173 |
| 0.1328 | 2.8654 | 1192 | 0.8932 | 0.5798 | 0.8932 | 0.9451 |
| 0.1328 | 2.8702 | 1194 | 0.9422 | 0.5650 | 0.9422 | 0.9707 |
| 0.1328 | 2.875 | 1196 | 0.9286 | 0.5798 | 0.9286 | 0.9636 |
| 0.1328 | 2.8798 | 1198 | 0.9602 | 0.5798 | 0.9602 | 0.9799 |
| 0.1328 | 2.8846 | 1200 | 0.9850 | 0.5798 | 0.9850 | 0.9925 |
| 0.1328 | 2.8894 | 1202 | 0.9671 | 0.5946 | 0.9671 | 0.9834 |
| 0.1328 | 2.8942 | 1204 | 0.9585 | 0.5831 | 0.9585 | 0.9790 |
| 0.1328 | 2.8990 | 1206 | 0.9682 | 0.6679 | 0.9682 | 0.9840 |
| 0.1328 | 2.9038 | 1208 | 0.9803 | 0.6846 | 0.9803 | 0.9901 |
| 0.1328 | 2.9087 | 1210 | 1.0451 | 0.6554 | 1.0451 | 1.0223 |
| 0.1328 | 2.9135 | 1212 | 1.1251 | 0.6224 | 1.1251 | 1.0607 |
| 0.1328 | 2.9183 | 1214 | 1.0509 | 0.6214 | 1.0509 | 1.0251 |
| 0.1328 | 2.9231 | 1216 | 0.8921 | 0.6548 | 0.8921 | 0.9445 |
| 0.1328 | 2.9279 | 1218 | 0.7639 | 0.6337 | 0.7639 | 0.8740 |
| 0.1328 | 2.9327 | 1220 | 0.7301 | 0.6337 | 0.7301 | 0.8544 |
| 0.1328 | 2.9375 | 1222 | 0.7597 | 0.6337 | 0.7597 | 0.8716 |
| 0.1328 | 2.9423 | 1224 | 0.7561 | 0.6136 | 0.7561 | 0.8695 |
| 0.1328 | 2.9471 | 1226 | 0.7684 | 0.6429 | 0.7684 | 0.8766 |
| 0.1328 | 2.9519 | 1228 | 0.8116 | 0.6468 | 0.8116 | 0.9009 |
| 0.1328 | 2.9567 | 1230 | 0.9022 | 0.6327 | 0.9022 | 0.9499 |
| 0.1328 | 2.9615 | 1232 | 0.9101 | 0.6076 | 0.9101 | 0.9540 |
| 0.1328 | 2.9663 | 1234 | 0.9191 | 0.5798 | 0.9191 | 0.9587 |
| 0.1328 | 2.9712 | 1236 | 0.8733 | 0.5946 | 0.8733 | 0.9345 |
| 0.1328 | 2.9760 | 1238 | 0.8599 | 0.5946 | 0.8599 | 0.9273 |
| 0.1328 | 2.9808 | 1240 | 0.8933 | 0.5798 | 0.8933 | 0.9451 |
| 0.1328 | 2.9856 | 1242 | 1.0043 | 0.5940 | 1.0043 | 1.0022 |
| 0.1328 | 2.9904 | 1244 | 1.0428 | 0.6489 | 1.0428 | 1.0212 |
| 0.1328 | 2.9952 | 1246 | 0.9634 | 0.6348 | 0.9634 | 0.9816 |
| 0.1328 | 3.0 | 1248 | 0.8189 | 0.6239 | 0.8189 | 0.9049 |
| 0.1328 | 3.0048 | 1250 | 0.7259 | 0.6478 | 0.7259 | 0.8520 |
| 0.1328 | 3.0096 | 1252 | 0.7137 | 0.6358 | 0.7137 | 0.8448 |
| 0.1328 | 3.0144 | 1254 | 0.7572 | 0.6478 | 0.7572 | 0.8702 |
| 0.1328 | 3.0192 | 1256 | 0.9136 | 0.6732 | 0.9136 | 0.9558 |
| 0.1328 | 3.0240 | 1258 | 1.0242 | 0.5618 | 1.0242 | 1.0120 |
| 0.1328 | 3.0288 | 1260 | 0.9884 | 0.5755 | 0.9884 | 0.9942 |
| 0.1328 | 3.0337 | 1262 | 0.9057 | 0.6168 | 0.9057 | 0.9517 |
| 0.1328 | 3.0385 | 1264 | 0.8314 | 0.675 | 0.8314 | 0.9118 |
| 0.1328 | 3.0433 | 1266 | 0.7085 | 0.6358 | 0.7085 | 0.8417 |
| 0.1328 | 3.0481 | 1268 | 0.6665 | 0.6283 | 0.6665 | 0.8164 |
| 0.1328 | 3.0529 | 1270 | 0.6954 | 0.6358 | 0.6954 | 0.8339 |
| 0.1328 | 3.0577 | 1272 | 0.7852 | 0.6686 | 0.7852 | 0.8861 |
| 0.1328 | 3.0625 | 1274 | 0.8989 | 0.6410 | 0.8989 | 0.9481 |
| 0.1328 | 3.0673 | 1276 | 1.0611 | 0.6489 | 1.0611 | 1.0301 |
| 0.1328 | 3.0721 | 1278 | 1.1405 | 0.5747 | 1.1405 | 1.0679 |
| 0.1328 | 3.0769 | 1280 | 1.0813 | 0.6129 | 1.0813 | 1.0398 |
| 0.1328 | 3.0817 | 1282 | 0.9573 | 0.6461 | 0.9573 | 0.9784 |
| 0.1328 | 3.0865 | 1284 | 0.8711 | 0.6548 | 0.8711 | 0.9333 |
| 0.1328 | 3.0913 | 1286 | 0.7207 | 0.6358 | 0.7207 | 0.8489 |
| 0.1328 | 3.0962 | 1288 | 0.6134 | 0.6405 | 0.6134 | 0.7832 |
| 0.1328 | 3.1010 | 1290 | 0.5988 | 0.6691 | 0.5988 | 0.7738 |
| 0.1328 | 3.1058 | 1292 | 0.6454 | 0.6157 | 0.6454 | 0.8034 |
| 0.1328 | 3.1106 | 1294 | 0.7857 | 0.6860 | 0.7857 | 0.8864 |
| 0.1328 | 3.1154 | 1296 | 0.9499 | 0.6445 | 0.9499 | 0.9746 |
| 0.1328 | 3.1202 | 1298 | 0.9996 | 0.6335 | 0.9996 | 0.9998 |
| 0.1328 | 3.125 | 1300 | 0.9984 | 0.6308 | 0.9984 | 0.9992 |
| 0.1328 | 3.1298 | 1302 | 1.0028 | 0.6006 | 1.0028 | 1.0014 |
| 0.1328 | 3.1346 | 1304 | 1.0791 | 0.5446 | 1.0791 | 1.0388 |
| 0.1328 | 3.1394 | 1306 | 1.1265 | 0.4986 | 1.1265 | 1.0614 |
| 0.1328 | 3.1442 | 1308 | 1.2048 | 0.5093 | 1.2048 | 1.0976 |
| 0.1328 | 3.1490 | 1310 | 1.2150 | 0.5191 | 1.2150 | 1.1023 |
| 0.1328 | 3.1538 | 1312 | 1.1440 | 0.5594 | 1.1440 | 1.0696 |
| 0.1328 | 3.1587 | 1314 | 1.0118 | 0.6224 | 1.0118 | 1.0059 |
| 0.1328 | 3.1635 | 1316 | 0.9057 | 0.6185 | 0.9057 | 0.9517 |
| 0.1328 | 3.1683 | 1318 | 0.8059 | 0.5965 | 0.8059 | 0.8977 |
| 0.1328 | 3.1731 | 1320 | 0.7910 | 0.5965 | 0.7910 | 0.8894 |
| 0.1328 | 3.1779 | 1322 | 0.7988 | 0.5897 | 0.7988 | 0.8937 |
| 0.1328 | 3.1827 | 1324 | 0.8407 | 0.6076 | 0.8407 | 0.9169 |
| 0.1328 | 3.1875 | 1326 | 0.8363 | 0.5757 | 0.8363 | 0.9145 |
| 0.1328 | 3.1923 | 1328 | 0.8535 | 0.6173 | 0.8535 | 0.9239 |
| 0.1328 | 3.1971 | 1330 | 0.8704 | 0.6173 | 0.8704 | 0.9330 |
| 0.1328 | 3.2019 | 1332 | 0.8574 | 0.5946 | 0.8574 | 0.9260 |
| 0.1328 | 3.2067 | 1334 | 0.8219 | 0.5946 | 0.8219 | 0.9066 |
| 0.1328 | 3.2115 | 1336 | 0.8227 | 0.6410 | 0.8227 | 0.9070 |
| 0.1328 | 3.2163 | 1338 | 0.8793 | 0.6733 | 0.8793 | 0.9377 |
| 0.1328 | 3.2212 | 1340 | 0.9906 | 0.63 | 0.9906 | 0.9953 |
| 0.1328 | 3.2260 | 1342 | 1.0123 | 0.63 | 1.0123 | 1.0062 |
| 0.1328 | 3.2308 | 1344 | 0.9845 | 0.6313 | 0.9845 | 0.9922 |
| 0.1328 | 3.2356 | 1346 | 0.8946 | 0.6616 | 0.8946 | 0.9458 |
| 0.1328 | 3.2404 | 1348 | 0.8701 | 0.6616 | 0.8701 | 0.9328 |
| 0.1328 | 3.2452 | 1350 | 0.9444 | 0.6110 | 0.9444 | 0.9718 |
| 0.1328 | 3.25 | 1352 | 0.9616 | 0.5920 | 0.9616 | 0.9806 |
| 0.1328 | 3.2548 | 1354 | 0.9378 | 0.5984 | 0.9378 | 0.9684 |
| 0.1328 | 3.2596 | 1356 | 0.9291 | 0.6202 | 0.9291 | 0.9639 |
| 0.1328 | 3.2644 | 1358 | 0.9480 | 0.6269 | 0.9480 | 0.9736 |
| 0.1328 | 3.2692 | 1360 | 1.0155 | 0.6489 | 1.0155 | 1.0077 |
| 0.1328 | 3.2740 | 1362 | 1.0196 | 0.5984 | 1.0196 | 1.0098 |
| 0.1328 | 3.2788 | 1364 | 0.9506 | 0.5920 | 0.9506 | 0.9750 |
| 0.1328 | 3.2837 | 1366 | 0.8672 | 0.5885 | 0.8672 | 0.9313 |
| 0.1328 | 3.2885 | 1368 | 0.8561 | 0.5650 | 0.8561 | 0.9253 |
| 0.1328 | 3.2933 | 1370 | 0.8426 | 0.5946 | 0.8426 | 0.9179 |
| 0.1328 | 3.2981 | 1372 | 0.7885 | 0.6217 | 0.7885 | 0.8879 |
| 0.1328 | 3.3029 | 1374 | 0.7641 | 0.6022 | 0.7641 | 0.8741 |
| 0.1328 | 3.3077 | 1376 | 0.7944 | 0.6022 | 0.7944 | 0.8913 |
| 0.1328 | 3.3125 | 1378 | 0.8959 | 0.625 | 0.8959 | 0.9465 |
| 0.1328 | 3.3173 | 1380 | 0.9581 | 0.6130 | 0.9581 | 0.9788 |
| 0.1328 | 3.3221 | 1382 | 1.0482 | 0.6161 | 1.0482 | 1.0238 |
| 0.1328 | 3.3269 | 1384 | 1.1249 | 0.6161 | 1.1249 | 1.0606 |
| 0.1328 | 3.3317 | 1386 | 1.0989 | 0.5825 | 1.0989 | 1.0483 |
| 0.1328 | 3.3365 | 1388 | 1.0191 | 0.5448 | 1.0192 | 1.0095 |
| 0.1328 | 3.3413 | 1390 | 0.9738 | 0.5623 | 0.9738 | 0.9868 |
| 0.1328 | 3.3462 | 1392 | 0.9642 | 0.5702 | 0.9642 | 0.9819 |
| 0.1328 | 3.3510 | 1394 | 0.9272 | 0.6040 | 0.9272 | 0.9629 |
| 0.1328 | 3.3558 | 1396 | 0.9353 | 0.6040 | 0.9353 | 0.9671 |
| 0.1328 | 3.3606 | 1398 | 1.0376 | 0.6043 | 1.0376 | 1.0186 |
| 0.1328 | 3.3654 | 1400 | 1.0876 | 0.6354 | 1.0876 | 1.0429 |
| 0.1328 | 3.3702 | 1402 | 1.0361 | 0.6313 | 1.0361 | 1.0179 |
| 0.1328 | 3.375 | 1404 | 0.9472 | 0.5898 | 0.9472 | 0.9732 |
| 0.1328 | 3.3798 | 1406 | 0.8810 | 0.6190 | 0.8810 | 0.9386 |
| 0.1328 | 3.3846 | 1408 | 0.8884 | 0.6117 | 0.8884 | 0.9426 |
| 0.1328 | 3.3894 | 1410 | 0.9144 | 0.5831 | 0.9144 | 0.9562 |
| 0.1328 | 3.3942 | 1412 | 0.9604 | 0.6240 | 0.9604 | 0.9800 |
| 0.1328 | 3.3990 | 1414 | 0.9141 | 0.6173 | 0.9141 | 0.9561 |
| 0.1328 | 3.4038 | 1416 | 0.9077 | 0.5798 | 0.9077 | 0.9527 |
| 0.1328 | 3.4087 | 1418 | 0.8854 | 0.5798 | 0.8854 | 0.9410 |
| 0.1328 | 3.4135 | 1420 | 0.9002 | 0.5798 | 0.9002 | 0.9488 |
| 0.1328 | 3.4183 | 1422 | 0.9054 | 0.5798 | 0.9054 | 0.9515 |
| 0.1328 | 3.4231 | 1424 | 0.9099 | 0.5798 | 0.9099 | 0.9539 |
| 0.1328 | 3.4279 | 1426 | 0.9345 | 0.5616 | 0.9345 | 0.9667 |
| 0.1328 | 3.4327 | 1428 | 0.9497 | 0.6055 | 0.9497 | 0.9745 |
| 0.1328 | 3.4375 | 1430 | 0.9067 | 0.6209 | 0.9067 | 0.9522 |
| 0.1328 | 3.4423 | 1432 | 0.8404 | 0.6 | 0.8404 | 0.9167 |
| 0.1328 | 3.4471 | 1434 | 0.8875 | 0.5946 | 0.8875 | 0.9421 |
| 0.1328 | 3.4519 | 1436 | 0.9503 | 0.5840 | 0.9503 | 0.9749 |
| 0.1328 | 3.4567 | 1438 | 0.9724 | 0.6029 | 0.9724 | 0.9861 |
| 0.1328 | 3.4615 | 1440 | 0.9278 | 0.5798 | 0.9278 | 0.9632 |
| 0.1328 | 3.4663 | 1442 | 0.9253 | 0.5798 | 0.9253 | 0.9619 |
| 0.1328 | 3.4712 | 1444 | 0.9332 | 0.5798 | 0.9332 | 0.9660 |
| 0.1328 | 3.4760 | 1446 | 0.9451 | 0.5946 | 0.9451 | 0.9722 |
| 0.1328 | 3.4808 | 1448 | 1.0340 | 0.5304 | 1.0340 | 1.0169 |
| 0.1328 | 3.4856 | 1450 | 1.0812 | 0.5899 | 1.0812 | 1.0398 |
| 0.1328 | 3.4904 | 1452 | 1.0087 | 0.6287 | 1.0087 | 1.0043 |
| 0.1328 | 3.4952 | 1454 | 0.8725 | 0.5757 | 0.8725 | 0.9341 |
| 0.1328 | 3.5 | 1456 | 0.8044 | 0.6093 | 0.8044 | 0.8969 |
| 0.1328 | 3.5048 | 1458 | 0.7869 | 0.6093 | 0.7869 | 0.8871 |
| 0.1328 | 3.5096 | 1460 | 0.7973 | 0.6093 | 0.7973 | 0.8929 |
| 0.1328 | 3.5144 | 1462 | 0.8525 | 0.5946 | 0.8525 | 0.9233 |
| 0.1328 | 3.5192 | 1464 | 0.8552 | 0.5946 | 0.8552 | 0.9247 |
| 0.1328 | 3.5240 | 1466 | 0.9054 | 0.5681 | 0.9054 | 0.9515 |
| 0.1328 | 3.5288 | 1468 | 1.0004 | 0.6150 | 1.0004 | 1.0002 |
| 0.1328 | 3.5337 | 1470 | 1.0093 | 0.5861 | 1.0093 | 1.0046 |
| 0.1328 | 3.5385 | 1472 | 0.9320 | 0.5681 | 0.9320 | 0.9654 |
| 0.1328 | 3.5433 | 1474 | 0.8790 | 0.5681 | 0.8790 | 0.9376 |
| 0.1328 | 3.5481 | 1476 | 0.8682 | 0.5946 | 0.8682 | 0.9318 |
| 0.1328 | 3.5529 | 1478 | 0.9197 | 0.5921 | 0.9197 | 0.9590 |
| 0.1328 | 3.5577 | 1480 | 1.0269 | 0.5678 | 1.0269 | 1.0134 |
| 0.1328 | 3.5625 | 1482 | 1.0976 | 0.55 | 1.0976 | 1.0477 |
| 0.1328 | 3.5673 | 1484 | 1.0818 | 0.5839 | 1.0818 | 1.0401 |
| 0.1328 | 3.5721 | 1486 | 0.9774 | 0.5861 | 0.9774 | 0.9887 |
| 0.1328 | 3.5769 | 1488 | 0.8323 | 0.5681 | 0.8323 | 0.9123 |
| 0.1328 | 3.5817 | 1490 | 0.7672 | 0.6135 | 0.7672 | 0.8759 |
| 0.1328 | 3.5865 | 1492 | 0.7520 | 0.6135 | 0.7520 | 0.8672 |
| 0.1328 | 3.5913 | 1494 | 0.8028 | 0.5965 | 0.8028 | 0.8960 |
| 0.1328 | 3.5962 | 1496 | 0.8325 | 0.6450 | 0.8325 | 0.9124 |
| 0.1328 | 3.6010 | 1498 | 0.8953 | 0.6547 | 0.8953 | 0.9462 |
| 0.0901 | 3.6058 | 1500 | 0.8808 | 0.6547 | 0.8808 | 0.9385 |
| 0.0901 | 3.6106 | 1502 | 0.8131 | 0.6170 | 0.8131 | 0.9017 |
| 0.0901 | 3.6154 | 1504 | 0.7638 | 0.6093 | 0.7638 | 0.8740 |
| 0.0901 | 3.6202 | 1506 | 0.7518 | 0.6135 | 0.7518 | 0.8671 |
| 0.0901 | 3.625 | 1508 | 0.8056 | 0.5681 | 0.8056 | 0.8975 |
| 0.0901 | 3.6298 | 1510 | 0.9075 | 0.5681 | 0.9075 | 0.9526 |
| 0.0901 | 3.6346 | 1512 | 1.0178 | 0.5244 | 1.0178 | 1.0089 |
| 0.0901 | 3.6394 | 1514 | 1.0428 | 0.4864 | 1.0428 | 1.0212 |
| 0.0901 | 3.6442 | 1516 | 1.0029 | 0.5299 | 1.0029 | 1.0015 |
| 0.0901 | 3.6490 | 1518 | 1.0404 | 0.5244 | 1.0404 | 1.0200 |
| 0.0901 | 3.6538 | 1520 | 1.0294 | 0.5570 | 1.0294 | 1.0146 |
| 0.0901 | 3.6587 | 1522 | 0.9505 | 0.5673 | 0.9505 | 0.9750 |
| 0.0901 | 3.6635 | 1524 | 0.8973 | 0.5808 | 0.8973 | 0.9473 |
| 0.0901 | 3.6683 | 1526 | 0.8590 | 0.5726 | 0.8590 | 0.9268 |
| 0.0901 | 3.6731 | 1528 | 0.8752 | 0.5943 | 0.8752 | 0.9355 |
| 0.0901 | 3.6779 | 1530 | 0.9395 | 0.6308 | 0.9395 | 0.9693 |
| 0.0901 | 3.6827 | 1532 | 0.9215 | 0.6633 | 0.9215 | 0.9600 |
| 0.0901 | 3.6875 | 1534 | 0.8440 | 0.6170 | 0.8440 | 0.9187 |
| 0.0901 | 3.6923 | 1536 | 0.7580 | 0.6337 | 0.7580 | 0.8707 |
| 0.0901 | 3.6971 | 1538 | 0.7187 | 0.6384 | 0.7187 | 0.8478 |
| 0.0901 | 3.7019 | 1540 | 0.7386 | 0.6135 | 0.7386 | 0.8594 |
| 0.0901 | 3.7067 | 1542 | 0.8219 | 0.6316 | 0.8219 | 0.9066 |
| 0.0901 | 3.7115 | 1544 | 0.9730 | 0.6019 | 0.9730 | 0.9864 |
| 0.0901 | 3.7163 | 1546 | 1.1003 | 0.6092 | 1.1003 | 1.0489 |
| 0.0901 | 3.7212 | 1548 | 1.1124 | 0.6143 | 1.1124 | 1.0547 |
| 0.0901 | 3.7260 | 1550 | 1.0293 | 0.6183 | 1.0293 | 1.0146 |
| 0.0901 | 3.7308 | 1552 | 0.9386 | 0.6240 | 0.9386 | 0.9688 |
| 0.0901 | 3.7356 | 1554 | 0.8436 | 0.6529 | 0.8436 | 0.9185 |
| 0.0901 | 3.7404 | 1556 | 0.7735 | 0.6316 | 0.7735 | 0.8795 |
| 0.0901 | 3.7452 | 1558 | 0.7761 | 0.6316 | 0.7761 | 0.8810 |
| 0.0901 | 3.75 | 1560 | 0.7680 | 0.6548 | 0.7680 | 0.8764 |
| 0.0901 | 3.7548 | 1562 | 0.7948 | 0.6367 | 0.7948 | 0.8915 |
| 0.0901 | 3.7596 | 1564 | 0.7802 | 0.6574 | 0.7802 | 0.8833 |
| 0.0901 | 3.7644 | 1566 | 0.7450 | 0.6386 | 0.7450 | 0.8631 |
| 0.0901 | 3.7692 | 1568 | 0.7657 | 0.6386 | 0.7657 | 0.8750 |
| 0.0901 | 3.7740 | 1570 | 0.8123 | 0.6574 | 0.8123 | 0.9013 |
| 0.0901 | 3.7788 | 1572 | 0.8107 | 0.6529 | 0.8107 | 0.9004 |
| 0.0901 | 3.7837 | 1574 | 0.7938 | 0.6529 | 0.7938 | 0.8909 |
| 0.0901 | 3.7885 | 1576 | 0.8210 | 0.6529 | 0.8210 | 0.9061 |
| 0.0901 | 3.7933 | 1578 | 0.8360 | 0.6529 | 0.8360 | 0.9143 |
| 0.0901 | 3.7981 | 1580 | 0.8334 | 0.6529 | 0.8334 | 0.9129 |
| 0.0901 | 3.8029 | 1582 | 0.8138 | 0.6634 | 0.8138 | 0.9021 |
| 0.0901 | 3.8077 | 1584 | 0.7845 | 0.6634 | 0.7845 | 0.8857 |
| 0.0901 | 3.8125 | 1586 | 0.7890 | 0.6634 | 0.7890 | 0.8883 |
| 0.0901 | 3.8173 | 1588 | 0.7543 | 0.6431 | 0.7543 | 0.8685 |
| 0.0901 | 3.8221 | 1590 | 0.7199 | 0.6431 | 0.7199 | 0.8485 |
| 0.0901 | 3.8269 | 1592 | 0.7307 | 0.6431 | 0.7307 | 0.8548 |
| 0.0901 | 3.8317 | 1594 | 0.7604 | 0.6634 | 0.7604 | 0.8720 |
| 0.0901 | 3.8365 | 1596 | 0.7625 | 0.6634 | 0.7625 | 0.8732 |
| 0.0901 | 3.8413 | 1598 | 0.8011 | 0.6634 | 0.8011 | 0.8951 |
| 0.0901 | 3.8462 | 1600 | 0.8168 | 0.6634 | 0.8168 | 0.9038 |
| 0.0901 | 3.8510 | 1602 | 0.8119 | 0.6634 | 0.8119 | 0.9010 |
| 0.0901 | 3.8558 | 1604 | 0.8333 | 0.6634 | 0.8333 | 0.9129 |
| 0.0901 | 3.8606 | 1606 | 0.8540 | 0.6634 | 0.8540 | 0.9241 |
| 0.0901 | 3.8654 | 1608 | 0.9272 | 0.6300 | 0.9272 | 0.9629 |
| 0.0901 | 3.8702 | 1610 | 0.9120 | 0.6468 | 0.9120 | 0.9550 |
| 0.0901 | 3.875 | 1612 | 0.8307 | 0.6634 | 0.8307 | 0.9114 |
| 0.0901 | 3.8798 | 1614 | 0.7721 | 0.6634 | 0.7721 | 0.8787 |
| 0.0901 | 3.8846 | 1616 | 0.7389 | 0.6768 | 0.7389 | 0.8596 |
| 0.0901 | 3.8894 | 1618 | 0.7542 | 0.6634 | 0.7542 | 0.8685 |
| 0.0901 | 3.8942 | 1620 | 0.8289 | 0.6468 | 0.8289 | 0.9104 |
| 0.0901 | 3.8990 | 1622 | 0.9542 | 0.6461 | 0.9542 | 0.9769 |
| 0.0901 | 3.9038 | 1624 | 0.9863 | 0.6445 | 0.9863 | 0.9931 |
| 0.0901 | 3.9087 | 1626 | 0.9698 | 0.6554 | 0.9698 | 0.9848 |
| 0.0901 | 3.9135 | 1628 | 0.9286 | 0.6574 | 0.9286 | 0.9636 |
| 0.0901 | 3.9183 | 1630 | 0.9327 | 0.6574 | 0.9327 | 0.9657 |
| 0.0901 | 3.9231 | 1632 | 0.8866 | 0.6538 | 0.8866 | 0.9416 |
| 0.0901 | 3.9279 | 1634 | 0.8436 | 0.6548 | 0.8436 | 0.9185 |
| 0.0901 | 3.9327 | 1636 | 0.8267 | 0.6217 | 0.8267 | 0.9092 |
| 0.0901 | 3.9375 | 1638 | 0.8205 | 0.6022 | 0.8205 | 0.9058 |
| 0.0901 | 3.9423 | 1640 | 0.8867 | 0.6574 | 0.8867 | 0.9417 |
| 0.0901 | 3.9471 | 1642 | 0.9396 | 0.6574 | 0.9396 | 0.9693 |
| 0.0901 | 3.9519 | 1644 | 0.9347 | 0.6574 | 0.9347 | 0.9668 |
| 0.0901 | 3.9567 | 1646 | 0.8980 | 0.6574 | 0.8980 | 0.9476 |
| 0.0901 | 3.9615 | 1648 | 0.8806 | 0.6341 | 0.8806 | 0.9384 |
| 0.0901 | 3.9663 | 1650 | 0.8279 | 0.6136 | 0.8279 | 0.9099 |
| 0.0901 | 3.9712 | 1652 | 0.7728 | 0.6337 | 0.7728 | 0.8791 |
| 0.0901 | 3.9760 | 1654 | 0.7636 | 0.6337 | 0.7636 | 0.8739 |
| 0.0901 | 3.9808 | 1656 | 0.7952 | 0.6337 | 0.7952 | 0.8918 |
| 0.0901 | 3.9856 | 1658 | 0.8663 | 0.6619 | 0.8663 | 0.9308 |
| 0.0901 | 3.9904 | 1660 | 0.9017 | 0.6650 | 0.9017 | 0.9496 |
| 0.0901 | 3.9952 | 1662 | 0.8602 | 0.6341 | 0.8602 | 0.9275 |
| 0.0901 | 4.0 | 1664 | 0.7805 | 0.6384 | 0.7805 | 0.8834 |
| 0.0901 | 4.0048 | 1666 | 0.7399 | 0.6384 | 0.7399 | 0.8602 |
| 0.0901 | 4.0096 | 1668 | 0.7162 | 0.6384 | 0.7162 | 0.8463 |
| 0.0901 | 4.0144 | 1670 | 0.7524 | 0.6384 | 0.7524 | 0.8674 |
| 0.0901 | 4.0192 | 1672 | 0.7836 | 0.6341 | 0.7836 | 0.8852 |
| 0.0901 | 4.0240 | 1674 | 0.8513 | 0.6825 | 0.8513 | 0.9227 |
| 0.0901 | 4.0288 | 1676 | 0.8733 | 0.6846 | 0.8733 | 0.9345 |
| 0.0901 | 4.0337 | 1678 | 0.8809 | 0.6846 | 0.8809 | 0.9385 |
| 0.0901 | 4.0385 | 1680 | 0.9007 | 0.6606 | 0.9007 | 0.9491 |
| 0.0901 | 4.0433 | 1682 | 0.8622 | 0.6538 | 0.8622 | 0.9285 |
| 0.0901 | 4.0481 | 1684 | 0.7782 | 0.675 | 0.7782 | 0.8822 |
| 0.0901 | 4.0529 | 1686 | 0.7510 | 0.6431 | 0.7510 | 0.8666 |
| 0.0901 | 4.0577 | 1688 | 0.7547 | 0.6634 | 0.7547 | 0.8687 |
| 0.0901 | 4.0625 | 1690 | 0.7736 | 0.6634 | 0.7736 | 0.8795 |
| 0.0901 | 4.0673 | 1692 | 0.8287 | 0.675 | 0.8287 | 0.9103 |
| 0.0901 | 4.0721 | 1694 | 0.8824 | 0.6538 | 0.8824 | 0.9394 |
| 0.0901 | 4.0769 | 1696 | 0.8838 | 0.6538 | 0.8838 | 0.9401 |
| 0.0901 | 4.0817 | 1698 | 0.8207 | 0.6538 | 0.8207 | 0.9059 |
| 0.0901 | 4.0865 | 1700 | 0.8022 | 0.6341 | 0.8022 | 0.8957 |
| 0.0901 | 4.0913 | 1702 | 0.8088 | 0.6337 | 0.8088 | 0.8993 |
| 0.0901 | 4.0962 | 1704 | 0.8408 | 0.6529 | 0.8408 | 0.9170 |
| 0.0901 | 4.1010 | 1706 | 0.9321 | 0.6189 | 0.9321 | 0.9654 |
| 0.0901 | 4.1058 | 1708 | 1.0066 | 0.6189 | 1.0066 | 1.0033 |
| 0.0901 | 4.1106 | 1710 | 1.0497 | 0.5537 | 1.0497 | 1.0245 |
| 0.0901 | 4.1154 | 1712 | 1.0813 | 0.5283 | 1.0813 | 1.0398 |
| 0.0901 | 4.1202 | 1714 | 1.0500 | 0.5850 | 1.0500 | 1.0247 |
| 0.0901 | 4.125 | 1716 | 1.0262 | 0.5991 | 1.0262 | 1.0130 |
| 0.0901 | 4.1298 | 1718 | 1.0279 | 0.5991 | 1.0279 | 1.0139 |
| 0.0901 | 4.1346 | 1720 | 0.9792 | 0.6390 | 0.9792 | 0.9895 |
| 0.0901 | 4.1394 | 1722 | 0.9806 | 0.6390 | 0.9806 | 0.9903 |
| 0.0901 | 4.1442 | 1724 | 0.9474 | 0.6390 | 0.9474 | 0.9733 |
| 0.0901 | 4.1490 | 1726 | 0.9066 | 0.5946 | 0.9066 | 0.9522 |
| 0.0901 | 4.1538 | 1728 | 0.9147 | 0.5946 | 0.9147 | 0.9564 |
| 0.0901 | 4.1587 | 1730 | 0.9556 | 0.6189 | 0.9556 | 0.9775 |
| 0.0901 | 4.1635 | 1732 | 1.0438 | 0.6547 | 1.0438 | 1.0217 |
| 0.0901 | 4.1683 | 1734 | 1.0873 | 0.6547 | 1.0873 | 1.0427 |
| 0.0901 | 4.1731 | 1736 | 1.0546 | 0.6547 | 1.0546 | 1.0269 |
| 0.0901 | 4.1779 | 1738 | 1.0114 | 0.6547 | 1.0114 | 1.0057 |
| 0.0901 | 4.1827 | 1740 | 1.0013 | 0.6077 | 1.0013 | 1.0006 |
| 0.0901 | 4.1875 | 1742 | 1.0707 | 0.5720 | 1.0707 | 1.0348 |
| 0.0901 | 4.1923 | 1744 | 1.1323 | 0.5378 | 1.1323 | 1.0641 |
| 0.0901 | 4.1971 | 1746 | 1.2477 | 0.4740 | 1.2477 | 1.1170 |
| 0.0901 | 4.2019 | 1748 | 1.3492 | 0.4375 | 1.3492 | 1.1615 |
| 0.0901 | 4.2067 | 1750 | 1.3820 | 0.4492 | 1.3820 | 1.1756 |
| 0.0901 | 4.2115 | 1752 | 1.3252 | 0.4159 | 1.3252 | 1.1512 |
| 0.0901 | 4.2163 | 1754 | 1.2056 | 0.5317 | 1.2056 | 1.0980 |
| 0.0901 | 4.2212 | 1756 | 1.0490 | 0.5350 | 1.0490 | 1.0242 |
| 0.0901 | 4.2260 | 1758 | 0.9411 | 0.5248 | 0.9411 | 0.9701 |
| 0.0901 | 4.2308 | 1760 | 0.8884 | 0.5870 | 0.8884 | 0.9425 |
| 0.0901 | 4.2356 | 1762 | 0.9129 | 0.5645 | 0.9129 | 0.9555 |
| 0.0901 | 4.2404 | 1764 | 0.9822 | 0.5720 | 0.9822 | 0.9910 |
| 0.0901 | 4.2452 | 1766 | 1.0328 | 0.6010 | 1.0328 | 1.0162 |
| 0.0901 | 4.25 | 1768 | 1.0844 | 0.5638 | 1.0844 | 1.0414 |
| 0.0901 | 4.2548 | 1770 | 1.0626 | 0.5720 | 1.0626 | 1.0308 |
| 0.0901 | 4.2596 | 1772 | 1.0970 | 0.5720 | 1.0970 | 1.0474 |
| 0.0901 | 4.2644 | 1774 | 1.0790 | 0.5593 | 1.0790 | 1.0387 |
| 0.0901 | 4.2692 | 1776 | 1.0273 | 0.55 | 1.0273 | 1.0135 |
| 0.0901 | 4.2740 | 1778 | 0.9921 | 0.5248 | 0.9921 | 0.9961 |
| 0.0901 | 4.2788 | 1780 | 0.9722 | 0.5248 | 0.9722 | 0.9860 |
| 0.0901 | 4.2837 | 1782 | 1.0127 | 0.5093 | 1.0127 | 1.0063 |
| 0.0901 | 4.2885 | 1784 | 1.0900 | 0.4941 | 1.0900 | 1.0440 |
| 0.0901 | 4.2933 | 1786 | 1.1753 | 0.5 | 1.1753 | 1.0841 |
| 0.0901 | 4.2981 | 1788 | 1.2248 | 0.4740 | 1.2248 | 1.1067 |
| 0.0901 | 4.3029 | 1790 | 1.2037 | 0.4740 | 1.2037 | 1.0971 |
| 0.0901 | 4.3077 | 1792 | 1.0996 | 0.4792 | 1.0996 | 1.0486 |
| 0.0901 | 4.3125 | 1794 | 1.0102 | 0.5088 | 1.0102 | 1.0051 |
| 0.0901 | 4.3173 | 1796 | 0.9989 | 0.5088 | 0.9989 | 0.9994 |
| 0.0901 | 4.3221 | 1798 | 1.0215 | 0.5088 | 1.0215 | 1.0107 |
| 0.0901 | 4.3269 | 1800 | 0.9912 | 0.55 | 0.9912 | 0.9956 |
| 0.0901 | 4.3317 | 1802 | 0.9935 | 0.5733 | 0.9935 | 0.9968 |
| 0.0901 | 4.3365 | 1804 | 1.0353 | 0.5817 | 1.0353 | 1.0175 |
| 0.0901 | 4.3413 | 1806 | 1.0069 | 0.5955 | 1.0069 | 1.0034 |
| 0.0901 | 4.3462 | 1808 | 0.9499 | 0.5921 | 0.9499 | 0.9746 |
| 0.0901 | 4.3510 | 1810 | 0.8610 | 0.5714 | 0.8610 | 0.9279 |
| 0.0901 | 4.3558 | 1812 | 0.7905 | 0.5714 | 0.7905 | 0.8891 |
| 0.0901 | 4.3606 | 1814 | 0.7859 | 0.5714 | 0.7859 | 0.8865 |
| 0.0901 | 4.3654 | 1816 | 0.8273 | 0.5714 | 0.8273 | 0.9095 |
| 0.0901 | 4.3702 | 1818 | 0.9029 | 0.5921 | 0.9029 | 0.9502 |
| 0.0901 | 4.375 | 1820 | 0.9183 | 0.5921 | 0.9183 | 0.9583 |
| 0.0901 | 4.3798 | 1822 | 0.9506 | 0.5733 | 0.9506 | 0.9750 |
| 0.0901 | 4.3846 | 1824 | 0.9507 | 0.5733 | 0.9507 | 0.9750 |
| 0.0901 | 4.3894 | 1826 | 0.9846 | 0.5733 | 0.9846 | 0.9923 |
| 0.0901 | 4.3942 | 1828 | 0.9498 | 0.55 | 0.9498 | 0.9746 |
| 0.0901 | 4.3990 | 1830 | 0.9467 | 0.55 | 0.9467 | 0.9730 |
| 0.0901 | 4.4038 | 1832 | 0.9432 | 0.5757 | 0.9432 | 0.9712 |
| 0.0901 | 4.4087 | 1834 | 0.8837 | 0.5789 | 0.8837 | 0.9401 |
| 0.0901 | 4.4135 | 1836 | 0.8481 | 0.5985 | 0.8481 | 0.9209 |
| 0.0901 | 4.4183 | 1838 | 0.8433 | 0.5985 | 0.8433 | 0.9183 |
| 0.0901 | 4.4231 | 1840 | 0.8864 | 0.5985 | 0.8864 | 0.9415 |
| 0.0901 | 4.4279 | 1842 | 0.9643 | 0.55 | 0.9643 | 0.9820 |
| 0.0901 | 4.4327 | 1844 | 1.0560 | 0.5817 | 1.0560 | 1.0276 |
| 0.0901 | 4.4375 | 1846 | 1.1284 | 0.5671 | 1.1284 | 1.0623 |
| 0.0901 | 4.4423 | 1848 | 1.1181 | 0.5104 | 1.1181 | 1.0574 |
| 0.0901 | 4.4471 | 1850 | 1.1173 | 0.5104 | 1.1173 | 1.0570 |
| 0.0901 | 4.4519 | 1852 | 1.0642 | 0.5043 | 1.0642 | 1.0316 |
| 0.0901 | 4.4567 | 1854 | 1.0184 | 0.5188 | 1.0184 | 1.0092 |
| 0.0901 | 4.4615 | 1856 | 0.9856 | 0.5354 | 0.9856 | 0.9928 |
| 0.0901 | 4.4663 | 1858 | 0.9545 | 0.5527 | 0.9545 | 0.9770 |
| 0.0901 | 4.4712 | 1860 | 0.8996 | 0.5527 | 0.8996 | 0.9485 |
| 0.0901 | 4.4760 | 1862 | 0.8514 | 0.5527 | 0.8514 | 0.9227 |
| 0.0901 | 4.4808 | 1864 | 0.8453 | 0.5676 | 0.8453 | 0.9194 |
| 0.0901 | 4.4856 | 1866 | 0.8528 | 0.5676 | 0.8528 | 0.9235 |
| 0.0901 | 4.4904 | 1868 | 0.9337 | 0.5840 | 0.9337 | 0.9663 |
| 0.0901 | 4.4952 | 1870 | 1.0661 | 0.6530 | 1.0661 | 1.0325 |
| 0.0901 | 4.5 | 1872 | 1.0850 | 0.6146 | 1.0850 | 1.0416 |
| 0.0901 | 4.5048 | 1874 | 1.0344 | 0.5955 | 1.0344 | 1.0171 |
| 0.0901 | 4.5096 | 1876 | 0.9686 | 0.5921 | 0.9686 | 0.9842 |
| 0.0901 | 4.5144 | 1878 | 0.9618 | 0.5960 | 0.9618 | 0.9807 |
| 0.0901 | 4.5192 | 1880 | 0.9573 | 0.5960 | 0.9573 | 0.9784 |
| 0.0901 | 4.5240 | 1882 | 0.8716 | 0.5870 | 0.8716 | 0.9336 |
| 0.0901 | 4.5288 | 1884 | 0.7996 | 0.5870 | 0.7996 | 0.8942 |
| 0.0901 | 4.5337 | 1886 | 0.8029 | 0.5676 | 0.8029 | 0.8961 |
| 0.0901 | 4.5385 | 1888 | 0.8598 | 0.5831 | 0.8598 | 0.9272 |
| 0.0901 | 4.5433 | 1890 | 0.9264 | 0.6287 | 0.9264 | 0.9625 |
| 0.0901 | 4.5481 | 1892 | 1.0139 | 0.6351 | 1.0139 | 1.0069 |
| 0.0901 | 4.5529 | 1894 | 1.0519 | 0.6196 | 1.0519 | 1.0256 |
| 0.0901 | 4.5577 | 1896 | 0.9984 | 0.6168 | 0.9984 | 0.9992 |
| 0.0901 | 4.5625 | 1898 | 0.9138 | 0.6150 | 0.9138 | 0.9559 |
| 0.0901 | 4.5673 | 1900 | 0.8624 | 0.5833 | 0.8624 | 0.9287 |
| 0.0901 | 4.5721 | 1902 | 0.7916 | 0.6135 | 0.7916 | 0.8897 |
| 0.0901 | 4.5769 | 1904 | 0.7855 | 0.6135 | 0.7855 | 0.8863 |
| 0.0901 | 4.5817 | 1906 | 0.8188 | 0.6093 | 0.8188 | 0.9049 |
| 0.0901 | 4.5865 | 1908 | 0.9023 | 0.5681 | 0.9023 | 0.9499 |
| 0.0901 | 4.5913 | 1910 | 0.9734 | 0.5817 | 0.9734 | 0.9866 |
| 0.0901 | 4.5962 | 1912 | 0.9918 | 0.6033 | 0.9918 | 0.9959 |
| 0.0901 | 4.6010 | 1914 | 1.0021 | 0.6033 | 1.0021 | 1.0010 |
| 0.0901 | 4.6058 | 1916 | 0.9421 | 0.5978 | 0.9422 | 0.9706 |
| 0.0901 | 4.6106 | 1918 | 0.8429 | 0.6093 | 0.8429 | 0.9181 |
| 0.0901 | 4.6154 | 1920 | 0.7986 | 0.6135 | 0.7986 | 0.8936 |
| 0.0901 | 4.6202 | 1922 | 0.7842 | 0.6135 | 0.7842 | 0.8856 |
| 0.0901 | 4.625 | 1924 | 0.8180 | 0.5897 | 0.8180 | 0.9045 |
| 0.0901 | 4.6298 | 1926 | 0.9166 | 0.6287 | 0.9166 | 0.9574 |
| 0.0901 | 4.6346 | 1928 | 0.9778 | 0.6718 | 0.9778 | 0.9889 |
| 0.0901 | 4.6394 | 1930 | 0.9627 | 0.6547 | 0.9627 | 0.9812 |
| 0.0901 | 4.6442 | 1932 | 0.8928 | 0.6409 | 0.8928 | 0.9449 |
| 0.0901 | 4.6490 | 1934 | 0.7924 | 0.5933 | 0.7924 | 0.8902 |
| 0.0901 | 4.6538 | 1936 | 0.7245 | 0.6135 | 0.7245 | 0.8512 |
| 0.0901 | 4.6587 | 1938 | 0.7103 | 0.6135 | 0.7103 | 0.8428 |
| 0.0901 | 4.6635 | 1940 | 0.7326 | 0.6135 | 0.7326 | 0.8559 |
| 0.0901 | 4.6683 | 1942 | 0.7906 | 0.6135 | 0.7906 | 0.8891 |
| 0.0901 | 4.6731 | 1944 | 0.8458 | 0.6362 | 0.8458 | 0.9197 |
| 0.0901 | 4.6779 | 1946 | 0.8641 | 0.6316 | 0.8641 | 0.9295 |
| 0.0901 | 4.6827 | 1948 | 0.8498 | 0.6135 | 0.8498 | 0.9218 |
| 0.0901 | 4.6875 | 1950 | 0.8264 | 0.6135 | 0.8264 | 0.9091 |
| 0.0901 | 4.6923 | 1952 | 0.8087 | 0.6135 | 0.8087 | 0.8993 |
| 0.0901 | 4.6971 | 1954 | 0.7682 | 0.6135 | 0.7682 | 0.8765 |
| 0.0901 | 4.7019 | 1956 | 0.7509 | 0.6135 | 0.7509 | 0.8665 |
| 0.0901 | 4.7067 | 1958 | 0.7508 | 0.6135 | 0.7508 | 0.8665 |
| 0.0901 | 4.7115 | 1960 | 0.8051 | 0.5897 | 0.8051 | 0.8973 |
| 0.0901 | 4.7163 | 1962 | 0.8855 | 0.6170 | 0.8855 | 0.9410 |
| 0.0901 | 4.7212 | 1964 | 0.9410 | 0.6221 | 0.9410 | 0.9701 |
| 0.0901 | 4.7260 | 1966 | 0.9853 | 0.6409 | 0.9853 | 0.9926 |
| 0.0901 | 4.7308 | 1968 | 0.9874 | 0.6240 | 0.9874 | 0.9937 |
| 0.0901 | 4.7356 | 1970 | 0.9281 | 0.6316 | 0.9281 | 0.9634 |
| 0.0901 | 4.7404 | 1972 | 0.8780 | 0.6093 | 0.8780 | 0.9370 |
| 0.0901 | 4.7452 | 1974 | 0.8435 | 0.6093 | 0.8435 | 0.9184 |
| 0.0901 | 4.75 | 1976 | 0.8253 | 0.6093 | 0.8253 | 0.9085 |
| 0.0901 | 4.7548 | 1978 | 0.8538 | 0.5946 | 0.8538 | 0.9240 |
| 0.0901 | 4.7596 | 1980 | 0.9164 | 0.5955 | 0.9164 | 0.9573 |
| 0.0901 | 4.7644 | 1982 | 0.9844 | 0.5537 | 0.9844 | 0.9921 |
| 0.0901 | 4.7692 | 1984 | 1.0546 | 0.5465 | 1.0546 | 1.0270 |
| 0.0901 | 4.7740 | 1986 | 1.0867 | 0.5546 | 1.0867 | 1.0425 |
| 0.0901 | 4.7788 | 1988 | 1.0238 | 0.5593 | 1.0238 | 1.0118 |
| 0.0901 | 4.7837 | 1990 | 0.9324 | 0.6040 | 0.9324 | 0.9656 |
| 0.0901 | 4.7885 | 1992 | 0.8485 | 0.6170 | 0.8485 | 0.9211 |
| 0.0901 | 4.7933 | 1994 | 0.8111 | 0.6170 | 0.8111 | 0.9006 |
| 0.0901 | 4.7981 | 1996 | 0.8055 | 0.6170 | 0.8055 | 0.8975 |
| 0.0901 | 4.8029 | 1998 | 0.8310 | 0.6170 | 0.8310 | 0.9116 |
| 0.0735 | 4.8077 | 2000 | 0.8579 | 0.6367 | 0.8579 | 0.9262 |
| 0.0735 | 4.8125 | 2002 | 0.8458 | 0.6170 | 0.8458 | 0.9197 |
| 0.0735 | 4.8173 | 2004 | 0.8644 | 0.6170 | 0.8644 | 0.9297 |
| 0.0735 | 4.8221 | 2006 | 0.8462 | 0.6365 | 0.8462 | 0.9199 |
| 0.0735 | 4.8269 | 2008 | 0.8344 | 0.6316 | 0.8344 | 0.9135 |
| 0.0735 | 4.8317 | 2010 | 0.8362 | 0.6093 | 0.8362 | 0.9144 |
| 0.0735 | 4.8365 | 2012 | 0.8986 | 0.6316 | 0.8986 | 0.9480 |
| 0.0735 | 4.8413 | 2014 | 0.9874 | 0.5593 | 0.9874 | 0.9937 |
| 0.0735 | 4.8462 | 2016 | 1.0190 | 0.5826 | 1.0190 | 1.0095 |
| 0.0735 | 4.8510 | 2018 | 0.9678 | 0.5350 | 0.9678 | 0.9838 |
| 0.0735 | 4.8558 | 2020 | 0.9418 | 0.5921 | 0.9418 | 0.9704 |
| 0.0735 | 4.8606 | 2022 | 0.9503 | 0.6173 | 0.9503 | 0.9748 |
| 0.0735 | 4.8654 | 2024 | 0.9384 | 0.6316 | 0.9384 | 0.9687 |
| 0.0735 | 4.8702 | 2026 | 0.9629 | 0.6529 | 0.9629 | 0.9813 |
| 0.0735 | 4.875 | 2028 | 0.9905 | 0.6150 | 0.9905 | 0.9952 |
| 0.0735 | 4.8798 | 2030 | 0.9958 | 0.6150 | 0.9958 | 0.9979 |
| 0.0735 | 4.8846 | 2032 | 1.0531 | 0.5968 | 1.0531 | 1.0262 |
| 0.0735 | 4.8894 | 2034 | 1.0906 | 0.5826 | 1.0906 | 1.0443 |
| 0.0735 | 4.8942 | 2036 | 1.0348 | 0.5968 | 1.0348 | 1.0172 |
| 0.0735 | 4.8990 | 2038 | 0.9473 | 0.6732 | 0.9473 | 0.9733 |
| 0.0735 | 4.9038 | 2040 | 0.8690 | 0.6316 | 0.8690 | 0.9322 |
| 0.0735 | 4.9087 | 2042 | 0.7950 | 0.6093 | 0.7950 | 0.8917 |
| 0.0735 | 4.9135 | 2044 | 0.7501 | 0.6093 | 0.7501 | 0.8661 |
| 0.0735 | 4.9183 | 2046 | 0.7409 | 0.6093 | 0.7409 | 0.8608 |
| 0.0735 | 4.9231 | 2048 | 0.7556 | 0.6093 | 0.7556 | 0.8693 |
| 0.0735 | 4.9279 | 2050 | 0.7807 | 0.6093 | 0.7807 | 0.8835 |
| 0.0735 | 4.9327 | 2052 | 0.8111 | 0.6093 | 0.8111 | 0.9006 |
| 0.0735 | 4.9375 | 2054 | 0.8924 | 0.6069 | 0.8924 | 0.9447 |
| 0.0735 | 4.9423 | 2056 | 0.9663 | 0.5817 | 0.9663 | 0.9830 |
| 0.0735 | 4.9471 | 2058 | 1.0571 | 0.5512 | 1.0571 | 1.0281 |
| 0.0735 | 4.9519 | 2060 | 1.0664 | 0.5422 | 1.0664 | 1.0327 |
| 0.0735 | 4.9567 | 2062 | 1.0406 | 0.4924 | 1.0406 | 1.0201 |
| 0.0735 | 4.9615 | 2064 | 1.0346 | 0.4498 | 1.0346 | 1.0172 |
| 0.0735 | 4.9663 | 2066 | 0.9911 | 0.4656 | 0.9911 | 0.9955 |
| 0.0735 | 4.9712 | 2068 | 0.9523 | 0.4813 | 0.9523 | 0.9759 |
| 0.0735 | 4.9760 | 2070 | 0.9020 | 0.5681 | 0.9020 | 0.9497 |
| 0.0735 | 4.9808 | 2072 | 0.8393 | 0.5617 | 0.8393 | 0.9161 |
| 0.0735 | 4.9856 | 2074 | 0.8043 | 0.5617 | 0.8043 | 0.8968 |
| 0.0735 | 4.9904 | 2076 | 0.8128 | 0.5833 | 0.8128 | 0.9015 |
| 0.0735 | 4.9952 | 2078 | 0.8793 | 0.6229 | 0.8793 | 0.9377 |
| 0.0735 | 5.0 | 2080 | 0.9342 | 0.6287 | 0.9342 | 0.9665 |
| 0.0735 | 5.0048 | 2082 | 0.9245 | 0.6287 | 0.9245 | 0.9615 |
| 0.0735 | 5.0096 | 2084 | 0.8773 | 0.6097 | 0.8773 | 0.9366 |
| 0.0735 | 5.0144 | 2086 | 0.8257 | 0.5833 | 0.8257 | 0.9087 |
| 0.0735 | 5.0192 | 2088 | 0.7747 | 0.5833 | 0.7747 | 0.8801 |
| 0.0735 | 5.0240 | 2090 | 0.7883 | 0.5833 | 0.7883 | 0.8879 |
| 0.0735 | 5.0288 | 2092 | 0.8522 | 0.5833 | 0.8522 | 0.9232 |
| 0.0735 | 5.0337 | 2094 | 0.9425 | 0.6509 | 0.9425 | 0.9708 |
| 0.0735 | 5.0385 | 2096 | 0.9680 | 0.6509 | 0.9680 | 0.9839 |
| 0.0735 | 5.0433 | 2098 | 0.9210 | 0.6069 | 0.9210 | 0.9597 |
| 0.0735 | 5.0481 | 2100 | 0.8673 | 0.5833 | 0.8673 | 0.9313 |
| 0.0735 | 5.0529 | 2102 | 0.8179 | 0.5833 | 0.8179 | 0.9044 |
| 0.0735 | 5.0577 | 2104 | 0.7896 | 0.5833 | 0.7896 | 0.8886 |
| 0.0735 | 5.0625 | 2106 | 0.7598 | 0.5833 | 0.7598 | 0.8717 |
| 0.0735 | 5.0673 | 2108 | 0.7640 | 0.5870 | 0.7640 | 0.8740 |
| 0.0735 | 5.0721 | 2110 | 0.7904 | 0.5833 | 0.7904 | 0.8890 |
| 0.0735 | 5.0769 | 2112 | 0.8261 | 0.5833 | 0.8261 | 0.9089 |
| 0.0735 | 5.0817 | 2114 | 0.8597 | 0.5833 | 0.8597 | 0.9272 |
| 0.0735 | 5.0865 | 2116 | 0.8801 | 0.6069 | 0.8801 | 0.9381 |
| 0.0735 | 5.0913 | 2118 | 0.8814 | 0.6069 | 0.8814 | 0.9388 |
| 0.0735 | 5.0962 | 2120 | 0.8479 | 0.5833 | 0.8479 | 0.9208 |
| 0.0735 | 5.1010 | 2122 | 0.8481 | 0.5833 | 0.8481 | 0.9209 |
| 0.0735 | 5.1058 | 2124 | 0.8734 | 0.5833 | 0.8734 | 0.9346 |
| 0.0735 | 5.1106 | 2126 | 0.9532 | 0.5921 | 0.9532 | 0.9763 |
| 0.0735 | 5.1154 | 2128 | 0.9994 | 0.5645 | 0.9994 | 0.9997 |
| 0.0735 | 5.1202 | 2130 | 0.9965 | 0.5645 | 0.9965 | 0.9983 |
| 0.0735 | 5.125 | 2132 | 1.0175 | 0.5645 | 1.0175 | 1.0087 |
| 0.0735 | 5.1298 | 2134 | 1.0646 | 0.5545 | 1.0646 | 1.0318 |
| 0.0735 | 5.1346 | 2136 | 1.0609 | 0.5545 | 1.0609 | 1.0300 |
| 0.0735 | 5.1394 | 2138 | 1.0235 | 0.5041 | 1.0235 | 1.0117 |
| 0.0735 | 5.1442 | 2140 | 1.0066 | 0.5188 | 1.0066 | 1.0033 |
| 0.0735 | 5.1490 | 2142 | 0.9841 | 0.5921 | 0.9841 | 0.9920 |
| 0.0735 | 5.1538 | 2144 | 1.0130 | 0.5808 | 1.0130 | 1.0065 |
| 0.0735 | 5.1587 | 2146 | 1.0891 | 0.5357 | 1.0891 | 1.0436 |
| 0.0735 | 5.1635 | 2148 | 1.1317 | 0.5597 | 1.1317 | 1.0638 |
| 0.0735 | 5.1683 | 2150 | 1.0954 | 0.5225 | 1.0954 | 1.0466 |
| 0.0735 | 5.1731 | 2152 | 1.0572 | 0.5277 | 1.0573 | 1.0282 |
| 0.0735 | 5.1779 | 2154 | 0.9758 | 0.4924 | 0.9758 | 0.9878 |
| 0.0735 | 5.1827 | 2156 | 0.9260 | 0.5681 | 0.9260 | 0.9623 |
| 0.0735 | 5.1875 | 2158 | 0.9089 | 0.5681 | 0.9089 | 0.9534 |
| 0.0735 | 5.1923 | 2160 | 0.9310 | 0.5921 | 0.9310 | 0.9649 |
| 0.0735 | 5.1971 | 2162 | 0.9782 | 0.5773 | 0.9782 | 0.9891 |
| 0.0735 | 5.2019 | 2164 | 0.9872 | 0.5773 | 0.9872 | 0.9936 |
| 0.0735 | 5.2067 | 2166 | 1.0176 | 0.6232 | 1.0176 | 1.0088 |
| 0.0735 | 5.2115 | 2168 | 1.0005 | 0.6718 | 1.0005 | 1.0002 |
| 0.0735 | 5.2163 | 2170 | 0.9283 | 0.6229 | 0.9283 | 0.9635 |
| 0.0735 | 5.2212 | 2172 | 0.8638 | 0.6548 | 0.8638 | 0.9294 |
| 0.0735 | 5.2260 | 2174 | 0.8611 | 0.6410 | 0.8611 | 0.9279 |
| 0.0735 | 5.2308 | 2176 | 0.8365 | 0.6384 | 0.8365 | 0.9146 |
| 0.0735 | 5.2356 | 2178 | 0.8169 | 0.6384 | 0.8169 | 0.9038 |
| 0.0735 | 5.2404 | 2180 | 0.8098 | 0.6384 | 0.8098 | 0.8999 |
| 0.0735 | 5.2452 | 2182 | 0.8242 | 0.6239 | 0.8242 | 0.9078 |
| 0.0735 | 5.25 | 2184 | 0.8565 | 0.6410 | 0.8565 | 0.9255 |
| 0.0735 | 5.2548 | 2186 | 0.8638 | 0.6410 | 0.8638 | 0.9294 |
| 0.0735 | 5.2596 | 2188 | 0.8890 | 0.6410 | 0.8890 | 0.9429 |
| 0.0735 | 5.2644 | 2190 | 0.8652 | 0.6410 | 0.8652 | 0.9301 |
| 0.0735 | 5.2692 | 2192 | 0.8371 | 0.6410 | 0.8371 | 0.9149 |
| 0.0735 | 5.2740 | 2194 | 0.8298 | 0.6410 | 0.8298 | 0.9109 |
| 0.0735 | 5.2788 | 2196 | 0.8374 | 0.6410 | 0.8374 | 0.9151 |
| 0.0735 | 5.2837 | 2198 | 0.8654 | 0.6410 | 0.8654 | 0.9302 |
| 0.0735 | 5.2885 | 2200 | 0.8694 | 0.6410 | 0.8694 | 0.9324 |
| 0.0735 | 5.2933 | 2202 | 0.8478 | 0.6410 | 0.8478 | 0.9207 |
| 0.0735 | 5.2981 | 2204 | 0.8264 | 0.6548 | 0.8264 | 0.9091 |
| 0.0735 | 5.3029 | 2206 | 0.8248 | 0.6548 | 0.8248 | 0.9082 |
| 0.0735 | 5.3077 | 2208 | 0.8435 | 0.6548 | 0.8435 | 0.9184 |
| 0.0735 | 5.3125 | 2210 | 0.9193 | 0.652 | 0.9193 | 0.9588 |
| 0.0735 | 5.3173 | 2212 | 0.9510 | 0.6475 | 0.9510 | 0.9752 |
| 0.0735 | 5.3221 | 2214 | 0.9366 | 0.652 | 0.9366 | 0.9678 |
| 0.0735 | 5.3269 | 2216 | 0.8936 | 0.6616 | 0.8936 | 0.9453 |
| 0.0735 | 5.3317 | 2218 | 0.8351 | 0.6884 | 0.8351 | 0.9138 |
| 0.0735 | 5.3365 | 2220 | 0.8174 | 0.6686 | 0.8174 | 0.9041 |
| 0.0735 | 5.3413 | 2222 | 0.7766 | 0.6568 | 0.7766 | 0.8813 |
| 0.0735 | 5.3462 | 2224 | 0.7266 | 0.6568 | 0.7266 | 0.8524 |
| 0.0735 | 5.3510 | 2226 | 0.7265 | 0.6568 | 0.7265 | 0.8524 |
| 0.0735 | 5.3558 | 2228 | 0.7731 | 0.6686 | 0.7731 | 0.8793 |
| 0.0735 | 5.3606 | 2230 | 0.8385 | 0.6548 | 0.8385 | 0.9157 |
| 0.0735 | 5.3654 | 2232 | 0.8757 | 0.6410 | 0.8757 | 0.9358 |
| 0.0735 | 5.3702 | 2234 | 0.8627 | 0.6410 | 0.8627 | 0.9288 |
| 0.0735 | 5.375 | 2236 | 0.8540 | 0.6410 | 0.8540 | 0.9241 |
| 0.0735 | 5.3798 | 2238 | 0.8779 | 0.6410 | 0.8779 | 0.9370 |
| 0.0735 | 5.3846 | 2240 | 0.9264 | 0.6410 | 0.9264 | 0.9625 |
| 0.0735 | 5.3894 | 2242 | 0.9034 | 0.6410 | 0.9034 | 0.9505 |
| 0.0735 | 5.3942 | 2244 | 0.8699 | 0.6410 | 0.8699 | 0.9327 |
| 0.0735 | 5.3990 | 2246 | 0.8586 | 0.6410 | 0.8586 | 0.9266 |
| 0.0735 | 5.4038 | 2248 | 0.8694 | 0.6410 | 0.8694 | 0.9324 |
| 0.0735 | 5.4087 | 2250 | 0.8539 | 0.6410 | 0.8539 | 0.9241 |
| 0.0735 | 5.4135 | 2252 | 0.7852 | 0.6686 | 0.7852 | 0.8861 |
| 0.0735 | 5.4183 | 2254 | 0.6767 | 0.6405 | 0.6767 | 0.8226 |
| 0.0735 | 5.4231 | 2256 | 0.6185 | 0.6549 | 0.6185 | 0.7865 |
| 0.0735 | 5.4279 | 2258 | 0.6032 | 0.6549 | 0.6032 | 0.7767 |
| 0.0735 | 5.4327 | 2260 | 0.6178 | 0.6405 | 0.6178 | 0.7860 |
| 0.0735 | 5.4375 | 2262 | 0.6686 | 0.6405 | 0.6686 | 0.8177 |
| 0.0735 | 5.4423 | 2264 | 0.7392 | 0.6528 | 0.7392 | 0.8598 |
| 0.0735 | 5.4471 | 2266 | 0.8106 | 0.6548 | 0.8106 | 0.9004 |
| 0.0735 | 5.4519 | 2268 | 0.8618 | 0.6410 | 0.8618 | 0.9283 |
| 0.0735 | 5.4567 | 2270 | 0.8930 | 0.6410 | 0.8930 | 0.9450 |
| 0.0735 | 5.4615 | 2272 | 0.8730 | 0.6410 | 0.8730 | 0.9343 |
| 0.0735 | 5.4663 | 2274 | 0.8788 | 0.6410 | 0.8788 | 0.9374 |
| 0.0735 | 5.4712 | 2276 | 0.8599 | 0.6410 | 0.8599 | 0.9273 |
| 0.0735 | 5.4760 | 2278 | 0.8318 | 0.6458 | 0.8318 | 0.9120 |
| 0.0735 | 5.4808 | 2280 | 0.7844 | 0.6384 | 0.7844 | 0.8857 |
| 0.0735 | 5.4856 | 2282 | 0.7894 | 0.6384 | 0.7894 | 0.8885 |
| 0.0735 | 5.4904 | 2284 | 0.8264 | 0.6599 | 0.8264 | 0.9090 |
| 0.0735 | 5.4952 | 2286 | 0.8756 | 0.6410 | 0.8756 | 0.9357 |
| 0.0735 | 5.5 | 2288 | 0.8730 | 0.6410 | 0.8730 | 0.9343 |
| 0.0735 | 5.5048 | 2290 | 0.8340 | 0.6599 | 0.8340 | 0.9132 |
| 0.0735 | 5.5096 | 2292 | 0.7918 | 0.6599 | 0.7918 | 0.8899 |
| 0.0735 | 5.5144 | 2294 | 0.7779 | 0.6599 | 0.7779 | 0.8820 |
| 0.0735 | 5.5192 | 2296 | 0.8076 | 0.6599 | 0.8076 | 0.8987 |
| 0.0735 | 5.5240 | 2298 | 0.8724 | 0.6014 | 0.8724 | 0.9340 |
| 0.0735 | 5.5288 | 2300 | 0.9146 | 0.5616 | 0.9146 | 0.9564 |
| 0.0735 | 5.5337 | 2302 | 0.9007 | 0.6014 | 0.9007 | 0.9491 |
| 0.0735 | 5.5385 | 2304 | 0.8645 | 0.6410 | 0.8645 | 0.9298 |
| 0.0735 | 5.5433 | 2306 | 0.8280 | 0.6458 | 0.8280 | 0.9100 |
| 0.0735 | 5.5481 | 2308 | 0.8228 | 0.6458 | 0.8228 | 0.9071 |
| 0.0735 | 5.5529 | 2310 | 0.8568 | 0.6458 | 0.8568 | 0.9256 |
| 0.0735 | 5.5577 | 2312 | 0.8837 | 0.6410 | 0.8837 | 0.9401 |
| 0.0735 | 5.5625 | 2314 | 0.9506 | 0.6229 | 0.9506 | 0.9750 |
| 0.0735 | 5.5673 | 2316 | 0.9666 | 0.5840 | 0.9666 | 0.9832 |
| 0.0735 | 5.5721 | 2318 | 0.9501 | 0.6229 | 0.9501 | 0.9747 |
| 0.0735 | 5.5769 | 2320 | 0.8898 | 0.6410 | 0.8898 | 0.9433 |
| 0.0735 | 5.5817 | 2322 | 0.8079 | 0.6239 | 0.8079 | 0.8988 |
| 0.0735 | 5.5865 | 2324 | 0.7801 | 0.6384 | 0.7801 | 0.8832 |
| 0.0735 | 5.5913 | 2326 | 0.7566 | 0.6384 | 0.7566 | 0.8698 |
| 0.0735 | 5.5962 | 2328 | 0.7793 | 0.6384 | 0.7793 | 0.8828 |
| 0.0735 | 5.6010 | 2330 | 0.8388 | 0.6239 | 0.8388 | 0.9159 |
| 0.0735 | 5.6058 | 2332 | 0.9278 | 0.5789 | 0.9278 | 0.9632 |
| 0.0735 | 5.6106 | 2334 | 1.0090 | 0.4928 | 1.0090 | 1.0045 |
| 0.0735 | 5.6154 | 2336 | 1.0183 | 0.4783 | 1.0183 | 1.0091 |
| 0.0735 | 5.6202 | 2338 | 0.9879 | 0.4656 | 0.9879 | 0.9939 |
| 0.0735 | 5.625 | 2340 | 1.0010 | 0.4656 | 1.0010 | 1.0005 |
| 0.0735 | 5.6298 | 2342 | 1.0261 | 0.4771 | 1.0261 | 1.0129 |
| 0.0735 | 5.6346 | 2344 | 1.0474 | 0.5030 | 1.0474 | 1.0234 |
| 0.0735 | 5.6394 | 2346 | 1.0240 | 0.5030 | 1.0240 | 1.0119 |
| 0.0735 | 5.6442 | 2348 | 0.9796 | 0.5740 | 0.9796 | 0.9898 |
| 0.0735 | 5.6490 | 2350 | 0.9598 | 0.5740 | 0.9598 | 0.9797 |
| 0.0735 | 5.6538 | 2352 | 0.9410 | 0.6437 | 0.9410 | 0.9700 |
| 0.0735 | 5.6587 | 2354 | 0.8831 | 0.6362 | 0.8831 | 0.9397 |
| 0.0735 | 5.6635 | 2356 | 0.8685 | 0.6362 | 0.8685 | 0.9319 |
| 0.0735 | 5.6683 | 2358 | 0.8368 | 0.6599 | 0.8368 | 0.9148 |
| 0.0735 | 5.6731 | 2360 | 0.7907 | 0.6599 | 0.7907 | 0.8892 |
| 0.0735 | 5.6779 | 2362 | 0.7571 | 0.6384 | 0.7571 | 0.8701 |
| 0.0735 | 5.6827 | 2364 | 0.7339 | 0.6384 | 0.7339 | 0.8567 |
| 0.0735 | 5.6875 | 2366 | 0.7457 | 0.6384 | 0.7457 | 0.8636 |
| 0.0735 | 5.6923 | 2368 | 0.7458 | 0.6384 | 0.7458 | 0.8636 |
| 0.0735 | 5.6971 | 2370 | 0.7611 | 0.6384 | 0.7611 | 0.8724 |
| 0.0735 | 5.7019 | 2372 | 0.8265 | 0.6362 | 0.8265 | 0.9091 |
| 0.0735 | 5.7067 | 2374 | 0.9114 | 0.6216 | 0.9114 | 0.9547 |
| 0.0735 | 5.7115 | 2376 | 0.9795 | 0.5798 | 0.9795 | 0.9897 |
| 0.0735 | 5.7163 | 2378 | 1.0389 | 0.5528 | 1.0389 | 1.0193 |
| 0.0735 | 5.7212 | 2380 | 1.0512 | 0.5528 | 1.0512 | 1.0253 |
| 0.0735 | 5.7260 | 2382 | 1.0348 | 0.5528 | 1.0348 | 1.0173 |
| 0.0735 | 5.7308 | 2384 | 0.9994 | 0.5271 | 0.9994 | 0.9997 |
| 0.0735 | 5.7356 | 2386 | 0.9740 | 0.5271 | 0.9740 | 0.9869 |
| 0.0735 | 5.7404 | 2388 | 0.9957 | 0.5112 | 0.9957 | 0.9978 |
| 0.0735 | 5.7452 | 2390 | 0.9971 | 0.4952 | 0.9971 | 0.9986 |
| 0.0735 | 5.75 | 2392 | 0.9930 | 0.5112 | 0.9930 | 0.9965 |
| 0.0735 | 5.7548 | 2394 | 1.0308 | 0.5219 | 1.0308 | 1.0153 |
| 0.0735 | 5.7596 | 2396 | 1.1159 | 0.5030 | 1.1159 | 1.0564 |
| 0.0735 | 5.7644 | 2398 | 1.1797 | 0.4600 | 1.1797 | 1.0861 |
| 0.0735 | 5.7692 | 2400 | 1.1792 | 0.4600 | 1.1792 | 1.0859 |
| 0.0735 | 5.7740 | 2402 | 1.1290 | 0.5 | 1.1290 | 1.0626 |
| 0.0735 | 5.7788 | 2404 | 1.0469 | 0.5740 | 1.0469 | 1.0232 |
| 0.0735 | 5.7837 | 2406 | 0.9800 | 0.5528 | 0.9800 | 0.9900 |
| 0.0735 | 5.7885 | 2408 | 0.9705 | 0.5960 | 0.9705 | 0.9851 |
| 0.0735 | 5.7933 | 2410 | 0.9979 | 0.5271 | 0.9979 | 0.9990 |
| 0.0735 | 5.7981 | 2412 | 1.0401 | 0.5350 | 1.0401 | 1.0199 |
| 0.0735 | 5.8029 | 2414 | 1.0749 | 0.5350 | 1.0749 | 1.0368 |
| 0.0735 | 5.8077 | 2416 | 1.1362 | 0.4072 | 1.1362 | 1.0659 |
| 0.0735 | 5.8125 | 2418 | 1.2176 | 0.4456 | 1.2176 | 1.1034 |
| 0.0735 | 5.8173 | 2420 | 1.2714 | 0.3959 | 1.2714 | 1.1276 |
| 0.0735 | 5.8221 | 2422 | 1.2513 | 0.3950 | 1.2513 | 1.1186 |
| 0.0735 | 5.8269 | 2424 | 1.1875 | 0.3919 | 1.1875 | 1.0897 |
| 0.0735 | 5.8317 | 2426 | 1.1385 | 0.4340 | 1.1385 | 1.0670 |
| 0.0735 | 5.8365 | 2428 | 1.1082 | 0.4771 | 1.1082 | 1.0527 |
| 0.0735 | 5.8413 | 2430 | 1.0968 | 0.4924 | 1.0968 | 1.0473 |
| 0.0735 | 5.8462 | 2432 | 1.0687 | 0.5350 | 1.0687 | 1.0338 |
| 0.0735 | 5.8510 | 2434 | 1.0680 | 0.5350 | 1.0680 | 1.0334 |
| 0.0735 | 5.8558 | 2436 | 1.0548 | 0.4924 | 1.0548 | 1.0271 |
| 0.0735 | 5.8606 | 2438 | 1.0786 | 0.4924 | 1.0786 | 1.0386 |
| 0.0735 | 5.8654 | 2440 | 1.0789 | 0.4924 | 1.0789 | 1.0387 |
| 0.0735 | 5.8702 | 2442 | 1.0986 | 0.4924 | 1.0986 | 1.0482 |
| 0.0735 | 5.875 | 2444 | 1.1160 | 0.4495 | 1.1160 | 1.0564 |
| 0.0735 | 5.8798 | 2446 | 1.1447 | 0.4495 | 1.1447 | 1.0699 |
| 0.0735 | 5.8846 | 2448 | 1.1270 | 0.4495 | 1.1270 | 1.0616 |
| 0.0735 | 5.8894 | 2450 | 1.1110 | 0.4924 | 1.1110 | 1.0540 |
| 0.0735 | 5.8942 | 2452 | 1.0915 | 0.4924 | 1.0915 | 1.0448 |
| 0.0735 | 5.8990 | 2454 | 1.0740 | 0.5076 | 1.0740 | 1.0363 |
| 0.0735 | 5.9038 | 2456 | 1.0719 | 0.4924 | 1.0719 | 1.0353 |
| 0.0735 | 5.9087 | 2458 | 1.0531 | 0.5076 | 1.0531 | 1.0262 |
| 0.0735 | 5.9135 | 2460 | 1.0736 | 0.4495 | 1.0736 | 1.0361 |
| 0.0735 | 5.9183 | 2462 | 1.1007 | 0.4495 | 1.1007 | 1.0492 |
| 0.0735 | 5.9231 | 2464 | 1.0989 | 0.4495 | 1.0989 | 1.0483 |
| 0.0735 | 5.9279 | 2466 | 1.0590 | 0.55 | 1.0590 | 1.0291 |
| 0.0735 | 5.9327 | 2468 | 1.0171 | 0.5528 | 1.0171 | 1.0085 |
| 0.0735 | 5.9375 | 2470 | 0.9841 | 0.5528 | 0.9841 | 0.9920 |
| 0.0735 | 5.9423 | 2472 | 0.9896 | 0.5354 | 0.9896 | 0.9948 |
| 0.0735 | 5.9471 | 2474 | 1.0442 | 0.5426 | 1.0442 | 1.0218 |
| 0.0735 | 5.9519 | 2476 | 1.0709 | 0.5145 | 1.0709 | 1.0348 |
| 0.0735 | 5.9567 | 2478 | 1.0651 | 0.5145 | 1.0651 | 1.0320 |
| 0.0735 | 5.9615 | 2480 | 1.0777 | 0.5145 | 1.0777 | 1.0381 |
| 0.0735 | 5.9663 | 2482 | 1.0501 | 0.4941 | 1.0501 | 1.0247 |
| 0.0735 | 5.9712 | 2484 | 0.9949 | 0.5093 | 0.9949 | 0.9975 |
| 0.0735 | 5.9760 | 2486 | 0.9529 | 0.5271 | 0.9529 | 0.9761 |
| 0.0735 | 5.9808 | 2488 | 0.9292 | 0.5556 | 0.9292 | 0.9640 |
| 0.0735 | 5.9856 | 2490 | 0.9220 | 0.5556 | 0.9220 | 0.9602 |
| 0.0735 | 5.9904 | 2492 | 0.9507 | 0.5556 | 0.9507 | 0.9750 |
| 0.0735 | 5.9952 | 2494 | 0.9993 | 0.5216 | 0.9993 | 0.9997 |
| 0.0735 | 6.0 | 2496 | 1.0153 | 0.5216 | 1.0153 | 1.0076 |
| 0.0735 | 6.0048 | 2498 | 0.9912 | 0.5798 | 0.9912 | 0.9956 |
| 0.0602 | 6.0096 | 2500 | 0.9915 | 0.5124 | 0.9915 | 0.9958 |
| 0.0602 | 6.0144 | 2502 | 1.0032 | 0.5124 | 1.0032 | 1.0016 |
| 0.0602 | 6.0192 | 2504 | 1.0158 | 0.4824 | 1.0158 | 1.0079 |
| 0.0602 | 6.0240 | 2506 | 1.0097 | 0.4824 | 1.0097 | 1.0048 |
| 0.0602 | 6.0288 | 2508 | 0.9868 | 0.4824 | 0.9868 | 0.9934 |
| 0.0602 | 6.0337 | 2510 | 1.0021 | 0.4824 | 1.0021 | 1.0010 |
| 0.0602 | 6.0385 | 2512 | 1.0415 | 0.4656 | 1.0415 | 1.0206 |
| 0.0602 | 6.0433 | 2514 | 1.0724 | 0.4649 | 1.0724 | 1.0356 |
| 0.0602 | 6.0481 | 2516 | 1.0528 | 0.4656 | 1.0528 | 1.0261 |
| 0.0602 | 6.0529 | 2518 | 1.0358 | 0.5093 | 1.0358 | 1.0177 |
| 0.0602 | 6.0577 | 2520 | 1.0421 | 0.5378 | 1.0421 | 1.0208 |
| 0.0602 | 6.0625 | 2522 | 1.0625 | 0.5354 | 1.0625 | 1.0308 |
| 0.0602 | 6.0673 | 2524 | 1.0904 | 0.4941 | 1.0904 | 1.0442 |
| 0.0602 | 6.0721 | 2526 | 1.0640 | 0.5354 | 1.0640 | 1.0315 |
| 0.0602 | 6.0769 | 2528 | 0.9901 | 0.5378 | 0.9901 | 0.9950 |
| 0.0602 | 6.0817 | 2530 | 0.8970 | 0.6362 | 0.8970 | 0.9471 |
| 0.0602 | 6.0865 | 2532 | 0.8512 | 0.6135 | 0.8512 | 0.9226 |
| 0.0602 | 6.0913 | 2534 | 0.8564 | 0.6135 | 0.8564 | 0.9254 |
| 0.0602 | 6.0962 | 2536 | 0.8888 | 0.6362 | 0.8888 | 0.9428 |
| 0.0602 | 6.1010 | 2538 | 0.9262 | 0.6362 | 0.9262 | 0.9624 |
| 0.0602 | 6.1058 | 2540 | 0.9463 | 0.5946 | 0.9463 | 0.9728 |
| 0.0602 | 6.1106 | 2542 | 0.9721 | 0.5946 | 0.9721 | 0.9860 |
| 0.0602 | 6.1154 | 2544 | 0.9532 | 0.5946 | 0.9532 | 0.9763 |
| 0.0602 | 6.1202 | 2546 | 0.9416 | 0.5946 | 0.9416 | 0.9704 |
| 0.0602 | 6.125 | 2548 | 0.9155 | 0.6362 | 0.9155 | 0.9568 |
| 0.0602 | 6.1298 | 2550 | 0.8781 | 0.6362 | 0.8781 | 0.9370 |
| 0.0602 | 6.1346 | 2552 | 0.8435 | 0.6135 | 0.8435 | 0.9184 |
| 0.0602 | 6.1394 | 2554 | 0.8354 | 0.6135 | 0.8354 | 0.9140 |
| 0.0602 | 6.1442 | 2556 | 0.8358 | 0.6135 | 0.8358 | 0.9142 |
| 0.0602 | 6.1490 | 2558 | 0.8524 | 0.6362 | 0.8524 | 0.9233 |
| 0.0602 | 6.1538 | 2560 | 0.8542 | 0.6599 | 0.8542 | 0.9242 |
| 0.0602 | 6.1587 | 2562 | 0.8778 | 0.6362 | 0.8778 | 0.9369 |
| 0.0602 | 6.1635 | 2564 | 0.9091 | 0.6362 | 0.9091 | 0.9535 |
| 0.0602 | 6.1683 | 2566 | 0.9201 | 0.6362 | 0.9201 | 0.9592 |
| 0.0602 | 6.1731 | 2568 | 0.9427 | 0.6216 | 0.9427 | 0.9709 |
| 0.0602 | 6.1779 | 2570 | 0.9304 | 0.6216 | 0.9304 | 0.9646 |
| 0.0602 | 6.1827 | 2572 | 0.9051 | 0.6362 | 0.9051 | 0.9514 |
| 0.0602 | 6.1875 | 2574 | 0.8729 | 0.6135 | 0.8729 | 0.9343 |
| 0.0602 | 6.1923 | 2576 | 0.8751 | 0.6135 | 0.8751 | 0.9355 |
| 0.0602 | 6.1971 | 2578 | 0.8727 | 0.6135 | 0.8727 | 0.9342 |
| 0.0602 | 6.2019 | 2580 | 0.9003 | 0.5985 | 0.9003 | 0.9488 |
| 0.0602 | 6.2067 | 2582 | 0.9229 | 0.6216 | 0.9229 | 0.9607 |
| 0.0602 | 6.2115 | 2584 | 0.9590 | 0.6216 | 0.9590 | 0.9793 |
| 0.0602 | 6.2163 | 2586 | 0.9464 | 0.6216 | 0.9464 | 0.9729 |
| 0.0602 | 6.2212 | 2588 | 0.9386 | 0.6216 | 0.9386 | 0.9688 |
| 0.0602 | 6.2260 | 2590 | 0.9416 | 0.6216 | 0.9416 | 0.9704 |
| 0.0602 | 6.2308 | 2592 | 0.9553 | 0.6216 | 0.9553 | 0.9774 |
| 0.0602 | 6.2356 | 2594 | 0.9956 | 0.5378 | 0.9956 | 0.9978 |
| 0.0602 | 6.2404 | 2596 | 1.0428 | 0.5216 | 1.0428 | 1.0212 |
| 0.0602 | 6.2452 | 2598 | 1.0470 | 0.5216 | 1.0470 | 1.0232 |
| 0.0602 | 6.25 | 2600 | 1.0010 | 0.5378 | 1.0010 | 1.0005 |
| 0.0602 | 6.2548 | 2602 | 0.9424 | 0.5798 | 0.9424 | 0.9708 |
| 0.0602 | 6.2596 | 2604 | 0.9334 | 0.5556 | 0.9334 | 0.9661 |
| 0.0602 | 6.2644 | 2606 | 0.9250 | 0.5556 | 0.9250 | 0.9618 |
| 0.0602 | 6.2692 | 2608 | 0.9078 | 0.5556 | 0.9078 | 0.9528 |
| 0.0602 | 6.2740 | 2610 | 0.9194 | 0.5556 | 0.9194 | 0.9589 |
| 0.0602 | 6.2788 | 2612 | 0.9463 | 0.5556 | 0.9463 | 0.9728 |
| 0.0602 | 6.2837 | 2614 | 0.9347 | 0.5556 | 0.9347 | 0.9668 |
| 0.0602 | 6.2885 | 2616 | 0.9460 | 0.5556 | 0.9460 | 0.9726 |
| 0.0602 | 6.2933 | 2618 | 0.9786 | 0.5556 | 0.9786 | 0.9893 |
| 0.0602 | 6.2981 | 2620 | 1.0022 | 0.5616 | 1.0022 | 1.0011 |
| 0.0602 | 6.3029 | 2622 | 0.9925 | 0.5616 | 0.9925 | 0.9963 |
| 0.0602 | 6.3077 | 2624 | 0.9960 | 0.5616 | 0.9960 | 0.9980 |
| 0.0602 | 6.3125 | 2626 | 1.0093 | 0.5616 | 1.0093 | 1.0047 |
| 0.0602 | 6.3173 | 2628 | 0.9783 | 0.5381 | 0.9783 | 0.9891 |
| 0.0602 | 6.3221 | 2630 | 0.9643 | 0.5556 | 0.9643 | 0.9820 |
| 0.0602 | 6.3269 | 2632 | 0.9298 | 0.5708 | 0.9298 | 0.9643 |
| 0.0602 | 6.3317 | 2634 | 0.9009 | 0.6135 | 0.9009 | 0.9491 |
| 0.0602 | 6.3365 | 2636 | 0.8952 | 0.6135 | 0.8952 | 0.9462 |
| 0.0602 | 6.3413 | 2638 | 0.9199 | 0.5985 | 0.9199 | 0.9591 |
| 0.0602 | 6.3462 | 2640 | 0.9751 | 0.5556 | 0.9751 | 0.9875 |
| 0.0602 | 6.3510 | 2642 | 1.0050 | 0.5616 | 1.0050 | 1.0025 |
| 0.0602 | 6.3558 | 2644 | 0.9964 | 0.5616 | 0.9964 | 0.9982 |
| 0.0602 | 6.3606 | 2646 | 0.9544 | 0.5757 | 0.9544 | 0.9770 |
| 0.0602 | 6.3654 | 2648 | 0.8909 | 0.6362 | 0.8909 | 0.9439 |
| 0.0602 | 6.3702 | 2650 | 0.8707 | 0.6384 | 0.8707 | 0.9331 |
| 0.0602 | 6.375 | 2652 | 0.8644 | 0.6384 | 0.8644 | 0.9297 |
| 0.0602 | 6.3798 | 2654 | 0.8545 | 0.6385 | 0.8545 | 0.9244 |
| 0.0602 | 6.3846 | 2656 | 0.8725 | 0.6385 | 0.8725 | 0.9341 |
| 0.0602 | 6.3894 | 2658 | 0.8954 | 0.6385 | 0.8954 | 0.9463 |
| 0.0602 | 6.3942 | 2660 | 0.9119 | 0.6154 | 0.9119 | 0.9549 |
| 0.0602 | 6.3990 | 2662 | 0.9521 | 0.5757 | 0.9521 | 0.9758 |
| 0.0602 | 6.4038 | 2664 | 0.9677 | 0.5757 | 0.9677 | 0.9837 |
| 0.0602 | 6.4087 | 2666 | 0.9597 | 0.5757 | 0.9597 | 0.9796 |
| 0.0602 | 6.4135 | 2668 | 0.9235 | 0.6362 | 0.9235 | 0.9610 |
| 0.0602 | 6.4183 | 2670 | 0.9303 | 0.6362 | 0.9303 | 0.9645 |
| 0.0602 | 6.4231 | 2672 | 0.9308 | 0.6362 | 0.9308 | 0.9648 |
| 0.0602 | 6.4279 | 2674 | 0.9553 | 0.5946 | 0.9553 | 0.9774 |
| 0.0602 | 6.4327 | 2676 | 0.9577 | 0.5946 | 0.9577 | 0.9786 |
| 0.0602 | 6.4375 | 2678 | 0.9388 | 0.5946 | 0.9388 | 0.9689 |
| 0.0602 | 6.4423 | 2680 | 0.9145 | 0.5946 | 0.9145 | 0.9563 |
| 0.0602 | 6.4471 | 2682 | 0.9010 | 0.6362 | 0.9010 | 0.9492 |
| 0.0602 | 6.4519 | 2684 | 0.9281 | 0.5946 | 0.9281 | 0.9634 |
| 0.0602 | 6.4567 | 2686 | 0.9234 | 0.5946 | 0.9234 | 0.9610 |
| 0.0602 | 6.4615 | 2688 | 0.9402 | 0.5946 | 0.9402 | 0.9697 |
| 0.0602 | 6.4663 | 2690 | 0.9810 | 0.5757 | 0.9810 | 0.9904 |
| 0.0602 | 6.4712 | 2692 | 1.0313 | 0.5216 | 1.0313 | 1.0155 |
| 0.0602 | 6.4760 | 2694 | 1.0421 | 0.5295 | 1.0421 | 1.0208 |
| 0.0602 | 6.4808 | 2696 | 1.0124 | 0.5216 | 1.0124 | 1.0062 |
| 0.0602 | 6.4856 | 2698 | 0.9811 | 0.5757 | 0.9811 | 0.9905 |
| 0.0602 | 6.4904 | 2700 | 0.9354 | 0.5757 | 0.9354 | 0.9672 |
| 0.0602 | 6.4952 | 2702 | 0.8698 | 0.6599 | 0.8698 | 0.9326 |
| 0.0602 | 6.5 | 2704 | 0.8043 | 0.6384 | 0.8043 | 0.8968 |
| 0.0602 | 6.5048 | 2706 | 0.7850 | 0.6384 | 0.7850 | 0.8860 |
| 0.0602 | 6.5096 | 2708 | 0.7823 | 0.6384 | 0.7823 | 0.8845 |
| 0.0602 | 6.5144 | 2710 | 0.8001 | 0.6385 | 0.8001 | 0.8945 |
| 0.0602 | 6.5192 | 2712 | 0.8334 | 0.6429 | 0.8334 | 0.9129 |
| 0.0602 | 6.5240 | 2714 | 0.8606 | 0.6429 | 0.8606 | 0.9277 |
| 0.0602 | 6.5288 | 2716 | 0.8743 | 0.6429 | 0.8743 | 0.9350 |
| 0.0602 | 6.5337 | 2718 | 0.8590 | 0.6429 | 0.8590 | 0.9268 |
| 0.0602 | 6.5385 | 2720 | 0.8328 | 0.6599 | 0.8328 | 0.9126 |
| 0.0602 | 6.5433 | 2722 | 0.8316 | 0.6599 | 0.8316 | 0.9119 |
| 0.0602 | 6.5481 | 2724 | 0.8213 | 0.6599 | 0.8213 | 0.9062 |
| 0.0602 | 6.5529 | 2726 | 0.8006 | 0.6599 | 0.8006 | 0.8948 |
| 0.0602 | 6.5577 | 2728 | 0.8107 | 0.6599 | 0.8107 | 0.9004 |
| 0.0602 | 6.5625 | 2730 | 0.8260 | 0.6599 | 0.8260 | 0.9089 |
| 0.0602 | 6.5673 | 2732 | 0.8175 | 0.6599 | 0.8175 | 0.9042 |
| 0.0602 | 6.5721 | 2734 | 0.8239 | 0.6599 | 0.8239 | 0.9077 |
| 0.0602 | 6.5769 | 2736 | 0.8203 | 0.6384 | 0.8203 | 0.9057 |
| 0.0602 | 6.5817 | 2738 | 0.8200 | 0.6384 | 0.8200 | 0.9055 |
| 0.0602 | 6.5865 | 2740 | 0.8133 | 0.6384 | 0.8133 | 0.9018 |
| 0.0602 | 6.5913 | 2742 | 0.8145 | 0.6384 | 0.8145 | 0.9025 |
| 0.0602 | 6.5962 | 2744 | 0.7936 | 0.6384 | 0.7936 | 0.8908 |
| 0.0602 | 6.6010 | 2746 | 0.7816 | 0.6384 | 0.7816 | 0.8841 |
| 0.0602 | 6.6058 | 2748 | 0.7613 | 0.6528 | 0.7613 | 0.8725 |
| 0.0602 | 6.6106 | 2750 | 0.7495 | 0.6528 | 0.7495 | 0.8657 |
| 0.0602 | 6.6154 | 2752 | 0.7519 | 0.6528 | 0.7519 | 0.8671 |
| 0.0602 | 6.6202 | 2754 | 0.7840 | 0.6384 | 0.7840 | 0.8854 |
| 0.0602 | 6.625 | 2756 | 0.8360 | 0.6599 | 0.8360 | 0.9144 |
| 0.0602 | 6.6298 | 2758 | 0.8695 | 0.6599 | 0.8695 | 0.9325 |
| 0.0602 | 6.6346 | 2760 | 0.8686 | 0.6599 | 0.8686 | 0.9320 |
| 0.0602 | 6.6394 | 2762 | 0.8366 | 0.6384 | 0.8366 | 0.9146 |
| 0.0602 | 6.6442 | 2764 | 0.8028 | 0.6384 | 0.8028 | 0.8960 |
| 0.0602 | 6.6490 | 2766 | 0.7777 | 0.6528 | 0.7777 | 0.8819 |
| 0.0602 | 6.6538 | 2768 | 0.7812 | 0.6528 | 0.7812 | 0.8838 |
| 0.0602 | 6.6587 | 2770 | 0.8059 | 0.6384 | 0.8059 | 0.8977 |
| 0.0602 | 6.6635 | 2772 | 0.8401 | 0.6384 | 0.8401 | 0.9166 |
| 0.0602 | 6.6683 | 2774 | 0.9127 | 0.6410 | 0.9127 | 0.9553 |
| 0.0602 | 6.6731 | 2776 | 0.9527 | 0.6209 | 0.9527 | 0.9761 |
| 0.0602 | 6.6779 | 2778 | 0.9465 | 0.6209 | 0.9465 | 0.9729 |
| 0.0602 | 6.6827 | 2780 | 0.9264 | 0.6410 | 0.9264 | 0.9625 |
| 0.0602 | 6.6875 | 2782 | 0.8821 | 0.6548 | 0.8821 | 0.9392 |
| 0.0602 | 6.6923 | 2784 | 0.8244 | 0.6384 | 0.8244 | 0.9079 |
| 0.0602 | 6.6971 | 2786 | 0.7724 | 0.6528 | 0.7724 | 0.8788 |
| 0.0602 | 6.7019 | 2788 | 0.7433 | 0.6528 | 0.7433 | 0.8622 |
| 0.0602 | 6.7067 | 2790 | 0.7417 | 0.6528 | 0.7417 | 0.8612 |
| 0.0602 | 6.7115 | 2792 | 0.7666 | 0.6528 | 0.7666 | 0.8756 |
| 0.0602 | 6.7163 | 2794 | 0.8013 | 0.6528 | 0.8013 | 0.8952 |
| 0.0602 | 6.7212 | 2796 | 0.8670 | 0.6548 | 0.8670 | 0.9311 |
| 0.0602 | 6.7260 | 2798 | 0.9580 | 0.6548 | 0.9580 | 0.9788 |
| 0.0602 | 6.7308 | 2800 | 0.9965 | 0.6410 | 0.9965 | 0.9983 |
| 0.0602 | 6.7356 | 2802 | 1.0026 | 0.6410 | 1.0026 | 1.0013 |
| 0.0602 | 6.7404 | 2804 | 0.9859 | 0.6410 | 0.9859 | 0.9929 |
| 0.0602 | 6.7452 | 2806 | 0.9446 | 0.6410 | 0.9446 | 0.9719 |
| 0.0602 | 6.75 | 2808 | 0.9224 | 0.6410 | 0.9224 | 0.9604 |
| 0.0602 | 6.7548 | 2810 | 0.8862 | 0.6337 | 0.8862 | 0.9414 |
| 0.0602 | 6.7596 | 2812 | 0.8571 | 0.6384 | 0.8571 | 0.9258 |
| 0.0602 | 6.7644 | 2814 | 0.8362 | 0.6384 | 0.8362 | 0.9144 |
| 0.0602 | 6.7692 | 2816 | 0.8384 | 0.6384 | 0.8384 | 0.9156 |
| 0.0602 | 6.7740 | 2818 | 0.8262 | 0.6384 | 0.8262 | 0.9090 |
| 0.0602 | 6.7788 | 2820 | 0.7969 | 0.6384 | 0.7969 | 0.8927 |
| 0.0602 | 6.7837 | 2822 | 0.7662 | 0.6384 | 0.7662 | 0.8753 |
| 0.0602 | 6.7885 | 2824 | 0.7574 | 0.6384 | 0.7574 | 0.8703 |
| 0.0602 | 6.7933 | 2826 | 0.7571 | 0.6528 | 0.7571 | 0.8701 |
| 0.0602 | 6.7981 | 2828 | 0.7850 | 0.6384 | 0.7850 | 0.8860 |
| 0.0602 | 6.8029 | 2830 | 0.8127 | 0.6384 | 0.8127 | 0.9015 |
| 0.0602 | 6.8077 | 2832 | 0.8034 | 0.6384 | 0.8034 | 0.8963 |
| 0.0602 | 6.8125 | 2834 | 0.7871 | 0.6384 | 0.7871 | 0.8872 |
| 0.0602 | 6.8173 | 2836 | 0.7684 | 0.6384 | 0.7684 | 0.8766 |
| 0.0602 | 6.8221 | 2838 | 0.7763 | 0.6384 | 0.7763 | 0.8811 |
| 0.0602 | 6.8269 | 2840 | 0.7712 | 0.6384 | 0.7712 | 0.8782 |
| 0.0602 | 6.8317 | 2842 | 0.7716 | 0.6384 | 0.7716 | 0.8784 |
| 0.0602 | 6.8365 | 2844 | 0.7877 | 0.6384 | 0.7877 | 0.8875 |
| 0.0602 | 6.8413 | 2846 | 0.8194 | 0.6599 | 0.8194 | 0.9052 |
| 0.0602 | 6.8462 | 2848 | 0.8349 | 0.6599 | 0.8349 | 0.9137 |
| 0.0602 | 6.8510 | 2850 | 0.8245 | 0.6599 | 0.8245 | 0.9080 |
| 0.0602 | 6.8558 | 2852 | 0.8060 | 0.6599 | 0.8060 | 0.8978 |
| 0.0602 | 6.8606 | 2854 | 0.7857 | 0.6384 | 0.7857 | 0.8864 |
| 0.0602 | 6.8654 | 2856 | 0.7814 | 0.6384 | 0.7814 | 0.8839 |
| 0.0602 | 6.8702 | 2858 | 0.8046 | 0.6384 | 0.8046 | 0.8970 |
| 0.0602 | 6.875 | 2860 | 0.8291 | 0.6384 | 0.8291 | 0.9105 |
| 0.0602 | 6.8798 | 2862 | 0.8614 | 0.6599 | 0.8614 | 0.9281 |
| 0.0602 | 6.8846 | 2864 | 0.9015 | 0.6599 | 0.9015 | 0.9495 |
| 0.0602 | 6.8894 | 2866 | 0.9094 | 0.6599 | 0.9094 | 0.9536 |
| 0.0602 | 6.8942 | 2868 | 0.9332 | 0.6053 | 0.9332 | 0.9660 |
| 0.0602 | 6.8990 | 2870 | 0.9443 | 0.6014 | 0.9443 | 0.9718 |
| 0.0602 | 6.9038 | 2872 | 0.9357 | 0.6154 | 0.9357 | 0.9673 |
| 0.0602 | 6.9087 | 2874 | 0.9193 | 0.6599 | 0.9193 | 0.9588 |
| 0.0602 | 6.9135 | 2876 | 0.8932 | 0.6599 | 0.8932 | 0.9451 |
| 0.0602 | 6.9183 | 2878 | 0.8638 | 0.6384 | 0.8638 | 0.9294 |
| 0.0602 | 6.9231 | 2880 | 0.8381 | 0.6384 | 0.8381 | 0.9155 |
| 0.0602 | 6.9279 | 2882 | 0.8435 | 0.6384 | 0.8435 | 0.9184 |
| 0.0602 | 6.9327 | 2884 | 0.8375 | 0.6384 | 0.8375 | 0.9151 |
| 0.0602 | 6.9375 | 2886 | 0.8277 | 0.6384 | 0.8277 | 0.9098 |
| 0.0602 | 6.9423 | 2888 | 0.8290 | 0.6384 | 0.8290 | 0.9105 |
| 0.0602 | 6.9471 | 2890 | 0.8221 | 0.6384 | 0.8221 | 0.9067 |
| 0.0602 | 6.9519 | 2892 | 0.8357 | 0.6384 | 0.8357 | 0.9142 |
| 0.0602 | 6.9567 | 2894 | 0.8584 | 0.6384 | 0.8584 | 0.9265 |
| 0.0602 | 6.9615 | 2896 | 0.8669 | 0.6384 | 0.8669 | 0.9311 |
| 0.0602 | 6.9663 | 2898 | 0.8742 | 0.6384 | 0.8742 | 0.9350 |
| 0.0602 | 6.9712 | 2900 | 0.8708 | 0.6384 | 0.8708 | 0.9332 |
| 0.0602 | 6.9760 | 2902 | 0.8709 | 0.6384 | 0.8709 | 0.9332 |
| 0.0602 | 6.9808 | 2904 | 0.8694 | 0.6384 | 0.8694 | 0.9324 |
| 0.0602 | 6.9856 | 2906 | 0.8551 | 0.6384 | 0.8551 | 0.9247 |
| 0.0602 | 6.9904 | 2908 | 0.8634 | 0.6384 | 0.8634 | 0.9292 |
| 0.0602 | 6.9952 | 2910 | 0.8815 | 0.6384 | 0.8815 | 0.9389 |
| 0.0602 | 7.0 | 2912 | 0.8991 | 0.6599 | 0.8991 | 0.9482 |
| 0.0602 | 7.0048 | 2914 | 0.8892 | 0.6599 | 0.8892 | 0.9430 |
| 0.0602 | 7.0096 | 2916 | 0.8764 | 0.6599 | 0.8764 | 0.9361 |
| 0.0602 | 7.0144 | 2918 | 0.8844 | 0.6599 | 0.8844 | 0.9404 |
| 0.0602 | 7.0192 | 2920 | 0.8745 | 0.6599 | 0.8745 | 0.9352 |
| 0.0602 | 7.0240 | 2922 | 0.8562 | 0.6599 | 0.8562 | 0.9253 |
| 0.0602 | 7.0288 | 2924 | 0.8372 | 0.6599 | 0.8372 | 0.9150 |
| 0.0602 | 7.0337 | 2926 | 0.8378 | 0.6599 | 0.8378 | 0.9153 |
| 0.0602 | 7.0385 | 2928 | 0.8382 | 0.6599 | 0.8382 | 0.9155 |
| 0.0602 | 7.0433 | 2930 | 0.8577 | 0.6599 | 0.8577 | 0.9261 |
| 0.0602 | 7.0481 | 2932 | 0.8847 | 0.6362 | 0.8847 | 0.9406 |
| 0.0602 | 7.0529 | 2934 | 0.9274 | 0.6111 | 0.9274 | 0.9630 |
| 0.0602 | 7.0577 | 2936 | 0.9773 | 0.4649 | 0.9773 | 0.9886 |
| 0.0602 | 7.0625 | 2938 | 0.9874 | 0.4649 | 0.9874 | 0.9937 |
| 0.0602 | 7.0673 | 2940 | 0.9916 | 0.4649 | 0.9916 | 0.9958 |
| 0.0602 | 7.0721 | 2942 | 0.9596 | 0.4656 | 0.9596 | 0.9796 |
| 0.0602 | 7.0769 | 2944 | 0.9128 | 0.6111 | 0.9128 | 0.9554 |
| 0.0602 | 7.0817 | 2946 | 0.8586 | 0.6135 | 0.8586 | 0.9266 |
| 0.0602 | 7.0865 | 2948 | 0.8024 | 0.6384 | 0.8024 | 0.8958 |
| 0.0602 | 7.0913 | 2950 | 0.7800 | 0.6159 | 0.7800 | 0.8832 |
| 0.0602 | 7.0962 | 2952 | 0.7837 | 0.6384 | 0.7837 | 0.8852 |
| 0.0602 | 7.1010 | 2954 | 0.8022 | 0.6384 | 0.8022 | 0.8956 |
| 0.0602 | 7.1058 | 2956 | 0.8438 | 0.6384 | 0.8438 | 0.9186 |
| 0.0602 | 7.1106 | 2958 | 0.8845 | 0.6599 | 0.8845 | 0.9405 |
| 0.0602 | 7.1154 | 2960 | 0.9000 | 0.6599 | 0.9000 | 0.9487 |
| 0.0602 | 7.1202 | 2962 | 0.8988 | 0.6599 | 0.8988 | 0.9481 |
| 0.0602 | 7.125 | 2964 | 0.8824 | 0.6599 | 0.8824 | 0.9394 |
| 0.0602 | 7.1298 | 2966 | 0.8529 | 0.6599 | 0.8529 | 0.9236 |
| 0.0602 | 7.1346 | 2968 | 0.8381 | 0.6599 | 0.8381 | 0.9155 |
| 0.0602 | 7.1394 | 2970 | 0.8227 | 0.6384 | 0.8227 | 0.9070 |
| 0.0602 | 7.1442 | 2972 | 0.8217 | 0.6384 | 0.8217 | 0.9065 |
| 0.0602 | 7.1490 | 2974 | 0.8422 | 0.6599 | 0.8422 | 0.9177 |
| 0.0602 | 7.1538 | 2976 | 0.8817 | 0.6548 | 0.8817 | 0.9390 |
| 0.0602 | 7.1587 | 2978 | 0.9277 | 0.6316 | 0.9277 | 0.9632 |
| 0.0602 | 7.1635 | 2980 | 0.9718 | 0.6173 | 0.9718 | 0.9858 |
| 0.0602 | 7.1683 | 2982 | 1.0071 | 0.5921 | 1.0071 | 1.0035 |
| 0.0602 | 7.1731 | 2984 | 1.0006 | 0.6173 | 1.0006 | 1.0003 |
| 0.0602 | 7.1779 | 2986 | 1.0061 | 0.6173 | 1.0061 | 1.0030 |
| 0.0602 | 7.1827 | 2988 | 0.9869 | 0.6173 | 0.9869 | 0.9934 |
| 0.0602 | 7.1875 | 2990 | 0.9627 | 0.6173 | 0.9627 | 0.9812 |
| 0.0602 | 7.1923 | 2992 | 0.9622 | 0.6173 | 0.9622 | 0.9809 |
| 0.0602 | 7.1971 | 2994 | 0.9878 | 0.6173 | 0.9878 | 0.9939 |
| 0.0602 | 7.2019 | 2996 | 1.0154 | 0.6173 | 1.0154 | 1.0077 |
| 0.0602 | 7.2067 | 2998 | 1.0154 | 0.6173 | 1.0154 | 1.0077 |
| 0.0507 | 7.2115 | 3000 | 0.9779 | 0.6173 | 0.9779 | 0.9889 |
| 0.0507 | 7.2163 | 3002 | 0.9504 | 0.6410 | 0.9504 | 0.9749 |
| 0.0507 | 7.2212 | 3004 | 0.9461 | 0.6548 | 0.9461 | 0.9727 |
| 0.0507 | 7.2260 | 3006 | 0.9589 | 0.6410 | 0.9589 | 0.9792 |
| 0.0507 | 7.2308 | 3008 | 0.9440 | 0.6548 | 0.9440 | 0.9716 |
| 0.0507 | 7.2356 | 3010 | 0.9127 | 0.6548 | 0.9127 | 0.9553 |
| 0.0507 | 7.2404 | 3012 | 0.9025 | 0.6548 | 0.9025 | 0.9500 |
| 0.0507 | 7.2452 | 3014 | 0.9184 | 0.6548 | 0.9184 | 0.9583 |
| 0.0507 | 7.25 | 3016 | 0.9602 | 0.6173 | 0.9602 | 0.9799 |
| 0.0507 | 7.2548 | 3018 | 1.0170 | 0.5188 | 1.0170 | 1.0085 |
| 0.0507 | 7.2596 | 3020 | 1.0624 | 0.5593 | 1.0624 | 1.0307 |
| 0.0507 | 7.2644 | 3022 | 1.1119 | 0.5795 | 1.1119 | 1.0545 |
| 0.0507 | 7.2692 | 3024 | 1.1546 | 0.4523 | 1.1546 | 1.0745 |
| 0.0507 | 7.2740 | 3026 | 1.1502 | 0.4523 | 1.1502 | 1.0725 |
| 0.0507 | 7.2788 | 3028 | 1.1135 | 0.5199 | 1.1135 | 1.0552 |
| 0.0507 | 7.2837 | 3030 | 1.0461 | 0.5593 | 1.0461 | 1.0228 |
| 0.0507 | 7.2885 | 3032 | 0.9631 | 0.5921 | 0.9631 | 0.9814 |
| 0.0507 | 7.2933 | 3034 | 0.8952 | 0.6548 | 0.8952 | 0.9461 |
| 0.0507 | 7.2981 | 3036 | 0.8464 | 0.6599 | 0.8464 | 0.9200 |
| 0.0507 | 7.3029 | 3038 | 0.8312 | 0.6384 | 0.8312 | 0.9117 |
| 0.0507 | 7.3077 | 3040 | 0.8473 | 0.6599 | 0.8473 | 0.9205 |
| 0.0507 | 7.3125 | 3042 | 0.8774 | 0.6599 | 0.8774 | 0.9367 |
| 0.0507 | 7.3173 | 3044 | 0.9124 | 0.6173 | 0.9124 | 0.9552 |
| 0.0507 | 7.3221 | 3046 | 0.9486 | 0.5921 | 0.9486 | 0.9740 |
| 0.0507 | 7.3269 | 3048 | 0.9590 | 0.5921 | 0.9590 | 0.9793 |
| 0.0507 | 7.3317 | 3050 | 0.9681 | 0.5921 | 0.9681 | 0.9839 |
| 0.0507 | 7.3365 | 3052 | 0.9925 | 0.5773 | 0.9925 | 0.9962 |
| 0.0507 | 7.3413 | 3054 | 0.9896 | 0.5350 | 0.9896 | 0.9948 |
| 0.0507 | 7.3462 | 3056 | 0.9591 | 0.5921 | 0.9591 | 0.9794 |
| 0.0507 | 7.3510 | 3058 | 0.9173 | 0.5960 | 0.9173 | 0.9577 |
| 0.0507 | 7.3558 | 3060 | 0.8991 | 0.5714 | 0.8991 | 0.9482 |
| 0.0507 | 7.3606 | 3062 | 0.8990 | 0.5714 | 0.8990 | 0.9481 |
| 0.0507 | 7.3654 | 3064 | 0.9093 | 0.5960 | 0.9093 | 0.9536 |
| 0.0507 | 7.3702 | 3066 | 0.9198 | 0.5960 | 0.9198 | 0.9591 |
| 0.0507 | 7.375 | 3068 | 0.9267 | 0.6216 | 0.9267 | 0.9627 |
| 0.0507 | 7.3798 | 3070 | 0.9255 | 0.6216 | 0.9255 | 0.9620 |
| 0.0507 | 7.3846 | 3072 | 0.9226 | 0.6216 | 0.9226 | 0.9605 |
| 0.0507 | 7.3894 | 3074 | 0.9324 | 0.6216 | 0.9324 | 0.9656 |
| 0.0507 | 7.3942 | 3076 | 0.9498 | 0.6216 | 0.9498 | 0.9746 |
| 0.0507 | 7.3990 | 3078 | 0.9787 | 0.5765 | 0.9787 | 0.9893 |
| 0.0507 | 7.4038 | 3080 | 0.9719 | 0.5765 | 0.9719 | 0.9858 |
| 0.0507 | 7.4087 | 3082 | 0.9590 | 0.6173 | 0.9590 | 0.9793 |
| 0.0507 | 7.4135 | 3084 | 0.9417 | 0.6173 | 0.9417 | 0.9704 |
| 0.0507 | 7.4183 | 3086 | 0.9375 | 0.5798 | 0.9375 | 0.9682 |
| 0.0507 | 7.4231 | 3088 | 0.9445 | 0.5798 | 0.9445 | 0.9719 |
| 0.0507 | 7.4279 | 3090 | 0.9596 | 0.5093 | 0.9596 | 0.9796 |
| 0.0507 | 7.4327 | 3092 | 0.9613 | 0.5093 | 0.9613 | 0.9805 |
| 0.0507 | 7.4375 | 3094 | 0.9557 | 0.5378 | 0.9557 | 0.9776 |
| 0.0507 | 7.4423 | 3096 | 0.9671 | 0.5378 | 0.9671 | 0.9834 |
| 0.0507 | 7.4471 | 3098 | 1.0107 | 0.5354 | 1.0107 | 1.0053 |
| 0.0507 | 7.4519 | 3100 | 1.0427 | 0.5426 | 1.0427 | 1.0211 |
| 0.0507 | 7.4567 | 3102 | 1.0481 | 0.5426 | 1.0481 | 1.0238 |
| 0.0507 | 7.4615 | 3104 | 1.0144 | 0.5354 | 1.0144 | 1.0072 |
| 0.0507 | 7.4663 | 3106 | 0.9551 | 0.5765 | 0.9551 | 0.9773 |
| 0.0507 | 7.4712 | 3108 | 0.8950 | 0.5985 | 0.8950 | 0.9460 |
| 0.0507 | 7.4760 | 3110 | 0.8676 | 0.5985 | 0.8676 | 0.9315 |
| 0.0507 | 7.4808 | 3112 | 0.8607 | 0.5985 | 0.8607 | 0.9277 |
| 0.0507 | 7.4856 | 3114 | 0.8717 | 0.5985 | 0.8717 | 0.9337 |
| 0.0507 | 7.4904 | 3116 | 0.8746 | 0.6239 | 0.8746 | 0.9352 |
| 0.0507 | 7.4952 | 3118 | 0.8810 | 0.6458 | 0.8810 | 0.9386 |
| 0.0507 | 7.5 | 3120 | 0.8740 | 0.625 | 0.8740 | 0.9349 |
| 0.0507 | 7.5048 | 3122 | 0.8599 | 0.625 | 0.8599 | 0.9273 |
| 0.0507 | 7.5096 | 3124 | 0.8474 | 0.6176 | 0.8474 | 0.9205 |
| 0.0507 | 7.5144 | 3126 | 0.8457 | 0.6176 | 0.8457 | 0.9196 |
| 0.0507 | 7.5192 | 3128 | 0.8507 | 0.625 | 0.8507 | 0.9223 |
| 0.0507 | 7.5240 | 3130 | 0.8605 | 0.625 | 0.8605 | 0.9276 |
| 0.0507 | 7.5288 | 3132 | 0.8770 | 0.625 | 0.8770 | 0.9365 |
| 0.0507 | 7.5337 | 3134 | 0.8805 | 0.6458 | 0.8805 | 0.9383 |
| 0.0507 | 7.5385 | 3136 | 0.8802 | 0.6458 | 0.8802 | 0.9382 |
| 0.0507 | 7.5433 | 3138 | 0.8784 | 0.6458 | 0.8784 | 0.9372 |
| 0.0507 | 7.5481 | 3140 | 0.8843 | 0.6458 | 0.8843 | 0.9404 |
| 0.0507 | 7.5529 | 3142 | 0.8852 | 0.6458 | 0.8852 | 0.9409 |
| 0.0507 | 7.5577 | 3144 | 0.8648 | 0.6458 | 0.8648 | 0.9299 |
| 0.0507 | 7.5625 | 3146 | 0.8365 | 0.6239 | 0.8365 | 0.9146 |
| 0.0507 | 7.5673 | 3148 | 0.8106 | 0.6384 | 0.8106 | 0.9003 |
| 0.0507 | 7.5721 | 3150 | 0.7940 | 0.6384 | 0.7940 | 0.8911 |
| 0.0507 | 7.5769 | 3152 | 0.7965 | 0.6384 | 0.7965 | 0.8925 |
| 0.0507 | 7.5817 | 3154 | 0.8167 | 0.6384 | 0.8167 | 0.9037 |
| 0.0507 | 7.5865 | 3156 | 0.8427 | 0.5985 | 0.8427 | 0.9180 |
| 0.0507 | 7.5913 | 3158 | 0.8743 | 0.6216 | 0.8743 | 0.9351 |
| 0.0507 | 7.5962 | 3160 | 0.9141 | 0.6216 | 0.9141 | 0.9561 |
| 0.0507 | 7.6010 | 3162 | 0.9356 | 0.6216 | 0.9356 | 0.9672 |
| 0.0507 | 7.6058 | 3164 | 0.9689 | 0.6014 | 0.9689 | 0.9843 |
| 0.0507 | 7.6106 | 3166 | 0.9946 | 0.6189 | 0.9946 | 0.9973 |
| 0.0507 | 7.6154 | 3168 | 0.9979 | 0.5874 | 0.9979 | 0.9989 |
| 0.0507 | 7.6202 | 3170 | 0.9825 | 0.5874 | 0.9825 | 0.9912 |
| 0.0507 | 7.625 | 3172 | 0.9530 | 0.6014 | 0.9530 | 0.9762 |
| 0.0507 | 7.6298 | 3174 | 0.9255 | 0.6216 | 0.9255 | 0.9621 |
| 0.0507 | 7.6346 | 3176 | 0.8925 | 0.6216 | 0.8925 | 0.9447 |
| 0.0507 | 7.6394 | 3178 | 0.8803 | 0.6216 | 0.8803 | 0.9382 |
| 0.0507 | 7.6442 | 3180 | 0.8883 | 0.6216 | 0.8883 | 0.9425 |
| 0.0507 | 7.6490 | 3182 | 0.9017 | 0.6216 | 0.9017 | 0.9496 |
| 0.0507 | 7.6538 | 3184 | 0.8999 | 0.6216 | 0.8999 | 0.9486 |
| 0.0507 | 7.6587 | 3186 | 0.8980 | 0.6216 | 0.8980 | 0.9477 |
| 0.0507 | 7.6635 | 3188 | 0.9075 | 0.6216 | 0.9075 | 0.9526 |
| 0.0507 | 7.6683 | 3190 | 0.9160 | 0.6014 | 0.9160 | 0.9571 |
| 0.0507 | 7.6731 | 3192 | 0.8995 | 0.6014 | 0.8995 | 0.9484 |
| 0.0507 | 7.6779 | 3194 | 0.8685 | 0.6599 | 0.8685 | 0.9319 |
| 0.0507 | 7.6827 | 3196 | 0.8605 | 0.6599 | 0.8605 | 0.9276 |
| 0.0507 | 7.6875 | 3198 | 0.8607 | 0.6362 | 0.8607 | 0.9277 |
| 0.0507 | 7.6923 | 3200 | 0.8755 | 0.6216 | 0.8755 | 0.9357 |
| 0.0507 | 7.6971 | 3202 | 0.9020 | 0.6216 | 0.9020 | 0.9498 |
| 0.0507 | 7.7019 | 3204 | 0.9344 | 0.6014 | 0.9344 | 0.9666 |
| 0.0507 | 7.7067 | 3206 | 0.9317 | 0.6014 | 0.9317 | 0.9652 |
| 0.0507 | 7.7115 | 3208 | 0.9308 | 0.6014 | 0.9308 | 0.9648 |
| 0.0507 | 7.7163 | 3210 | 0.9546 | 0.5978 | 0.9546 | 0.9770 |
| 0.0507 | 7.7212 | 3212 | 0.9827 | 0.6040 | 0.9827 | 0.9913 |
| 0.0507 | 7.7260 | 3214 | 0.9982 | 0.6040 | 0.9982 | 0.9991 |
| 0.0507 | 7.7308 | 3216 | 1.0045 | 0.6240 | 1.0045 | 1.0022 |
| 0.0507 | 7.7356 | 3218 | 0.9813 | 0.6173 | 0.9813 | 0.9906 |
| 0.0507 | 7.7404 | 3220 | 0.9791 | 0.6173 | 0.9791 | 0.9895 |
| 0.0507 | 7.7452 | 3222 | 0.9958 | 0.5921 | 0.9958 | 0.9979 |
| 0.0507 | 7.75 | 3224 | 1.0110 | 0.6006 | 1.0110 | 1.0055 |
| 0.0507 | 7.7548 | 3226 | 0.9969 | 0.5773 | 0.9969 | 0.9984 |
| 0.0507 | 7.7596 | 3228 | 0.9866 | 0.6173 | 0.9866 | 0.9933 |
| 0.0507 | 7.7644 | 3230 | 0.9730 | 0.6173 | 0.9730 | 0.9864 |
| 0.0507 | 7.7692 | 3232 | 0.9843 | 0.6173 | 0.9843 | 0.9921 |
| 0.0507 | 7.7740 | 3234 | 1.0009 | 0.6029 | 1.0009 | 1.0004 |
| 0.0507 | 7.7788 | 3236 | 1.0339 | 0.5817 | 1.0339 | 1.0168 |
| 0.0507 | 7.7837 | 3238 | 1.0661 | 0.5512 | 1.0661 | 1.0325 |
| 0.0507 | 7.7885 | 3240 | 1.0857 | 0.5512 | 1.0857 | 1.0420 |
| 0.0507 | 7.7933 | 3242 | 1.0873 | 0.5593 | 1.0873 | 1.0428 |
| 0.0507 | 7.7981 | 3244 | 1.0646 | 0.5887 | 1.0646 | 1.0318 |
| 0.0507 | 7.8029 | 3246 | 1.0182 | 0.5591 | 1.0182 | 1.0091 |
| 0.0507 | 7.8077 | 3248 | 0.9668 | 0.5921 | 0.9668 | 0.9832 |
| 0.0507 | 7.8125 | 3250 | 0.9279 | 0.5921 | 0.9279 | 0.9633 |
| 0.0507 | 7.8173 | 3252 | 0.9041 | 0.6111 | 0.9041 | 0.9509 |
| 0.0507 | 7.8221 | 3254 | 0.8955 | 0.6111 | 0.8955 | 0.9463 |
| 0.0507 | 7.8269 | 3256 | 0.9086 | 0.6111 | 0.9086 | 0.9532 |
| 0.0507 | 7.8317 | 3258 | 0.9242 | 0.5874 | 0.9242 | 0.9614 |
| 0.0507 | 7.8365 | 3260 | 0.9333 | 0.6287 | 0.9333 | 0.9661 |
| 0.0507 | 7.8413 | 3262 | 0.9572 | 0.6165 | 0.9572 | 0.9784 |
| 0.0507 | 7.8462 | 3264 | 0.9744 | 0.6165 | 0.9744 | 0.9871 |
| 0.0507 | 7.8510 | 3266 | 0.9825 | 0.6369 | 0.9825 | 0.9912 |
| 0.0507 | 7.8558 | 3268 | 0.9785 | 0.6369 | 0.9785 | 0.9892 |
| 0.0507 | 7.8606 | 3270 | 0.9714 | 0.6369 | 0.9714 | 0.9856 |
| 0.0507 | 7.8654 | 3272 | 0.9746 | 0.6369 | 0.9746 | 0.9872 |
| 0.0507 | 7.8702 | 3274 | 0.9595 | 0.6488 | 0.9595 | 0.9795 |
| 0.0507 | 7.875 | 3276 | 0.9216 | 0.6679 | 0.9216 | 0.9600 |
| 0.0507 | 7.8798 | 3278 | 0.8795 | 0.6679 | 0.8795 | 0.9378 |
| 0.0507 | 7.8846 | 3280 | 0.8480 | 0.6385 | 0.8480 | 0.9209 |
| 0.0507 | 7.8894 | 3282 | 0.8301 | 0.6599 | 0.8301 | 0.9111 |
| 0.0507 | 7.8942 | 3284 | 0.8185 | 0.6599 | 0.8185 | 0.9047 |
| 0.0507 | 7.8990 | 3286 | 0.8237 | 0.6599 | 0.8237 | 0.9076 |
| 0.0507 | 7.9038 | 3288 | 0.8422 | 0.6599 | 0.8422 | 0.9177 |
| 0.0507 | 7.9087 | 3290 | 0.8483 | 0.6599 | 0.8483 | 0.9210 |
| 0.0507 | 7.9135 | 3292 | 0.8600 | 0.6599 | 0.8600 | 0.9274 |
| 0.0507 | 7.9183 | 3294 | 0.8792 | 0.6548 | 0.8792 | 0.9377 |
| 0.0507 | 7.9231 | 3296 | 0.8890 | 0.6548 | 0.8890 | 0.9428 |
| 0.0507 | 7.9279 | 3298 | 0.8933 | 0.6548 | 0.8933 | 0.9452 |
| 0.0507 | 7.9327 | 3300 | 0.9171 | 0.6548 | 0.9171 | 0.9576 |
| 0.0507 | 7.9375 | 3302 | 0.9263 | 0.6548 | 0.9263 | 0.9624 |
| 0.0507 | 7.9423 | 3304 | 0.9182 | 0.6548 | 0.9182 | 0.9582 |
| 0.0507 | 7.9471 | 3306 | 0.9038 | 0.6548 | 0.9038 | 0.9507 |
| 0.0507 | 7.9519 | 3308 | 0.8735 | 0.6599 | 0.8735 | 0.9346 |
| 0.0507 | 7.9567 | 3310 | 0.8545 | 0.6599 | 0.8545 | 0.9244 |
| 0.0507 | 7.9615 | 3312 | 0.8529 | 0.6599 | 0.8529 | 0.9235 |
| 0.0507 | 7.9663 | 3314 | 0.8427 | 0.6599 | 0.8427 | 0.9180 |
| 0.0507 | 7.9712 | 3316 | 0.8148 | 0.6384 | 0.8148 | 0.9027 |
| 0.0507 | 7.9760 | 3318 | 0.7998 | 0.6384 | 0.7998 | 0.8943 |
| 0.0507 | 7.9808 | 3320 | 0.7891 | 0.6384 | 0.7891 | 0.8883 |
| 0.0507 | 7.9856 | 3322 | 0.7954 | 0.6384 | 0.7954 | 0.8919 |
| 0.0507 | 7.9904 | 3324 | 0.8200 | 0.6384 | 0.8200 | 0.9056 |
| 0.0507 | 7.9952 | 3326 | 0.8502 | 0.6384 | 0.8502 | 0.9221 |
| 0.0507 | 8.0 | 3328 | 0.8890 | 0.6599 | 0.8890 | 0.9428 |
| 0.0507 | 8.0048 | 3330 | 0.9296 | 0.6599 | 0.9296 | 0.9642 |
| 0.0507 | 8.0096 | 3332 | 0.9633 | 0.6053 | 0.9633 | 0.9815 |
| 0.0507 | 8.0144 | 3334 | 0.9986 | 0.5697 | 0.9986 | 0.9993 |
| 0.0507 | 8.0192 | 3336 | 1.0210 | 0.5697 | 1.0210 | 1.0104 |
| 0.0507 | 8.0240 | 3338 | 1.0219 | 0.5697 | 1.0219 | 1.0109 |
| 0.0507 | 8.0288 | 3340 | 0.9979 | 0.5697 | 0.9979 | 0.9989 |
| 0.0507 | 8.0337 | 3342 | 0.9613 | 0.6053 | 0.9613 | 0.9805 |
| 0.0507 | 8.0385 | 3344 | 0.9306 | 0.6599 | 0.9306 | 0.9647 |
| 0.0507 | 8.0433 | 3346 | 0.9059 | 0.6599 | 0.9059 | 0.9518 |
| 0.0507 | 8.0481 | 3348 | 0.8766 | 0.6599 | 0.8766 | 0.9363 |
| 0.0507 | 8.0529 | 3350 | 0.8743 | 0.6599 | 0.8743 | 0.9351 |
| 0.0507 | 8.0577 | 3352 | 0.8813 | 0.6599 | 0.8813 | 0.9388 |
| 0.0507 | 8.0625 | 3354 | 0.8773 | 0.6599 | 0.8773 | 0.9367 |
| 0.0507 | 8.0673 | 3356 | 0.8717 | 0.6599 | 0.8717 | 0.9336 |
| 0.0507 | 8.0721 | 3358 | 0.8587 | 0.6599 | 0.8587 | 0.9267 |
| 0.0507 | 8.0769 | 3360 | 0.8580 | 0.6599 | 0.8580 | 0.9263 |
| 0.0507 | 8.0817 | 3362 | 0.8683 | 0.6599 | 0.8683 | 0.9318 |
| 0.0507 | 8.0865 | 3364 | 0.8832 | 0.6599 | 0.8832 | 0.9398 |
| 0.0507 | 8.0913 | 3366 | 0.8976 | 0.6599 | 0.8976 | 0.9474 |
| 0.0507 | 8.0962 | 3368 | 0.9121 | 0.6599 | 0.9121 | 0.9551 |
| 0.0507 | 8.1010 | 3370 | 0.9187 | 0.6599 | 0.9187 | 0.9585 |
| 0.0507 | 8.1058 | 3372 | 0.9188 | 0.6195 | 0.9188 | 0.9585 |
| 0.0507 | 8.1106 | 3374 | 0.9371 | 0.6053 | 0.9371 | 0.9680 |
| 0.0507 | 8.1154 | 3376 | 0.9419 | 0.5798 | 0.9419 | 0.9705 |
| 0.0507 | 8.1202 | 3378 | 0.9455 | 0.5798 | 0.9455 | 0.9724 |
| 0.0507 | 8.125 | 3380 | 0.9248 | 0.5798 | 0.9248 | 0.9617 |
| 0.0507 | 8.1298 | 3382 | 0.9177 | 0.5798 | 0.9177 | 0.9580 |
| 0.0507 | 8.1346 | 3384 | 0.9237 | 0.5798 | 0.9237 | 0.9611 |
| 0.0507 | 8.1394 | 3386 | 0.9237 | 0.5798 | 0.9237 | 0.9611 |
| 0.0507 | 8.1442 | 3388 | 0.9298 | 0.5798 | 0.9298 | 0.9643 |
| 0.0507 | 8.1490 | 3390 | 0.9226 | 0.6053 | 0.9226 | 0.9605 |
| 0.0507 | 8.1538 | 3392 | 0.9049 | 0.6599 | 0.9049 | 0.9512 |
| 0.0507 | 8.1587 | 3394 | 0.9000 | 0.6599 | 0.9000 | 0.9487 |
| 0.0507 | 8.1635 | 3396 | 0.8820 | 0.6599 | 0.8820 | 0.9392 |
| 0.0507 | 8.1683 | 3398 | 0.8638 | 0.6599 | 0.8638 | 0.9294 |
| 0.0507 | 8.1731 | 3400 | 0.8571 | 0.6599 | 0.8571 | 0.9258 |
| 0.0507 | 8.1779 | 3402 | 0.8655 | 0.6599 | 0.8655 | 0.9303 |
| 0.0507 | 8.1827 | 3404 | 0.8757 | 0.6599 | 0.8757 | 0.9358 |
| 0.0507 | 8.1875 | 3406 | 0.9012 | 0.6599 | 0.9012 | 0.9493 |
| 0.0507 | 8.1923 | 3408 | 0.9091 | 0.6599 | 0.9091 | 0.9535 |
| 0.0507 | 8.1971 | 3410 | 0.8972 | 0.6599 | 0.8972 | 0.9472 |
| 0.0507 | 8.2019 | 3412 | 0.8857 | 0.6599 | 0.8857 | 0.9411 |
| 0.0507 | 8.2067 | 3414 | 0.8627 | 0.6599 | 0.8627 | 0.9288 |
| 0.0507 | 8.2115 | 3416 | 0.8532 | 0.6599 | 0.8532 | 0.9237 |
| 0.0507 | 8.2163 | 3418 | 0.8610 | 0.6599 | 0.8610 | 0.9279 |
| 0.0507 | 8.2212 | 3420 | 0.8863 | 0.6599 | 0.8863 | 0.9414 |
| 0.0507 | 8.2260 | 3422 | 0.9181 | 0.6341 | 0.9181 | 0.9582 |
| 0.0507 | 8.2308 | 3424 | 0.9446 | 0.6209 | 0.9446 | 0.9719 |
| 0.0507 | 8.2356 | 3426 | 0.9531 | 0.6209 | 0.9531 | 0.9763 |
| 0.0507 | 8.2404 | 3428 | 0.9413 | 0.6341 | 0.9413 | 0.9702 |
| 0.0507 | 8.2452 | 3430 | 0.9262 | 0.6341 | 0.9262 | 0.9624 |
| 0.0507 | 8.25 | 3432 | 0.9202 | 0.6341 | 0.9202 | 0.9593 |
| 0.0507 | 8.2548 | 3434 | 0.9067 | 0.6341 | 0.9067 | 0.9522 |
| 0.0507 | 8.2596 | 3436 | 0.8978 | 0.6385 | 0.8978 | 0.9475 |
| 0.0507 | 8.2644 | 3438 | 0.8858 | 0.6599 | 0.8858 | 0.9412 |
| 0.0507 | 8.2692 | 3440 | 0.8885 | 0.6385 | 0.8885 | 0.9426 |
| 0.0507 | 8.2740 | 3442 | 0.8836 | 0.6385 | 0.8836 | 0.9400 |
| 0.0507 | 8.2788 | 3444 | 0.8671 | 0.6385 | 0.8671 | 0.9312 |
| 0.0507 | 8.2837 | 3446 | 0.8554 | 0.6599 | 0.8554 | 0.9249 |
| 0.0507 | 8.2885 | 3448 | 0.8608 | 0.6599 | 0.8608 | 0.9278 |
| 0.0507 | 8.2933 | 3450 | 0.8670 | 0.6599 | 0.8670 | 0.9311 |
| 0.0507 | 8.2981 | 3452 | 0.8661 | 0.6599 | 0.8661 | 0.9307 |
| 0.0507 | 8.3029 | 3454 | 0.8703 | 0.6599 | 0.8703 | 0.9329 |
| 0.0507 | 8.3077 | 3456 | 0.8735 | 0.6599 | 0.8735 | 0.9346 |
| 0.0507 | 8.3125 | 3458 | 0.8760 | 0.6599 | 0.8760 | 0.9359 |
| 0.0507 | 8.3173 | 3460 | 0.8841 | 0.6599 | 0.8841 | 0.9402 |
| 0.0507 | 8.3221 | 3462 | 0.8709 | 0.6599 | 0.8709 | 0.9332 |
| 0.0507 | 8.3269 | 3464 | 0.8513 | 0.6599 | 0.8513 | 0.9227 |
| 0.0507 | 8.3317 | 3466 | 0.8469 | 0.6599 | 0.8469 | 0.9203 |
| 0.0507 | 8.3365 | 3468 | 0.8525 | 0.6599 | 0.8525 | 0.9233 |
| 0.0507 | 8.3413 | 3470 | 0.8605 | 0.6599 | 0.8605 | 0.9277 |
| 0.0507 | 8.3462 | 3472 | 0.8601 | 0.6599 | 0.8601 | 0.9274 |
| 0.0507 | 8.3510 | 3474 | 0.8673 | 0.6599 | 0.8673 | 0.9313 |
| 0.0507 | 8.3558 | 3476 | 0.8892 | 0.6599 | 0.8892 | 0.9430 |
| 0.0507 | 8.3606 | 3478 | 0.9169 | 0.6599 | 0.9169 | 0.9576 |
| 0.0507 | 8.3654 | 3480 | 0.9184 | 0.6385 | 0.9184 | 0.9583 |
| 0.0507 | 8.3702 | 3482 | 0.9307 | 0.6385 | 0.9307 | 0.9647 |
| 0.0507 | 8.375 | 3484 | 0.9371 | 0.6341 | 0.9371 | 0.9680 |
| 0.0507 | 8.3798 | 3486 | 0.9247 | 0.6341 | 0.9247 | 0.9616 |
| 0.0507 | 8.3846 | 3488 | 0.9067 | 0.6599 | 0.9067 | 0.9522 |
| 0.0507 | 8.3894 | 3490 | 0.8944 | 0.6599 | 0.8944 | 0.9457 |
| 0.0507 | 8.3942 | 3492 | 0.8759 | 0.6599 | 0.8759 | 0.9359 |
| 0.0507 | 8.3990 | 3494 | 0.8572 | 0.6599 | 0.8572 | 0.9259 |
| 0.0507 | 8.4038 | 3496 | 0.8399 | 0.6599 | 0.8399 | 0.9164 |
| 0.0507 | 8.4087 | 3498 | 0.8392 | 0.6599 | 0.8392 | 0.9161 |
| 0.0474 | 8.4135 | 3500 | 0.8480 | 0.6599 | 0.8480 | 0.9209 |
| 0.0474 | 8.4183 | 3502 | 0.8573 | 0.6599 | 0.8573 | 0.9259 |
| 0.0474 | 8.4231 | 3504 | 0.8675 | 0.6599 | 0.8675 | 0.9314 |
| 0.0474 | 8.4279 | 3506 | 0.8741 | 0.6599 | 0.8741 | 0.9349 |
| 0.0474 | 8.4327 | 3508 | 0.8849 | 0.6548 | 0.8849 | 0.9407 |
| 0.0474 | 8.4375 | 3510 | 0.8931 | 0.6548 | 0.8931 | 0.9450 |
| 0.0474 | 8.4423 | 3512 | 0.8975 | 0.6548 | 0.8975 | 0.9473 |
| 0.0474 | 8.4471 | 3514 | 0.8973 | 0.6548 | 0.8973 | 0.9473 |
| 0.0474 | 8.4519 | 3516 | 0.8833 | 0.6548 | 0.8833 | 0.9398 |
| 0.0474 | 8.4567 | 3518 | 0.8746 | 0.6548 | 0.8746 | 0.9352 |
| 0.0474 | 8.4615 | 3520 | 0.8725 | 0.6548 | 0.8725 | 0.9341 |
| 0.0474 | 8.4663 | 3522 | 0.8896 | 0.6548 | 0.8896 | 0.9432 |
| 0.0474 | 8.4712 | 3524 | 0.9066 | 0.6341 | 0.9066 | 0.9522 |
| 0.0474 | 8.4760 | 3526 | 0.9053 | 0.6341 | 0.9053 | 0.9514 |
| 0.0474 | 8.4808 | 3528 | 0.8923 | 0.6548 | 0.8923 | 0.9446 |
| 0.0474 | 8.4856 | 3530 | 0.8816 | 0.6548 | 0.8816 | 0.9389 |
| 0.0474 | 8.4904 | 3532 | 0.8804 | 0.6548 | 0.8804 | 0.9383 |
| 0.0474 | 8.4952 | 3534 | 0.8694 | 0.6548 | 0.8694 | 0.9324 |
| 0.0474 | 8.5 | 3536 | 0.8527 | 0.6548 | 0.8527 | 0.9234 |
| 0.0474 | 8.5048 | 3538 | 0.8461 | 0.6548 | 0.8461 | 0.9198 |
| 0.0474 | 8.5096 | 3540 | 0.8452 | 0.6599 | 0.8452 | 0.9193 |
| 0.0474 | 8.5144 | 3542 | 0.8510 | 0.6548 | 0.8510 | 0.9225 |
| 0.0474 | 8.5192 | 3544 | 0.8616 | 0.6548 | 0.8616 | 0.9282 |
| 0.0474 | 8.5240 | 3546 | 0.8686 | 0.6548 | 0.8686 | 0.9320 |
| 0.0474 | 8.5288 | 3548 | 0.8691 | 0.6548 | 0.8691 | 0.9323 |
| 0.0474 | 8.5337 | 3550 | 0.8746 | 0.6548 | 0.8746 | 0.9352 |
| 0.0474 | 8.5385 | 3552 | 0.8714 | 0.6548 | 0.8714 | 0.9335 |
| 0.0474 | 8.5433 | 3554 | 0.8612 | 0.6548 | 0.8612 | 0.9280 |
| 0.0474 | 8.5481 | 3556 | 0.8507 | 0.6599 | 0.8507 | 0.9223 |
| 0.0474 | 8.5529 | 3558 | 0.8477 | 0.6548 | 0.8477 | 0.9207 |
| 0.0474 | 8.5577 | 3560 | 0.8416 | 0.6599 | 0.8416 | 0.9174 |
| 0.0474 | 8.5625 | 3562 | 0.8296 | 0.6599 | 0.8296 | 0.9108 |
| 0.0474 | 8.5673 | 3564 | 0.8245 | 0.6599 | 0.8245 | 0.9080 |
| 0.0474 | 8.5721 | 3566 | 0.8172 | 0.6599 | 0.8172 | 0.9040 |
| 0.0474 | 8.5769 | 3568 | 0.7990 | 0.6599 | 0.7990 | 0.8939 |
| 0.0474 | 8.5817 | 3570 | 0.7961 | 0.6599 | 0.7961 | 0.8923 |
| 0.0474 | 8.5865 | 3572 | 0.7962 | 0.6599 | 0.7962 | 0.8923 |
| 0.0474 | 8.5913 | 3574 | 0.7840 | 0.6599 | 0.7840 | 0.8855 |
| 0.0474 | 8.5962 | 3576 | 0.7847 | 0.6599 | 0.7847 | 0.8858 |
| 0.0474 | 8.6010 | 3578 | 0.7861 | 0.6599 | 0.7861 | 0.8866 |
| 0.0474 | 8.6058 | 3580 | 0.8010 | 0.6599 | 0.8010 | 0.8950 |
| 0.0474 | 8.6106 | 3582 | 0.8116 | 0.6599 | 0.8116 | 0.9009 |
| 0.0474 | 8.6154 | 3584 | 0.8150 | 0.6599 | 0.8150 | 0.9028 |
| 0.0474 | 8.6202 | 3586 | 0.8220 | 0.6599 | 0.8220 | 0.9066 |
| 0.0474 | 8.625 | 3588 | 0.8266 | 0.6548 | 0.8266 | 0.9092 |
| 0.0474 | 8.6298 | 3590 | 0.8466 | 0.6548 | 0.8466 | 0.9201 |
| 0.0474 | 8.6346 | 3592 | 0.8641 | 0.6341 | 0.8641 | 0.9295 |
| 0.0474 | 8.6394 | 3594 | 0.8762 | 0.6341 | 0.8762 | 0.9360 |
| 0.0474 | 8.6442 | 3596 | 0.8792 | 0.6341 | 0.8792 | 0.9377 |
| 0.0474 | 8.6490 | 3598 | 0.8721 | 0.6341 | 0.8721 | 0.9339 |
| 0.0474 | 8.6538 | 3600 | 0.8585 | 0.6548 | 0.8585 | 0.9266 |
| 0.0474 | 8.6587 | 3602 | 0.8541 | 0.6548 | 0.8541 | 0.9242 |
| 0.0474 | 8.6635 | 3604 | 0.8600 | 0.6548 | 0.8600 | 0.9274 |
| 0.0474 | 8.6683 | 3606 | 0.8706 | 0.6548 | 0.8706 | 0.9331 |
| 0.0474 | 8.6731 | 3608 | 0.8789 | 0.6548 | 0.8789 | 0.9375 |
| 0.0474 | 8.6779 | 3610 | 0.8930 | 0.6548 | 0.8930 | 0.9450 |
| 0.0474 | 8.6827 | 3612 | 0.8968 | 0.6548 | 0.8968 | 0.9470 |
| 0.0474 | 8.6875 | 3614 | 0.9019 | 0.6548 | 0.9019 | 0.9497 |
| 0.0474 | 8.6923 | 3616 | 0.9067 | 0.6548 | 0.9067 | 0.9522 |
| 0.0474 | 8.6971 | 3618 | 0.9077 | 0.6548 | 0.9077 | 0.9527 |
| 0.0474 | 8.7019 | 3620 | 0.9084 | 0.6548 | 0.9084 | 0.9531 |
| 0.0474 | 8.7067 | 3622 | 0.9056 | 0.6548 | 0.9056 | 0.9516 |
| 0.0474 | 8.7115 | 3624 | 0.9076 | 0.6548 | 0.9076 | 0.9527 |
| 0.0474 | 8.7163 | 3626 | 0.8996 | 0.6548 | 0.8996 | 0.9485 |
| 0.0474 | 8.7212 | 3628 | 0.8936 | 0.6548 | 0.8936 | 0.9453 |
| 0.0474 | 8.7260 | 3630 | 0.8839 | 0.6548 | 0.8839 | 0.9402 |
| 0.0474 | 8.7308 | 3632 | 0.8881 | 0.6548 | 0.8881 | 0.9424 |
| 0.0474 | 8.7356 | 3634 | 0.8875 | 0.6548 | 0.8875 | 0.9421 |
| 0.0474 | 8.7404 | 3636 | 0.8875 | 0.6548 | 0.8875 | 0.9421 |
| 0.0474 | 8.7452 | 3638 | 0.8839 | 0.6548 | 0.8839 | 0.9401 |
| 0.0474 | 8.75 | 3640 | 0.8877 | 0.6548 | 0.8877 | 0.9422 |
| 0.0474 | 8.7548 | 3642 | 0.8893 | 0.6548 | 0.8893 | 0.9430 |
| 0.0474 | 8.7596 | 3644 | 0.8904 | 0.6548 | 0.8904 | 0.9436 |
| 0.0474 | 8.7644 | 3646 | 0.8993 | 0.6548 | 0.8993 | 0.9483 |
| 0.0474 | 8.7692 | 3648 | 0.9032 | 0.6548 | 0.9032 | 0.9504 |
| 0.0474 | 8.7740 | 3650 | 0.9075 | 0.6548 | 0.9075 | 0.9526 |
| 0.0474 | 8.7788 | 3652 | 0.9090 | 0.6548 | 0.9090 | 0.9534 |
| 0.0474 | 8.7837 | 3654 | 0.8967 | 0.6548 | 0.8967 | 0.9469 |
| 0.0474 | 8.7885 | 3656 | 0.8740 | 0.6548 | 0.8740 | 0.9349 |
| 0.0474 | 8.7933 | 3658 | 0.8597 | 0.6548 | 0.8597 | 0.9272 |
| 0.0474 | 8.7981 | 3660 | 0.8579 | 0.6548 | 0.8579 | 0.9262 |
| 0.0474 | 8.8029 | 3662 | 0.8538 | 0.6599 | 0.8538 | 0.9240 |
| 0.0474 | 8.8077 | 3664 | 0.8633 | 0.6548 | 0.8633 | 0.9291 |
| 0.0474 | 8.8125 | 3666 | 0.8801 | 0.6548 | 0.8801 | 0.9382 |
| 0.0474 | 8.8173 | 3668 | 0.8971 | 0.6548 | 0.8971 | 0.9472 |
| 0.0474 | 8.8221 | 3670 | 0.9086 | 0.6548 | 0.9086 | 0.9532 |
| 0.0474 | 8.8269 | 3672 | 0.9148 | 0.6548 | 0.9148 | 0.9565 |
| 0.0474 | 8.8317 | 3674 | 0.9260 | 0.6548 | 0.9260 | 0.9623 |
| 0.0474 | 8.8365 | 3676 | 0.9185 | 0.6548 | 0.9185 | 0.9584 |
| 0.0474 | 8.8413 | 3678 | 0.9048 | 0.6548 | 0.9048 | 0.9512 |
| 0.0474 | 8.8462 | 3680 | 0.9036 | 0.6548 | 0.9036 | 0.9506 |
| 0.0474 | 8.8510 | 3682 | 0.9094 | 0.6548 | 0.9094 | 0.9536 |
| 0.0474 | 8.8558 | 3684 | 0.9056 | 0.6548 | 0.9056 | 0.9516 |
| 0.0474 | 8.8606 | 3686 | 0.8974 | 0.6548 | 0.8974 | 0.9473 |
| 0.0474 | 8.8654 | 3688 | 0.8830 | 0.6548 | 0.8830 | 0.9397 |
| 0.0474 | 8.8702 | 3690 | 0.8643 | 0.6599 | 0.8643 | 0.9297 |
| 0.0474 | 8.875 | 3692 | 0.8478 | 0.6599 | 0.8478 | 0.9208 |
| 0.0474 | 8.8798 | 3694 | 0.8427 | 0.6599 | 0.8427 | 0.9180 |
| 0.0474 | 8.8846 | 3696 | 0.8488 | 0.6599 | 0.8488 | 0.9213 |
| 0.0474 | 8.8894 | 3698 | 0.8543 | 0.6599 | 0.8543 | 0.9243 |
| 0.0474 | 8.8942 | 3700 | 0.8542 | 0.6599 | 0.8542 | 0.9242 |
| 0.0474 | 8.8990 | 3702 | 0.8518 | 0.6599 | 0.8518 | 0.9229 |
| 0.0474 | 8.9038 | 3704 | 0.8589 | 0.6599 | 0.8589 | 0.9268 |
| 0.0474 | 8.9087 | 3706 | 0.8649 | 0.6599 | 0.8649 | 0.9300 |
| 0.0474 | 8.9135 | 3708 | 0.8758 | 0.6599 | 0.8758 | 0.9359 |
| 0.0474 | 8.9183 | 3710 | 0.8992 | 0.6341 | 0.8992 | 0.9482 |
| 0.0474 | 8.9231 | 3712 | 0.9192 | 0.6341 | 0.9192 | 0.9588 |
| 0.0474 | 8.9279 | 3714 | 0.9487 | 0.6209 | 0.9487 | 0.9740 |
| 0.0474 | 8.9327 | 3716 | 0.9747 | 0.6209 | 0.9747 | 0.9872 |
| 0.0474 | 8.9375 | 3718 | 1.0011 | 0.6185 | 1.0011 | 1.0006 |
| 0.0474 | 8.9423 | 3720 | 1.0151 | 0.6232 | 1.0151 | 1.0075 |
| 0.0474 | 8.9471 | 3722 | 1.0220 | 0.6232 | 1.0220 | 1.0109 |
| 0.0474 | 8.9519 | 3724 | 1.0116 | 0.6185 | 1.0116 | 1.0058 |
| 0.0474 | 8.9567 | 3726 | 0.9922 | 0.5831 | 0.9922 | 0.9961 |
| 0.0474 | 8.9615 | 3728 | 0.9675 | 0.5831 | 0.9675 | 0.9836 |
| 0.0474 | 8.9663 | 3730 | 0.9437 | 0.6410 | 0.9437 | 0.9714 |
| 0.0474 | 8.9712 | 3732 | 0.9330 | 0.6410 | 0.9330 | 0.9659 |
| 0.0474 | 8.9760 | 3734 | 0.9265 | 0.6410 | 0.9265 | 0.9625 |
| 0.0474 | 8.9808 | 3736 | 0.9279 | 0.6410 | 0.9279 | 0.9633 |
| 0.0474 | 8.9856 | 3738 | 0.9273 | 0.6410 | 0.9273 | 0.9630 |
| 0.0474 | 8.9904 | 3740 | 0.9326 | 0.6173 | 0.9326 | 0.9657 |
| 0.0474 | 8.9952 | 3742 | 0.9393 | 0.6410 | 0.9393 | 0.9692 |
| 0.0474 | 9.0 | 3744 | 0.9459 | 0.6410 | 0.9459 | 0.9726 |
| 0.0474 | 9.0048 | 3746 | 0.9432 | 0.6410 | 0.9432 | 0.9712 |
| 0.0474 | 9.0096 | 3748 | 0.9412 | 0.6209 | 0.9412 | 0.9701 |
| 0.0474 | 9.0144 | 3750 | 0.9312 | 0.6209 | 0.9312 | 0.9650 |
| 0.0474 | 9.0192 | 3752 | 0.9109 | 0.6341 | 0.9109 | 0.9544 |
| 0.0474 | 9.0240 | 3754 | 0.8844 | 0.6599 | 0.8844 | 0.9404 |
| 0.0474 | 9.0288 | 3756 | 0.8568 | 0.6599 | 0.8568 | 0.9257 |
| 0.0474 | 9.0337 | 3758 | 0.8431 | 0.6599 | 0.8431 | 0.9182 |
| 0.0474 | 9.0385 | 3760 | 0.8423 | 0.6599 | 0.8423 | 0.9178 |
| 0.0474 | 9.0433 | 3762 | 0.8439 | 0.6599 | 0.8439 | 0.9187 |
| 0.0474 | 9.0481 | 3764 | 0.8460 | 0.6599 | 0.8460 | 0.9198 |
| 0.0474 | 9.0529 | 3766 | 0.8499 | 0.6599 | 0.8499 | 0.9219 |
| 0.0474 | 9.0577 | 3768 | 0.8564 | 0.6599 | 0.8564 | 0.9254 |
| 0.0474 | 9.0625 | 3770 | 0.8588 | 0.6599 | 0.8588 | 0.9267 |
| 0.0474 | 9.0673 | 3772 | 0.8649 | 0.6599 | 0.8649 | 0.9300 |
| 0.0474 | 9.0721 | 3774 | 0.8683 | 0.6599 | 0.8683 | 0.9318 |
| 0.0474 | 9.0769 | 3776 | 0.8699 | 0.6599 | 0.8699 | 0.9327 |
| 0.0474 | 9.0817 | 3778 | 0.8787 | 0.6599 | 0.8787 | 0.9374 |
| 0.0474 | 9.0865 | 3780 | 0.8973 | 0.6599 | 0.8973 | 0.9473 |
| 0.0474 | 9.0913 | 3782 | 0.9201 | 0.6548 | 0.9201 | 0.9592 |
| 0.0474 | 9.0962 | 3784 | 0.9430 | 0.6410 | 0.9430 | 0.9711 |
| 0.0474 | 9.1010 | 3786 | 0.9554 | 0.6014 | 0.9554 | 0.9774 |
| 0.0474 | 9.1058 | 3788 | 0.9544 | 0.6014 | 0.9544 | 0.9769 |
| 0.0474 | 9.1106 | 3790 | 0.9459 | 0.6014 | 0.9459 | 0.9726 |
| 0.0474 | 9.1154 | 3792 | 0.9384 | 0.6014 | 0.9384 | 0.9687 |
| 0.0474 | 9.1202 | 3794 | 0.9260 | 0.6410 | 0.9260 | 0.9623 |
| 0.0474 | 9.125 | 3796 | 0.9137 | 0.6458 | 0.9137 | 0.9559 |
| 0.0474 | 9.1298 | 3798 | 0.9049 | 0.6599 | 0.9049 | 0.9513 |
| 0.0474 | 9.1346 | 3800 | 0.8966 | 0.6599 | 0.8966 | 0.9469 |
| 0.0474 | 9.1394 | 3802 | 0.8937 | 0.6599 | 0.8937 | 0.9453 |
| 0.0474 | 9.1442 | 3804 | 0.8956 | 0.6599 | 0.8956 | 0.9464 |
| 0.0474 | 9.1490 | 3806 | 0.9001 | 0.6599 | 0.9001 | 0.9487 |
| 0.0474 | 9.1538 | 3808 | 0.9039 | 0.6458 | 0.9039 | 0.9507 |
| 0.0474 | 9.1587 | 3810 | 0.9148 | 0.6458 | 0.9148 | 0.9565 |
| 0.0474 | 9.1635 | 3812 | 0.9304 | 0.6410 | 0.9304 | 0.9646 |
| 0.0474 | 9.1683 | 3814 | 0.9397 | 0.6173 | 0.9397 | 0.9694 |
| 0.0474 | 9.1731 | 3816 | 0.9427 | 0.6173 | 0.9427 | 0.9709 |
| 0.0474 | 9.1779 | 3818 | 0.9372 | 0.6173 | 0.9372 | 0.9681 |
| 0.0474 | 9.1827 | 3820 | 0.9239 | 0.6173 | 0.9239 | 0.9612 |
| 0.0474 | 9.1875 | 3822 | 0.9104 | 0.6458 | 0.9104 | 0.9542 |
| 0.0474 | 9.1923 | 3824 | 0.8995 | 0.6458 | 0.8995 | 0.9484 |
| 0.0474 | 9.1971 | 3826 | 0.8941 | 0.6458 | 0.8941 | 0.9456 |
| 0.0474 | 9.2019 | 3828 | 0.8885 | 0.6384 | 0.8885 | 0.9426 |
| 0.0474 | 9.2067 | 3830 | 0.8893 | 0.6599 | 0.8893 | 0.9430 |
| 0.0474 | 9.2115 | 3832 | 0.8998 | 0.6458 | 0.8998 | 0.9486 |
| 0.0474 | 9.2163 | 3834 | 0.9178 | 0.6173 | 0.9178 | 0.9580 |
| 0.0474 | 9.2212 | 3836 | 0.9383 | 0.6173 | 0.9383 | 0.9687 |
| 0.0474 | 9.2260 | 3838 | 0.9532 | 0.6173 | 0.9532 | 0.9763 |
| 0.0474 | 9.2308 | 3840 | 0.9598 | 0.6173 | 0.9598 | 0.9797 |
| 0.0474 | 9.2356 | 3842 | 0.9634 | 0.6173 | 0.9634 | 0.9815 |
| 0.0474 | 9.2404 | 3844 | 0.9578 | 0.6173 | 0.9578 | 0.9787 |
| 0.0474 | 9.2452 | 3846 | 0.9541 | 0.6173 | 0.9541 | 0.9768 |
| 0.0474 | 9.25 | 3848 | 0.9589 | 0.6173 | 0.9589 | 0.9792 |
| 0.0474 | 9.2548 | 3850 | 0.9611 | 0.6173 | 0.9611 | 0.9804 |
| 0.0474 | 9.2596 | 3852 | 0.9673 | 0.5978 | 0.9673 | 0.9835 |
| 0.0474 | 9.2644 | 3854 | 0.9721 | 0.5978 | 0.9721 | 0.9859 |
| 0.0474 | 9.2692 | 3856 | 0.9665 | 0.5978 | 0.9665 | 0.9831 |
| 0.0474 | 9.2740 | 3858 | 0.9647 | 0.5978 | 0.9647 | 0.9822 |
| 0.0474 | 9.2788 | 3860 | 0.9667 | 0.6173 | 0.9667 | 0.9832 |
| 0.0474 | 9.2837 | 3862 | 0.9588 | 0.6173 | 0.9588 | 0.9792 |
| 0.0474 | 9.2885 | 3864 | 0.9475 | 0.6173 | 0.9475 | 0.9734 |
| 0.0474 | 9.2933 | 3866 | 0.9388 | 0.6173 | 0.9388 | 0.9689 |
| 0.0474 | 9.2981 | 3868 | 0.9269 | 0.6173 | 0.9269 | 0.9628 |
| 0.0474 | 9.3029 | 3870 | 0.9190 | 0.6410 | 0.9190 | 0.9586 |
| 0.0474 | 9.3077 | 3872 | 0.9151 | 0.6410 | 0.9151 | 0.9566 |
| 0.0474 | 9.3125 | 3874 | 0.9126 | 0.6410 | 0.9126 | 0.9553 |
| 0.0474 | 9.3173 | 3876 | 0.9150 | 0.6173 | 0.9150 | 0.9566 |
| 0.0474 | 9.3221 | 3878 | 0.9159 | 0.6410 | 0.9159 | 0.9570 |
| 0.0474 | 9.3269 | 3880 | 0.9188 | 0.6410 | 0.9188 | 0.9585 |
| 0.0474 | 9.3317 | 3882 | 0.9219 | 0.6410 | 0.9219 | 0.9602 |
| 0.0474 | 9.3365 | 3884 | 0.9252 | 0.6410 | 0.9252 | 0.9619 |
| 0.0474 | 9.3413 | 3886 | 0.9242 | 0.6410 | 0.9242 | 0.9613 |
| 0.0474 | 9.3462 | 3888 | 0.9205 | 0.6410 | 0.9205 | 0.9594 |
| 0.0474 | 9.3510 | 3890 | 0.9109 | 0.6410 | 0.9109 | 0.9544 |
| 0.0474 | 9.3558 | 3892 | 0.9013 | 0.6410 | 0.9013 | 0.9493 |
| 0.0474 | 9.3606 | 3894 | 0.8984 | 0.6410 | 0.8984 | 0.9478 |
| 0.0474 | 9.3654 | 3896 | 0.9005 | 0.6410 | 0.9005 | 0.9489 |
| 0.0474 | 9.3702 | 3898 | 0.9044 | 0.6410 | 0.9044 | 0.9510 |
| 0.0474 | 9.375 | 3900 | 0.9109 | 0.6410 | 0.9109 | 0.9544 |
| 0.0474 | 9.3798 | 3902 | 0.9221 | 0.6410 | 0.9221 | 0.9602 |
| 0.0474 | 9.3846 | 3904 | 0.9355 | 0.6410 | 0.9355 | 0.9672 |
| 0.0474 | 9.3894 | 3906 | 0.9489 | 0.6173 | 0.9489 | 0.9741 |
| 0.0474 | 9.3942 | 3908 | 0.9612 | 0.6173 | 0.9612 | 0.9804 |
| 0.0474 | 9.3990 | 3910 | 0.9753 | 0.6173 | 0.9753 | 0.9876 |
| 0.0474 | 9.4038 | 3912 | 0.9863 | 0.5619 | 0.9863 | 0.9931 |
| 0.0474 | 9.4087 | 3914 | 0.9908 | 0.5619 | 0.9908 | 0.9954 |
| 0.0474 | 9.4135 | 3916 | 0.9964 | 0.5619 | 0.9964 | 0.9982 |
| 0.0474 | 9.4183 | 3918 | 0.9954 | 0.5619 | 0.9954 | 0.9977 |
| 0.0474 | 9.4231 | 3920 | 0.9914 | 0.5619 | 0.9914 | 0.9957 |
| 0.0474 | 9.4279 | 3922 | 0.9907 | 0.5619 | 0.9907 | 0.9954 |
| 0.0474 | 9.4327 | 3924 | 0.9821 | 0.6029 | 0.9821 | 0.9910 |
| 0.0474 | 9.4375 | 3926 | 0.9727 | 0.6173 | 0.9727 | 0.9862 |
| 0.0474 | 9.4423 | 3928 | 0.9643 | 0.6173 | 0.9643 | 0.9820 |
| 0.0474 | 9.4471 | 3930 | 0.9544 | 0.6410 | 0.9544 | 0.9770 |
| 0.0474 | 9.4519 | 3932 | 0.9439 | 0.6410 | 0.9439 | 0.9715 |
| 0.0474 | 9.4567 | 3934 | 0.9325 | 0.6410 | 0.9325 | 0.9657 |
| 0.0474 | 9.4615 | 3936 | 0.9227 | 0.6410 | 0.9227 | 0.9606 |
| 0.0474 | 9.4663 | 3938 | 0.9170 | 0.6410 | 0.9170 | 0.9576 |
| 0.0474 | 9.4712 | 3940 | 0.9143 | 0.6410 | 0.9143 | 0.9562 |
| 0.0474 | 9.4760 | 3942 | 0.9112 | 0.6410 | 0.9112 | 0.9546 |
| 0.0474 | 9.4808 | 3944 | 0.9106 | 0.6410 | 0.9106 | 0.9543 |
| 0.0474 | 9.4856 | 3946 | 0.9073 | 0.6410 | 0.9073 | 0.9525 |
| 0.0474 | 9.4904 | 3948 | 0.9022 | 0.6410 | 0.9022 | 0.9499 |
| 0.0474 | 9.4952 | 3950 | 0.8950 | 0.6599 | 0.8950 | 0.9460 |
| 0.0474 | 9.5 | 3952 | 0.8880 | 0.6599 | 0.8880 | 0.9423 |
| 0.0474 | 9.5048 | 3954 | 0.8790 | 0.6599 | 0.8790 | 0.9375 |
| 0.0474 | 9.5096 | 3956 | 0.8706 | 0.6599 | 0.8706 | 0.9330 |
| 0.0474 | 9.5144 | 3958 | 0.8659 | 0.6599 | 0.8659 | 0.9306 |
| 0.0474 | 9.5192 | 3960 | 0.8633 | 0.6599 | 0.8633 | 0.9292 |
| 0.0474 | 9.5240 | 3962 | 0.8629 | 0.6599 | 0.8629 | 0.9289 |
| 0.0474 | 9.5288 | 3964 | 0.8672 | 0.6599 | 0.8672 | 0.9312 |
| 0.0474 | 9.5337 | 3966 | 0.8685 | 0.6599 | 0.8685 | 0.9319 |
| 0.0474 | 9.5385 | 3968 | 0.8671 | 0.6599 | 0.8671 | 0.9312 |
| 0.0474 | 9.5433 | 3970 | 0.8699 | 0.6599 | 0.8699 | 0.9327 |
| 0.0474 | 9.5481 | 3972 | 0.8747 | 0.6599 | 0.8747 | 0.9353 |
| 0.0474 | 9.5529 | 3974 | 0.8817 | 0.6599 | 0.8817 | 0.9390 |
| 0.0474 | 9.5577 | 3976 | 0.8876 | 0.6599 | 0.8876 | 0.9421 |
| 0.0474 | 9.5625 | 3978 | 0.8952 | 0.6458 | 0.8952 | 0.9461 |
| 0.0474 | 9.5673 | 3980 | 0.9036 | 0.6458 | 0.9036 | 0.9506 |
| 0.0474 | 9.5721 | 3982 | 0.9107 | 0.6410 | 0.9107 | 0.9543 |
| 0.0474 | 9.5769 | 3984 | 0.9197 | 0.6410 | 0.9197 | 0.9590 |
| 0.0474 | 9.5817 | 3986 | 0.9257 | 0.6410 | 0.9257 | 0.9621 |
| 0.0474 | 9.5865 | 3988 | 0.9317 | 0.6410 | 0.9317 | 0.9652 |
| 0.0474 | 9.5913 | 3990 | 0.9347 | 0.6410 | 0.9347 | 0.9668 |
| 0.0474 | 9.5962 | 3992 | 0.9388 | 0.6410 | 0.9388 | 0.9689 |
| 0.0474 | 9.6010 | 3994 | 0.9411 | 0.6410 | 0.9411 | 0.9701 |
| 0.0474 | 9.6058 | 3996 | 0.9406 | 0.6410 | 0.9406 | 0.9698 |
| 0.0474 | 9.6106 | 3998 | 0.9413 | 0.6410 | 0.9413 | 0.9702 |
| 0.038 | 9.6154 | 4000 | 0.9446 | 0.6410 | 0.9446 | 0.9719 |
| 0.038 | 9.6202 | 4002 | 0.9500 | 0.6410 | 0.9500 | 0.9747 |
| 0.038 | 9.625 | 4004 | 0.9507 | 0.6410 | 0.9507 | 0.9751 |
| 0.038 | 9.6298 | 4006 | 0.9495 | 0.6410 | 0.9495 | 0.9744 |
| 0.038 | 9.6346 | 4008 | 0.9437 | 0.6410 | 0.9437 | 0.9714 |
| 0.038 | 9.6394 | 4010 | 0.9338 | 0.6410 | 0.9338 | 0.9664 |
| 0.038 | 9.6442 | 4012 | 0.9236 | 0.6410 | 0.9236 | 0.9610 |
| 0.038 | 9.6490 | 4014 | 0.9143 | 0.6458 | 0.9143 | 0.9562 |
| 0.038 | 9.6538 | 4016 | 0.9051 | 0.6458 | 0.9051 | 0.9514 |
| 0.038 | 9.6587 | 4018 | 0.9003 | 0.6458 | 0.9003 | 0.9488 |
| 0.038 | 9.6635 | 4020 | 0.8974 | 0.6458 | 0.8974 | 0.9473 |
| 0.038 | 9.6683 | 4022 | 0.8949 | 0.6458 | 0.8949 | 0.9460 |
| 0.038 | 9.6731 | 4024 | 0.8956 | 0.6458 | 0.8956 | 0.9464 |
| 0.038 | 9.6779 | 4026 | 0.8962 | 0.6458 | 0.8962 | 0.9467 |
| 0.038 | 9.6827 | 4028 | 0.8976 | 0.6458 | 0.8976 | 0.9474 |
| 0.038 | 9.6875 | 4030 | 0.9004 | 0.6458 | 0.9004 | 0.9489 |
| 0.038 | 9.6923 | 4032 | 0.9047 | 0.6458 | 0.9047 | 0.9511 |
| 0.038 | 9.6971 | 4034 | 0.9092 | 0.6410 | 0.9092 | 0.9535 |
| 0.038 | 9.7019 | 4036 | 0.9118 | 0.6410 | 0.9118 | 0.9549 |
| 0.038 | 9.7067 | 4038 | 0.9168 | 0.6410 | 0.9168 | 0.9575 |
| 0.038 | 9.7115 | 4040 | 0.9191 | 0.6410 | 0.9191 | 0.9587 |
| 0.038 | 9.7163 | 4042 | 0.9196 | 0.6410 | 0.9196 | 0.9590 |
| 0.038 | 9.7212 | 4044 | 0.9183 | 0.6410 | 0.9183 | 0.9583 |
| 0.038 | 9.7260 | 4046 | 0.9161 | 0.6410 | 0.9161 | 0.9571 |
| 0.038 | 9.7308 | 4048 | 0.9155 | 0.6410 | 0.9155 | 0.9568 |
| 0.038 | 9.7356 | 4050 | 0.9132 | 0.6410 | 0.9132 | 0.9556 |
| 0.038 | 9.7404 | 4052 | 0.9103 | 0.6410 | 0.9103 | 0.9541 |
| 0.038 | 9.7452 | 4054 | 0.9074 | 0.6410 | 0.9074 | 0.9526 |
| 0.038 | 9.75 | 4056 | 0.9044 | 0.6410 | 0.9044 | 0.9510 |
| 0.038 | 9.7548 | 4058 | 0.9005 | 0.6410 | 0.9005 | 0.9490 |
| 0.038 | 9.7596 | 4060 | 0.8974 | 0.6410 | 0.8974 | 0.9473 |
| 0.038 | 9.7644 | 4062 | 0.8971 | 0.6548 | 0.8971 | 0.9471 |
| 0.038 | 9.7692 | 4064 | 0.8985 | 0.6410 | 0.8985 | 0.9479 |
| 0.038 | 9.7740 | 4066 | 0.8989 | 0.6410 | 0.8989 | 0.9481 |
| 0.038 | 9.7788 | 4068 | 0.8987 | 0.6410 | 0.8987 | 0.9480 |
| 0.038 | 9.7837 | 4070 | 0.8978 | 0.6410 | 0.8978 | 0.9475 |
| 0.038 | 9.7885 | 4072 | 0.8956 | 0.6410 | 0.8956 | 0.9464 |
| 0.038 | 9.7933 | 4074 | 0.8913 | 0.6548 | 0.8913 | 0.9441 |
| 0.038 | 9.7981 | 4076 | 0.8888 | 0.6599 | 0.8888 | 0.9427 |
| 0.038 | 9.8029 | 4078 | 0.8862 | 0.6599 | 0.8862 | 0.9414 |
| 0.038 | 9.8077 | 4080 | 0.8852 | 0.6599 | 0.8852 | 0.9409 |
| 0.038 | 9.8125 | 4082 | 0.8839 | 0.6599 | 0.8839 | 0.9401 |
| 0.038 | 9.8173 | 4084 | 0.8841 | 0.6599 | 0.8841 | 0.9402 |
| 0.038 | 9.8221 | 4086 | 0.8851 | 0.6599 | 0.8851 | 0.9408 |
| 0.038 | 9.8269 | 4088 | 0.8854 | 0.6599 | 0.8854 | 0.9410 |
| 0.038 | 9.8317 | 4090 | 0.8848 | 0.6599 | 0.8848 | 0.9406 |
| 0.038 | 9.8365 | 4092 | 0.8838 | 0.6599 | 0.8838 | 0.9401 |
| 0.038 | 9.8413 | 4094 | 0.8820 | 0.6599 | 0.8820 | 0.9391 |
| 0.038 | 9.8462 | 4096 | 0.8809 | 0.6599 | 0.8809 | 0.9386 |
| 0.038 | 9.8510 | 4098 | 0.8808 | 0.6599 | 0.8808 | 0.9385 |
| 0.038 | 9.8558 | 4100 | 0.8807 | 0.6599 | 0.8807 | 0.9384 |
| 0.038 | 9.8606 | 4102 | 0.8811 | 0.6599 | 0.8811 | 0.9387 |
| 0.038 | 9.8654 | 4104 | 0.8818 | 0.6599 | 0.8818 | 0.9390 |
| 0.038 | 9.8702 | 4106 | 0.8829 | 0.6599 | 0.8829 | 0.9396 |
| 0.038 | 9.875 | 4108 | 0.8847 | 0.6599 | 0.8847 | 0.9406 |
| 0.038 | 9.8798 | 4110 | 0.8864 | 0.6599 | 0.8864 | 0.9415 |
| 0.038 | 9.8846 | 4112 | 0.8887 | 0.6458 | 0.8887 | 0.9427 |
| 0.038 | 9.8894 | 4114 | 0.8910 | 0.6458 | 0.8910 | 0.9439 |
| 0.038 | 9.8942 | 4116 | 0.8932 | 0.6410 | 0.8932 | 0.9451 |
| 0.038 | 9.8990 | 4118 | 0.8955 | 0.6410 | 0.8955 | 0.9463 |
| 0.038 | 9.9038 | 4120 | 0.8972 | 0.6410 | 0.8972 | 0.9472 |
| 0.038 | 9.9087 | 4122 | 0.8985 | 0.6410 | 0.8985 | 0.9479 |
| 0.038 | 9.9135 | 4124 | 0.8990 | 0.6410 | 0.8990 | 0.9482 |
| 0.038 | 9.9183 | 4126 | 0.8994 | 0.6410 | 0.8994 | 0.9483 |
| 0.038 | 9.9231 | 4128 | 0.8998 | 0.6410 | 0.8998 | 0.9486 |
| 0.038 | 9.9279 | 4130 | 0.9001 | 0.6410 | 0.9001 | 0.9487 |
| 0.038 | 9.9327 | 4132 | 0.8997 | 0.6410 | 0.8997 | 0.9485 |
| 0.038 | 9.9375 | 4134 | 0.8992 | 0.6410 | 0.8992 | 0.9483 |
| 0.038 | 9.9423 | 4136 | 0.8988 | 0.6410 | 0.8988 | 0.9480 |
| 0.038 | 9.9471 | 4138 | 0.8979 | 0.6410 | 0.8979 | 0.9476 |
| 0.038 | 9.9519 | 4140 | 0.8973 | 0.6410 | 0.8973 | 0.9473 |
| 0.038 | 9.9567 | 4142 | 0.8968 | 0.6410 | 0.8968 | 0.9470 |
| 0.038 | 9.9615 | 4144 | 0.8967 | 0.6410 | 0.8967 | 0.9470 |
| 0.038 | 9.9663 | 4146 | 0.8967 | 0.6410 | 0.8967 | 0.9469 |
| 0.038 | 9.9712 | 4148 | 0.8967 | 0.6410 | 0.8967 | 0.9469 |
| 0.038 | 9.9760 | 4150 | 0.8965 | 0.6410 | 0.8965 | 0.9469 |
| 0.038 | 9.9808 | 4152 | 0.8965 | 0.6410 | 0.8965 | 0.9468 |
| 0.038 | 9.9856 | 4154 | 0.8964 | 0.6410 | 0.8964 | 0.9468 |
| 0.038 | 9.9904 | 4156 | 0.8964 | 0.6410 | 0.8964 | 0.9468 |
| 0.038 | 9.9952 | 4158 | 0.8964 | 0.6410 | 0.8964 | 0.9468 |
| 0.038 | 10.0 | 4160 | 0.8964 | 0.6410 | 0.8964 | 0.9468 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
DevQuasar/OLMoE-1B-7B-0924-Instruct-GGUF
|
DevQuasar
| 2024-12-04T14:20:18Z | 27 | 0 | null |
[
"gguf",
"text-generation",
"base_model:allenai/OLMoE-1B-7B-0924-Instruct",
"base_model:quantized:allenai/OLMoE-1B-7B-0924-Instruct",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-15T16:02:00Z |
---
base_model:
- allenai/OLMoE-1B-7B-0924-Instruct
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
DevQuasar/OLMoE-1B-7B-0924-GGUF
|
DevQuasar
| 2024-12-04T14:20:02Z | 7 | 0 | null |
[
"gguf",
"text-generation",
"base_model:allenai/OLMoE-1B-7B-0924",
"base_model:quantized:allenai/OLMoE-1B-7B-0924",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-09-15T15:56:11Z |
---
base_model:
- allenai/OLMoE-1B-7B-0924
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
mradermacher/Virtuoso-Small-i1-GGUF
|
mradermacher
| 2024-12-04T14:18:42Z | 37 | 2 |
transformers
|
[
"transformers",
"gguf",
"en",
"base_model:arcee-ai/Virtuoso-Small",
"base_model:quantized:arcee-ai/Virtuoso-Small",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-12-04T12:48:45Z |
---
base_model: arcee-ai/Virtuoso-Small
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/arcee-ai/Virtuoso-Small
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Virtuoso-Small-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ1_S.gguf) | i1-IQ1_S | 3.7 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ1_M.gguf) | i1-IQ1_M | 4.0 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ2_S.gguf) | i1-IQ2_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ2_M.gguf) | i1-IQ2_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q2_K_S.gguf) | i1-Q2_K_S | 5.5 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q2_K.gguf) | i1-Q2_K | 5.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 6.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ3_XS.gguf) | i1-IQ3_XS | 6.5 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q3_K_S.gguf) | i1-Q3_K_S | 6.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ3_S.gguf) | i1-IQ3_S | 6.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ3_M.gguf) | i1-IQ3_M | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q3_K_M.gguf) | i1-Q3_K_M | 7.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q3_K_L.gguf) | i1-Q3_K_L | 8.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-IQ4_XS.gguf) | i1-IQ4_XS | 8.2 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 8.6 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 8.6 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 8.6 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q4_0.gguf) | i1-Q4_0 | 8.6 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q4_K_S.gguf) | i1-Q4_K_S | 8.7 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q4_K_M.gguf) | i1-Q4_K_M | 9.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q5_K_S.gguf) | i1-Q5_K_S | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q5_K_M.gguf) | i1-Q5_K_M | 10.6 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF/resolve/main/Virtuoso-Small.i1-Q6_K.gguf) | i1-Q6_K | 12.2 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
gokulsrinivasagan/distilbert_lda_20_v1_sst2
|
gokulsrinivasagan
| 2024-12-04T14:18:19Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_20_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_20_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T20:47:37Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_20_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_lda_20_v1_sst2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE SST2
type: glue
args: sst2
metrics:
- name: Accuracy
type: accuracy
value: 0.8451834862385321
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_20_v1_sst2
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_20_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_20_v1) on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3649
- Accuracy: 0.8452
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3686 | 1.0 | 264 | 0.3649 | 0.8452 |
| 0.2092 | 2.0 | 528 | 0.3841 | 0.8532 |
| 0.143 | 3.0 | 792 | 0.4609 | 0.8383 |
| 0.1028 | 4.0 | 1056 | 0.4837 | 0.8475 |
| 0.0783 | 5.0 | 1320 | 0.5645 | 0.8406 |
| 0.0607 | 6.0 | 1584 | 0.6181 | 0.8406 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
MarketLLM/krx_Qwen2.5-7B-it_market_base_001
|
MarketLLM
| 2024-12-04T14:17:31Z | 8 | 0 | null |
[
"safetensors",
"qwen2",
"krx",
"license:apache-2.0",
"region:us"
] | null | 2024-12-04T11:17:41Z |
---
license: apache-2.0
tags:
- krx
---
|
DevQuasar/LongCite-glm4-9b-GGUF
|
DevQuasar
| 2024-12-04T14:17:24Z | 25 | 2 | null |
[
"gguf",
"text-generation",
"base_model:THUDM/LongCite-glm4-9b",
"base_model:quantized:THUDM/LongCite-glm4-9b",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-13T22:50:35Z |
---
base_model:
- THUDM/LongCite-glm4-9b
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
miasetya/fine_tuned_t5_small_model_sec_5_v5
|
miasetya
| 2024-12-04T14:15:57Z | 114 | 0 |
transformers
|
[
"transformers",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"base_model:finetune:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2024-12-04T14:15:37Z |
---
library_name: transformers
license: apache-2.0
base_model: t5-small
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: fine_tuned_t5_small_model_sec_5_v5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine_tuned_t5_small_model_sec_5_v5
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.8447
- Rouge1: 0.3701
- Rouge2: 0.1378
- Rougel: 0.2427
- Rougelsum: 0.2427
- Gen Len: 78.0105
- Bert F1: 0.8699
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | Bert F1 |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|:-------:|
| 3.8604 | 0.4211 | 40 | 3.2077 | 0.3522 | 0.1312 | 0.2314 | 0.2321 | 60.8684 | 0.8715 |
| 3.4406 | 0.8421 | 80 | 3.0222 | 0.3615 | 0.1333 | 0.2327 | 0.2331 | 70.9421 | 0.8725 |
| 3.2592 | 1.2632 | 120 | 2.9491 | 0.3796 | 0.1401 | 0.2392 | 0.2395 | 81.9421 | 0.8728 |
| 3.1817 | 1.6842 | 160 | 2.9082 | 0.3791 | 0.135 | 0.2382 | 0.2383 | 83.7579 | 0.873 |
| 3.1808 | 2.1053 | 200 | 2.8893 | 0.3817 | 0.1403 | 0.2433 | 0.2436 | 82.9211 | 0.8741 |
| 3.1333 | 2.5263 | 240 | 2.8745 | 0.3737 | 0.1365 | 0.2427 | 0.2431 | 80.8632 | 0.8735 |
| 3.1758 | 2.9474 | 280 | 2.8623 | 0.3751 | 0.142 | 0.245 | 0.2452 | 79.6526 | 0.8744 |
| 3.0898 | 3.3684 | 320 | 2.8559 | 0.3739 | 0.1407 | 0.2441 | 0.2443 | 80.1684 | 0.8741 |
| 3.1227 | 3.7895 | 360 | 2.8499 | 0.3739 | 0.1406 | 0.2458 | 0.2456 | 78.8789 | 0.8743 |
| 3.0641 | 4.2105 | 400 | 2.8467 | 0.3678 | 0.1368 | 0.2418 | 0.2418 | 78.2053 | 0.8691 |
| 3.0768 | 4.6316 | 440 | 2.8447 | 0.3701 | 0.1378 | 0.2427 | 0.2427 | 78.0105 | 0.8699 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.3
|
DevQuasar/Llama-3.1-SuperNova-Lite-GGUF
|
DevQuasar
| 2024-12-04T14:15:44Z | 8 | 0 | null |
[
"gguf",
"text-generation",
"base_model:arcee-ai/Llama-3.1-SuperNova-Lite",
"base_model:quantized:arcee-ai/Llama-3.1-SuperNova-Lite",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-09-10T20:38:26Z |
---
base_model:
- arcee-ai/Llama-3.1-SuperNova-Lite
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
DevQuasar/Mistral-7B-Instruct-v0.3_brainstorm-v3.1-GGUF
|
DevQuasar
| 2024-12-04T14:13:59Z | 11 | 1 | null |
[
"gguf",
"text-generation",
"base_model:DevQuasar/Mistral-7B-Instruct-v0.3_brainstorm-v3.1",
"base_model:quantized:DevQuasar/Mistral-7B-Instruct-v0.3_brainstorm-v3.1",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2024-08-31T16:54:28Z |
---
license: apache-2.0
base_model:
- DevQuasar/Mistral-7B-Instruct-v0.3_brainstorm-v3.1
pipeline_tag: text-generation
---
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
ilhamiuturkkan/fake-news-model-3000-samples
|
ilhamiuturkkan
| 2024-12-04T14:12:46Z | 117 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-12-04T13:59:53Z |
---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: fake-news-model-3000-samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fake-news-model-3000-samples
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0007
- Accuracy: 0.9995
- F1: 0.9995
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/distilbert_lda_20_v1_rte
|
gokulsrinivasagan
| 2024-12-04T14:10:14Z | 121 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_20_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_20_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T20:46:11Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_20_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_lda_20_v1_rte
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE RTE
type: glue
args: rte
metrics:
- name: Accuracy
type: accuracy
value: 0.5415162454873647
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_20_v1_rte
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_20_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_20_v1) on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6842
- Accuracy: 0.5415
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7258 | 1.0 | 10 | 0.6862 | 0.5343 |
| 0.6879 | 2.0 | 20 | 0.6842 | 0.5415 |
| 0.6517 | 3.0 | 30 | 0.6895 | 0.5090 |
| 0.5932 | 4.0 | 40 | 0.7368 | 0.5307 |
| 0.5085 | 5.0 | 50 | 0.8123 | 0.5307 |
| 0.3794 | 6.0 | 60 | 0.9527 | 0.5199 |
| 0.2539 | 7.0 | 70 | 1.1710 | 0.5235 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/distilbert_lda_100_v1_wnli
|
gokulsrinivasagan
| 2024-12-04T14:09:54Z | 122 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T21:00:18Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_lda_100_v1_wnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE WNLI
type: glue
args: wnli
metrics:
- name: Accuracy
type: accuracy
value: 0.5633802816901409
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_100_v1_wnli
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_100_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_100_v1) on the GLUE WNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6910
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7643 | 1.0 | 3 | 0.7202 | 0.4507 |
| 0.699 | 2.0 | 6 | 0.6910 | 0.5634 |
| 0.7004 | 3.0 | 9 | 0.6951 | 0.4789 |
| 0.6965 | 4.0 | 12 | 0.6969 | 0.4366 |
| 0.6938 | 5.0 | 15 | 0.7141 | 0.4507 |
| 0.6974 | 6.0 | 18 | 0.7016 | 0.4366 |
| 0.6953 | 7.0 | 21 | 0.7069 | 0.4366 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
mav23/zephyr-7b-sft-full-GGUF
|
mav23
| 2024-12-04T14:08:21Z | 279 | 0 | null |
[
"gguf",
"alignment-handbook",
"generated_from_trainer",
"trl",
"sft",
"dataset:HuggingFaceH4/ultrachat_200k",
"base_model:mistralai/Mistral-7B-v0.1",
"base_model:quantized:mistralai/Mistral-7B-v0.1",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-12-04T13:21:49Z |
---
license: apache-2.0
base_model: mistralai/Mistral-7B-v0.1
tags:
- alignment-handbook
- generated_from_trainer
- trl
- sft
- generated_from_trainer
datasets:
- HuggingFaceH4/ultrachat_200k
model-index:
- name: zephyr-7b-sft-full
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# zephyr-7b-sft-full
This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the HuggingFaceH4/ultrachat_200k dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9353
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 128
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.9075 | 1.0 | 1090 | 0.9353 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.14.6
- Tokenizers 0.15.0
|
mradermacher/Virtuoso-Small-GGUF
|
mradermacher
| 2024-12-04T14:08:08Z | 38 | 1 |
transformers
|
[
"transformers",
"gguf",
"en",
"base_model:arcee-ai/Virtuoso-Small",
"base_model:quantized:arcee-ai/Virtuoso-Small",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-12-04T12:09:47Z |
---
base_model: arcee-ai/Virtuoso-Small
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/arcee-ai/Virtuoso-Small
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Virtuoso-Small-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q2_K.gguf) | Q2_K | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q3_K_S.gguf) | Q3_K_S | 6.8 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q3_K_M.gguf) | Q3_K_M | 7.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q3_K_L.gguf) | Q3_K_L | 8.0 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.IQ4_XS.gguf) | IQ4_XS | 8.3 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q4_0_4_4.gguf) | Q4_0_4_4 | 8.6 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q4_K_S.gguf) | Q4_K_S | 8.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q4_K_M.gguf) | Q4_K_M | 9.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q5_K_S.gguf) | Q5_K_S | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q5_K_M.gguf) | Q5_K_M | 10.6 | |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q6_K.gguf) | Q6_K | 12.2 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Virtuoso-Small-GGUF/resolve/main/Virtuoso-Small.Q8_0.gguf) | Q8_0 | 15.8 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
Wisdom-math/wisdom-qwen2-70b
|
Wisdom-math
| 2024-12-04T14:07:47Z | 6 | 0 | null |
[
"safetensors",
"qwen2",
"arxiv:2107.02027",
"region:us"
] | null | 2024-09-02T06:41:13Z |
# 🧙🏼WISDOM
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->
> WISDOM: PROGRESSIVE CURRICULUM SYNTHESIS MAKES LLMS BETTER MATHEMATICAL REASONER
🤗[Datasets&Models@HF](https://huggingface.co/Wisdom-math)
| 🐱 [Code@GitHub](https://anonymous.4open.science/r/Wisdom-math-377B)
<div align="center">
<img src="https://anonymous.4open.science/r/Wisdom-math-377B/imgs/main.jpg">
<em> Figure 1: The overall workflow of _WISDOM_, which leverages Progressive Curriculum Synthesis to generate questions and responses with Deepseek Coder V2 and GPT-4o, including weak teacher guiding, critical expert teaching, experts consistency voting, and hard instruction evolving. </em>
</div>
## Main Results on the smaller models
| **Method** | **Base** | **GSM8K** | **MATH** | **College**† | **Olympiad** | **TabMWP** | **TheoremQA** | **AMC2023** | **AIME2024** |
|------------------------|----------------|-----------|----------|--------------|--------------|------------|---------------|-------------|------------|
| **Mathstral** | Mistral-7B | **83.3** | 54.3 | 36.7 | **22.4** | **82.8** | 26.3 | 12/40 | **1**/30 |
| **KPMath-Plus** | Mistral-7B | 82.1 | 46.8 | – | – | 66.4 | – | – | – |
| **DART-Math** | Mistral-7B | 81.3 | 45.0 | 28.3 | 14.5 | 65.8 | 20.5 | 7/40 | 0/30 |
| **MAmmoTH2** | Mistral-7B | 67.4 | 34.2 | 31.0 | 9.8 | 26.8 | 26.7 | 6/40 | 1/30 |
| **MathScale** | Mistral-7B | 58.5 | 33.2 | 22.0 | 7.8 | 73.3 | 18.1 | 6/40 | 1/30 |
| **_WISDOM_** | Mistral-7B | 80.0 | **56.4** | **41.6** | 21.9 | 72.3 | **27.6** | **15**/40 | **1**/30 |
| **Method** | **Base** | **GSM8K** | **MATH** | **College**† | **Olympiad** | **TabMWP** | **TheoremQA** | **AMC2023** | **AIME2024** |
|------------------------|----------------|-----------|----------|--------------|--------------|------------|---------------|-------------|------------|
| **Llama3-instruct** | Llama3-8B | 78.2 | 27.2 | 22.8 | 5.6 | 75.3 | 18.9 | 5/40 | 0/30 |
| **MetaMath** | Llama3-8B | 80.5 | 32.6 | 19.3 | 6.7 | 54.1 | 13.3 | 6/40 | 0/30 |
| **DART-Math** | Llama3-8B | 81.8 | 46.9 | 28.4 | 15.9 | 66.3 | 20.5 | 8/40 | **1**/30 |
| **MAmmoTH2** | Llama3-8B | 69.6 | 33.4 | 32.3 | 8.1 | 43.8 | **29.7** | 7/40 | 0/30 |
| **MathScale** | Llama3-8B | 70.8 | 34.6 | 22.5 | 9.0 | 74.3 | 18.9 | 2/40 | 1/30 |
| _**WISDOM**_ | Llama3-8B | **83.2** | **59.7** | **42.2** | **25.6** | **83.0** | 28.6 | **17**/40 | **1**/30 |
| **Method** | **Base** | **GSM8K** | **MATH** | **College**† | **Olympiad** | **TabMWP** | **TheoremQA** | **AMC2023** | **AIME2024** |
|-----------------------|----------------|-----------|----------|--------------|--------------|------------|---------------|-----------|--------------|
| **DSMath-instruct** | DSMath-7B | 82.0 | 46.3 | 38.1 | 13.6 | 76.7 | 31.9 | 7/40 | 1/30 |
| **MetaMath** | DSMath-7B | 76.5 | 37.2 | 27.3 | 10.7 | 67.1 | 13.9 | 10/40 | 0/30 |
| **KPMath-Plus** | DSMath-7B | 83.9 | 48.8 | – | – | 78.7 | – | – | – |
| **DART-Math** | DSMath-7B | **87.5** | 53.9 | 40.7 | 20.0 | 82.9 | 31.5 | 8/30 | 0/30 |
| **NuminaMath** | DSMath-7B | 77.1 | 53.7 | 32.4 | 24.0 | 77.7 | 29.4 | **12**/40 | 1/30 |
| **MathScale** | DSMath-7B | 62.7 | 33.4 | 23.0 | 8.1 | 71.3 | 24.5 | 4/40 | 0/30 |
| **WISDOM** | DSMath-7B | 83.3 | **62.4** | **45.0** | **28.9** | **85.7** | **34.9** | 11/40 | **2**/30 |
## Main Results on the bigger models
| **Method** | **Base** | **GSM8K** | **MATH** | **College**† | **Olympiad** | **TabMWP** | **TheoremQA** | **AMC2023** | **AIME2024** |
|------------------------|----------------|-----------|----------|--------------|--------------|------------|---------------|-------------|--------------|
| **GPT-4o-0513** | – | 95.8 | 76.6 | – | – | – | – | – | 2/30 |
| **GPT-4-1106-preview** | – | 91.4 | 64.3 | – | – | – | – | – | 1/30 |
| **Claude-3-Opus** | – | 95.0 | 60.1 | – | – | – | – | – | 2/30 |
| **DeepSeek Coder V2** | – | 94.9 | 75.7 | – | – | – | – | – | **4**/30 |
| **Llama3-instruct** | Llama3-70B | 93.1 | 50.4 | 40.3 | 17.6 | 89.9 | 34.1 | 8/40 | 2/30 |
| **Qwen2-instruct** | Qwen2-72B | 93.6 | 69.3 | 46.8 | 35.3 | 92.4 | 42.0 | 17/40 | **4**/30 |
| **DART-Math** | Llama3-70B | 89.8 | 55.7 | 37.9 | 21.0 | 80.9 | 28.2 | 13/40 | 1/30 |
| **KPMath-Plus** | Qwen1.5-72B | 87.0 | 58.3 | – | – | 76.7 | – | – | – |
| **MetaMath** | Llama3-70B | 88.0 | 44.9 | 31.9 | 11.6 | – | 21.9 | – | – |
| **NuminaMath** | Qwen2-72B | 91.5 | 66.9 | 42.1 | 33.6 | 86.7 | 29.0 | 13/40 | **4**/30 |
| _**WISDOM**_ | Llama3-70B | 94.1 | 68.2 | 43.4 | 34.4 | 91.8 | 41.4 | 22/40 | 3/30 |
| _**WISDOM**_ | Qwen2-72B | **94.2** | **76.1** | **47.6** | **39.1** | **94.5** | **45.4** | **23/40** | 2/30 |
† In short of College MATH.
<em>Table 1:Main results on in-domain benchmarks, GSM8K and MATH, and out-of-domain benchmarks, including College MATH, Olympiad, TabMWP, TheoremQA, AMC2023, and AIME2024. We select the current well-performing LLMs to evaluate their test accuracy on these benchmarks. Since KPMath-Plus is not open-sourced, the results are quoted from the corresponding paper.</em>
## **Introduction of Paper**
we introduce _WISDOM_, which draws inspiration from the human learning process and employs curriculum learning to gradually synthesize high-quality CoT data from easy to hard.
## **Template**
All models were trained using the [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) template.
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.\n### Instruction:\n{question}\n\n### Response:
```
## **Training Setup**
### **Data Contamination**
we applied a 10-gram hash deduplication method to the questions in both our in-domain and out-of-domain benchmarks, with a condition that the ratio of the longest common sequence must exceed 0.6, Any detected duplicates were removed.
### **Training details**
We employ [Llama-factory](https://github.com/hiyouga/LLaMA-Factory) for fine-tuning the entire suite of models and utilized [sequence packing](https://arxiv.org/abs/2107.02027) to accelerate the training process.
The training was conducted using 88 NVIDIA A800 GPUs, with a configuration of batch size 1, gradient accumulation of 2, sequence length of 8192, and bf16 precision.
We optimized the models with the AdamW optimizer, setting a learning rate warmup using a cosine schedule with a warmup ratio of 0.03, and trained each model for 3 epochs.
The learning rates were adjusted slightly for different models: Mistral 7B at 1e-5, DeepSeekMath-7B at 5e-5, Llama3-8B at 4e-5, and both Llama3-70B and Qwen2-72B at 2e-5.
|
gokulsrinivasagan/distilbert_lda_100_v1_stsb
|
gokulsrinivasagan
| 2024-12-04T14:07:40Z | 125 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T20:57:20Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- spearmanr
model-index:
- name: distilbert_lda_100_v1_stsb
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE STSB
type: glue
args: stsb
metrics:
- name: Spearmanr
type: spearmanr
value: 0.7703935491023064
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_100_v1_stsb
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_100_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_100_v1) on the GLUE STSB dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9153
- Pearson: 0.7758
- Spearmanr: 0.7704
- Combined Score: 0.7731
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Pearson | Spearmanr | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:---------:|:--------------:|
| 2.622 | 1.0 | 23 | 2.5298 | 0.1011 | 0.0891 | 0.0951 |
| 1.8404 | 2.0 | 46 | 2.3343 | 0.4642 | 0.4648 | 0.4645 |
| 1.3143 | 3.0 | 69 | 1.2509 | 0.6736 | 0.6670 | 0.6703 |
| 0.8809 | 4.0 | 92 | 1.3874 | 0.7172 | 0.7254 | 0.7213 |
| 0.6317 | 5.0 | 115 | 1.5835 | 0.7091 | 0.7238 | 0.7164 |
| 0.5139 | 6.0 | 138 | 1.2793 | 0.7443 | 0.7470 | 0.7456 |
| 0.3919 | 7.0 | 161 | 1.0238 | 0.7576 | 0.7535 | 0.7556 |
| 0.3125 | 8.0 | 184 | 1.4519 | 0.7331 | 0.7349 | 0.7340 |
| 0.281 | 9.0 | 207 | 1.2564 | 0.7390 | 0.7374 | 0.7382 |
| 0.2395 | 10.0 | 230 | 0.9153 | 0.7758 | 0.7704 | 0.7731 |
| 0.2219 | 11.0 | 253 | 1.2411 | 0.7509 | 0.7509 | 0.7509 |
| 0.1923 | 12.0 | 276 | 1.5144 | 0.7429 | 0.7444 | 0.7436 |
| 0.1688 | 13.0 | 299 | 1.0667 | 0.7518 | 0.7468 | 0.7493 |
| 0.1494 | 14.0 | 322 | 1.2371 | 0.7502 | 0.7483 | 0.7493 |
| 0.1498 | 15.0 | 345 | 1.1066 | 0.7473 | 0.7433 | 0.7453 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/distilbert_lda_100_v1_sst2
|
gokulsrinivasagan
| 2024-12-04T14:03:31Z | 120 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T20:50:03Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_lda_100_v1_sst2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE SST2
type: glue
args: sst2
metrics:
- name: Accuracy
type: accuracy
value: 0.8268348623853211
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_100_v1_sst2
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_100_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_100_v1) on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3907
- Accuracy: 0.8268
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3896 | 1.0 | 264 | 0.3907 | 0.8268 |
| 0.2205 | 2.0 | 528 | 0.4041 | 0.8349 |
| 0.1574 | 3.0 | 792 | 0.5309 | 0.8165 |
| 0.1151 | 4.0 | 1056 | 0.5299 | 0.8211 |
| 0.0891 | 5.0 | 1320 | 0.5801 | 0.8372 |
| 0.0677 | 6.0 | 1584 | 0.6953 | 0.8234 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
2point5p/krx-gemma-2-9b-it-X-All-1
|
2point5p
| 2024-12-04T13:59:27Z | 6 | 0 |
transformers
|
[
"transformers",
"safetensors",
"gemma2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"krx",
"conversational",
"en",
"base_model:unsloth/gemma-2-9b-it",
"base_model:finetune:unsloth/gemma-2-9b-it",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-12-04T03:03:03Z |
---
base_model: unsloth/gemma-2-9b-it
tags:
- text-generation-inference
- transformers
- unsloth
- gemma2
- trl
- krx
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** 2point5p
- **License:** apache-2.0
- **Finetuned from model :** unsloth/gemma-2-9b-it
This gemma2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
JohnFante/Florence-2-base-ft-2-feat
|
JohnFante
| 2024-12-04T13:59:19Z | 104 | 0 |
transformers
|
[
"transformers",
"safetensors",
"florence2",
"text-generation",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] |
text-generation
| 2024-12-04T11:49:07Z |
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
DevQuasar/mathstral-7B-v0.1-GGUF
|
DevQuasar
| 2024-12-04T13:58:22Z | 20 | 1 | null |
[
"gguf",
"text-generation",
"base_model:mistralai/Mathstral-7B-v0.1",
"base_model:quantized:mistralai/Mathstral-7B-v0.1",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-07-16T18:18:47Z |
---
license: apache-2.0
base_model:
- mistralai/Mathstral-7B-v0.1
pipeline_tag: text-generation
---
quantized version of [mistralai/mathstral-7B-v0.1](https://huggingface.co/mistralai/mathstral-7B-v0.1)
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/
|
gokulsrinivasagan/distilbert_lda_100_v1_rte
|
gokulsrinivasagan
| 2024-12-04T13:56:15Z | 107 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T20:48:23Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: distilbert_lda_100_v1_rte
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE RTE
type: glue
args: rte
metrics:
- name: Accuracy
type: accuracy
value: 0.5451263537906137
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_100_v1_rte
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_100_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_100_v1) on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6952
- Accuracy: 0.5451
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6947 | 1.0 | 10 | 0.6952 | 0.5451 |
| 0.6851 | 2.0 | 20 | 0.7430 | 0.4838 |
| 0.6787 | 3.0 | 30 | 0.7019 | 0.5343 |
| 0.634 | 4.0 | 40 | 0.7290 | 0.5054 |
| 0.5534 | 5.0 | 50 | 0.8035 | 0.4874 |
| 0.4215 | 6.0 | 60 | 1.0177 | 0.4693 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
gokulsrinivasagan/distilbert_lda_100_v1_qqp
|
gokulsrinivasagan
| 2024-12-04T13:55:19Z | 122 | 0 |
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/distilbert_lda_100_v1",
"base_model:finetune:gokulsrinivasagan/distilbert_lda_100_v1",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2024-11-26T19:59:55Z |
---
library_name: transformers
language:
- en
base_model: gokulsrinivasagan/distilbert_lda_100_v1
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: distilbert_lda_100_v1_qqp
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE QQP
type: glue
args: qqp
metrics:
- name: Accuracy
type: accuracy
value: 0.8599554786049963
- name: F1
type: f1
value: 0.8240412704332154
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_lda_100_v1_qqp
This model is a fine-tuned version of [gokulsrinivasagan/distilbert_lda_100_v1](https://huggingface.co/gokulsrinivasagan/distilbert_lda_100_v1) on the GLUE QQP dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3133
- Accuracy: 0.8600
- F1: 0.8240
- Combined Score: 0.8420
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:|
| 0.4043 | 1.0 | 1422 | 0.3275 | 0.8511 | 0.7992 | 0.8252 |
| 0.2918 | 2.0 | 2844 | 0.3133 | 0.8600 | 0.8240 | 0.8420 |
| 0.2305 | 3.0 | 4266 | 0.3147 | 0.8715 | 0.8340 | 0.8527 |
| 0.179 | 4.0 | 5688 | 0.3178 | 0.8760 | 0.8279 | 0.8520 |
| 0.1389 | 5.0 | 7110 | 0.3525 | 0.8805 | 0.8365 | 0.8585 |
| 0.1067 | 6.0 | 8532 | 0.3905 | 0.8783 | 0.8409 | 0.8596 |
| 0.086 | 7.0 | 9954 | 0.4037 | 0.8788 | 0.8427 | 0.8608 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.2.1+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
davidbzyk/Angionator-Qwen2.5-14b-Coder-merged_16bit
|
davidbzyk
| 2024-12-04T13:46:18Z | 5 | 0 |
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2024-12-04T13:36:24Z |
---
base_model: unsloth/qwen2.5-coder-14b-instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** davidbzyk
- **License:** apache-2.0
- **Finetuned from model :** unsloth/qwen2.5-coder-14b-instruct-bnb-4bit
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF
|
mradermacher
| 2024-12-04T13:45:03Z | 67 | 0 |
transformers
|
[
"transformers",
"gguf",
"UNA",
"simple-math",
"juanako",
"en",
"dataset:fblgit/simple-math",
"dataset:jondurbin/bagel-v0.3",
"base_model:fblgit/UNA-SimpleSmaug-34b-v1beta",
"base_model:quantized:fblgit/UNA-SimpleSmaug-34b-v1beta",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-12-04T10:27:27Z |
---
base_model: fblgit/UNA-SimpleSmaug-34b-v1beta
datasets:
- fblgit/simple-math
- jondurbin/bagel-v0.3
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- UNA
- simple-math
- juanako
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/fblgit/UNA-SimpleSmaug-34b-v1beta
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q2_K.gguf) | Q2_K | 12.9 | |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q3_K_S.gguf) | Q3_K_S | 15.1 | |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q3_K_M.gguf) | Q3_K_M | 16.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q3_K_L.gguf) | Q3_K_L | 18.2 | |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.IQ4_XS.gguf) | IQ4_XS | 18.7 | |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q4_K_S.gguf) | Q4_K_S | 19.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q4_K_M.gguf) | Q4_K_M | 20.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q5_K_S.gguf) | Q5_K_S | 23.8 | |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q5_K_M.gguf) | Q5_K_M | 24.4 | |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q6_K.gguf) | Q6_K | 28.3 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/UNA-SimpleSmaug-34b-v1beta-GGUF/resolve/main/UNA-SimpleSmaug-34b-v1beta.Q8_0.gguf) | Q8_0 | 36.6 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
jaeyong2/bge-m3-Thai
|
jaeyong2
| 2024-12-04T13:44:10Z | 7 | 0 | null |
[
"safetensors",
"xlm-roberta",
"th",
"base_model:BAAI/bge-m3",
"base_model:finetune:BAAI/bge-m3",
"license:mit",
"region:us"
] | null | 2024-12-04T13:40:13Z |
---
license: mit
language:
- th
base_model:
- BAAI/bge-m3
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
## Train
- H/W : colab A100 40GB
- Data : jaeyong2/Thai-emb-PreView (step : 70729)
```
!torchrun --nproc_per_node 1 \
-m FlagEmbedding.finetune.embedder.encoder_only.m3 \
--output_dir "/content/drive/My Drive/bge_thai" \
--model_name_or_path BAAI/bge-m3 \
--train_data ./train.jsonl \
--learning_rate 1e-5 \
--bf16 \
--num_train_epochs 1 \
--per_device_train_batch_size 1 \
--dataloader_drop_last True \
--temperature 0.02 \
--query_max_len 2048 \
--passage_max_len 512 \
--train_group_size 2 \
--negatives_cross_device \
--logging_steps 10 \
--save_steps 1000 \
--query_instruction_for_retrieval ""
```
## Evaluation
Code :
```
import torch
import numpy as np
from sklearn.metrics import pairwise_distances
from tqdm import tqdm
import datasets
def get_embedding(text, model):
with torch.no_grad():
embedding = model.encode(text)['dense_vecs']
return embedding
dataset = datasets.load_dataset("jaeyong2/Thai-emb-PreView")
validation_dataset = dataset["test"].select(range((1000)))
def evaluate(validation_dataset):
correct_count = 0
for item in tqdm(validation_dataset):
query_embedding = get_embedding(item["context"], fine_tuned_model)
document_embedding = get_embedding(item["Title"], fine_tuned_model)
negative_embedding = get_embedding(item["Fake Title"], fine_tuned_model)
# 쿼리와 모든 문서 간의 유사도 계산 (코사인 거리 사용)
positive_distances = pairwise_distances(query_embedding.reshape(1, -1), document_embedding.reshape(1, -1), metric="cosine")
negative_distances = pairwise_distances(query_embedding.reshape(1, -1), negative_embedding.reshape(1, -1), metric="cosine")
if positive_distances < negative_distances:
correct_count += 1
accuracy = correct_count / len(validation_dataset)
return accuracy
results = evaluate(validation_dataset)
print(f"Validation Results: {results}")
```
Accuracy
- BAAI/bge-m3 : 0.961
- jaeyong2/bge-m3-Thai : 0.997
### License
- BAAI/bge-m3 : https://choosealicense.com/licenses/mit/
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.