youssefkhalil320 commited on
Commit
10bdefb
·
verified ·
1 Parent(s): f815ada

Upload folder using huggingface_hub

Browse files
checkpoint-105000/1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 384,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
checkpoint-105000/README.md ADDED
@@ -0,0 +1,1440 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: sentence-transformers/all-MiniLM-L6-v2
3
+ datasets:
4
+ - youssefkhalil320/pairs_three_scores_v5
5
+ language:
6
+ - en
7
+ library_name: sentence-transformers
8
+ license: apache-2.0
9
+ pipeline_tag: sentence-similarity
10
+ tags:
11
+ - sentence-transformers
12
+ - sentence-similarity
13
+ - feature-extraction
14
+ - generated_from_trainer
15
+ - dataset_size:80000003
16
+ - loss:CoSENTLoss
17
+ widget:
18
+ - source_sentence: durable pvc swim ring
19
+ sentences:
20
+ - flaky croissant
21
+ - urban shoes
22
+ - warm drinks mug
23
+ - source_sentence: iso mak retard capsules
24
+ sentences:
25
+ - savory baguette
26
+ - shea butter body cream
27
+ - softwheeled cruiser
28
+ - source_sentence: love sandra potty
29
+ sentences:
30
+ - utensil holder
31
+ - olive pants
32
+ - headwear
33
+ - source_sentence: dusky hair brush
34
+ sentences:
35
+ - back compartment laptop
36
+ - rubber feet platter
37
+ - honed blade knife
38
+ - source_sentence: nkd skn
39
+ sentences:
40
+ - fruit fragrances nail polish remover
41
+ - panini salmon
42
+ - hand drawing bag
43
+ ---
44
+
45
+ # all-MiniLM-L6-v8-pair_score
46
+
47
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) on the [pairs_three_scores_v5](https://huggingface.co/datasets/youssefkhalil320/pairs_three_scores_v5) dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
48
+
49
+ ## Model Details
50
+
51
+ ### Model Description
52
+ - **Model Type:** Sentence Transformer
53
+ - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision c9745ed1d9f207416be6d2e6f8de32d1f16199bf -->
54
+ - **Maximum Sequence Length:** 256 tokens
55
+ - **Output Dimensionality:** 384 tokens
56
+ - **Similarity Function:** Cosine Similarity
57
+ - **Training Dataset:**
58
+ - [pairs_three_scores_v5](https://huggingface.co/datasets/youssefkhalil320/pairs_three_scores_v5)
59
+ - **Language:** en
60
+ - **License:** apache-2.0
61
+
62
+ ### Model Sources
63
+
64
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
65
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
66
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
67
+
68
+ ### Full Model Architecture
69
+
70
+ ```
71
+ SentenceTransformer(
72
+ (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
73
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
74
+ (2): Normalize()
75
+ )
76
+ ```
77
+
78
+ ## Usage
79
+
80
+ ### Direct Usage (Sentence Transformers)
81
+
82
+ First install the Sentence Transformers library:
83
+
84
+ ```bash
85
+ pip install -U sentence-transformers
86
+ ```
87
+
88
+ Then you can load this model and run inference.
89
+ ```python
90
+ from sentence_transformers import SentenceTransformer
91
+
92
+ # Download from the 🤗 Hub
93
+ model = SentenceTransformer("sentence_transformers_model_id")
94
+ # Run inference
95
+ sentences = [
96
+ 'nkd skn',
97
+ 'hand drawing bag',
98
+ 'panini salmon',
99
+ ]
100
+ embeddings = model.encode(sentences)
101
+ print(embeddings.shape)
102
+ # [3, 384]
103
+
104
+ # Get the similarity scores for the embeddings
105
+ similarities = model.similarity(embeddings, embeddings)
106
+ print(similarities.shape)
107
+ # [3, 3]
108
+ ```
109
+
110
+ <!--
111
+ ### Direct Usage (Transformers)
112
+
113
+ <details><summary>Click to see the direct usage in Transformers</summary>
114
+
115
+ </details>
116
+ -->
117
+
118
+ <!--
119
+ ### Downstream Usage (Sentence Transformers)
120
+
121
+ You can finetune this model on your own dataset.
122
+
123
+ <details><summary>Click to expand</summary>
124
+
125
+ </details>
126
+ -->
127
+
128
+ <!--
129
+ ### Out-of-Scope Use
130
+
131
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
132
+ -->
133
+
134
+ <!--
135
+ ## Bias, Risks and Limitations
136
+
137
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
138
+ -->
139
+
140
+ <!--
141
+ ### Recommendations
142
+
143
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
144
+ -->
145
+
146
+ ## Training Details
147
+
148
+ ### Training Dataset
149
+
150
+ #### pairs_three_scores_v5
151
+
152
+ * Dataset: [pairs_three_scores_v5](https://huggingface.co/datasets/youssefkhalil320/pairs_three_scores_v5) at [3d8c457](https://huggingface.co/datasets/youssefkhalil320/pairs_three_scores_v5/tree/3d8c45703846bd2adfaaf422abafbc389b283de1)
153
+ * Size: 80,000,003 training samples
154
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
155
+ * Approximate statistics based on the first 1000 samples:
156
+ | | sentence1 | sentence2 | score |
157
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
158
+ | type | string | string | float |
159
+ | details | <ul><li>min: 3 tokens</li><li>mean: 6.06 tokens</li><li>max: 12 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.71 tokens</li><li>max: 13 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.11</li><li>max: 1.0</li></ul> |
160
+ * Samples:
161
+ | sentence1 | sentence2 | score |
162
+ |:-------------------------------------|:---------------------------------------|:-----------------|
163
+ | <code>vanilla hair cream</code> | <code>free of paraben hair mask</code> | <code>0.5</code> |
164
+ | <code>nourishing shampoo</code> | <code>cumin lemon tea</code> | <code>0.0</code> |
165
+ | <code>safe materials pacifier</code> | <code>facial serum</code> | <code>0.5</code> |
166
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
167
+ ```json
168
+ {
169
+ "scale": 20.0,
170
+ "similarity_fct": "pairwise_cos_sim"
171
+ }
172
+ ```
173
+
174
+ ### Evaluation Dataset
175
+
176
+ #### pairs_three_scores_v5
177
+
178
+ * Dataset: [pairs_three_scores_v5](https://huggingface.co/datasets/youssefkhalil320/pairs_three_scores_v5) at [3d8c457](https://huggingface.co/datasets/youssefkhalil320/pairs_three_scores_v5/tree/3d8c45703846bd2adfaaf422abafbc389b283de1)
179
+ * Size: 20,000,001 evaluation samples
180
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
181
+ * Approximate statistics based on the first 1000 samples:
182
+ | | sentence1 | sentence2 | score |
183
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
184
+ | type | string | string | float |
185
+ | details | <ul><li>min: 3 tokens</li><li>mean: 6.21 tokens</li><li>max: 12 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.75 tokens</li><li>max: 12 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.11</li><li>max: 1.0</li></ul> |
186
+ * Samples:
187
+ | sentence1 | sentence2 | score |
188
+ |:----------------------------------------|:-----------------------------------|:-----------------|
189
+ | <code>teddy bear toy</code> | <code>long lasting cat food</code> | <code>0.0</code> |
190
+ | <code>eva hair treatment</code> | <code>fresh pineapple</code> | <code>0.0</code> |
191
+ | <code>soft wave hair conditioner</code> | <code>hybrid seat bike</code> | <code>0.0</code> |
192
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
193
+ ```json
194
+ {
195
+ "scale": 20.0,
196
+ "similarity_fct": "pairwise_cos_sim"
197
+ }
198
+ ```
199
+
200
+ ### Training Hyperparameters
201
+ #### Non-Default Hyperparameters
202
+
203
+ - `eval_strategy`: steps
204
+ - `per_device_train_batch_size`: 128
205
+ - `per_device_eval_batch_size`: 128
206
+ - `learning_rate`: 2e-05
207
+ - `num_train_epochs`: 1
208
+ - `warmup_ratio`: 0.1
209
+ - `fp16`: True
210
+
211
+ #### All Hyperparameters
212
+ <details><summary>Click to expand</summary>
213
+
214
+ - `overwrite_output_dir`: False
215
+ - `do_predict`: False
216
+ - `eval_strategy`: steps
217
+ - `prediction_loss_only`: True
218
+ - `per_device_train_batch_size`: 128
219
+ - `per_device_eval_batch_size`: 128
220
+ - `per_gpu_train_batch_size`: None
221
+ - `per_gpu_eval_batch_size`: None
222
+ - `gradient_accumulation_steps`: 1
223
+ - `eval_accumulation_steps`: None
224
+ - `torch_empty_cache_steps`: None
225
+ - `learning_rate`: 2e-05
226
+ - `weight_decay`: 0.0
227
+ - `adam_beta1`: 0.9
228
+ - `adam_beta2`: 0.999
229
+ - `adam_epsilon`: 1e-08
230
+ - `max_grad_norm`: 1.0
231
+ - `num_train_epochs`: 1
232
+ - `max_steps`: -1
233
+ - `lr_scheduler_type`: linear
234
+ - `lr_scheduler_kwargs`: {}
235
+ - `warmup_ratio`: 0.1
236
+ - `warmup_steps`: 0
237
+ - `log_level`: passive
238
+ - `log_level_replica`: warning
239
+ - `log_on_each_node`: True
240
+ - `logging_nan_inf_filter`: True
241
+ - `save_safetensors`: True
242
+ - `save_on_each_node`: False
243
+ - `save_only_model`: False
244
+ - `restore_callback_states_from_checkpoint`: False
245
+ - `no_cuda`: False
246
+ - `use_cpu`: False
247
+ - `use_mps_device`: False
248
+ - `seed`: 42
249
+ - `data_seed`: None
250
+ - `jit_mode_eval`: False
251
+ - `use_ipex`: False
252
+ - `bf16`: False
253
+ - `fp16`: True
254
+ - `fp16_opt_level`: O1
255
+ - `half_precision_backend`: auto
256
+ - `bf16_full_eval`: False
257
+ - `fp16_full_eval`: False
258
+ - `tf32`: None
259
+ - `local_rank`: 0
260
+ - `ddp_backend`: None
261
+ - `tpu_num_cores`: None
262
+ - `tpu_metrics_debug`: False
263
+ - `debug`: []
264
+ - `dataloader_drop_last`: False
265
+ - `dataloader_num_workers`: 0
266
+ - `dataloader_prefetch_factor`: None
267
+ - `past_index`: -1
268
+ - `disable_tqdm`: False
269
+ - `remove_unused_columns`: True
270
+ - `label_names`: None
271
+ - `load_best_model_at_end`: False
272
+ - `ignore_data_skip`: False
273
+ - `fsdp`: []
274
+ - `fsdp_min_num_params`: 0
275
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
276
+ - `fsdp_transformer_layer_cls_to_wrap`: None
277
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
278
+ - `deepspeed`: None
279
+ - `label_smoothing_factor`: 0.0
280
+ - `optim`: adamw_torch
281
+ - `optim_args`: None
282
+ - `adafactor`: False
283
+ - `group_by_length`: False
284
+ - `length_column_name`: length
285
+ - `ddp_find_unused_parameters`: None
286
+ - `ddp_bucket_cap_mb`: None
287
+ - `ddp_broadcast_buffers`: False
288
+ - `dataloader_pin_memory`: True
289
+ - `dataloader_persistent_workers`: False
290
+ - `skip_memory_metrics`: True
291
+ - `use_legacy_prediction_loop`: False
292
+ - `push_to_hub`: False
293
+ - `resume_from_checkpoint`: None
294
+ - `hub_model_id`: None
295
+ - `hub_strategy`: every_save
296
+ - `hub_private_repo`: False
297
+ - `hub_always_push`: False
298
+ - `gradient_checkpointing`: False
299
+ - `gradient_checkpointing_kwargs`: None
300
+ - `include_inputs_for_metrics`: False
301
+ - `eval_do_concat_batches`: True
302
+ - `fp16_backend`: auto
303
+ - `push_to_hub_model_id`: None
304
+ - `push_to_hub_organization`: None
305
+ - `mp_parameters`:
306
+ - `auto_find_batch_size`: False
307
+ - `full_determinism`: False
308
+ - `torchdynamo`: None
309
+ - `ray_scope`: last
310
+ - `ddp_timeout`: 1800
311
+ - `torch_compile`: False
312
+ - `torch_compile_backend`: None
313
+ - `torch_compile_mode`: None
314
+ - `dispatch_batches`: None
315
+ - `split_batches`: None
316
+ - `include_tokens_per_second`: False
317
+ - `include_num_input_tokens_seen`: False
318
+ - `neftune_noise_alpha`: None
319
+ - `optim_target_modules`: None
320
+ - `batch_eval_metrics`: False
321
+ - `eval_on_start`: False
322
+ - `use_liger_kernel`: False
323
+ - `eval_use_gather_object`: False
324
+ - `batch_sampler`: batch_sampler
325
+ - `multi_dataset_batch_sampler`: proportional
326
+
327
+ </details>
328
+
329
+ ### Training Logs
330
+ <details><summary>Click to expand</summary>
331
+
332
+ | Epoch | Step | Training Loss |
333
+ |:------:|:------:|:-------------:|
334
+ | 0.0002 | 100 | 10.8792 |
335
+ | 0.0003 | 200 | 10.9284 |
336
+ | 0.0005 | 300 | 10.6466 |
337
+ | 0.0006 | 400 | 10.841 |
338
+ | 0.0008 | 500 | 10.8094 |
339
+ | 0.0010 | 600 | 10.4323 |
340
+ | 0.0011 | 700 | 10.3032 |
341
+ | 0.0013 | 800 | 10.4006 |
342
+ | 0.0014 | 900 | 10.4743 |
343
+ | 0.0016 | 1000 | 10.2334 |
344
+ | 0.0018 | 1100 | 10.0135 |
345
+ | 0.0019 | 1200 | 9.7874 |
346
+ | 0.0021 | 1300 | 9.7419 |
347
+ | 0.0022 | 1400 | 9.7412 |
348
+ | 0.0024 | 1500 | 9.4585 |
349
+ | 0.0026 | 1600 | 9.5339 |
350
+ | 0.0027 | 1700 | 9.4345 |
351
+ | 0.0029 | 1800 | 9.1733 |
352
+ | 0.0030 | 1900 | 8.9952 |
353
+ | 0.0032 | 2000 | 8.9669 |
354
+ | 0.0034 | 2100 | 8.8152 |
355
+ | 0.0035 | 2200 | 8.7936 |
356
+ | 0.0037 | 2300 | 8.6771 |
357
+ | 0.0038 | 2400 | 8.4648 |
358
+ | 0.0040 | 2500 | 8.5764 |
359
+ | 0.0042 | 2600 | 8.4587 |
360
+ | 0.0043 | 2700 | 8.2966 |
361
+ | 0.0045 | 2800 | 8.2329 |
362
+ | 0.0046 | 2900 | 8.1415 |
363
+ | 0.0048 | 3000 | 8.0404 |
364
+ | 0.0050 | 3100 | 7.9698 |
365
+ | 0.0051 | 3200 | 7.9205 |
366
+ | 0.0053 | 3300 | 7.8314 |
367
+ | 0.0054 | 3400 | 7.8369 |
368
+ | 0.0056 | 3500 | 7.6403 |
369
+ | 0.0058 | 3600 | 7.5842 |
370
+ | 0.0059 | 3700 | 7.5812 |
371
+ | 0.0061 | 3800 | 7.4335 |
372
+ | 0.0062 | 3900 | 7.4917 |
373
+ | 0.0064 | 4000 | 7.3204 |
374
+ | 0.0066 | 4100 | 7.2971 |
375
+ | 0.0067 | 4200 | 7.2233 |
376
+ | 0.0069 | 4300 | 7.2081 |
377
+ | 0.0070 | 4400 | 7.1364 |
378
+ | 0.0072 | 4500 | 7.0663 |
379
+ | 0.0074 | 4600 | 6.9601 |
380
+ | 0.0075 | 4700 | 6.9546 |
381
+ | 0.0077 | 4800 | 6.9019 |
382
+ | 0.0078 | 4900 | 6.8801 |
383
+ | 0.0080 | 5000 | 6.7734 |
384
+ | 0.0082 | 5100 | 6.7648 |
385
+ | 0.0083 | 5200 | 6.7498 |
386
+ | 0.0085 | 5300 | 6.6872 |
387
+ | 0.0086 | 5400 | 6.6264 |
388
+ | 0.0088 | 5500 | 6.579 |
389
+ | 0.0090 | 5600 | 6.6001 |
390
+ | 0.0091 | 5700 | 6.5971 |
391
+ | 0.0093 | 5800 | 6.4694 |
392
+ | 0.0094 | 5900 | 6.3983 |
393
+ | 0.0096 | 6000 | 6.4477 |
394
+ | 0.0098 | 6100 | 6.4308 |
395
+ | 0.0099 | 6200 | 6.4248 |
396
+ | 0.0101 | 6300 | 6.2642 |
397
+ | 0.0102 | 6400 | 6.2763 |
398
+ | 0.0104 | 6500 | 6.3878 |
399
+ | 0.0106 | 6600 | 6.2601 |
400
+ | 0.0107 | 6700 | 6.1789 |
401
+ | 0.0109 | 6800 | 6.1773 |
402
+ | 0.0110 | 6900 | 6.1439 |
403
+ | 0.0112 | 7000 | 6.1863 |
404
+ | 0.0114 | 7100 | 6.0513 |
405
+ | 0.0115 | 7200 | 6.0671 |
406
+ | 0.0117 | 7300 | 6.0212 |
407
+ | 0.0118 | 7400 | 6.0043 |
408
+ | 0.0120 | 7500 | 6.0166 |
409
+ | 0.0122 | 7600 | 5.9754 |
410
+ | 0.0123 | 7700 | 5.9211 |
411
+ | 0.0125 | 7800 | 5.7867 |
412
+ | 0.0126 | 7900 | 5.8534 |
413
+ | 0.0128 | 8000 | 5.7708 |
414
+ | 0.0130 | 8100 | 5.8328 |
415
+ | 0.0131 | 8200 | 5.7417 |
416
+ | 0.0133 | 8300 | 5.8097 |
417
+ | 0.0134 | 8400 | 5.7578 |
418
+ | 0.0136 | 8500 | 5.643 |
419
+ | 0.0138 | 8600 | 5.6401 |
420
+ | 0.0139 | 8700 | 5.6627 |
421
+ | 0.0141 | 8800 | 5.6167 |
422
+ | 0.0142 | 8900 | 5.6539 |
423
+ | 0.0144 | 9000 | 5.4513 |
424
+ | 0.0146 | 9100 | 5.4132 |
425
+ | 0.0147 | 9200 | 5.4714 |
426
+ | 0.0149 | 9300 | 5.4786 |
427
+ | 0.0150 | 9400 | 5.3928 |
428
+ | 0.0152 | 9500 | 5.4774 |
429
+ | 0.0154 | 9600 | 5.2881 |
430
+ | 0.0155 | 9700 | 5.3699 |
431
+ | 0.0157 | 9800 | 5.1483 |
432
+ | 0.0158 | 9900 | 5.3051 |
433
+ | 0.0160 | 10000 | 5.2546 |
434
+ | 0.0162 | 10100 | 5.2314 |
435
+ | 0.0163 | 10200 | 5.1783 |
436
+ | 0.0165 | 10300 | 5.2074 |
437
+ | 0.0166 | 10400 | 5.2825 |
438
+ | 0.0168 | 10500 | 5.1715 |
439
+ | 0.0170 | 10600 | 5.087 |
440
+ | 0.0171 | 10700 | 5.082 |
441
+ | 0.0173 | 10800 | 4.9111 |
442
+ | 0.0174 | 10900 | 5.0213 |
443
+ | 0.0176 | 11000 | 4.9898 |
444
+ | 0.0178 | 11100 | 4.7734 |
445
+ | 0.0179 | 11200 | 4.9511 |
446
+ | 0.0181 | 11300 | 5.0481 |
447
+ | 0.0182 | 11400 | 4.8441 |
448
+ | 0.0184 | 11500 | 4.873 |
449
+ | 0.0186 | 11600 | 4.9988 |
450
+ | 0.0187 | 11700 | 4.7653 |
451
+ | 0.0189 | 11800 | 4.804 |
452
+ | 0.0190 | 11900 | 4.8288 |
453
+ | 0.0192 | 12000 | 4.7053 |
454
+ | 0.0194 | 12100 | 4.6887 |
455
+ | 0.0195 | 12200 | 4.7832 |
456
+ | 0.0197 | 12300 | 4.6817 |
457
+ | 0.0198 | 12400 | 4.6252 |
458
+ | 0.0200 | 12500 | 4.5936 |
459
+ | 0.0202 | 12600 | 4.7452 |
460
+ | 0.0203 | 12700 | 4.5321 |
461
+ | 0.0205 | 12800 | 4.4964 |
462
+ | 0.0206 | 12900 | 4.4421 |
463
+ | 0.0208 | 13000 | 4.3782 |
464
+ | 0.0210 | 13100 | 4.5169 |
465
+ | 0.0211 | 13200 | 4.533 |
466
+ | 0.0213 | 13300 | 4.3725 |
467
+ | 0.0214 | 13400 | 4.2911 |
468
+ | 0.0216 | 13500 | 4.2261 |
469
+ | 0.0218 | 13600 | 4.2467 |
470
+ | 0.0219 | 13700 | 4.1558 |
471
+ | 0.0221 | 13800 | 4.2794 |
472
+ | 0.0222 | 13900 | 4.2383 |
473
+ | 0.0224 | 14000 | 4.1654 |
474
+ | 0.0226 | 14100 | 4.158 |
475
+ | 0.0227 | 14200 | 4.1299 |
476
+ | 0.0229 | 14300 | 4.1902 |
477
+ | 0.0230 | 14400 | 3.7853 |
478
+ | 0.0232 | 14500 | 4.0514 |
479
+ | 0.0234 | 14600 | 4.1655 |
480
+ | 0.0235 | 14700 | 4.051 |
481
+ | 0.0237 | 14800 | 4.078 |
482
+ | 0.0238 | 14900 | 4.1193 |
483
+ | 0.0240 | 15000 | 4.1536 |
484
+ | 0.0242 | 15100 | 3.935 |
485
+ | 0.0243 | 15200 | 3.9535 |
486
+ | 0.0245 | 15300 | 3.7051 |
487
+ | 0.0246 | 15400 | 3.8329 |
488
+ | 0.0248 | 15500 | 3.9412 |
489
+ | 0.0250 | 15600 | 3.6668 |
490
+ | 0.0251 | 15700 | 3.7758 |
491
+ | 0.0253 | 15800 | 3.8805 |
492
+ | 0.0254 | 15900 | 3.8848 |
493
+ | 0.0256 | 16000 | 3.75 |
494
+ | 0.0258 | 16100 | 3.5685 |
495
+ | 0.0259 | 16200 | 3.7016 |
496
+ | 0.0261 | 16300 | 4.0955 |
497
+ | 0.0262 | 16400 | 3.7577 |
498
+ | 0.0264 | 16500 | 3.7485 |
499
+ | 0.0266 | 16600 | 3.8263 |
500
+ | 0.0267 | 16700 | 3.6922 |
501
+ | 0.0269 | 16800 | 3.6568 |
502
+ | 0.0270 | 16900 | 3.7317 |
503
+ | 0.0272 | 17000 | 3.5089 |
504
+ | 0.0274 | 17100 | 3.7377 |
505
+ | 0.0275 | 17200 | 3.6206 |
506
+ | 0.0277 | 17300 | 3.3702 |
507
+ | 0.0278 | 17400 | 3.5126 |
508
+ | 0.0280 | 17500 | 3.4841 |
509
+ | 0.0282 | 17600 | 3.1464 |
510
+ | 0.0283 | 17700 | 3.7012 |
511
+ | 0.0285 | 17800 | 3.5802 |
512
+ | 0.0286 | 17900 | 3.4952 |
513
+ | 0.0288 | 18000 | 3.1174 |
514
+ | 0.0290 | 18100 | 3.3134 |
515
+ | 0.0291 | 18200 | 3.3578 |
516
+ | 0.0293 | 18300 | 3.0209 |
517
+ | 0.0294 | 18400 | 3.3796 |
518
+ | 0.0296 | 18500 | 3.2287 |
519
+ | 0.0298 | 18600 | 3.1537 |
520
+ | 0.0299 | 18700 | 2.9073 |
521
+ | 0.0301 | 18800 | 3.3444 |
522
+ | 0.0302 | 18900 | 3.1341 |
523
+ | 0.0304 | 19000 | 2.8862 |
524
+ | 0.0306 | 19100 | 3.2033 |
525
+ | 0.0307 | 19200 | 3.2764 |
526
+ | 0.0309 | 19300 | 3.0725 |
527
+ | 0.0310 | 19400 | 3.0436 |
528
+ | 0.0312 | 19500 | 3.3493 |
529
+ | 0.0314 | 19600 | 3.0141 |
530
+ | 0.0315 | 19700 | 2.779 |
531
+ | 0.0317 | 19800 | 3.3543 |
532
+ | 0.0318 | 19900 | 3.1526 |
533
+ | 0.0320 | 20000 | 2.7896 |
534
+ | 0.0322 | 20100 | 2.9398 |
535
+ | 0.0323 | 20200 | 3.1254 |
536
+ | 0.0325 | 20300 | 2.8832 |
537
+ | 0.0326 | 20400 | 3.0542 |
538
+ | 0.0328 | 20500 | 2.9722 |
539
+ | 0.0330 | 20600 | 2.9321 |
540
+ | 0.0331 | 20700 | 2.6448 |
541
+ | 0.0333 | 20800 | 3.4006 |
542
+ | 0.0334 | 20900 | 3.0022 |
543
+ | 0.0336 | 21000 | 2.6366 |
544
+ | 0.0338 | 21100 | 3.0112 |
545
+ | 0.0339 | 21200 | 2.7856 |
546
+ | 0.0341 | 21300 | 3.0967 |
547
+ | 0.0342 | 21400 | 2.8754 |
548
+ | 0.0344 | 21500 | 3.1269 |
549
+ | 0.0346 | 21600 | 2.8235 |
550
+ | 0.0347 | 21700 | 2.4912 |
551
+ | 0.0349 | 21800 | 2.5079 |
552
+ | 0.0350 | 21900 | 3.2942 |
553
+ | 0.0352 | 22000 | 2.4184 |
554
+ | 0.0354 | 22100 | 2.782 |
555
+ | 0.0355 | 22200 | 2.7652 |
556
+ | 0.0357 | 22300 | 3.113 |
557
+ | 0.0358 | 22400 | 2.7451 |
558
+ | 0.0360 | 22500 | 2.7473 |
559
+ | 0.0362 | 22600 | 2.5116 |
560
+ | 0.0363 | 22700 | 2.8531 |
561
+ | 0.0365 | 22800 | 2.9171 |
562
+ | 0.0366 | 22900 | 2.7954 |
563
+ | 0.0368 | 23000 | 2.5376 |
564
+ | 0.0370 | 23100 | 3.2488 |
565
+ | 0.0371 | 23200 | 2.6131 |
566
+ | 0.0373 | 23300 | 3.1343 |
567
+ | 0.0374 | 23400 | 2.3159 |
568
+ | 0.0376 | 23500 | 2.4225 |
569
+ | 0.0378 | 23600 | 2.5034 |
570
+ | 0.0379 | 23700 | 3.0067 |
571
+ | 0.0381 | 23800 | 2.313 |
572
+ | 0.0382 | 23900 | 2.5363 |
573
+ | 0.0384 | 24000 | 2.7929 |
574
+ | 0.0386 | 24100 | 2.617 |
575
+ | 0.0387 | 24200 | 2.9711 |
576
+ | 0.0389 | 24300 | 2.7726 |
577
+ | 0.0390 | 24400 | 2.5849 |
578
+ | 0.0392 | 24500 | 2.3231 |
579
+ | 0.0394 | 24600 | 2.2477 |
580
+ | 0.0395 | 24700 | 2.5487 |
581
+ | 0.0397 | 24800 | 2.5175 |
582
+ | 0.0398 | 24900 | 2.6758 |
583
+ | 0.0400 | 25000 | 2.7313 |
584
+ | 0.0402 | 25100 | 2.4846 |
585
+ | 0.0403 | 25200 | 2.8697 |
586
+ | 0.0405 | 25300 | 2.5289 |
587
+ | 0.0406 | 25400 | 2.235 |
588
+ | 0.0408 | 25500 | 2.5028 |
589
+ | 0.0410 | 25600 | 2.6295 |
590
+ | 0.0411 | 25700 | 2.6159 |
591
+ | 0.0413 | 25800 | 2.4447 |
592
+ | 0.0414 | 25900 | 2.7233 |
593
+ | 0.0416 | 26000 | 2.5651 |
594
+ | 0.0418 | 26100 | 2.1317 |
595
+ | 0.0419 | 26200 | 2.6157 |
596
+ | 0.0421 | 26300 | 2.7385 |
597
+ | 0.0422 | 26400 | 2.4642 |
598
+ | 0.0424 | 26500 | 2.0621 |
599
+ | 0.0426 | 26600 | 2.3864 |
600
+ | 0.0427 | 26700 | 2.6951 |
601
+ | 0.0429 | 26800 | 2.2628 |
602
+ | 0.0430 | 26900 | 2.7538 |
603
+ | 0.0432 | 27000 | 2.6871 |
604
+ | 0.0434 | 27100 | 2.2453 |
605
+ | 0.0435 | 27200 | 1.6334 |
606
+ | 0.0437 | 27300 | 2.666 |
607
+ | 0.0438 | 27400 | 2.128 |
608
+ | 0.0440 | 27500 | 2.7573 |
609
+ | 0.0442 | 27600 | 2.5276 |
610
+ | 0.0443 | 27700 | 2.2438 |
611
+ | 0.0445 | 27800 | 2.3156 |
612
+ | 0.0446 | 27900 | 2.1735 |
613
+ | 0.0448 | 28000 | 2.1733 |
614
+ | 0.0450 | 28100 | 2.4094 |
615
+ | 0.0451 | 28200 | 2.8484 |
616
+ | 0.0453 | 28300 | 2.4507 |
617
+ | 0.0454 | 28400 | 2.6822 |
618
+ | 0.0456 | 28500 | 2.1191 |
619
+ | 0.0458 | 28600 | 2.0696 |
620
+ | 0.0459 | 28700 | 2.4027 |
621
+ | 0.0461 | 28800 | 1.7958 |
622
+ | 0.0462 | 28900 | 2.5874 |
623
+ | 0.0464 | 29000 | 2.2679 |
624
+ | 0.0466 | 29100 | 2.6394 |
625
+ | 0.0467 | 29200 | 1.7998 |
626
+ | 0.0469 | 29300 | 2.6834 |
627
+ | 0.0470 | 29400 | 2.1242 |
628
+ | 0.0472 | 29500 | 2.0039 |
629
+ | 0.0474 | 29600 | 2.018 |
630
+ | 0.0475 | 29700 | 2.9357 |
631
+ | 0.0477 | 29800 | 2.1914 |
632
+ | 0.0478 | 29900 | 2.0968 |
633
+ | 0.0480 | 30000 | 1.9762 |
634
+ | 0.0482 | 30100 | 2.1436 |
635
+ | 0.0483 | 30200 | 2.1919 |
636
+ | 0.0485 | 30300 | 1.9683 |
637
+ | 0.0486 | 30400 | 2.3543 |
638
+ | 0.0488 | 30500 | 2.0642 |
639
+ | 0.0490 | 30600 | 1.8447 |
640
+ | 0.0491 | 30700 | 2.3467 |
641
+ | 0.0493 | 30800 | 2.6461 |
642
+ | 0.0494 | 30900 | 2.028 |
643
+ | 0.0496 | 31000 | 1.4188 |
644
+ | 0.0498 | 31100 | 2.7219 |
645
+ | 0.0499 | 31200 | 2.2345 |
646
+ | 0.0501 | 31300 | 2.201 |
647
+ | 0.0502 | 31400 | 2.092 |
648
+ | 0.0504 | 31500 | 2.2871 |
649
+ | 0.0506 | 31600 | 2.0167 |
650
+ | 0.0507 | 31700 | 1.9175 |
651
+ | 0.0509 | 31800 | 2.2229 |
652
+ | 0.0510 | 31900 | 2.1196 |
653
+ | 0.0512 | 32000 | 2.2192 |
654
+ | 0.0514 | 32100 | 1.6462 |
655
+ | 0.0515 | 32200 | 2.099 |
656
+ | 0.0517 | 32300 | 2.0914 |
657
+ | 0.0518 | 32400 | 2.3295 |
658
+ | 0.0520 | 32500 | 2.256 |
659
+ | 0.0522 | 32600 | 1.7662 |
660
+ | 0.0523 | 32700 | 1.7234 |
661
+ | 0.0525 | 32800 | 1.984 |
662
+ | 0.0526 | 32900 | 2.1815 |
663
+ | 0.0528 | 33000 | 1.4987 |
664
+ | 0.0530 | 33100 | 2.0034 |
665
+ | 0.0531 | 33200 | 2.6008 |
666
+ | 0.0533 | 33300 | 2.4585 |
667
+ | 0.0534 | 33400 | 1.881 |
668
+ | 0.0536 | 33500 | 1.8738 |
669
+ | 0.0538 | 33600 | 1.9726 |
670
+ | 0.0539 | 33700 | 2.3734 |
671
+ | 0.0541 | 33800 | 1.6898 |
672
+ | 0.0542 | 33900 | 2.2171 |
673
+ | 0.0544 | 34000 | 1.4453 |
674
+ | 0.0546 | 34100 | 1.5057 |
675
+ | 0.0547 | 34200 | 2.1497 |
676
+ | 0.0549 | 34300 | 1.8618 |
677
+ | 0.0550 | 34400 | 1.7878 |
678
+ | 0.0552 | 34500 | 1.8199 |
679
+ | 0.0554 | 34600 | 2.1649 |
680
+ | 0.0555 | 34700 | 1.7906 |
681
+ | 0.0557 | 34800 | 1.6816 |
682
+ | 0.0558 | 34900 | 2.1464 |
683
+ | 0.0560 | 35000 | 2.0039 |
684
+ | 0.0562 | 35100 | 1.735 |
685
+ | 0.0563 | 35200 | 1.853 |
686
+ | 0.0565 | 35300 | 1.6068 |
687
+ | 0.0566 | 35400 | 1.6349 |
688
+ | 0.0568 | 35500 | 1.9571 |
689
+ | 0.0570 | 35600 | 1.5854 |
690
+ | 0.0571 | 35700 | 1.9756 |
691
+ | 0.0573 | 35800 | 1.9816 |
692
+ | 0.0574 | 35900 | 1.6758 |
693
+ | 0.0576 | 36000 | 2.2583 |
694
+ | 0.0578 | 36100 | 1.7584 |
695
+ | 0.0579 | 36200 | 1.9894 |
696
+ | 0.0581 | 36300 | 2.3922 |
697
+ | 0.0582 | 36400 | 2.0077 |
698
+ | 0.0584 | 36500 | 2.3684 |
699
+ | 0.0586 | 36600 | 2.1103 |
700
+ | 0.0587 | 36700 | 2.0728 |
701
+ | 0.0589 | 36800 | 1.9364 |
702
+ | 0.0590 | 36900 | 2.5203 |
703
+ | 0.0592 | 37000 | 1.8473 |
704
+ | 0.0594 | 37100 | 1.8076 |
705
+ | 0.0595 | 37200 | 2.0157 |
706
+ | 0.0597 | 37300 | 2.1587 |
707
+ | 0.0598 | 37400 | 1.9825 |
708
+ | 0.0600 | 37500 | 2.0693 |
709
+ | 0.0602 | 37600 | 1.5505 |
710
+ | 0.0603 | 37700 | 1.5472 |
711
+ | 0.0605 | 37800 | 2.0568 |
712
+ | 0.0606 | 37900 | 1.9219 |
713
+ | 0.0608 | 38000 | 2.091 |
714
+ | 0.0610 | 38100 | 2.0523 |
715
+ | 0.0611 | 38200 | 1.7628 |
716
+ | 0.0613 | 38300 | 1.8753 |
717
+ | 0.0614 | 38400 | 1.846 |
718
+ | 0.0616 | 38500 | 1.803 |
719
+ | 0.0618 | 38600 | 2.1226 |
720
+ | 0.0619 | 38700 | 2.0906 |
721
+ | 0.0621 | 38800 | 1.4321 |
722
+ | 0.0622 | 38900 | 2.5214 |
723
+ | 0.0624 | 39000 | 1.5412 |
724
+ | 0.0626 | 39100 | 1.4382 |
725
+ | 0.0627 | 39200 | 1.8417 |
726
+ | 0.0629 | 39300 | 2.1105 |
727
+ | 0.0630 | 39400 | 1.6347 |
728
+ | 0.0632 | 39500 | 2.0372 |
729
+ | 0.0634 | 39600 | 1.6222 |
730
+ | 0.0635 | 39700 | 1.8033 |
731
+ | 0.0637 | 39800 | 1.9847 |
732
+ | 0.0638 | 39900 | 2.1354 |
733
+ | 0.0640 | 40000 | 1.6792 |
734
+ | 0.0642 | 40100 | 2.1055 |
735
+ | 0.0643 | 40200 | 2.0657 |
736
+ | 0.0645 | 40300 | 1.9618 |
737
+ | 0.0646 | 40400 | 1.5807 |
738
+ | 0.0648 | 40500 | 1.6451 |
739
+ | 0.0650 | 40600 | 2.1299 |
740
+ | 0.0651 | 40700 | 1.9912 |
741
+ | 0.0653 | 40800 | 1.6392 |
742
+ | 0.0654 | 40900 | 1.8049 |
743
+ | 0.0656 | 41000 | 1.9832 |
744
+ | 0.0658 | 41100 | 2.0309 |
745
+ | 0.0659 | 41200 | 1.8362 |
746
+ | 0.0661 | 41300 | 2.2709 |
747
+ | 0.0662 | 41400 | 2.0785 |
748
+ | 0.0664 | 41500 | 1.5627 |
749
+ | 0.0666 | 41600 | 1.6058 |
750
+ | 0.0667 | 41700 | 1.7099 |
751
+ | 0.0669 | 41800 | 1.7096 |
752
+ | 0.0670 | 41900 | 1.6429 |
753
+ | 0.0672 | 42000 | 1.2514 |
754
+ | 0.0674 | 42100 | 1.5746 |
755
+ | 0.0675 | 42200 | 1.7186 |
756
+ | 0.0677 | 42300 | 1.8152 |
757
+ | 0.0678 | 42400 | 1.705 |
758
+ | 0.0680 | 42500 | 1.6779 |
759
+ | 0.0682 | 42600 | 1.8157 |
760
+ | 0.0683 | 42700 | 1.8464 |
761
+ | 0.0685 | 42800 | 1.748 |
762
+ | 0.0686 | 42900 | 1.6836 |
763
+ | 0.0688 | 43000 | 1.65 |
764
+ | 0.0690 | 43100 | 1.5632 |
765
+ | 0.0691 | 43200 | 2.0987 |
766
+ | 0.0693 | 43300 | 1.5783 |
767
+ | 0.0694 | 43400 | 1.8029 |
768
+ | 0.0696 | 43500 | 1.7154 |
769
+ | 0.0698 | 43600 | 1.663 |
770
+ | 0.0699 | 43700 | 1.4403 |
771
+ | 0.0701 | 43800 | 1.6513 |
772
+ | 0.0702 | 43900 | 2.2041 |
773
+ | 0.0704 | 44000 | 2.3908 |
774
+ | 0.0706 | 44100 | 1.7153 |
775
+ | 0.0707 | 44200 | 2.2112 |
776
+ | 0.0709 | 44300 | 1.8663 |
777
+ | 0.0710 | 44400 | 1.8206 |
778
+ | 0.0712 | 44500 | 2.2269 |
779
+ | 0.0714 | 44600 | 1.8159 |
780
+ | 0.0715 | 44700 | 1.9257 |
781
+ | 0.0717 | 44800 | 2.087 |
782
+ | 0.0718 | 44900 | 1.3623 |
783
+ | 0.0720 | 45000 | 1.5747 |
784
+ | 0.0722 | 45100 | 1.8051 |
785
+ | 0.0723 | 45200 | 2.3691 |
786
+ | 0.0725 | 45300 | 2.1125 |
787
+ | 0.0726 | 45400 | 1.566 |
788
+ | 0.0728 | 45500 | 1.5042 |
789
+ | 0.0730 | 45600 | 1.9469 |
790
+ | 0.0731 | 45700 | 1.9346 |
791
+ | 0.0733 | 45800 | 1.4362 |
792
+ | 0.0734 | 45900 | 1.9164 |
793
+ | 0.0736 | 46000 | 1.511 |
794
+ | 0.0738 | 46100 | 1.4523 |
795
+ | 0.0739 | 46200 | 1.1247 |
796
+ | 0.0741 | 46300 | 1.9694 |
797
+ | 0.0742 | 46400 | 2.1909 |
798
+ | 0.0744 | 46500 | 2.0247 |
799
+ | 0.0746 | 46600 | 1.2061 |
800
+ | 0.0747 | 46700 | 1.6151 |
801
+ | 0.0749 | 46800 | 1.6184 |
802
+ | 0.0750 | 46900 | 2.0375 |
803
+ | 0.0752 | 47000 | 1.8357 |
804
+ | 0.0754 | 47100 | 1.7605 |
805
+ | 0.0755 | 47200 | 2.1139 |
806
+ | 0.0757 | 47300 | 1.2971 |
807
+ | 0.0758 | 47400 | 1.7242 |
808
+ | 0.0760 | 47500 | 1.2726 |
809
+ | 0.0762 | 47600 | 1.9947 |
810
+ | 0.0763 | 47700 | 2.2796 |
811
+ | 0.0765 | 47800 | 1.6232 |
812
+ | 0.0766 | 47900 | 1.3513 |
813
+ | 0.0768 | 48000 | 1.291 |
814
+ | 0.0770 | 48100 | 1.5954 |
815
+ | 0.0771 | 48200 | 1.6232 |
816
+ | 0.0773 | 48300 | 1.8858 |
817
+ | 0.0774 | 48400 | 1.6235 |
818
+ | 0.0776 | 48500 | 1.9061 |
819
+ | 0.0778 | 48600 | 1.5919 |
820
+ | 0.0779 | 48700 | 1.8474 |
821
+ | 0.0781 | 48800 | 1.7112 |
822
+ | 0.0782 | 48900 | 1.8007 |
823
+ | 0.0784 | 49000 | 1.7499 |
824
+ | 0.0786 | 49100 | 1.4046 |
825
+ | 0.0787 | 49200 | 2.0843 |
826
+ | 0.0789 | 49300 | 1.52 |
827
+ | 0.0790 | 49400 | 1.8708 |
828
+ | 0.0792 | 49500 | 1.673 |
829
+ | 0.0794 | 49600 | 1.8457 |
830
+ | 0.0795 | 49700 | 1.5627 |
831
+ | 0.0797 | 49800 | 1.6497 |
832
+ | 0.0798 | 49900 | 1.5787 |
833
+ | 0.0800 | 50000 | 1.8507 |
834
+ | 0.0802 | 50100 | 1.4336 |
835
+ | 0.0803 | 50200 | 2.152 |
836
+ | 0.0805 | 50300 | 1.6311 |
837
+ | 0.0806 | 50400 | 1.7442 |
838
+ | 0.0808 | 50500 | 1.8063 |
839
+ | 0.0810 | 50600 | 1.4 |
840
+ | 0.0811 | 50700 | 1.6401 |
841
+ | 0.0813 | 50800 | 1.9426 |
842
+ | 0.0814 | 50900 | 2.0937 |
843
+ | 0.0816 | 51000 | 1.8187 |
844
+ | 0.0818 | 51100 | 2.1751 |
845
+ | 0.0819 | 51200 | 2.1703 |
846
+ | 0.0821 | 51300 | 1.4443 |
847
+ | 0.0822 | 51400 | 1.9266 |
848
+ | 0.0824 | 51500 | 1.8226 |
849
+ | 0.0826 | 51600 | 1.4394 |
850
+ | 0.0827 | 51700 | 1.052 |
851
+ | 0.0829 | 51800 | 1.0614 |
852
+ | 0.0830 | 51900 | 1.4591 |
853
+ | 0.0832 | 52000 | 1.6479 |
854
+ | 0.0834 | 52100 | 1.7548 |
855
+ | 0.0835 | 52200 | 1.6293 |
856
+ | 0.0837 | 52300 | 1.7183 |
857
+ | 0.0838 | 52400 | 1.2329 |
858
+ | 0.0840 | 52500 | 1.5292 |
859
+ | 0.0842 | 52600 | 1.6752 |
860
+ | 0.0843 | 52700 | 1.3228 |
861
+ | 0.0845 | 52800 | 1.485 |
862
+ | 0.0846 | 52900 | 1.4228 |
863
+ | 0.0848 | 53000 | 1.1385 |
864
+ | 0.0850 | 53100 | 1.1812 |
865
+ | 0.0851 | 53200 | 1.4763 |
866
+ | 0.0853 | 53300 | 1.9444 |
867
+ | 0.0854 | 53400 | 1.5316 |
868
+ | 0.0856 | 53500 | 1.6928 |
869
+ | 0.0858 | 53600 | 1.4466 |
870
+ | 0.0859 | 53700 | 1.438 |
871
+ | 0.0861 | 53800 | 1.1629 |
872
+ | 0.0862 | 53900 | 1.3017 |
873
+ | 0.0864 | 54000 | 1.6614 |
874
+ | 0.0866 | 54100 | 1.4535 |
875
+ | 0.0867 | 54200 | 1.7061 |
876
+ | 0.0869 | 54300 | 1.4681 |
877
+ | 0.0870 | 54400 | 1.3449 |
878
+ | 0.0872 | 54500 | 1.8814 |
879
+ | 0.0874 | 54600 | 1.5989 |
880
+ | 0.0875 | 54700 | 1.3711 |
881
+ | 0.0877 | 54800 | 1.3199 |
882
+ | 0.0878 | 54900 | 1.3713 |
883
+ | 0.0880 | 55000 | 1.441 |
884
+ | 0.0882 | 55100 | 1.268 |
885
+ | 0.0883 | 55200 | 1.1648 |
886
+ | 0.0885 | 55300 | 1.8108 |
887
+ | 0.0886 | 55400 | 1.4904 |
888
+ | 0.0888 | 55500 | 1.2555 |
889
+ | 0.0890 | 55600 | 1.2733 |
890
+ | 0.0891 | 55700 | 1.5194 |
891
+ | 0.0893 | 55800 | 1.7587 |
892
+ | 0.0894 | 55900 | 1.6183 |
893
+ | 0.0896 | 56000 | 1.3596 |
894
+ | 0.0898 | 56100 | 1.5248 |
895
+ | 0.0899 | 56200 | 1.5177 |
896
+ | 0.0901 | 56300 | 1.7579 |
897
+ | 0.0902 | 56400 | 1.5508 |
898
+ | 0.0904 | 56500 | 1.5965 |
899
+ | 0.0906 | 56600 | 1.5762 |
900
+ | 0.0907 | 56700 | 1.7441 |
901
+ | 0.0909 | 56800 | 2.0257 |
902
+ | 0.0910 | 56900 | 1.1371 |
903
+ | 0.0912 | 57000 | 1.8825 |
904
+ | 0.0914 | 57100 | 1.0455 |
905
+ | 0.0915 | 57200 | 1.5889 |
906
+ | 0.0917 | 57300 | 1.192 |
907
+ | 0.0918 | 57400 | 1.5374 |
908
+ | 0.0920 | 57500 | 1.6236 |
909
+ | 0.0922 | 57600 | 1.8945 |
910
+ | 0.0923 | 57700 | 1.607 |
911
+ | 0.0925 | 57800 | 1.8133 |
912
+ | 0.0926 | 57900 | 1.5777 |
913
+ | 0.0928 | 58000 | 1.5043 |
914
+ | 0.0930 | 58100 | 1.7681 |
915
+ | 0.0931 | 58200 | 1.623 |
916
+ | 0.0933 | 58300 | 2.2137 |
917
+ | 0.0934 | 58400 | 2.2447 |
918
+ | 0.0936 | 58500 | 2.3013 |
919
+ | 0.0938 | 58600 | 1.3105 |
920
+ | 0.0939 | 58700 | 1.4461 |
921
+ | 0.0941 | 58800 | 2.1321 |
922
+ | 0.0942 | 58900 | 1.7541 |
923
+ | 0.0944 | 59000 | 1.7894 |
924
+ | 0.0946 | 59100 | 1.693 |
925
+ | 0.0947 | 59200 | 1.7073 |
926
+ | 0.0949 | 59300 | 2.0305 |
927
+ | 0.0950 | 59400 | 1.3684 |
928
+ | 0.0952 | 59500 | 1.8754 |
929
+ | 0.0954 | 59600 | 2.0225 |
930
+ | 0.0955 | 59700 | 2.1975 |
931
+ | 0.0957 | 59800 | 1.7173 |
932
+ | 0.0958 | 59900 | 1.4302 |
933
+ | 0.0960 | 60000 | 1.2497 |
934
+ | 0.0962 | 60100 | 1.4058 |
935
+ | 0.0963 | 60200 | 1.0956 |
936
+ | 0.0965 | 60300 | 1.3731 |
937
+ | 0.0966 | 60400 | 1.2953 |
938
+ | 0.0968 | 60500 | 1.0987 |
939
+ | 0.0970 | 60600 | 1.5104 |
940
+ | 0.0971 | 60700 | 1.5224 |
941
+ | 0.0973 | 60800 | 1.3982 |
942
+ | 0.0974 | 60900 | 1.2785 |
943
+ | 0.0976 | 61000 | 1.6018 |
944
+ | 0.0978 | 61100 | 1.4968 |
945
+ | 0.0979 | 61200 | 1.2423 |
946
+ | 0.0981 | 61300 | 1.9973 |
947
+ | 0.0982 | 61400 | 1.2149 |
948
+ | 0.0984 | 61500 | 1.731 |
949
+ | 0.0986 | 61600 | 1.2889 |
950
+ | 0.0987 | 61700 | 1.856 |
951
+ | 0.0989 | 61800 | 0.8942 |
952
+ | 0.0990 | 61900 | 1.3371 |
953
+ | 0.0992 | 62000 | 1.5222 |
954
+ | 0.0994 | 62100 | 1.5435 |
955
+ | 0.0995 | 62200 | 1.1172 |
956
+ | 0.0997 | 62300 | 1.6024 |
957
+ | 0.0998 | 62400 | 1.3914 |
958
+ | 0.1000 | 62500 | 1.4714 |
959
+ | 0.1002 | 62600 | 1.2922 |
960
+ | 0.1003 | 62700 | 1.4263 |
961
+ | 0.1005 | 62800 | 1.4586 |
962
+ | 0.1006 | 62900 | 1.6312 |
963
+ | 0.1008 | 63000 | 1.9607 |
964
+ | 0.1010 | 63100 | 1.5771 |
965
+ | 0.1011 | 63200 | 1.6721 |
966
+ | 0.1013 | 63300 | 1.8461 |
967
+ | 0.1014 | 63400 | 1.5256 |
968
+ | 0.1016 | 63500 | 1.9736 |
969
+ | 0.1018 | 63600 | 1.4735 |
970
+ | 0.1019 | 63700 | 1.4619 |
971
+ | 0.1021 | 63800 | 1.6571 |
972
+ | 0.1022 | 63900 | 1.5888 |
973
+ | 0.1024 | 64000 | 2.0457 |
974
+ | 0.1026 | 64100 | 1.7843 |
975
+ | 0.1027 | 64200 | 1.5116 |
976
+ | 0.1029 | 64300 | 1.6682 |
977
+ | 0.1030 | 64400 | 1.2137 |
978
+ | 0.1032 | 64500 | 1.1308 |
979
+ | 0.1034 | 64600 | 2.031 |
980
+ | 0.1035 | 64700 | 1.6903 |
981
+ | 0.1037 | 64800 | 1.3365 |
982
+ | 0.1038 | 64900 | 1.5736 |
983
+ | 0.1040 | 65000 | 1.7264 |
984
+ | 0.1042 | 65100 | 1.1781 |
985
+ | 0.1043 | 65200 | 1.2503 |
986
+ | 0.1045 | 65300 | 0.9432 |
987
+ | 0.1046 | 65400 | 1.264 |
988
+ | 0.1048 | 65500 | 1.2086 |
989
+ | 0.1050 | 65600 | 1.8692 |
990
+ | 0.1051 | 65700 | 1.2745 |
991
+ | 0.1053 | 65800 | 1.6839 |
992
+ | 0.1054 | 65900 | 1.4509 |
993
+ | 0.1056 | 66000 | 1.1615 |
994
+ | 0.1058 | 66100 | 1.4458 |
995
+ | 0.1059 | 66200 | 1.8329 |
996
+ | 0.1061 | 66300 | 1.567 |
997
+ | 0.1062 | 66400 | 1.6746 |
998
+ | 0.1064 | 66500 | 1.65 |
999
+ | 0.1066 | 66600 | 1.5497 |
1000
+ | 0.1067 | 66700 | 1.4009 |
1001
+ | 0.1069 | 66800 | 2.058 |
1002
+ | 0.1070 | 66900 | 1.6306 |
1003
+ | 0.1072 | 67000 | 1.4377 |
1004
+ | 0.1074 | 67100 | 1.4501 |
1005
+ | 0.1075 | 67200 | 1.2648 |
1006
+ | 0.1077 | 67300 | 1.3186 |
1007
+ | 0.1078 | 67400 | 1.1313 |
1008
+ | 0.1080 | 67500 | 2.2523 |
1009
+ | 0.1082 | 67600 | 1.9146 |
1010
+ | 0.1083 | 67700 | 1.7334 |
1011
+ | 0.1085 | 67800 | 1.7195 |
1012
+ | 0.1086 | 67900 | 1.4661 |
1013
+ | 0.1088 | 68000 | 1.3503 |
1014
+ | 0.1090 | 68100 | 1.0129 |
1015
+ | 0.1091 | 68200 | 1.6036 |
1016
+ | 0.1093 | 68300 | 0.9312 |
1017
+ | 0.1094 | 68400 | 1.5817 |
1018
+ | 0.1096 | 68500 | 1.2024 |
1019
+ | 0.1098 | 68600 | 0.985 |
1020
+ | 0.1099 | 68700 | 1.1712 |
1021
+ | 0.1101 | 68800 | 1.5874 |
1022
+ | 0.1102 | 68900 | 1.8551 |
1023
+ | 0.1104 | 69000 | 1.232 |
1024
+ | 0.1106 | 69100 | 1.4688 |
1025
+ | 0.1107 | 69200 | 1.1107 |
1026
+ | 0.1109 | 69300 | 1.6495 |
1027
+ | 0.1110 | 69400 | 1.6278 |
1028
+ | 0.1112 | 69500 | 1.7135 |
1029
+ | 0.1114 | 69600 | 1.5108 |
1030
+ | 0.1115 | 69700 | 1.4056 |
1031
+ | 0.1117 | 69800 | 0.9324 |
1032
+ | 0.1118 | 69900 | 1.3613 |
1033
+ | 0.1120 | 70000 | 1.5283 |
1034
+ | 0.1122 | 70100 | 1.3809 |
1035
+ | 0.1123 | 70200 | 1.5552 |
1036
+ | 0.1125 | 70300 | 1.4567 |
1037
+ | 0.1126 | 70400 | 1.4404 |
1038
+ | 0.1128 | 70500 | 1.1805 |
1039
+ | 0.1130 | 70600 | 2.514 |
1040
+ | 0.1131 | 70700 | 1.4821 |
1041
+ | 0.1133 | 70800 | 1.5156 |
1042
+ | 0.1134 | 70900 | 1.5925 |
1043
+ | 0.1136 | 71000 | 1.9517 |
1044
+ | 0.1138 | 71100 | 1.2685 |
1045
+ | 0.1139 | 71200 | 1.6314 |
1046
+ | 0.1141 | 71300 | 1.5252 |
1047
+ | 0.1142 | 71400 | 1.5176 |
1048
+ | 0.1144 | 71500 | 1.3461 |
1049
+ | 0.1146 | 71600 | 1.3832 |
1050
+ | 0.1147 | 71700 | 1.2962 |
1051
+ | 0.1149 | 71800 | 1.5179 |
1052
+ | 0.1150 | 71900 | 1.1041 |
1053
+ | 0.1152 | 72000 | 1.5031 |
1054
+ | 0.1154 | 72100 | 1.5412 |
1055
+ | 0.1155 | 72200 | 1.2971 |
1056
+ | 0.1157 | 72300 | 1.0979 |
1057
+ | 0.1158 | 72400 | 1.307 |
1058
+ | 0.1160 | 72500 | 1.3418 |
1059
+ | 0.1162 | 72600 | 1.7298 |
1060
+ | 0.1163 | 72700 | 1.68 |
1061
+ | 0.1165 | 72800 | 1.3106 |
1062
+ | 0.1166 | 72900 | 1.0954 |
1063
+ | 0.1168 | 73000 | 1.5994 |
1064
+ | 0.1170 | 73100 | 1.5953 |
1065
+ | 0.1171 | 73200 | 1.9498 |
1066
+ | 0.1173 | 73300 | 0.9937 |
1067
+ | 0.1174 | 73400 | 1.4753 |
1068
+ | 0.1176 | 73500 | 1.417 |
1069
+ | 0.1178 | 73600 | 1.596 |
1070
+ | 0.1179 | 73700 | 1.8794 |
1071
+ | 0.1181 | 73800 | 1.3118 |
1072
+ | 0.1182 | 73900 | 1.732 |
1073
+ | 0.1184 | 74000 | 1.4504 |
1074
+ | 0.1186 | 74100 | 1.0878 |
1075
+ | 0.1187 | 74200 | 1.2488 |
1076
+ | 0.1189 | 74300 | 1.3887 |
1077
+ | 0.1190 | 74400 | 1.2265 |
1078
+ | 0.1192 | 74500 | 1.4668 |
1079
+ | 0.1194 | 74600 | 1.6258 |
1080
+ | 0.1195 | 74700 | 1.9551 |
1081
+ | 0.1197 | 74800 | 1.1811 |
1082
+ | 0.1198 | 74900 | 1.2119 |
1083
+ | 0.1200 | 75000 | 1.4051 |
1084
+ | 0.1202 | 75100 | 1.2587 |
1085
+ | 0.1203 | 75200 | 1.4563 |
1086
+ | 0.1205 | 75300 | 1.5581 |
1087
+ | 0.1206 | 75400 | 1.5457 |
1088
+ | 0.1208 | 75500 | 1.2675 |
1089
+ | 0.1210 | 75600 | 1.0948 |
1090
+ | 0.1211 | 75700 | 1.2045 |
1091
+ | 0.1213 | 75800 | 1.5964 |
1092
+ | 0.1214 | 75900 | 1.0517 |
1093
+ | 0.1216 | 76000 | 1.2883 |
1094
+ | 0.1218 | 76100 | 1.2276 |
1095
+ | 0.1219 | 76200 | 1.2463 |
1096
+ | 0.1221 | 76300 | 1.241 |
1097
+ | 0.1222 | 76400 | 1.8648 |
1098
+ | 0.1224 | 76500 | 1.4848 |
1099
+ | 0.1226 | 76600 | 1.413 |
1100
+ | 0.1227 | 76700 | 1.594 |
1101
+ | 0.1229 | 76800 | 1.3682 |
1102
+ | 0.1230 | 76900 | 1.159 |
1103
+ | 0.1232 | 77000 | 1.4702 |
1104
+ | 0.1234 | 77100 | 1.3251 |
1105
+ | 0.1235 | 77200 | 1.0538 |
1106
+ | 0.1237 | 77300 | 1.1708 |
1107
+ | 0.1238 | 77400 | 1.2864 |
1108
+ | 0.1240 | 77500 | 1.6501 |
1109
+ | 0.1242 | 77600 | 1.0104 |
1110
+ | 0.1243 | 77700 | 1.7969 |
1111
+ | 0.1245 | 77800 | 1.0293 |
1112
+ | 0.1246 | 77900 | 1.5593 |
1113
+ | 0.1248 | 78000 | 0.9902 |
1114
+ | 0.1250 | 78100 | 1.058 |
1115
+ | 0.1251 | 78200 | 1.4039 |
1116
+ | 0.1253 | 78300 | 1.008 |
1117
+ | 0.1254 | 78400 | 1.4593 |
1118
+ | 0.1256 | 78500 | 1.563 |
1119
+ | 0.1258 | 78600 | 1.1569 |
1120
+ | 0.1259 | 78700 | 1.3886 |
1121
+ | 0.1261 | 78800 | 1.061 |
1122
+ | 0.1262 | 78900 | 1.2085 |
1123
+ | 0.1264 | 79000 | 1.8553 |
1124
+ | 0.1266 | 79100 | 1.7144 |
1125
+ | 0.1267 | 79200 | 1.2216 |
1126
+ | 0.1269 | 79300 | 1.1646 |
1127
+ | 0.1270 | 79400 | 1.7768 |
1128
+ | 0.1272 | 79500 | 1.1314 |
1129
+ | 0.1274 | 79600 | 1.2374 |
1130
+ | 0.1275 | 79700 | 1.2681 |
1131
+ | 0.1277 | 79800 | 1.2624 |
1132
+ | 0.1278 | 79900 | 1.6775 |
1133
+ | 0.1280 | 80000 | 1.3587 |
1134
+ | 0.1282 | 80100 | 1.7402 |
1135
+ | 0.1283 | 80200 | 1.5349 |
1136
+ | 0.1285 | 80300 | 0.8546 |
1137
+ | 0.1286 | 80400 | 1.3903 |
1138
+ | 0.1288 | 80500 | 1.0712 |
1139
+ | 0.1290 | 80600 | 1.6633 |
1140
+ | 0.1291 | 80700 | 1.4125 |
1141
+ | 0.1293 | 80800 | 0.6973 |
1142
+ | 0.1294 | 80900 | 1.1729 |
1143
+ | 0.1296 | 81000 | 1.2217 |
1144
+ | 0.1298 | 81100 | 1.3184 |
1145
+ | 0.1299 | 81200 | 1.2718 |
1146
+ | 0.1301 | 81300 | 1.1913 |
1147
+ | 0.1302 | 81400 | 1.4728 |
1148
+ | 0.1304 | 81500 | 1.1221 |
1149
+ | 0.1306 | 81600 | 1.235 |
1150
+ | 0.1307 | 81700 | 1.3497 |
1151
+ | 0.1309 | 81800 | 1.2361 |
1152
+ | 0.1310 | 81900 | 2.0015 |
1153
+ | 0.1312 | 82000 | 1.2259 |
1154
+ | 0.1314 | 82100 | 0.9236 |
1155
+ | 0.1315 | 82200 | 1.5339 |
1156
+ | 0.1317 | 82300 | 1.2036 |
1157
+ | 0.1318 | 82400 | 1.2631 |
1158
+ | 0.1320 | 82500 | 1.0858 |
1159
+ | 0.1322 | 82600 | 1.635 |
1160
+ | 0.1323 | 82700 | 1.285 |
1161
+ | 0.1325 | 82800 | 1.1209 |
1162
+ | 0.1326 | 82900 | 1.4032 |
1163
+ | 0.1328 | 83000 | 1.1279 |
1164
+ | 0.1330 | 83100 | 1.5145 |
1165
+ | 0.1331 | 83200 | 1.4923 |
1166
+ | 0.1333 | 83300 | 0.9845 |
1167
+ | 0.1334 | 83400 | 1.3847 |
1168
+ | 0.1336 | 83500 | 1.0149 |
1169
+ | 0.1338 | 83600 | 1.2644 |
1170
+ | 0.1339 | 83700 | 1.2981 |
1171
+ | 0.1341 | 83800 | 1.6903 |
1172
+ | 0.1342 | 83900 | 1.2846 |
1173
+ | 0.1344 | 84000 | 1.4647 |
1174
+ | 0.1346 | 84100 | 1.1213 |
1175
+ | 0.1347 | 84200 | 1.1379 |
1176
+ | 0.1349 | 84300 | 1.2793 |
1177
+ | 0.1350 | 84400 | 1.343 |
1178
+ | 0.1352 | 84500 | 1.8342 |
1179
+ | 0.1354 | 84600 | 1.0487 |
1180
+ | 0.1355 | 84700 | 1.1531 |
1181
+ | 0.1357 | 84800 | 0.8552 |
1182
+ | 0.1358 | 84900 | 1.1422 |
1183
+ | 0.1360 | 85000 | 1.0918 |
1184
+ | 0.1362 | 85100 | 1.2873 |
1185
+ | 0.1363 | 85200 | 1.547 |
1186
+ | 0.1365 | 85300 | 1.5094 |
1187
+ | 0.1366 | 85400 | 1.051 |
1188
+ | 0.1368 | 85500 | 0.9952 |
1189
+ | 0.1370 | 85600 | 1.1978 |
1190
+ | 0.1371 | 85700 | 1.5221 |
1191
+ | 0.1373 | 85800 | 1.3841 |
1192
+ | 0.1374 | 85900 | 1.3999 |
1193
+ | 0.1376 | 86000 | 1.5574 |
1194
+ | 0.1378 | 86100 | 1.3267 |
1195
+ | 0.1379 | 86200 | 1.358 |
1196
+ | 0.1381 | 86300 | 1.5441 |
1197
+ | 0.1382 | 86400 | 1.4124 |
1198
+ | 0.1384 | 86500 | 0.8352 |
1199
+ | 0.1386 | 86600 | 1.2549 |
1200
+ | 0.1387 | 86700 | 1.4328 |
1201
+ | 0.1389 | 86800 | 1.2577 |
1202
+ | 0.1390 | 86900 | 1.4417 |
1203
+ | 0.1392 | 87000 | 1.1927 |
1204
+ | 0.1394 | 87100 | 1.4435 |
1205
+ | 0.1395 | 87200 | 1.3579 |
1206
+ | 0.1397 | 87300 | 1.3883 |
1207
+ | 0.1398 | 87400 | 1.2645 |
1208
+ | 0.1400 | 87500 | 1.1366 |
1209
+ | 0.1402 | 87600 | 1.4566 |
1210
+ | 0.1403 | 87700 | 1.447 |
1211
+ | 0.1405 | 87800 | 1.0701 |
1212
+ | 0.1406 | 87900 | 1.3449 |
1213
+ | 0.1408 | 88000 | 1.4331 |
1214
+ | 0.1410 | 88100 | 1.3965 |
1215
+ | 0.1411 | 88200 | 1.347 |
1216
+ | 0.1413 | 88300 | 1.0262 |
1217
+ | 0.1414 | 88400 | 1.0787 |
1218
+ | 0.1416 | 88500 | 1.3829 |
1219
+ | 0.1418 | 88600 | 1.2001 |
1220
+ | 0.1419 | 88700 | 1.2407 |
1221
+ | 0.1421 | 88800 | 1.6291 |
1222
+ | 0.1422 | 88900 | 1.1502 |
1223
+ | 0.1424 | 89000 | 1.2155 |
1224
+ | 0.1426 | 89100 | 1.3381 |
1225
+ | 0.1427 | 89200 | 0.819 |
1226
+ | 0.1429 | 89300 | 1.0402 |
1227
+ | 0.1430 | 89400 | 1.1062 |
1228
+ | 0.1432 | 89500 | 1.6693 |
1229
+ | 0.1434 | 89600 | 1.1991 |
1230
+ | 0.1435 | 89700 | 1.3535 |
1231
+ | 0.1437 | 89800 | 1.6776 |
1232
+ | 0.1438 | 89900 | 1.2221 |
1233
+ | 0.1440 | 90000 | 1.0253 |
1234
+ | 0.1442 | 90100 | 1.0469 |
1235
+ | 0.1443 | 90200 | 1.2465 |
1236
+ | 0.1445 | 90300 | 1.4068 |
1237
+ | 0.1446 | 90400 | 1.5961 |
1238
+ | 0.1448 | 90500 | 1.0579 |
1239
+ | 0.1450 | 90600 | 0.941 |
1240
+ | 0.1451 | 90700 | 1.1861 |
1241
+ | 0.1453 | 90800 | 1.4697 |
1242
+ | 0.1454 | 90900 | 0.6486 |
1243
+ | 0.1456 | 91000 | 1.3865 |
1244
+ | 0.1458 | 91100 | 1.1494 |
1245
+ | 0.1459 | 91200 | 1.3623 |
1246
+ | 0.1461 | 91300 | 1.2193 |
1247
+ | 0.1462 | 91400 | 1.3003 |
1248
+ | 0.1464 | 91500 | 1.2608 |
1249
+ | 0.1466 | 91600 | 1.2544 |
1250
+ | 0.1467 | 91700 | 1.332 |
1251
+ | 0.1469 | 91800 | 1.3548 |
1252
+ | 0.1470 | 91900 | 1.54 |
1253
+ | 0.1472 | 92000 | 1.3125 |
1254
+ | 0.1474 | 92100 | 0.897 |
1255
+ | 0.1475 | 92200 | 1.1594 |
1256
+ | 0.1477 | 92300 | 0.9194 |
1257
+ | 0.1478 | 92400 | 1.2209 |
1258
+ | 0.1480 | 92500 | 1.0027 |
1259
+ | 0.1482 | 92600 | 1.4675 |
1260
+ | 0.1483 | 92700 | 1.3982 |
1261
+ | 0.1485 | 92800 | 0.8595 |
1262
+ | 0.1486 | 92900 | 1.572 |
1263
+ | 0.1488 | 93000 | 1.2832 |
1264
+ | 0.1490 | 93100 | 1.2838 |
1265
+ | 0.1491 | 93200 | 1.6535 |
1266
+ | 0.1493 | 93300 | 1.5996 |
1267
+ | 0.1494 | 93400 | 1.058 |
1268
+ | 0.1496 | 93500 | 1.3316 |
1269
+ | 0.1498 | 93600 | 0.8627 |
1270
+ | 0.1499 | 93700 | 1.4411 |
1271
+ | 0.1501 | 93800 | 0.9331 |
1272
+ | 0.1502 | 93900 | 1.0032 |
1273
+ | 0.1504 | 94000 | 1.2341 |
1274
+ | 0.1506 | 94100 | 1.3369 |
1275
+ | 0.1507 | 94200 | 1.2324 |
1276
+ | 0.1509 | 94300 | 1.6952 |
1277
+ | 0.1510 | 94400 | 1.2401 |
1278
+ | 0.1512 | 94500 | 1.2998 |
1279
+ | 0.1514 | 94600 | 1.1458 |
1280
+ | 0.1515 | 94700 | 1.0211 |
1281
+ | 0.1517 | 94800 | 0.9866 |
1282
+ | 0.1518 | 94900 | 1.3636 |
1283
+ | 0.1520 | 95000 | 1.1485 |
1284
+ | 0.1522 | 95100 | 0.7671 |
1285
+ | 0.1523 | 95200 | 1.0069 |
1286
+ | 0.1525 | 95300 | 1.1276 |
1287
+ | 0.1526 | 95400 | 1.4477 |
1288
+ | 0.1528 | 95500 | 0.9887 |
1289
+ | 0.1530 | 95600 | 1.065 |
1290
+ | 0.1531 | 95700 | 0.982 |
1291
+ | 0.1533 | 95800 | 1.1166 |
1292
+ | 0.1534 | 95900 | 1.3949 |
1293
+ | 0.1536 | 96000 | 1.4164 |
1294
+ | 0.1538 | 96100 | 1.7997 |
1295
+ | 0.1539 | 96200 | 1.3941 |
1296
+ | 0.1541 | 96300 | 1.0592 |
1297
+ | 0.1542 | 96400 | 1.1661 |
1298
+ | 0.1544 | 96500 | 1.5968 |
1299
+ | 0.1546 | 96600 | 1.2586 |
1300
+ | 0.1547 | 96700 | 1.5164 |
1301
+ | 0.1549 | 96800 | 1.5942 |
1302
+ | 0.1550 | 96900 | 0.6635 |
1303
+ | 0.1552 | 97000 | 1.3037 |
1304
+ | 0.1554 | 97100 | 1.3557 |
1305
+ | 0.1555 | 97200 | 1.0864 |
1306
+ | 0.1557 | 97300 | 1.3139 |
1307
+ | 0.1558 | 97400 | 0.7139 |
1308
+ | 0.1560 | 97500 | 1.1084 |
1309
+ | 0.1562 | 97600 | 1.2294 |
1310
+ | 0.1563 | 97700 | 0.9581 |
1311
+ | 0.1565 | 97800 | 1.2983 |
1312
+ | 0.1566 | 97900 | 1.8281 |
1313
+ | 0.1568 | 98000 | 1.2914 |
1314
+ | 0.1570 | 98100 | 0.8656 |
1315
+ | 0.1571 | 98200 | 1.3438 |
1316
+ | 0.1573 | 98300 | 1.465 |
1317
+ | 0.1574 | 98400 | 1.2253 |
1318
+ | 0.1576 | 98500 | 1.3481 |
1319
+ | 0.1578 | 98600 | 1.5131 |
1320
+ | 0.1579 | 98700 | 1.4852 |
1321
+ | 0.1581 | 98800 | 1.1317 |
1322
+ | 0.1582 | 98900 | 1.0395 |
1323
+ | 0.1584 | 99000 | 0.9256 |
1324
+ | 0.1586 | 99100 | 0.9774 |
1325
+ | 0.1587 | 99200 | 0.9756 |
1326
+ | 0.1589 | 99300 | 1.4885 |
1327
+ | 0.1590 | 99400 | 1.2373 |
1328
+ | 0.1592 | 99500 | 1.3868 |
1329
+ | 0.1594 | 99600 | 0.9238 |
1330
+ | 0.1595 | 99700 | 1.0793 |
1331
+ | 0.1597 | 99800 | 1.2405 |
1332
+ | 0.1598 | 99900 | 1.2417 |
1333
+ | 0.1600 | 100000 | 1.1264 |
1334
+ | 0.1602 | 100100 | 1.3042 |
1335
+ | 0.1603 | 100200 | 1.7169 |
1336
+ | 0.1605 | 100300 | 1.0939 |
1337
+ | 0.1606 | 100400 | 1.4 |
1338
+ | 0.1608 | 100500 | 1.1289 |
1339
+ | 0.1610 | 100600 | 1.26 |
1340
+ | 0.1611 | 100700 | 0.815 |
1341
+ | 0.1613 | 100800 | 0.9622 |
1342
+ | 0.1614 | 100900 | 1.0715 |
1343
+ | 0.1616 | 101000 | 1.4498 |
1344
+ | 0.1618 | 101100 | 1.2484 |
1345
+ | 0.1619 | 101200 | 1.5755 |
1346
+ | 0.1621 | 101300 | 1.3742 |
1347
+ | 0.1622 | 101400 | 1.6062 |
1348
+ | 0.1624 | 101500 | 1.6763 |
1349
+ | 0.1626 | 101600 | 1.5295 |
1350
+ | 0.1627 | 101700 | 1.3866 |
1351
+ | 0.1629 | 101800 | 1.1005 |
1352
+ | 0.1630 | 101900 | 0.818 |
1353
+ | 0.1632 | 102000 | 1.6994 |
1354
+ | 0.1634 | 102100 | 0.7468 |
1355
+ | 0.1635 | 102200 | 1.1504 |
1356
+ | 0.1637 | 102300 | 1.023 |
1357
+ | 0.1638 | 102400 | 1.1705 |
1358
+ | 0.1640 | 102500 | 1.2671 |
1359
+ | 0.1642 | 102600 | 1.1874 |
1360
+ | 0.1643 | 102700 | 1.0913 |
1361
+ | 0.1645 | 102800 | 1.3353 |
1362
+ | 0.1646 | 102900 | 1.1726 |
1363
+ | 0.1648 | 103000 | 0.9484 |
1364
+ | 0.1650 | 103100 | 1.1276 |
1365
+ | 0.1651 | 103200 | 1.6352 |
1366
+ | 0.1653 | 103300 | 1.1789 |
1367
+ | 0.1654 | 103400 | 1.2853 |
1368
+ | 0.1656 | 103500 | 1.3151 |
1369
+ | 0.1658 | 103600 | 1.1619 |
1370
+ | 0.1659 | 103700 | 1.2232 |
1371
+ | 0.1661 | 103800 | 0.8593 |
1372
+ | 0.1662 | 103900 | 0.8925 |
1373
+ | 0.1664 | 104000 | 1.3056 |
1374
+ | 0.1666 | 104100 | 1.7856 |
1375
+ | 0.1667 | 104200 | 0.7826 |
1376
+ | 0.1669 | 104300 | 0.8696 |
1377
+ | 0.1670 | 104400 | 1.0999 |
1378
+ | 0.1672 | 104500 | 0.9611 |
1379
+ | 0.1674 | 104600 | 1.2425 |
1380
+ | 0.1675 | 104700 | 0.8884 |
1381
+ | 0.1677 | 104800 | 1.0689 |
1382
+ | 0.1678 | 104900 | 1.0076 |
1383
+ | 0.1680 | 105000 | 1.3108 |
1384
+
1385
+ </details>
1386
+
1387
+ ### Framework Versions
1388
+ - Python: 3.8.10
1389
+ - Sentence Transformers: 3.1.1
1390
+ - Transformers: 4.45.2
1391
+ - PyTorch: 2.4.1+cu118
1392
+ - Accelerate: 1.0.1
1393
+ - Datasets: 3.0.1
1394
+ - Tokenizers: 0.20.3
1395
+
1396
+ ## Citation
1397
+
1398
+ ### BibTeX
1399
+
1400
+ #### Sentence Transformers
1401
+ ```bibtex
1402
+ @inproceedings{reimers-2019-sentence-bert,
1403
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1404
+ author = "Reimers, Nils and Gurevych, Iryna",
1405
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1406
+ month = "11",
1407
+ year = "2019",
1408
+ publisher = "Association for Computational Linguistics",
1409
+ url = "https://arxiv.org/abs/1908.10084",
1410
+ }
1411
+ ```
1412
+
1413
+ #### CoSENTLoss
1414
+ ```bibtex
1415
+ @online{kexuefm-8847,
1416
+ title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
1417
+ author={Su Jianlin},
1418
+ year={2022},
1419
+ month={Jan},
1420
+ url={https://kexue.fm/archives/8847},
1421
+ }
1422
+ ```
1423
+
1424
+ <!--
1425
+ ## Glossary
1426
+
1427
+ *Clearly define terms in order to be accessible across audiences.*
1428
+ -->
1429
+
1430
+ <!--
1431
+ ## Model Card Authors
1432
+
1433
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1434
+ -->
1435
+
1436
+ <!--
1437
+ ## Model Card Contact
1438
+
1439
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1440
+ -->
checkpoint-105000/config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "sentence-transformers/all-MiniLM-L6-v2",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 384,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 1536,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 6,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.45.2",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
checkpoint-105000/config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.1.1",
4
+ "transformers": "4.45.2",
5
+ "pytorch": "2.4.1+cu118"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": null
10
+ }
checkpoint-105000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7521108161229c69c10b071c6fb3c53d3059eb2ddcc26dfb6a215dac53a9860
3
+ size 90864192
checkpoint-105000/modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
checkpoint-105000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c43dd30db075b1d69060eecbf761abfb6679dc660238ff82e77dcd659c5147d8
3
+ size 180607738
checkpoint-105000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:46c92391d3d05318980565381a01de2dc3740bacf95163c16184f482a66edcc2
3
+ size 14244
checkpoint-105000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:60cefc58e2dc8dd36b80d6052135b1059059a54387569fab807bde2b7b1f4ef7
3
+ size 1064
checkpoint-105000/sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 256,
3
+ "do_lower_case": false
4
+ }
checkpoint-105000/special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
checkpoint-105000/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-105000/tokenizer_config.json ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": false,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "mask_token": "[MASK]",
49
+ "max_length": 128,
50
+ "model_max_length": 256,
51
+ "never_split": null,
52
+ "pad_to_multiple_of": null,
53
+ "pad_token": "[PAD]",
54
+ "pad_token_type_id": 0,
55
+ "padding_side": "right",
56
+ "sep_token": "[SEP]",
57
+ "stride": 0,
58
+ "strip_accents": null,
59
+ "tokenize_chinese_chars": true,
60
+ "tokenizer_class": "BertTokenizer",
61
+ "truncation_side": "right",
62
+ "truncation_strategy": "longest_first",
63
+ "unk_token": "[UNK]"
64
+ }
checkpoint-105000/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-105000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:14eb8a69d6ba5b2bb8d2148b585526b13da2e45effc438997c1d2d513d64b838
3
+ size 5496
checkpoint-105000/vocab.txt ADDED
The diff for this file is too large to render. See raw diff