youssefkhalil320's picture
Upload folder using huggingface_hub
ca29ee3 verified
|
raw
history blame
50 kB
metadata
base_model: sentence-transformers/all-MiniLM-L6-v2
datasets:
  - youssefkhalil320/pairs_three_scores_v5
language:
  - en
library_name: sentence-transformers
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:80000003
  - loss:CoSENTLoss
widget:
  - source_sentence: durable pvc swim ring
    sentences:
      - flaky croissant
      - urban shoes
      - warm drinks mug
  - source_sentence: iso mak retard capsules
    sentences:
      - savory baguette
      - shea butter body cream
      - softwheeled cruiser
  - source_sentence: love sandra potty
    sentences:
      - utensil holder
      - olive pants
      - headwear
  - source_sentence: dusky hair brush
    sentences:
      - back compartment laptop
      - rubber feet platter
      - honed blade knife
  - source_sentence: nkd skn
    sentences:
      - fruit fragrances nail polish remover
      - panini salmon
      - hand drawing bag

all-MiniLM-L6-v8-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_three_scores_v5 dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'nkd skn',
    'hand drawing bag',
    'panini salmon',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

pairs_three_scores_v5

  • Dataset: pairs_three_scores_v5 at 3d8c457
  • Size: 80,000,003 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.06 tokens
    • max: 12 tokens
    • min: 3 tokens
    • mean: 5.71 tokens
    • max: 13 tokens
    • min: 0.0
    • mean: 0.11
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    vanilla hair cream free of paraben hair mask 0.5
    nourishing shampoo cumin lemon tea 0.0
    safe materials pacifier facial serum 0.5
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_three_scores_v5

  • Dataset: pairs_three_scores_v5 at 3d8c457
  • Size: 20,000,001 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.21 tokens
    • max: 12 tokens
    • min: 3 tokens
    • mean: 5.75 tokens
    • max: 12 tokens
    • min: 0.0
    • mean: 0.11
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    teddy bear toy long lasting cat food 0.0
    eva hair treatment fresh pineapple 0.0
    soft wave hair conditioner hybrid seat bike 0.0
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss
0.0002 100 10.8792
0.0003 200 10.9284
0.0005 300 10.6466
0.0006 400 10.841
0.0008 500 10.8094
0.0010 600 10.4323
0.0011 700 10.3032
0.0013 800 10.4006
0.0014 900 10.4743
0.0016 1000 10.2334
0.0018 1100 10.0135
0.0019 1200 9.7874
0.0021 1300 9.7419
0.0022 1400 9.7412
0.0024 1500 9.4585
0.0026 1600 9.5339
0.0027 1700 9.4345
0.0029 1800 9.1733
0.0030 1900 8.9952
0.0032 2000 8.9669
0.0034 2100 8.8152
0.0035 2200 8.7936
0.0037 2300 8.6771
0.0038 2400 8.4648
0.0040 2500 8.5764
0.0042 2600 8.4587
0.0043 2700 8.2966
0.0045 2800 8.2329
0.0046 2900 8.1415
0.0048 3000 8.0404
0.0050 3100 7.9698
0.0051 3200 7.9205
0.0053 3300 7.8314
0.0054 3400 7.8369
0.0056 3500 7.6403
0.0058 3600 7.5842
0.0059 3700 7.5812
0.0061 3800 7.4335
0.0062 3900 7.4917
0.0064 4000 7.3204
0.0066 4100 7.2971
0.0067 4200 7.2233
0.0069 4300 7.2081
0.0070 4400 7.1364
0.0072 4500 7.0663
0.0074 4600 6.9601
0.0075 4700 6.9546
0.0077 4800 6.9019
0.0078 4900 6.8801
0.0080 5000 6.7734
0.0082 5100 6.7648
0.0083 5200 6.7498
0.0085 5300 6.6872
0.0086 5400 6.6264
0.0088 5500 6.579
0.0090 5600 6.6001
0.0091 5700 6.5971
0.0093 5800 6.4694
0.0094 5900 6.3983
0.0096 6000 6.4477
0.0098 6100 6.4308
0.0099 6200 6.4248
0.0101 6300 6.2642
0.0102 6400 6.2763
0.0104 6500 6.3878
0.0106 6600 6.2601
0.0107 6700 6.1789
0.0109 6800 6.1773
0.0110 6900 6.1439
0.0112 7000 6.1863
0.0114 7100 6.0513
0.0115 7200 6.0671
0.0117 7300 6.0212
0.0118 7400 6.0043
0.0120 7500 6.0166
0.0122 7600 5.9754
0.0123 7700 5.9211
0.0125 7800 5.7867
0.0126 7900 5.8534
0.0128 8000 5.7708
0.0130 8100 5.8328
0.0131 8200 5.7417
0.0133 8300 5.8097
0.0134 8400 5.7578
0.0136 8500 5.643
0.0138 8600 5.6401
0.0139 8700 5.6627
0.0141 8800 5.6167
0.0142 8900 5.6539
0.0144 9000 5.4513
0.0146 9100 5.4132
0.0147 9200 5.4714
0.0149 9300 5.4786
0.0150 9400 5.3928
0.0152 9500 5.4774
0.0154 9600 5.2881
0.0155 9700 5.3699
0.0157 9800 5.1483
0.0158 9900 5.3051
0.0160 10000 5.2546
0.0162 10100 5.2314
0.0163 10200 5.1783
0.0165 10300 5.2074
0.0166 10400 5.2825
0.0168 10500 5.1715
0.0170 10600 5.087
0.0171 10700 5.082
0.0173 10800 4.9111
0.0174 10900 5.0213
0.0176 11000 4.9898
0.0178 11100 4.7734
0.0179 11200 4.9511
0.0181 11300 5.0481
0.0182 11400 4.8441
0.0184 11500 4.873
0.0186 11600 4.9988
0.0187 11700 4.7653
0.0189 11800 4.804
0.0190 11900 4.8288
0.0192 12000 4.7053
0.0194 12100 4.6887
0.0195 12200 4.7832
0.0197 12300 4.6817
0.0198 12400 4.6252
0.0200 12500 4.5936
0.0202 12600 4.7452
0.0203 12700 4.5321
0.0205 12800 4.4964
0.0206 12900 4.4421
0.0208 13000 4.3782
0.0210 13100 4.5169
0.0211 13200 4.533
0.0213 13300 4.3725
0.0214 13400 4.2911
0.0216 13500 4.2261
0.0218 13600 4.2467
0.0219 13700 4.1558
0.0221 13800 4.2794
0.0222 13900 4.2383
0.0224 14000 4.1654
0.0226 14100 4.158
0.0227 14200 4.1299
0.0229 14300 4.1902
0.0230 14400 3.7853
0.0232 14500 4.0514
0.0234 14600 4.1655
0.0235 14700 4.051
0.0237 14800 4.078
0.0238 14900 4.1193
0.0240 15000 4.1536
0.0242 15100 3.935
0.0243 15200 3.9535
0.0245 15300 3.7051
0.0246 15400 3.8329
0.0248 15500 3.9412
0.0250 15600 3.6668
0.0251 15700 3.7758
0.0253 15800 3.8805
0.0254 15900 3.8848
0.0256 16000 3.75
0.0258 16100 3.5685
0.0259 16200 3.7016
0.0261 16300 4.0955
0.0262 16400 3.7577
0.0264 16500 3.7485
0.0266 16600 3.8263
0.0267 16700 3.6922
0.0269 16800 3.6568
0.0270 16900 3.7317
0.0272 17000 3.5089
0.0274 17100 3.7377
0.0275 17200 3.6206
0.0277 17300 3.3702
0.0278 17400 3.5126
0.0280 17500 3.4841
0.0282 17600 3.1464
0.0283 17700 3.7012
0.0285 17800 3.5802
0.0286 17900 3.4952
0.0288 18000 3.1174
0.0290 18100 3.3134
0.0291 18200 3.3578
0.0293 18300 3.0209
0.0294 18400 3.3796
0.0296 18500 3.2287
0.0298 18600 3.1537
0.0299 18700 2.9073
0.0301 18800 3.3444
0.0302 18900 3.1341
0.0304 19000 2.8862
0.0306 19100 3.2033
0.0307 19200 3.2764
0.0309 19300 3.0725
0.0310 19400 3.0436
0.0312 19500 3.3493
0.0314 19600 3.0141
0.0315 19700 2.779
0.0317 19800 3.3543
0.0318 19900 3.1526
0.0320 20000 2.7896
0.0322 20100 2.9398
0.0323 20200 3.1254
0.0325 20300 2.8832
0.0326 20400 3.0542
0.0328 20500 2.9722
0.0330 20600 2.9321
0.0331 20700 2.6448
0.0333 20800 3.4006
0.0334 20900 3.0022
0.0336 21000 2.6366
0.0338 21100 3.0112
0.0339 21200 2.7856
0.0341 21300 3.0967
0.0342 21400 2.8754
0.0344 21500 3.1269
0.0346 21600 2.8235
0.0347 21700 2.4912
0.0349 21800 2.5079
0.0350 21900 3.2942
0.0352 22000 2.4184
0.0354 22100 2.782
0.0355 22200 2.7652
0.0357 22300 3.113
0.0358 22400 2.7451
0.0360 22500 2.7473
0.0362 22600 2.5116
0.0363 22700 2.8531
0.0365 22800 2.9171
0.0366 22900 2.7954
0.0368 23000 2.5376
0.0370 23100 3.2488
0.0371 23200 2.6131
0.0373 23300 3.1343
0.0374 23400 2.3159
0.0376 23500 2.4225
0.0378 23600 2.5034
0.0379 23700 3.0067
0.0381 23800 2.313
0.0382 23900 2.5363
0.0384 24000 2.7929
0.0386 24100 2.617
0.0387 24200 2.9711
0.0389 24300 2.7726
0.0390 24400 2.5849
0.0392 24500 2.3231
0.0394 24600 2.2477
0.0395 24700 2.5487
0.0397 24800 2.5175
0.0398 24900 2.6758
0.0400 25000 2.7313
0.0402 25100 2.4846
0.0403 25200 2.8697
0.0405 25300 2.5289
0.0406 25400 2.235
0.0408 25500 2.5028
0.0410 25600 2.6295
0.0411 25700 2.6159
0.0413 25800 2.4447
0.0414 25900 2.7233
0.0416 26000 2.5651
0.0418 26100 2.1317
0.0419 26200 2.6157
0.0421 26300 2.7385
0.0422 26400 2.4642
0.0424 26500 2.0621
0.0426 26600 2.3864
0.0427 26700 2.6951
0.0429 26800 2.2628
0.0430 26900 2.7538
0.0432 27000 2.6871
0.0434 27100 2.2453
0.0435 27200 1.6334
0.0437 27300 2.666
0.0438 27400 2.128
0.0440 27500 2.7573
0.0442 27600 2.5276
0.0443 27700 2.2438
0.0445 27800 2.3156
0.0446 27900 2.1735
0.0448 28000 2.1733
0.0450 28100 2.4094
0.0451 28200 2.8484
0.0453 28300 2.4507
0.0454 28400 2.6822
0.0456 28500 2.1191
0.0458 28600 2.0696
0.0459 28700 2.4027
0.0461 28800 1.7958
0.0462 28900 2.5874
0.0464 29000 2.2679
0.0466 29100 2.6394
0.0467 29200 1.7998
0.0469 29300 2.6834
0.0470 29400 2.1242
0.0472 29500 2.0039
0.0474 29600 2.018
0.0475 29700 2.9357
0.0477 29800 2.1914
0.0478 29900 2.0968
0.0480 30000 1.9762
0.0482 30100 2.1436
0.0483 30200 2.1919
0.0485 30300 1.9683
0.0486 30400 2.3543
0.0488 30500 2.0642
0.0490 30600 1.8447
0.0491 30700 2.3467
0.0493 30800 2.6461
0.0494 30900 2.028
0.0496 31000 1.4188
0.0498 31100 2.7219
0.0499 31200 2.2345
0.0501 31300 2.201
0.0502 31400 2.092
0.0504 31500 2.2871
0.0506 31600 2.0167
0.0507 31700 1.9175
0.0509 31800 2.2229
0.0510 31900 2.1196
0.0512 32000 2.2192
0.0514 32100 1.6462
0.0515 32200 2.099
0.0517 32300 2.0914
0.0518 32400 2.3295
0.0520 32500 2.256
0.0522 32600 1.7662
0.0523 32700 1.7234
0.0525 32800 1.984
0.0526 32900 2.1815
0.0528 33000 1.4987
0.0530 33100 2.0034
0.0531 33200 2.6008
0.0533 33300 2.4585
0.0534 33400 1.881
0.0536 33500 1.8738
0.0538 33600 1.9726
0.0539 33700 2.3734
0.0541 33800 1.6898
0.0542 33900 2.2171
0.0544 34000 1.4453
0.0546 34100 1.5057
0.0547 34200 2.1497
0.0549 34300 1.8618
0.0550 34400 1.7878
0.0552 34500 1.8199
0.0554 34600 2.1649
0.0555 34700 1.7906
0.0557 34800 1.6816
0.0558 34900 2.1464
0.0560 35000 2.0039
0.0562 35100 1.735
0.0563 35200 1.853
0.0565 35300 1.6068
0.0566 35400 1.6349
0.0568 35500 1.9571
0.0570 35600 1.5854
0.0571 35700 1.9756
0.0573 35800 1.9816
0.0574 35900 1.6758
0.0576 36000 2.2583
0.0578 36100 1.7584
0.0579 36200 1.9894
0.0581 36300 2.3922
0.0582 36400 2.0077
0.0584 36500 2.3684
0.0586 36600 2.1103
0.0587 36700 2.0728
0.0589 36800 1.9364
0.0590 36900 2.5203
0.0592 37000 1.8473
0.0594 37100 1.8076
0.0595 37200 2.0157
0.0597 37300 2.1587
0.0598 37400 1.9825
0.0600 37500 2.0693
0.0602 37600 1.5505
0.0603 37700 1.5472
0.0605 37800 2.0568
0.0606 37900 1.9219
0.0608 38000 2.091
0.0610 38100 2.0523
0.0611 38200 1.7628
0.0613 38300 1.8753
0.0614 38400 1.846
0.0616 38500 1.803
0.0618 38600 2.1226
0.0619 38700 2.0906
0.0621 38800 1.4321
0.0622 38900 2.5214
0.0624 39000 1.5412
0.0626 39100 1.4382
0.0627 39200 1.8417
0.0629 39300 2.1105
0.0630 39400 1.6347
0.0632 39500 2.0372
0.0634 39600 1.6222
0.0635 39700 1.8033
0.0637 39800 1.9847
0.0638 39900 2.1354
0.0640 40000 1.6792
0.0642 40100 2.1055
0.0643 40200 2.0657
0.0645 40300 1.9618
0.0646 40400 1.5807
0.0648 40500 1.6451
0.0650 40600 2.1299
0.0651 40700 1.9912
0.0653 40800 1.6392
0.0654 40900 1.8049
0.0656 41000 1.9832
0.0658 41100 2.0309
0.0659 41200 1.8362
0.0661 41300 2.2709
0.0662 41400 2.0785
0.0664 41500 1.5627
0.0666 41600 1.6058
0.0667 41700 1.7099
0.0669 41800 1.7096
0.0670 41900 1.6429
0.0672 42000 1.2514
0.0674 42100 1.5746
0.0675 42200 1.7186
0.0677 42300 1.8152
0.0678 42400 1.705
0.0680 42500 1.6779
0.0682 42600 1.8157
0.0683 42700 1.8464
0.0685 42800 1.748
0.0686 42900 1.6836
0.0688 43000 1.65
0.0690 43100 1.5632
0.0691 43200 2.0987
0.0693 43300 1.5783
0.0694 43400 1.8029
0.0696 43500 1.7154
0.0698 43600 1.663
0.0699 43700 1.4403
0.0701 43800 1.6513
0.0702 43900 2.2041
0.0704 44000 2.3908
0.0706 44100 1.7153
0.0707 44200 2.2112
0.0709 44300 1.8663
0.0710 44400 1.8206
0.0712 44500 2.2269
0.0714 44600 1.8159
0.0715 44700 1.9257
0.0717 44800 2.087
0.0718 44900 1.3623
0.0720 45000 1.5747
0.0722 45100 1.8051
0.0723 45200 2.3691
0.0725 45300 2.1125
0.0726 45400 1.566
0.0728 45500 1.5042
0.0730 45600 1.9469
0.0731 45700 1.9346
0.0733 45800 1.4362
0.0734 45900 1.9164
0.0736 46000 1.511
0.0738 46100 1.4523
0.0739 46200 1.1247
0.0741 46300 1.9694
0.0742 46400 2.1909
0.0744 46500 2.0247
0.0746 46600 1.2061
0.0747 46700 1.6151
0.0749 46800 1.6184
0.0750 46900 2.0375
0.0752 47000 1.8357
0.0754 47100 1.7605
0.0755 47200 2.1139
0.0757 47300 1.2971
0.0758 47400 1.7242
0.0760 47500 1.2726
0.0762 47600 1.9947
0.0763 47700 2.2796
0.0765 47800 1.6232
0.0766 47900 1.3513
0.0768 48000 1.291
0.0770 48100 1.5954
0.0771 48200 1.6232
0.0773 48300 1.8858
0.0774 48400 1.6235
0.0776 48500 1.9061
0.0778 48600 1.5919
0.0779 48700 1.8474
0.0781 48800 1.7112
0.0782 48900 1.8007
0.0784 49000 1.7499
0.0786 49100 1.4046
0.0787 49200 2.0843
0.0789 49300 1.52
0.0790 49400 1.8708
0.0792 49500 1.673
0.0794 49600 1.8457
0.0795 49700 1.5627
0.0797 49800 1.6497
0.0798 49900 1.5787
0.0800 50000 1.8507
0.0802 50100 1.4336
0.0803 50200 2.152
0.0805 50300 1.6311
0.0806 50400 1.7442
0.0808 50500 1.8063
0.0810 50600 1.4
0.0811 50700 1.6401
0.0813 50800 1.9426
0.0814 50900 2.0937
0.0816 51000 1.8187
0.0818 51100 2.1751
0.0819 51200 2.1703
0.0821 51300 1.4443
0.0822 51400 1.9266
0.0824 51500 1.8226
0.0826 51600 1.4394
0.0827 51700 1.052
0.0829 51800 1.0614
0.0830 51900 1.4591
0.0832 52000 1.6479
0.0834 52100 1.7548
0.0835 52200 1.6293
0.0837 52300 1.7183
0.0838 52400 1.2329
0.0840 52500 1.5292
0.0842 52600 1.6752
0.0843 52700 1.3228
0.0845 52800 1.485
0.0846 52900 1.4228
0.0848 53000 1.1385
0.0850 53100 1.1812
0.0851 53200 1.4763
0.0853 53300 1.9444
0.0854 53400 1.5316
0.0856 53500 1.6928
0.0858 53600 1.4466
0.0859 53700 1.438
0.0861 53800 1.1629
0.0862 53900 1.3017
0.0864 54000 1.6614
0.0866 54100 1.4535
0.0867 54200 1.7061
0.0869 54300 1.4681
0.0870 54400 1.3449
0.0872 54500 1.8814
0.0874 54600 1.5989
0.0875 54700 1.3711
0.0877 54800 1.3199
0.0878 54900 1.3713
0.0880 55000 1.441
0.0882 55100 1.268
0.0883 55200 1.1648
0.0885 55300 1.8108
0.0886 55400 1.4904
0.0888 55500 1.2555
0.0890 55600 1.2733
0.0891 55700 1.5194
0.0893 55800 1.7587
0.0894 55900 1.6183
0.0896 56000 1.3596
0.0898 56100 1.5248
0.0899 56200 1.5177
0.0901 56300 1.7579
0.0902 56400 1.5508
0.0904 56500 1.5965
0.0906 56600 1.5762
0.0907 56700 1.7441
0.0909 56800 2.0257
0.0910 56900 1.1371
0.0912 57000 1.8825
0.0914 57100 1.0455
0.0915 57200 1.5889
0.0917 57300 1.192
0.0918 57400 1.5374
0.0920 57500 1.6236
0.0922 57600 1.8945
0.0923 57700 1.607
0.0925 57800 1.8133
0.0926 57900 1.5777
0.0928 58000 1.5043
0.0930 58100 1.7681
0.0931 58200 1.623
0.0933 58300 2.2137
0.0934 58400 2.2447
0.0936 58500 2.3013
0.0938 58600 1.3105
0.0939 58700 1.4461
0.0941 58800 2.1321
0.0942 58900 1.7541
0.0944 59000 1.7894
0.0946 59100 1.693
0.0947 59200 1.7073
0.0949 59300 2.0305
0.0950 59400 1.3684
0.0952 59500 1.8754
0.0954 59600 2.0225
0.0955 59700 2.1975
0.0957 59800 1.7173
0.0958 59900 1.4302
0.0960 60000 1.2497
0.0962 60100 1.4058
0.0963 60200 1.0956
0.0965 60300 1.3731
0.0966 60400 1.2953
0.0968 60500 1.0987
0.0970 60600 1.5104
0.0971 60700 1.5224
0.0973 60800 1.3982
0.0974 60900 1.2785
0.0976 61000 1.6018
0.0978 61100 1.4968
0.0979 61200 1.2423
0.0981 61300 1.9973
0.0982 61400 1.2149
0.0984 61500 1.731
0.0986 61600 1.2889
0.0987 61700 1.856
0.0989 61800 0.8942
0.0990 61900 1.3371
0.0992 62000 1.5222
0.0994 62100 1.5435
0.0995 62200 1.1172
0.0997 62300 1.6024
0.0998 62400 1.3914
0.1000 62500 1.4714
0.1002 62600 1.2922
0.1003 62700 1.4263
0.1005 62800 1.4586
0.1006 62900 1.6312
0.1008 63000 1.9607
0.1010 63100 1.5771
0.1011 63200 1.6721
0.1013 63300 1.8461
0.1014 63400 1.5256
0.1016 63500 1.9736
0.1018 63600 1.4735
0.1019 63700 1.4619
0.1021 63800 1.6571
0.1022 63900 1.5888
0.1024 64000 2.0457
0.1026 64100 1.7843
0.1027 64200 1.5116
0.1029 64300 1.6682
0.1030 64400 1.2137
0.1032 64500 1.1308
0.1034 64600 2.031
0.1035 64700 1.6903
0.1037 64800 1.3365
0.1038 64900 1.5736
0.1040 65000 1.7264
0.1042 65100 1.1781
0.1043 65200 1.2503
0.1045 65300 0.9432
0.1046 65400 1.264
0.1048 65500 1.2086
0.1050 65600 1.8692
0.1051 65700 1.2745
0.1053 65800 1.6839
0.1054 65900 1.4509
0.1056 66000 1.1615
0.1058 66100 1.4458
0.1059 66200 1.8329
0.1061 66300 1.567
0.1062 66400 1.6746
0.1064 66500 1.65
0.1066 66600 1.5497
0.1067 66700 1.4009
0.1069 66800 2.058
0.1070 66900 1.6306
0.1072 67000 1.4377
0.1074 67100 1.4501
0.1075 67200 1.2648
0.1077 67300 1.3186
0.1078 67400 1.1313
0.1080 67500 2.2523
0.1082 67600 1.9146
0.1083 67700 1.7334
0.1085 67800 1.7195
0.1086 67900 1.4661
0.1088 68000 1.3503
0.1090 68100 1.0129
0.1091 68200 1.6036
0.1093 68300 0.9312
0.1094 68400 1.5817
0.1096 68500 1.2024
0.1098 68600 0.985
0.1099 68700 1.1712
0.1101 68800 1.5874
0.1102 68900 1.8551
0.1104 69000 1.232
0.1106 69100 1.4688
0.1107 69200 1.1107
0.1109 69300 1.6495
0.1110 69400 1.6278
0.1112 69500 1.7135
0.1114 69600 1.5108
0.1115 69700 1.4056
0.1117 69800 0.9324
0.1118 69900 1.3613
0.1120 70000 1.5283
0.1122 70100 1.3809
0.1123 70200 1.5552
0.1125 70300 1.4567
0.1126 70400 1.4404
0.1128 70500 1.1805
0.1130 70600 2.514
0.1131 70700 1.4821
0.1133 70800 1.5156
0.1134 70900 1.5925
0.1136 71000 1.9517
0.1138 71100 1.2685
0.1139 71200 1.6314
0.1141 71300 1.5252
0.1142 71400 1.5176
0.1144 71500 1.3461
0.1146 71600 1.3832
0.1147 71700 1.2962
0.1149 71800 1.5179
0.1150 71900 1.1041
0.1152 72000 1.5031
0.1154 72100 1.5412
0.1155 72200 1.2971
0.1157 72300 1.0979
0.1158 72400 1.307
0.1160 72500 1.3418
0.1162 72600 1.7298
0.1163 72700 1.68
0.1165 72800 1.3106
0.1166 72900 1.0954
0.1168 73000 1.5994
0.1170 73100 1.5953
0.1171 73200 1.9498
0.1173 73300 0.9937
0.1174 73400 1.4753
0.1176 73500 1.417
0.1178 73600 1.596
0.1179 73700 1.8794
0.1181 73800 1.3118
0.1182 73900 1.732
0.1184 74000 1.4504
0.1186 74100 1.0878
0.1187 74200 1.2488
0.1189 74300 1.3887
0.1190 74400 1.2265
0.1192 74500 1.4668
0.1194 74600 1.6258
0.1195 74700 1.9551
0.1197 74800 1.1811
0.1198 74900 1.2119
0.1200 75000 1.4051
0.1202 75100 1.2587
0.1203 75200 1.4563
0.1205 75300 1.5581
0.1206 75400 1.5457
0.1208 75500 1.2675
0.1210 75600 1.0948
0.1211 75700 1.2045
0.1213 75800 1.5964
0.1214 75900 1.0517
0.1216 76000 1.2883
0.1218 76100 1.2276
0.1219 76200 1.2463
0.1221 76300 1.241
0.1222 76400 1.8648
0.1224 76500 1.4848
0.1226 76600 1.413
0.1227 76700 1.594
0.1229 76800 1.3682
0.1230 76900 1.159
0.1232 77000 1.4702
0.1234 77100 1.3251
0.1235 77200 1.0538
0.1237 77300 1.1708
0.1238 77400 1.2864
0.1240 77500 1.6501
0.1242 77600 1.0104
0.1243 77700 1.7969
0.1245 77800 1.0293
0.1246 77900 1.5593
0.1248 78000 0.9902
0.1250 78100 1.058
0.1251 78200 1.4039
0.1253 78300 1.008
0.1254 78400 1.4593
0.1256 78500 1.563
0.1258 78600 1.1569
0.1259 78700 1.3886
0.1261 78800 1.061
0.1262 78900 1.2085
0.1264 79000 1.8553
0.1266 79100 1.7144
0.1267 79200 1.2216
0.1269 79300 1.1646
0.1270 79400 1.7768
0.1272 79500 1.1314
0.1274 79600 1.2374
0.1275 79700 1.2681
0.1277 79800 1.2624
0.1278 79900 1.6775
0.1280 80000 1.3587
0.1282 80100 1.7402
0.1283 80200 1.5349
0.1285 80300 0.8546
0.1286 80400 1.3903
0.1288 80500 1.0712
0.1290 80600 1.6633
0.1291 80700 1.4125
0.1293 80800 0.6973
0.1294 80900 1.1729
0.1296 81000 1.2217
0.1298 81100 1.3184
0.1299 81200 1.2718
0.1301 81300 1.1913
0.1302 81400 1.4728
0.1304 81500 1.1221
0.1306 81600 1.235
0.1307 81700 1.3497
0.1309 81800 1.2361
0.1310 81900 2.0015
0.1312 82000 1.2259
0.1314 82100 0.9236
0.1315 82200 1.5339
0.1317 82300 1.2036
0.1318 82400 1.2631
0.1320 82500 1.0858
0.1322 82600 1.635
0.1323 82700 1.285
0.1325 82800 1.1209
0.1326 82900 1.4032
0.1328 83000 1.1279
0.1330 83100 1.5145
0.1331 83200 1.4923
0.1333 83300 0.9845
0.1334 83400 1.3847
0.1336 83500 1.0149
0.1338 83600 1.2644
0.1339 83700 1.2981
0.1341 83800 1.6903
0.1342 83900 1.2846
0.1344 84000 1.4647
0.1346 84100 1.1213
0.1347 84200 1.1379
0.1349 84300 1.2793
0.1350 84400 1.343
0.1352 84500 1.8342
0.1354 84600 1.0487
0.1355 84700 1.1531
0.1357 84800 0.8552
0.1358 84900 1.1422
0.1360 85000 1.0918
0.1362 85100 1.2873
0.1363 85200 1.547
0.1365 85300 1.5094
0.1366 85400 1.051
0.1368 85500 0.9952
0.1370 85600 1.1978
0.1371 85700 1.5221
0.1373 85800 1.3841
0.1374 85900 1.3999
0.1376 86000 1.5574
0.1378 86100 1.3267
0.1379 86200 1.358
0.1381 86300 1.5441
0.1382 86400 1.4124
0.1384 86500 0.8352
0.1386 86600 1.2549
0.1387 86700 1.4328
0.1389 86800 1.2577
0.1390 86900 1.4417
0.1392 87000 1.1927
0.1394 87100 1.4435
0.1395 87200 1.3579
0.1397 87300 1.3883
0.1398 87400 1.2645
0.1400 87500 1.1366
0.1402 87600 1.4566
0.1403 87700 1.447
0.1405 87800 1.0701
0.1406 87900 1.3449
0.1408 88000 1.4331
0.1410 88100 1.3965
0.1411 88200 1.347
0.1413 88300 1.0262
0.1414 88400 1.0787
0.1416 88500 1.3829
0.1418 88600 1.2001
0.1419 88700 1.2407
0.1421 88800 1.6291
0.1422 88900 1.1502
0.1424 89000 1.2155
0.1426 89100 1.3381
0.1427 89200 0.819
0.1429 89300 1.0402
0.1430 89400 1.1062
0.1432 89500 1.6693
0.1434 89600 1.1991
0.1435 89700 1.3535
0.1437 89800 1.6776
0.1438 89900 1.2221
0.1440 90000 1.0253
0.1442 90100 1.0469
0.1443 90200 1.2465
0.1445 90300 1.4068
0.1446 90400 1.5961
0.1448 90500 1.0579
0.1450 90600 0.941
0.1451 90700 1.1861
0.1453 90800 1.4697
0.1454 90900 0.6486
0.1456 91000 1.3865
0.1458 91100 1.1494
0.1459 91200 1.3623
0.1461 91300 1.2193
0.1462 91400 1.3003
0.1464 91500 1.2608
0.1466 91600 1.2544
0.1467 91700 1.332
0.1469 91800 1.3548
0.1470 91900 1.54
0.1472 92000 1.3125
0.1474 92100 0.897
0.1475 92200 1.1594
0.1477 92300 0.9194
0.1478 92400 1.2209
0.1480 92500 1.0027
0.1482 92600 1.4675
0.1483 92700 1.3982
0.1485 92800 0.8595
0.1486 92900 1.572
0.1488 93000 1.2832
0.1490 93100 1.2838
0.1491 93200 1.6535
0.1493 93300 1.5996
0.1494 93400 1.058
0.1496 93500 1.3316
0.1498 93600 0.8627
0.1499 93700 1.4411
0.1501 93800 0.9331
0.1502 93900 1.0032
0.1504 94000 1.2341
0.1506 94100 1.3369
0.1507 94200 1.2324
0.1509 94300 1.6952
0.1510 94400 1.2401
0.1512 94500 1.2998
0.1514 94600 1.1458
0.1515 94700 1.0211
0.1517 94800 0.9866
0.1518 94900 1.3636
0.1520 95000 1.1485
0.1522 95100 0.7671
0.1523 95200 1.0069
0.1525 95300 1.1276
0.1526 95400 1.4477
0.1528 95500 0.9887
0.1530 95600 1.065
0.1531 95700 0.982
0.1533 95800 1.1166
0.1534 95900 1.3949
0.1536 96000 1.4164
0.1538 96100 1.7997
0.1539 96200 1.3941
0.1541 96300 1.0592
0.1542 96400 1.1661
0.1544 96500 1.5968
0.1546 96600 1.2586
0.1547 96700 1.5164
0.1549 96800 1.5942
0.1550 96900 0.6635
0.1552 97000 1.3037
0.1554 97100 1.3557
0.1555 97200 1.0864
0.1557 97300 1.3139
0.1558 97400 0.7139
0.1560 97500 1.1084
0.1562 97600 1.2294
0.1563 97700 0.9581
0.1565 97800 1.2983
0.1566 97900 1.8281
0.1568 98000 1.2914
0.1570 98100 0.8656
0.1571 98200 1.3438
0.1573 98300 1.465
0.1574 98400 1.2253
0.1576 98500 1.3481
0.1578 98600 1.5131
0.1579 98700 1.4852
0.1581 98800 1.1317
0.1582 98900 1.0395
0.1584 99000 0.9256
0.1586 99100 0.9774
0.1587 99200 0.9756
0.1589 99300 1.4885
0.1590 99400 1.2373
0.1592 99500 1.3868
0.1594 99600 0.9238
0.1595 99700 1.0793
0.1597 99800 1.2405
0.1598 99900 1.2417
0.1600 100000 1.1264
0.1602 100100 1.3042
0.1603 100200 1.7169
0.1605 100300 1.0939
0.1606 100400 1.4
0.1608 100500 1.1289

Framework Versions

  • Python: 3.8.10
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.2
  • PyTorch: 2.4.1+cu118
  • Accelerate: 1.0.1
  • Datasets: 3.0.1
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}