all-MiniLM-L6-v8-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_three_scores_v5 dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'activated charcoal shampoo',
    'belgian chocolate with cranberries',
    'saw palmetto oil conditioner',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

pairs_three_scores_v5

  • Dataset: pairs_three_scores_v5 at 3d8c457
  • Size: 80,000,003 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.06 tokens
    • max: 12 tokens
    • min: 3 tokens
    • mean: 5.71 tokens
    • max: 13 tokens
    • min: 0.0
    • mean: 0.11
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    vanilla hair cream free of paraben hair mask 0.5
    nourishing shampoo cumin lemon tea 0.0
    safe materials pacifier facial serum 0.5
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_three_scores_v5

  • Dataset: pairs_three_scores_v5 at 3d8c457
  • Size: 20,000,001 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.21 tokens
    • max: 12 tokens
    • min: 3 tokens
    • mean: 5.75 tokens
    • max: 12 tokens
    • min: 0.0
    • mean: 0.11
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    teddy bear toy long lasting cat food 0.0
    eva hair treatment fresh pineapple 0.0
    soft wave hair conditioner hybrid seat bike 0.0
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss
0.8682 542600 0.9424 -
0.8683 542700 0.968 -
0.8685 542800 0.9711 -
0.8686 542900 0.8668 -
0.8688 543000 0.7776 -
0.8690 543100 0.5938 -
0.8691 543200 0.7416 -
0.8693 543300 1.2082 -
0.8694 543400 0.5481 -
0.8696 543500 0.9383 -
0.8698 543600 0.699 -
0.8699 543700 0.6582 -
0.8701 543800 0.9237 -
0.8702 543900 0.5977 -
0.8704 544000 0.632 -
0.8706 544100 0.819 -
0.8707 544200 0.5877 -
0.8709 544300 0.725 -
0.8710 544400 0.6097 -
0.8712 544500 0.4883 -
0.8714 544600 0.7353 -
0.8715 544700 0.8596 -
0.8717 544800 0.4478 -
0.8718 544900 0.49 -
0.8720 545000 0.6203 -
0.8722 545100 0.6758 -
0.8723 545200 0.959 -
0.8725 545300 1.4645 -
0.8726 545400 0.7873 -
0.8728 545500 0.6609 -
0.8730 545600 0.8173 -
0.8731 545700 1.1416 -
0.8733 545800 0.8406 -
0.8734 545900 0.8781 -
0.8736 546000 0.6507 -
0.8738 546100 1.0332 -
0.8739 546200 0.9044 -
0.8741 546300 0.8617 -
0.8742 546400 0.7377 -
0.8744 546500 0.7668 -
0.8746 546600 0.6856 -
0.8747 546700 1.0069 -
0.8749 546800 0.5628 -
0.8750 546900 0.6661 -
0.8752 547000 0.8508 -
0.8754 547100 1.0595 -
0.8755 547200 0.8156 -
0.8757 547300 0.4535 -
0.8758 547400 0.7963 -
0.8760 547500 0.8964 -
0.8762 547600 0.6343 -
0.8763 547700 0.9795 -
0.8765 547800 1.0885 -
0.8766 547900 0.6333 -
0.8768 548000 1.0065 -
0.8770 548100 0.8155 -
0.8771 548200 0.8595 -
0.8773 548300 1.2973 -
0.8774 548400 0.9477 -
0.8776 548500 1.092 -
0.8778 548600 0.9252 -
0.8779 548700 0.6214 -
0.8781 548800 0.6302 -
0.8782 548900 0.4542 -
0.8784 549000 0.6626 -
0.8786 549100 0.6616 -
0.8787 549200 0.5897 -
0.8789 549300 0.5641 -
0.8790 549400 0.8449 -
0.8792 549500 0.7743 -
0.8794 549600 0.8134 -
0.8795 549700 1.0455 -
0.8797 549800 0.6805 -
0.8798 549900 0.6526 -
0.8800 550000 0.5727 -
0.8802 550100 0.5923 -
0.8803 550200 0.565 -
0.8805 550300 0.7745 -
0.8806 550400 0.757 -
0.8808 550500 0.624 -
0.8810 550600 0.9387 -
0.8811 550700 0.9718 -
0.8813 550800 0.9184 -
0.8814 550900 1.166 -
0.8816 551000 0.6725 -
0.8818 551100 0.9767 -
0.8819 551200 0.517 -
0.8821 551300 0.9126 -
0.8822 551400 1.3649 -
0.8824 551500 0.7633 -
0.8826 551600 0.6029 -
0.8827 551700 0.4923 -
0.8829 551800 0.7787 -
0.8830 551900 0.7437 -
0.8832 552000 0.7627 -
0.8834 552100 0.5648 -
0.8835 552200 0.5564 -
0.8837 552300 0.7376 -
0.8838 552400 0.8405 -
0.8840 552500 1.062 -
0.8842 552600 0.7844 -
0.8843 552700 0.6866 -
0.8845 552800 0.8532 -
0.8846 552900 0.4204 -
0.8848 553000 0.7487 -
0.8850 553100 0.5511 -
0.8851 553200 0.8468 -
0.8853 553300 0.9251 -
0.8854 553400 1.0542 -
0.8856 553500 0.9342 -
0.8858 553600 1.0533 -
0.8859 553700 0.7185 -
0.8861 553800 0.5811 -
0.8862 553900 0.8484 -
0.8864 554000 0.5215 -
0.8866 554100 0.4963 -
0.8867 554200 0.6594 -
0.8869 554300 0.5277 -
0.8870 554400 0.5139 -
0.8872 554500 0.9343 -
0.8874 554600 0.9857 -
0.8875 554700 0.7193 -
0.8877 554800 0.9394 -
0.8878 554900 0.91 -
0.8880 555000 0.601 -
0.8882 555100 0.4681 -
0.8883 555200 0.7671 -
0.8885 555300 0.4485 -
0.8886 555400 0.8482 -
0.8888 555500 0.8826 -
0.8890 555600 0.3257 -
0.8891 555700 0.6727 -
0.8893 555800 0.7098 -
0.8894 555900 0.7871 -
0.8896 556000 0.843 -
0.8898 556100 1.0637 -
0.8899 556200 0.4367 -
0.8901 556300 0.8833 -
0.8902 556400 0.6529 -
0.8904 556500 1.0443 -
0.8906 556600 1.1076 -
0.8907 556700 0.7762 -
0.8909 556800 0.5598 -
0.8910 556900 1.127 -
0.8912 557000 0.5471 -
0.8914 557100 0.869 -
0.8915 557200 0.8392 -
0.8917 557300 0.7547 -
0.8918 557400 0.5796 -
0.8920 557500 0.7446 -
0.8922 557600 0.9913 -
0.8923 557700 0.6937 -
0.8925 557800 0.5643 -
0.8926 557900 0.8484 -
0.8928 558000 0.8971 -
0.8930 558100 1.1094 -
0.8931 558200 0.6835 -
0.8933 558300 0.8818 -
0.8934 558400 1.1366 -
0.8936 558500 0.8887 -
0.8938 558600 0.5754 -
0.8939 558700 0.7777 -
0.8941 558800 0.9033 -
0.8942 558900 0.5612 -
0.8944 559000 0.7949 -
0.8946 559100 1.1081 -
0.8947 559200 0.6708 -
0.8949 559300 0.7044 -
0.8950 559400 0.6862 -
0.8952 559500 0.8251 -
0.8954 559600 0.7551 -
0.8955 559700 1.0697 -
0.8957 559800 0.9638 -
0.8958 559900 0.9433 -
0.8960 560000 0.7142 -
0.8962 560100 0.6868 -
0.8963 560200 1.0483 -
0.8965 560300 0.9234 -
0.8966 560400 1.0903 -
0.8968 560500 0.8217 -
0.8970 560600 0.6451 -
0.8971 560700 0.7644 -
0.8973 560800 0.518 -
0.8974 560900 0.6178 -
0.8976 561000 0.6031 -
0.8978 561100 0.7723 -
0.8979 561200 0.3514 -
0.8981 561300 0.8751 -
0.8982 561400 0.7787 -
0.8984 561500 0.7343 -
0.8986 561600 0.674 -
0.8987 561700 0.8175 -
0.8989 561800 0.7348 -
0.8990 561900 0.8423 -
0.8992 562000 1.004 -
0.8994 562100 0.9213 -
0.8995 562200 1.0421 -
0.8997 562300 0.658 -
0.8998 562400 0.8207 -
0.9000 562500 0.5178 -
0.9002 562600 0.7567 -
0.9003 562700 0.8287 -
0.9005 562800 0.5924 -
0.9006 562900 0.9151 -
0.9008 563000 0.9069 -
0.9010 563100 0.7985 -
0.9011 563200 0.7565 -
0.9013 563300 0.823 -
0.9014 563400 0.5373 -
0.9016 563500 0.5473 -
0.9018 563600 0.7323 -
0.9019 563700 0.691 -
0.9021 563800 0.8818 -
0.9022 563900 0.7351 -
0.9024 564000 1.0715 -
0.9026 564100 0.722 -
0.9027 564200 0.6988 -
0.9029 564300 1.1843 -
0.9030 564400 1.0299 -
0.9032 564500 1.119 -
0.9034 564600 0.6954 -
0.9035 564700 0.8773 -
0.9037 564800 0.7873 -
0.9038 564900 0.7503 -
0.9040 565000 1.1296 -
0.9042 565100 0.777 -
0.9043 565200 0.8982 -
0.9045 565300 0.9224 -
0.9046 565400 0.8596 -
0.9048 565500 0.5435 -
0.9050 565600 0.8324 -
0.9051 565700 0.4121 -
0.9053 565800 0.9148 -
0.9054 565900 0.4525 -
0.9056 566000 0.7476 -
0.9058 566100 0.8766 -
0.9059 566200 1.1123 -
0.9061 566300 0.8109 -
0.9062 566400 0.8251 -
0.9064 566500 0.9638 -
0.9066 566600 0.7842 -
0.9067 566700 0.8727 -
0.9069 566800 1.1777 -
0.9070 566900 1.435 -
0.9072 567000 0.7354 -
0.9074 567100 0.796 -
0.9075 567200 0.8451 -
0.9077 567300 0.479 -
0.9078 567400 0.5299 -
0.9080 567500 0.7735 -
0.9082 567600 1.1211 -
0.9083 567700 0.9364 -
0.9085 567800 0.5533 -
0.9086 567900 0.9091 -
0.9088 568000 0.7493 -
0.9090 568100 1.0247 -
0.9091 568200 0.4836 -
0.9093 568300 0.9966 -
0.9094 568400 0.8997 -
0.9096 568500 0.7764 -
0.9098 568600 0.7193 -
0.9099 568700 0.6184 -
0.9101 568800 0.9031 -
0.9102 568900 0.7061 -
0.9104 569000 1.0852 -
0.9106 569100 1.0778 -
0.9107 569200 0.6463 -
0.9109 569300 0.5569 -
0.9110 569400 0.5566 -
0.9112 569500 0.6489 -
0.9114 569600 0.8065 -
0.9115 569700 0.8123 -
0.9117 569800 0.5946 -
0.9118 569900 0.8424 -
0.9120 570000 0.8987 -
0.9122 570100 0.7266 -
0.9123 570200 0.7386 -
0.9125 570300 0.7266 -
0.9126 570400 0.8668 -
0.9128 570500 0.7228 -
0.9130 570600 0.6914 -
0.9131 570700 0.8578 -
0.9133 570800 0.7019 -
0.9134 570900 0.5146 -
0.9136 571000 0.8236 -
0.9138 571100 0.8977 -
0.9139 571200 0.6571 -
0.9141 571300 0.6818 -
0.9142 571400 0.6226 -
0.9144 571500 1.034 -
0.9146 571600 0.7306 -
0.9147 571700 1.4244 -
0.9149 571800 0.8388 -
0.9150 571900 0.5627 -
0.9152 572000 0.8637 -
0.9154 572100 1.0419 -
0.9155 572200 1.3043 -
0.9157 572300 0.4563 -
0.9158 572400 0.6826 -
0.9160 572500 0.801 -
0.9162 572600 0.82 -
0.9163 572700 0.4677 -
0.9165 572800 0.6236 -
0.9166 572900 0.7783 -
0.9168 573000 0.8284 -
0.9170 573100 0.8253 -
0.9171 573200 1.039 -
0.9173 573300 0.9098 -
0.9174 573400 1.068 -
0.9176 573500 0.6172 -
0.9178 573600 0.6679 -
0.9179 573700 0.8135 -
0.9181 573800 1.1141 -
0.9182 573900 0.8077 -
0.9184 574000 0.7952 -
0.9186 574100 0.8684 -
0.9187 574200 0.518 -
0.9189 574300 0.6675 -
0.9190 574400 0.9315 -
0.9192 574500 0.6984 -
0.9194 574600 0.7571 -
0.9195 574700 0.9037 -
0.9197 574800 0.5965 -
0.9198 574900 0.8542 -
0.9200 575000 0.8226 -
0.9202 575100 0.6057 -
0.9203 575200 0.7658 -
0.9205 575300 0.9765 -
0.9206 575400 0.9145 -
0.9208 575500 0.3843 -
0.9210 575600 0.8603 -
0.9211 575700 0.9048 -
0.9213 575800 0.7786 -
0.9214 575900 0.8639 -
0.9216 576000 0.8909 -
0.9218 576100 0.6091 -
0.9219 576200 0.4416 -
0.9221 576300 0.4569 -
0.9222 576400 0.6638 -
0.9224 576500 0.9033 -
0.9226 576600 0.5351 -
0.9227 576700 0.8799 -
0.9229 576800 1.212 -
0.9230 576900 0.7717 -
0.9232 577000 0.9058 -
0.9234 577100 0.9647 -
0.9235 577200 0.7648 -
0.9237 577300 0.8776 -
0.9238 577400 0.4155 -
0.9240 577500 0.5997 -
0.9242 577600 0.9836 -
0.9243 577700 0.7584 -
0.9245 577800 0.7656 -
0.9246 577900 0.7135 -
0.9248 578000 0.8408 -
0.9250 578100 0.9118 -
0.9251 578200 0.587 -
0.9253 578300 0.9372 -
0.9254 578400 0.674 -
0.9256 578500 0.7524 -
0.9258 578600 0.7039 -
0.9259 578700 0.7397 -
0.9261 578800 0.739 -
0.9262 578900 0.6249 -
0.9264 579000 0.7223 -
0.9266 579100 0.8787 -
0.9267 579200 0.6817 -
0.9269 579300 0.4517 -
0.9270 579400 0.9203 -
0.9272 579500 1.0586 -
0.9274 579600 0.4509 -
0.9275 579700 0.6122 -
0.9277 579800 0.8044 -
0.9278 579900 0.4963 -
0.9280 580000 0.5926 -
0.9282 580100 0.8616 -
0.9283 580200 0.79 -
0.9285 580300 1.1544 -
0.9286 580400 0.6989 -
0.9288 580500 1.3349 -
0.9290 580600 1.2488 -
0.9291 580700 1.171 -
0.9293 580800 0.5529 -
0.9294 580900 0.7977 -
0.9296 581000 0.6397 -
0.9298 581100 1.2556 -
0.9299 581200 0.8389 -
0.9301 581300 0.967 -
0.9302 581400 0.9108 -
0.9304 581500 0.927 -
0.9306 581600 0.8314 -
0.9307 581700 0.8189 -
0.9309 581800 0.5584 -
0.9310 581900 0.8506 -
0.9312 582000 0.9845 -
0.9314 582100 0.8159 -
0.9315 582200 0.6512 -
0.9317 582300 0.7216 -
0.9318 582400 0.7841 -
0.9320 582500 0.852 -
0.9322 582600 0.7754 -
0.9323 582700 0.6775 -
0.9325 582800 0.4598 -
0.9326 582900 0.625 -
0.9328 583000 1.1821 -
0.9330 583100 0.6845 -
0.9331 583200 0.8293 -
0.9333 583300 0.7485 -
0.9334 583400 1.0008 -
0.9336 583500 0.7762 -
0.9338 583600 0.5416 -
0.9339 583700 1.2784 -
0.9341 583800 0.9202 -
0.9342 583900 0.7189 -
0.9344 584000 1.0549 -
0.9346 584100 0.9661 -
0.9347 584200 0.5341 -
0.9349 584300 1.4547 -
0.9350 584400 1.0324 -
0.9352 584500 0.8276 -
0.9354 584600 0.3868 -
0.9355 584700 1.0488 -
0.9357 584800 0.9561 -
0.9358 584900 0.9193 -
0.9360 585000 0.9144 -
0.9362 585100 0.7702 -
0.9363 585200 0.798 -
0.9365 585300 0.5793 -
0.9366 585400 0.7867 -
0.9368 585500 0.8352 -
0.9370 585600 0.6128 -
0.9371 585700 0.734 -
0.9373 585800 0.5431 -
0.9374 585900 0.8416 -
0.9376 586000 0.8711 -
0.9378 586100 0.9059 -
0.9379 586200 0.5545 -
0.9381 586300 0.9609 -
0.9382 586400 0.579 -
0.9384 586500 1.1916 -
0.9386 586600 0.6305 -
0.9387 586700 0.9855 -
0.9389 586800 0.774 -
0.9390 586900 0.6012 -
0.9392 587000 0.7495 -
0.9394 587100 0.6666 -
0.9395 587200 0.8473 -
0.9397 587300 1.0324 -
0.9398 587400 0.6129 -
0.9400 587500 0.8905 -
0.9402 587600 0.6067 -
0.9403 587700 1.0607 -
0.9405 587800 0.6369 -
0.9406 587900 0.6892 -
0.9408 588000 0.6671 -
0.9410 588100 0.7971 -
0.9411 588200 0.7133 -
0.9413 588300 0.46 -
0.9414 588400 0.9073 -
0.9416 588500 0.9276 -
0.9418 588600 1.0273 -
0.9419 588700 0.6709 -
0.9421 588800 0.4284 -
0.9422 588900 0.8745 -
0.9424 589000 0.8677 -
0.9426 589100 0.867 -
0.9427 589200 0.6087 -
0.9429 589300 0.6777 -
0.9430 589400 0.6672 -
0.9432 589500 0.9492 -
0.9434 589600 0.6848 -
0.9435 589700 0.8975 -
0.9437 589800 0.3949 -
0.9438 589900 0.7469 -
0.9440 590000 0.7412 -
0.9442 590100 0.526 -
0.9443 590200 0.4228 -
0.9445 590300 0.9338 -
0.9446 590400 0.6516 -
0.9448 590500 0.9419 -
0.9450 590600 0.755 -
0.9451 590700 0.7699 -
0.9453 590800 0.8904 -
0.9454 590900 0.5596 -
0.9456 591000 0.9401 -
0.9458 591100 0.9583 -
0.9459 591200 0.6807 -
0.9461 591300 0.6972 -
0.9462 591400 0.7217 -
0.9464 591500 0.7406 -
0.9466 591600 0.5819 -
0.9467 591700 0.8508 -
0.9469 591800 0.5315 -
0.9470 591900 0.606 -
0.9472 592000 0.7971 -
0.9474 592100 1.0728 -
0.9475 592200 0.7283 -
0.9477 592300 0.5131 -
0.9478 592400 0.4695 -
0.9480 592500 0.2959 -
0.9482 592600 0.858 -
0.9483 592700 0.5761 -
0.9485 592800 0.9089 -
0.9486 592900 0.6238 -
0.9488 593000 0.5633 -
0.9490 593100 1.0323 -
0.9491 593200 0.6684 -
0.9493 593300 0.8563 -
0.9494 593400 0.7163 -
0.9496 593500 0.7814 -
0.9498 593600 0.4761 -
0.9499 593700 0.5203 -
0.9501 593800 0.9119 -
0.9502 593900 0.8535 -
0.9504 594000 1.0054 -
0.9506 594100 0.8794 -
0.9507 594200 0.6925 -
0.9509 594300 1.0048 -
0.9510 594400 0.7008 -
0.9512 594500 0.7092 -
0.9514 594600 0.803 -
0.9515 594700 0.7868 -
0.9517 594800 0.6047 -
0.9518 594900 0.6654 -
0.9520 595000 0.7418 -
0.9522 595100 1.0645 -
0.9523 595200 0.6193 -
0.9525 595300 0.7615 -
0.9526 595400 0.8291 -
0.9528 595500 0.8298 -
0.9530 595600 0.9187 -
0.9531 595700 0.6942 -
0.9533 595800 0.912 -
0.9534 595900 1.0213 -
0.9536 596000 0.9347 -
0.9538 596100 1.1183 -
0.9539 596200 0.78 -
0.9541 596300 0.8976 -
0.9542 596400 1.0957 -
0.9544 596500 0.8133 -
0.9546 596600 0.6568 -
0.9547 596700 0.8911 -
0.9549 596800 0.5183 -
0.9550 596900 0.7212 -
0.9552 597000 0.888 -
0.9554 597100 0.7661 -
0.9555 597200 0.6028 -
0.9557 597300 1.0602 -
0.9558 597400 0.7299 -
0.9560 597500 0.9885 -
0.9562 597600 0.8964 -
0.9563 597700 0.6961 -
0.9565 597800 0.6989 -
0.9566 597900 1.1453 -
0.9568 598000 0.4009 -
0.9570 598100 0.7645 -
0.9571 598200 0.9124 -
0.9573 598300 0.7354 -
0.9574 598400 0.803 -
0.9576 598500 1.2859 -
0.9578 598600 0.9726 -
0.9579 598700 0.5849 -
0.9581 598800 1.1357 -
0.9582 598900 0.904 -
0.9584 599000 0.6113 -
0.9586 599100 1.0399 -
0.9587 599200 0.8404 -
0.9589 599300 0.945 -
0.9590 599400 0.6225 -
0.9592 599500 0.8617 -
0.9594 599600 0.8782 -
0.9595 599700 0.9332 -
0.9597 599800 0.9949 -
0.9598 599900 0.7016 -
0.9600 600000 0.5833 0.7694
0.9602 600100 0.5462 -
0.9603 600200 0.8458 -
0.9605 600300 0.8256 -
0.9606 600400 0.8134 -
0.9608 600500 0.7465 -
0.9610 600600 1.0022 -
0.9611 600700 0.7794 -
0.9613 600800 0.8742 -
0.9614 600900 0.6161 -
0.9616 601000 1.1433 -
0.9618 601100 0.6988 -
0.9619 601200 0.8715 -
0.9621 601300 0.6198 -
0.9622 601400 0.896 -
0.9624 601500 0.5527 -
0.9626 601600 1.1485 -
0.9627 601700 0.8266 -
0.9629 601800 0.6972 -
0.9630 601900 0.5653 -
0.9632 602000 0.6448 -
0.9634 602100 0.9891 -
0.9635 602200 0.8991 -
0.9637 602300 0.8615 -
0.9638 602400 0.8568 -
0.9640 602500 0.7636 -
0.9642 602600 0.714 -
0.9643 602700 0.5237 -
0.9645 602800 1.1789 -
0.9646 602900 0.5586 -
0.9648 603000 0.5008 -
0.9650 603100 0.8864 -
0.9651 603200 0.8781 -
0.9653 603300 1.0112 -
0.9654 603400 0.9674 -
0.9656 603500 0.5763 -
0.9658 603600 0.4001 -
0.9659 603700 0.69 -
0.9661 603800 0.8321 -
0.9662 603900 0.8196 -
0.9664 604000 0.7085 -
0.9666 604100 0.8921 -
0.9667 604200 0.8983 -
0.9669 604300 1.0145 -
0.9670 604400 1.1885 -
0.9672 604500 0.7833 -
0.9674 604600 1.033 -
0.9675 604700 1.0585 -
0.9677 604800 0.856 -
0.9678 604900 0.4847 -
0.9680 605000 0.7013 -
0.9682 605100 0.7934 -
0.9683 605200 1.1386 -
0.9685 605300 0.6487 -
0.9686 605400 1.0657 -
0.9688 605500 0.432 -
0.9690 605600 0.822 -
0.9691 605700 1.0284 -
0.9693 605800 0.4082 -
0.9694 605900 0.9734 -
0.9696 606000 0.733 -
0.9698 606100 0.608 -
0.9699 606200 0.9526 -
0.9701 606300 0.837 -
0.9702 606400 0.8188 -
0.9704 606500 0.9309 -
0.9706 606600 0.7929 -
0.9707 606700 0.5051 -
0.9709 606800 0.9299 -
0.9710 606900 0.8015 -
0.9712 607000 0.6867 -
0.9714 607100 1.1677 -
0.9715 607200 0.7181 -
0.9717 607300 0.9442 -
0.9718 607400 0.663 -
0.9720 607500 0.7396 -
0.9722 607600 0.8251 -
0.9723 607700 0.6575 -
0.9725 607800 0.6674 -
0.9726 607900 0.7778 -
0.9728 608000 0.6021 -
0.9730 608100 0.9309 -
0.9731 608200 0.8329 -
0.9733 608300 0.9359 -
0.9734 608400 0.7212 -
0.9736 608500 1.0956 -
0.9738 608600 0.6235 -
0.9739 608700 0.6951 -
0.9741 608800 0.7357 -
0.9742 608900 0.427 -
0.9744 609000 1.3058 -
0.9746 609100 0.6824 -
0.9747 609200 0.7743 -
0.9749 609300 0.6551 -
0.9750 609400 0.5327 -
0.9752 609500 0.7648 -
0.9754 609600 0.6966 -
0.9755 609700 0.9422 -
0.9757 609800 1.1221 -
0.9758 609900 0.8919 -
0.9760 610000 0.5507 -
0.9762 610100 0.7228 -
0.9763 610200 0.7117 -
0.9765 610300 0.5439 -
0.9766 610400 1.0969 -
0.9768 610500 0.8394 -
0.9770 610600 1.4258 -
0.9771 610700 0.7213 -
0.9773 610800 0.8785 -
0.9774 610900 0.7981 -
0.9776 611000 0.526 -
0.9778 611100 0.6145 -
0.9779 611200 0.626 -
0.9781 611300 0.6958 -
0.9782 611400 0.7504 -
0.9784 611500 0.7285 -
0.9786 611600 1.0159 -
0.9787 611700 0.5826 -
0.9789 611800 0.7113 -
0.9790 611900 1.166 -
0.9792 612000 1.1578 -
0.9794 612100 0.7783 -
0.9795 612200 0.5356 -
0.9797 612300 0.9754 -
0.9798 612400 0.6884 -
0.9800 612500 0.6951 -
0.9802 612600 0.6126 -
0.9803 612700 0.5493 -
0.9805 612800 0.6776 -
0.9806 612900 0.5393 -
0.9808 613000 0.5629 -
0.9810 613100 0.7929 -
0.9811 613200 0.8572 -
0.9813 613300 1.056 -
0.9814 613400 0.6643 -
0.9816 613500 0.6809 -
0.9818 613600 0.8654 -
0.9819 613700 0.9761 -
0.9821 613800 1.0267 -
0.9822 613900 0.6882 -
0.9824 614000 0.6095 -
0.9826 614100 0.6508 -
0.9827 614200 0.8784 -
0.9829 614300 0.6203 -
0.9830 614400 1.0917 -
0.9832 614500 0.6585 -
0.9834 614600 0.5119 -
0.9835 614700 0.9765 -
0.9837 614800 0.84 -
0.9838 614900 0.6817 -
0.9840 615000 0.8435 -
0.9842 615100 0.6928 -
0.9843 615200 0.6534 -
0.9845 615300 0.5802 -
0.9846 615400 0.8526 -
0.9848 615500 0.841 -
0.9850 615600 0.8053 -
0.9851 615700 0.631 -
0.9853 615800 0.6311 -
0.9854 615900 0.9212 -
0.9856 616000 0.6748 -
0.9858 616100 0.6688 -
0.9859 616200 0.5771 -
0.9861 616300 0.753 -
0.9862 616400 0.7481 -
0.9864 616500 0.842 -
0.9866 616600 0.7109 -
0.9867 616700 0.9474 -
0.9869 616800 0.6522 -
0.9870 616900 0.5251 -
0.9872 617000 0.6909 -
0.9874 617100 0.8574 -
0.9875 617200 0.5703 -
0.9877 617300 0.9685 -
0.9878 617400 0.8947 -
0.9880 617500 0.5895 -
0.9882 617600 1.0236 -
0.9883 617700 0.5926 -
0.9885 617800 0.7436 -
0.9886 617900 0.6056 -
0.9888 618000 0.7208 -
0.9890 618100 0.9684 -
0.9891 618200 0.6403 -
0.9893 618300 0.8872 -
0.9894 618400 0.7158 -
0.9896 618500 0.6708 -
0.9898 618600 0.8817 -
0.9899 618700 0.8722 -
0.9901 618800 0.6972 -
0.9902 618900 0.752 -
0.9904 619000 1.6841 -
0.9906 619100 1.0315 -
0.9907 619200 0.5925 -
0.9909 619300 1.2046 -
0.9910 619400 0.9529 -
0.9912 619500 0.512 -
0.9914 619600 0.9372 -
0.9915 619700 0.8461 -
0.9917 619800 1.0018 -
0.9918 619900 0.8104 -
0.9920 620000 0.9701 -
0.9922 620100 0.9382 -
0.9923 620200 0.7666 -
0.9925 620300 0.5209 -
0.9926 620400 0.5529 -
0.9928 620500 0.8119 -
0.9930 620600 0.7313 -
0.9931 620700 0.7657 -
0.9933 620800 0.7837 -
0.9934 620900 0.7026 -
0.9936 621000 0.7149 -
0.9938 621100 0.6568 -
0.9939 621200 0.7321 -
0.9941 621300 0.7595 -
0.9942 621400 0.6011 -
0.9944 621500 1.2311 -
0.9946 621600 0.4925 -
0.9947 621700 0.8688 -
0.9949 621800 0.4481 -
0.9950 621900 1.0283 -
0.9952 622000 1.2286 -
0.9954 622100 0.873 -
0.9955 622200 0.7679 -
0.9957 622300 0.8617 -
0.9958 622400 0.6354 -
0.9960 622500 0.5432 -
0.9962 622600 1.113 -
0.9963 622700 0.8108 -
0.9965 622800 0.9604 -
0.9966 622900 0.6366 -
0.9968 623000 0.7617 -
0.9970 623100 0.7081 -
0.9971 623200 0.7325 -
0.9973 623300 0.6241 -
0.9974 623400 0.4382 -
0.9976 623500 0.3651 -
0.9978 623600 0.6324 -
0.9979 623700 0.5758 -
0.9981 623800 0.7779 -
0.9982 623900 0.7489 -
0.9984 624000 0.6391 -
0.9986 624100 0.673 -
0.9987 624200 0.7025 -
0.9989 624300 0.871 -
0.9990 624400 1.0238 -
0.9992 624500 0.5088 -
0.9994 624600 0.7578 -
0.9995 624700 0.8879 -
0.9997 624800 1.1698 -
0.9998 624900 1.0531 -
1.0000 625000 0.838 -

Framework Versions

  • Python: 3.8.10
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.2
  • PyTorch: 2.4.1+cu118
  • Accelerate: 1.0.1
  • Datasets: 3.0.1
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for youssefkhalil320/all-MiniLM-L6-v8-pair_score

Finetuned
(560)
this model

Dataset used to train youssefkhalil320/all-MiniLM-L6-v8-pair_score