csanz91's picture
Add new SentenceTransformer model
8d8df3e verified
metadata
language:
  - es
license: apache-2.0
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:14907
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
base_model: jinaai/jina-embeddings-v3
widget:
  - source_sentence: >-
      ¿Qué característica especial tenía la escultura del 'Torico' creada por
      Pedro Blesa?
    sentences:
      - >
        Después de dorar el conejo en la receta de Conejo escabechado, en la
        misma sartén se rehogan los ajos, con el laurel y la pimienta.
      - >-
        Rafael Barcelón se encargaba del servicio de electricidad en
        Valdeconejos en 1951.
      - >-
        La escultura del 'Torico' creada por Pedro Blesa era un anaglifo,
        visible en 3D con gafas especiales.
  - source_sentence: >-
      ¿Por qué cantidad adquirió Francisco Santacruz la mina Escuadra en la
      subasta pública?
    sentences:
      - >-
        Después de la temporada 1986-87, el equipo descendió, lo que provocó su
        desaparición del campeonato en la temporada 1987-88.
      - '''Al bies'' significa en diagonal.'
      - >-
        Francisco Santacruz adquirió la mina Escuadra por la cantidad de 931
        pesetas.
  - source_sentence: >-
      ¿Quién se desempeñaba como fiscal en el ayuntamiento de Escucha en el año
      1916?
    sentences:
      - El autor mencionado para la receta Sopas de ajo es Teo Martin Lafuente.
      - >-
        En Escucha en 1916, D. Joaquín Latorre del Río se desempeñaba como
        fiscal.
      - Felipe Mallén era el farmacéutico en Valdeconejos en 1928.
  - source_sentence: >-
      ¿Qué información transmiten los 'toques' en la caña de un pozo durante las
      operaciones mineras?
    sentences:
      - >-
        Juan Pedro Martín encontró fragmentos de carbón de piedra en el paraje
        de El Horcajo.
      - Se publicó en 1970 por Ediciones Cultura y Acción. CNT.
      - >-
        Los 'toques' son señales que se hacen en la caña del pozo para las
        distintas operaciones 1: alto 2: arriba 3: abajo 1+2: despacio arriba
        1+3: despacio abajo 4+2: personal arriba 4+3: personal abajo 4+1+2:
        señalista en jaula arriba 4+1+3: señalista en jaula abajo 5: jaula libre
        6: maniobra
  - source_sentence: ¿En qué año se demarcó y reconoció la mina 'El Pilar'?
    sentences:
      - >-
        Según la quinta demanda del SOMM, todas compañías mineras debían
        entregar a todos sus obreros un libramiento de liquidación mensual
      - '''Tontiar'' significa cuando dos jóvenes empiezan con un noviazgo.'
      - La mina 'El Pilar' se demarcó y reconoció en 1857.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: Lampistero
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 1024
          type: dim_1024
        metrics:
          - type: cosine_accuracy@1
            value: 0.7700663850331925
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8925769462884732
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9155099577549789
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9330114665057333
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7700663850331925
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2975256487628244
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.18310199155099577
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09330114665057333
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7700663850331925
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8925769462884732
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9155099577549789
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9330114665057333
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8578914781807897
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8330619976817926
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8343424106284848
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.7694628847314424
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8889559444779722
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9124924562462281
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9330114665057333
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7694628847314424
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.29631864815932407
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1824984912492456
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09330114665057332
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7694628847314424
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8889559444779722
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9124924562462281
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9330114665057333
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8571049923900239
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8320899311243306
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8333457816447034
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.7682558841279421
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8865419432709717
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9112854556427278
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9305974652987327
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7682558841279421
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2955139810903239
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.18225709112854557
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09305974652987326
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7682558841279421
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8865419432709717
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9112854556427278
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9305974652987327
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8555277012951626
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8307227155597702
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8321030396467847
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.764031382015691
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8901629450814725
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9082679541339771
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9299939649969825
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.764031382015691
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2967209816938242
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1816535908267954
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09299939649969825
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.764031382015691
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8901629450814725
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9082679541339771
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9299939649969825
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8535167149096011
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8282907530342651
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8296119986031772
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.7447193723596862
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8768859384429692
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9028364514182257
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9215449607724804
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7447193723596862
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2922953128143231
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1805672902836451
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09215449607724803
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7447193723596862
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8768859384429692
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9028364514182257
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9215449607724804
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8402664516336745
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8133905221714518
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8148588407289652
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.7103198551599276
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8491249245624622
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8780929390464696
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.899818949909475
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7103198551599276
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2830416415208208
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1756185878092939
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08998189499094747
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7103198551599276
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8491249245624622
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8780929390464696
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.899818949909475
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8119294706592789
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7829293234091058
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7850878407159746
            name: Cosine Map@100

Lampistero

This is a sentence-transformers model finetuned from jinaai/jina-embeddings-v3 on the json dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: jinaai/jina-embeddings-v3
  • Maximum Sequence Length: 8194 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • json
  • Language: es
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (transformer): Transformer(
    (auto_model): XLMRobertaLoRA(
      (roberta): XLMRobertaModel(
        (embeddings): XLMRobertaEmbeddings(
          (word_embeddings): ParametrizedEmbedding(
            250002, 1024, padding_idx=1
            (parametrizations): ModuleDict(
              (weight): ParametrizationList(
                (0): LoRAParametrization()
              )
            )
          )
          (token_type_embeddings): ParametrizedEmbedding(
            1, 1024
            (parametrizations): ModuleDict(
              (weight): ParametrizationList(
                (0): LoRAParametrization()
              )
            )
          )
        )
        (emb_drop): Dropout(p=0.1, inplace=False)
        (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder): XLMRobertaEncoder(
          (layers): ModuleList(
            (0-23): 24 x Block(
              (mixer): MHA(
                (rotary_emb): RotaryEmbedding()
                (Wqkv): ParametrizedLinearResidual(
                  in_features=1024, out_features=3072, bias=True
                  (parametrizations): ModuleDict(
                    (weight): ParametrizationList(
                      (0): LoRAParametrization()
                    )
                  )
                )
                (inner_attn): FlashSelfAttention(
                  (drop): Dropout(p=0.1, inplace=False)
                )
                (inner_cross_attn): FlashCrossAttention(
                  (drop): Dropout(p=0.1, inplace=False)
                )
                (out_proj): ParametrizedLinear(
                  in_features=1024, out_features=1024, bias=True
                  (parametrizations): ModuleDict(
                    (weight): ParametrizationList(
                      (0): LoRAParametrization()
                    )
                  )
                )
              )
              (dropout1): Dropout(p=0.1, inplace=False)
              (drop_path1): StochasticDepth(p=0.0, mode=row)
              (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (mlp): Mlp(
                (fc1): ParametrizedLinear(
                  in_features=1024, out_features=4096, bias=True
                  (parametrizations): ModuleDict(
                    (weight): ParametrizationList(
                      (0): LoRAParametrization()
                    )
                  )
                )
                (fc2): ParametrizedLinear(
                  in_features=4096, out_features=1024, bias=True
                  (parametrizations): ModuleDict(
                    (weight): ParametrizationList(
                      (0): LoRAParametrization()
                    )
                  )
                )
              )
              (dropout2): Dropout(p=0.1, inplace=False)
              (drop_path2): StochasticDepth(p=0.0, mode=row)
              (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
          )
        )
        (pooler): XLMRobertaPooler(
          (dense): ParametrizedLinear(
            in_features=1024, out_features=1024, bias=True
            (parametrizations): ModuleDict(
              (weight): ParametrizationList(
                (0): LoRAParametrization()
              )
            )
          )
          (activation): Tanh()
        )
      )
    )
  )
  (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (normalizer): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("csanz91/lampistero_rag_embeddings_2")
# Run inference
sentences = [
    "¿En qué año se demarcó y reconoció la mina 'El Pilar'?",
    "La mina 'El Pilar' se demarcó y reconoció en 1857.",
    'Según la quinta demanda del SOMM, todas compañías mineras debían entregar a todos sus obreros un libramiento de liquidación mensual',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.7701
cosine_accuracy@3 0.8926
cosine_accuracy@5 0.9155
cosine_accuracy@10 0.933
cosine_precision@1 0.7701
cosine_precision@3 0.2975
cosine_precision@5 0.1831
cosine_precision@10 0.0933
cosine_recall@1 0.7701
cosine_recall@3 0.8926
cosine_recall@5 0.9155
cosine_recall@10 0.933
cosine_ndcg@10 0.8579
cosine_mrr@10 0.8331
cosine_map@100 0.8343

Information Retrieval

Metric Value
cosine_accuracy@1 0.7695
cosine_accuracy@3 0.889
cosine_accuracy@5 0.9125
cosine_accuracy@10 0.933
cosine_precision@1 0.7695
cosine_precision@3 0.2963
cosine_precision@5 0.1825
cosine_precision@10 0.0933
cosine_recall@1 0.7695
cosine_recall@3 0.889
cosine_recall@5 0.9125
cosine_recall@10 0.933
cosine_ndcg@10 0.8571
cosine_mrr@10 0.8321
cosine_map@100 0.8333

Information Retrieval

Metric Value
cosine_accuracy@1 0.7683
cosine_accuracy@3 0.8865
cosine_accuracy@5 0.9113
cosine_accuracy@10 0.9306
cosine_precision@1 0.7683
cosine_precision@3 0.2955
cosine_precision@5 0.1823
cosine_precision@10 0.0931
cosine_recall@1 0.7683
cosine_recall@3 0.8865
cosine_recall@5 0.9113
cosine_recall@10 0.9306
cosine_ndcg@10 0.8555
cosine_mrr@10 0.8307
cosine_map@100 0.8321

Information Retrieval

Metric Value
cosine_accuracy@1 0.764
cosine_accuracy@3 0.8902
cosine_accuracy@5 0.9083
cosine_accuracy@10 0.93
cosine_precision@1 0.764
cosine_precision@3 0.2967
cosine_precision@5 0.1817
cosine_precision@10 0.093
cosine_recall@1 0.764
cosine_recall@3 0.8902
cosine_recall@5 0.9083
cosine_recall@10 0.93
cosine_ndcg@10 0.8535
cosine_mrr@10 0.8283
cosine_map@100 0.8296

Information Retrieval

Metric Value
cosine_accuracy@1 0.7447
cosine_accuracy@3 0.8769
cosine_accuracy@5 0.9028
cosine_accuracy@10 0.9215
cosine_precision@1 0.7447
cosine_precision@3 0.2923
cosine_precision@5 0.1806
cosine_precision@10 0.0922
cosine_recall@1 0.7447
cosine_recall@3 0.8769
cosine_recall@5 0.9028
cosine_recall@10 0.9215
cosine_ndcg@10 0.8403
cosine_mrr@10 0.8134
cosine_map@100 0.8149

Information Retrieval

Metric Value
cosine_accuracy@1 0.7103
cosine_accuracy@3 0.8491
cosine_accuracy@5 0.8781
cosine_accuracy@10 0.8998
cosine_precision@1 0.7103
cosine_precision@3 0.283
cosine_precision@5 0.1756
cosine_precision@10 0.09
cosine_recall@1 0.7103
cosine_recall@3 0.8491
cosine_recall@5 0.8781
cosine_recall@10 0.8998
cosine_ndcg@10 0.8119
cosine_mrr@10 0.7829
cosine_map@100 0.7851

Training Details

Training Dataset

json

  • Dataset: json
  • Size: 14,907 training samples
  • Columns: query and answer
  • Approximate statistics based on the first 1000 samples:
    query answer
    type string string
    details
    • min: 9 tokens
    • mean: 26.09 tokens
    • max: 66 tokens
    • min: 4 tokens
    • mean: 34.02 tokens
    • max: 405 tokens
  • Samples:
    query answer
    ¿Qué tipos de palas se utilizan para cargar el carbón y el mineral? Se utiliza una pala convencional y una pala hidráulica, esta última descarga sobre un páncer, puede hacerlo lateralmente y se desplaza sobre ruedas u oruga.
    Tras el cierre de la tejería de Florencio Salvador, ¿de dónde procedieron finalmente los ladrillos para las doscientas diez viviendas construidas en Utrillas? Los ladrillos y material para las doscientas diez viviendas construidas en Utrillas procedieron finalmente de Letux, Zaragoza .
    ¿Cuál es el formato de los juegos infantiles que se están preparando para el verano en Escucha en 2021? Los juegos infantiles que se están preparando para el verano en Escucha en 2021 están en formato revista.
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            1024,
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 32
  • learning_rate: 2e-05
  • num_train_epochs: 8
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 32
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 8
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_1024_cosine_ndcg@10 dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
1.0 8 - 0.7841 0.7835 0.7836 0.7791 0.7665 0.7226
1.2747 10 58.1187 - - - - - -
2.0 16 - 0.8348 0.8366 0.8345 0.8301 0.8184 0.7861
2.5494 20 24.4181 - - - - - -
3.0 24 - 0.8521 0.8504 0.8503 0.8457 0.8319 0.8007
3.8240 30 16.1488 - - - - - -
4.0 32 - 0.8561 0.8548 0.8555 0.8509 0.8387 0.8073
5.0 40 13.4897 0.8585 0.8556 0.8545 0.8528 0.8397 0.8111
6.0 48 - 0.8578 0.8563 0.8550 0.8535 0.8410 0.8110
6.2747 50 13.7469 - - - - - -
7.0 56 - 0.8579 0.8571 0.8555 0.8535 0.8403 0.8119

Framework Versions

  • Python: 3.12.10
  • Sentence Transformers: 4.1.0
  • Transformers: 4.51.3
  • PyTorch: 2.7.0+cu126
  • Accelerate: 1.7.0
  • Datasets: 3.6.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}