File size: 236,734 Bytes
fb940b7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
---
base_model: bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp
datasets:
- sentence-transformers/all-nli
- tals/vitaminc
- nyu-mll/glue
- allenai/scitail
- sentence-transformers/xsum
- sentence-transformers/sentence-compression
- allenai/sciq
- allenai/qasc
- sentence-transformers/msmarco-msmarco-distilbert-base-v3
- sentence-transformers/natural-questions
- sentence-transformers/trivia-qa
- sentence-transformers/quora-duplicates
- sentence-transformers/gooaq
language:
- en
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
- cosine_accuracy
- cosine_accuracy_threshold
- cosine_f1
- cosine_f1_threshold
- cosine_precision
- cosine_recall
- cosine_ap
- dot_accuracy
- dot_accuracy_threshold
- dot_f1
- dot_f1_threshold
- dot_precision
- dot_recall
- dot_ap
- manhattan_accuracy
- manhattan_accuracy_threshold
- manhattan_f1
- manhattan_f1_threshold
- manhattan_precision
- manhattan_recall
- manhattan_ap
- euclidean_accuracy
- euclidean_accuracy_threshold
- euclidean_f1
- euclidean_f1_threshold
- euclidean_precision
- euclidean_recall
- euclidean_ap
- max_accuracy
- max_accuracy_threshold
- max_f1
- max_f1_threshold
- max_precision
- max_recall
- max_ap
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:165061
- loss:AdaptiveLayerLoss
- loss:GISTEmbedLoss
- loss:OnlineContrastiveLoss
- loss:MultipleNegativesSymmetricRankingLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: how to move money from one ira account to another?
  sentences:
  - Apart from getting EPF withdrawal form from your employer, you can download it
    from the EPFO portal. If your Universal Account Number (UAN) is linked to your
    Aadhar then you can use the Aadhar-based withdrawal form. Otherwise, you can apply
    for the withdrawal through UAN's portal.
  - PJ Tucker is the worst starter on the Rockets, mainly because he is used as a
    minutes eater. He has the third-worst player efficiency rating on the team. He
    just doesn't compare in value to the other Houston starters.
  - Sign the paperwork and submit it to the new bank or brokerage. Contact your original
    trustee to request a direct transfer of your IRA funds to the IRA at the new institution.
    If you want to maintain your current investments, your trustee may be able to
    make an in-kind transfer, leaving your investments in place.
- source_sentence: The number of COVID-19 cases was more than 241,500 worldwide by
    March 20 , 2020 .
  sentences:
  - More than 5,500 people have died from the disease and over 72,000 have recovered
    .
  - more than 6,500 deaths have been attributed to COVID-19 .
  - As of 20 March , more than 242,000 cases of COVID-19 have been reported in over
    170 countries and territories , resulting in more than 9,900 deaths and 87,000
    recoveries.  On 13 March , the WHO announced that Europe had become the new epicentre
    of the pandemic .
- source_sentence: 'Reese Witherspoon says she''s ``deeply embarrassed'''' by her
    arrest, Kim Kardashian thinks she would get married again and Farrah Abraham brings
    her dad and 3-year-old daughter to her sex tape negotiation:'
  sentences:
  - Scouts sell $3.5 million in popcorn
  - 'Reese Witherspoon is ``deeply embarrassed'''' by arrest, Kim Kardashian thinks
    she would get married again:'
  - Horse diagnosed with disease
- source_sentence: who played snow queen in once upon a time
  sentences:
  - Terracotta Army The figures, dating from approximately the late third century
    BCE,[1] were discovered in 1974 by local farmers in Lintong District, Xi'an, Shaanxi
    province. The figures vary in height according to their roles, with the tallest
    being the generals. The figures include warriors, chariots and horses. Estimates
    from 2007 were that the three pits containing the Terracotta Army held more than
    8,000 soldiers, 130 chariots with 520 horses and 150 cavalry horses, the majority
    of which remained buried in the pits nearby Qin Shi Huang's mausoleum.[2] Other
    terracotta non-military figures were found in other pits, including officials,
    acrobats, strongmen and musicians.
  - 'Elizabeth Mitchell Elizabeth Mitchell (born Elizabeth Joanna Robertson: March
    27, 1970) is an American actress known for her role as Dr. Juliet Burke on the
    ABC series Lost.[1] She also had lead roles on the TV series V and Revolution,
    as well as the Snow Queen on Once Upon a Time and as Deb Carpenter on Dead of
    Summer. Mitchell has starred in such films as The Santa Clause 2 & 3: The Escape
    Clause, Gia and The Purge: Election Year.'
  - First Amendment to the United States Constitution The First Amendment (Amendment
    I) to the United States Constitution prevents Congress from making any law respecting
    an establishment of religion, prohibiting the free exercise of religion, or abridging
    the freedom of speech, the freedom of the press, the right to peaceably assemble,
    or to petition for a governmental redress of grievances. It was adopted on December
    15, 1791, as one of the ten amendments that constitute the Bill of Rights.
- source_sentence: what are light bulbs powered by?
  sentences:
  - an incandescent light bulb converts electricity into light by sending electricity
    through a filament. Incandescence is light from heat energy.. light bulbs are
    powered by heat
  - Viruses infect and live inside the cells of living organisms.. Life is a living
    organism.. viruses infect and live inside the cells of life.
  - Pressure receptors are found mainly in the skin.. Pacinian corpuscles are pressure
    receptors.. Pacinian corpuscles are found mainly in the skin.
model-index:
- name: SentenceTransformer based on bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp
  results:
  - task:
      type: semantic-similarity
      name: Semantic Similarity
    dataset:
      name: StS test
      type: StS-test
    metrics:
    - type: pearson_cosine
      value: 0.8821101738384596
      name: Pearson Cosine
    - type: spearman_cosine
      value: 0.8943047751560564
      name: Spearman Cosine
    - type: pearson_manhattan
      value: 0.8704995590187196
      name: Pearson Manhattan
    - type: spearman_manhattan
      value: 0.8737012027236009
      name: Spearman Manhattan
    - type: pearson_euclidean
      value: 0.8697205121607111
      name: Pearson Euclidean
    - type: spearman_euclidean
      value: 0.871583089708652
      name: Spearman Euclidean
    - type: pearson_dot
      value: 0.8032893366124795
      name: Pearson Dot
    - type: spearman_dot
      value: 0.8087424893555902
      name: Spearman Dot
    - type: pearson_max
      value: 0.8821101738384596
      name: Pearson Max
    - type: spearman_max
      value: 0.8943047751560564
      name: Spearman Max
  - task:
      type: binary-classification
      name: Binary Classification
    dataset:
      name: mrpc test
      type: mrpc-test
    metrics:
    - type: cosine_accuracy
      value: 0.7473684210526316
      name: Cosine Accuracy
    - type: cosine_accuracy_threshold
      value: 0.7145693302154541
      name: Cosine Accuracy Threshold
    - type: cosine_f1
      value: 0.8327645051194539
      name: Cosine F1
    - type: cosine_f1_threshold
      value: 0.6522408723831177
      name: Cosine F1 Threshold
    - type: cosine_precision
      value: 0.7218934911242604
      name: Cosine Precision
    - type: cosine_recall
      value: 0.9838709677419355
      name: Cosine Recall
    - type: cosine_ap
      value: 0.8563235829800693
      name: Cosine Ap
    - type: dot_accuracy
      value: 0.7026315789473684
      name: Dot Accuracy
    - type: dot_accuracy_threshold
      value: 14.454626083374023
      name: Dot Accuracy Threshold
    - type: dot_f1
      value: 0.8054607508532423
      name: Dot F1
    - type: dot_f1_threshold
      value: 13.752894401550293
      name: Dot F1 Threshold
    - type: dot_precision
      value: 0.6982248520710059
      name: Dot Precision
    - type: dot_recall
      value: 0.9516129032258065
      name: Dot Recall
    - type: dot_ap
      value: 0.796363256728503
      name: Dot Ap
    - type: manhattan_accuracy
      value: 0.7289473684210527
      name: Manhattan Accuracy
    - type: manhattan_accuracy_threshold
      value: 77.57926177978516
      name: Manhattan Accuracy Threshold
    - type: manhattan_f1
      value: 0.815742397137746
      name: Manhattan F1
    - type: manhattan_f1_threshold
      value: 79.14703369140625
      name: Manhattan F1 Threshold
    - type: manhattan_precision
      value: 0.7331189710610932
      name: Manhattan Precision
    - type: manhattan_recall
      value: 0.9193548387096774
      name: Manhattan Recall
    - type: manhattan_ap
      value: 0.8208816982117964
      name: Manhattan Ap
    - type: euclidean_accuracy
      value: 0.7315789473684211
      name: Euclidean Accuracy
    - type: euclidean_accuracy_threshold
      value: 3.890326499938965
      name: Euclidean Accuracy Threshold
    - type: euclidean_f1
      value: 0.8165467625899281
      name: Euclidean F1
    - type: euclidean_f1_threshold
      value: 3.890326499938965
      name: Euclidean F1 Threshold
    - type: euclidean_precision
      value: 0.737012987012987
      name: Euclidean Precision
    - type: euclidean_recall
      value: 0.9153225806451613
      name: Euclidean Recall
    - type: euclidean_ap
      value: 0.8252367395643119
      name: Euclidean Ap
    - type: max_accuracy
      value: 0.7473684210526316
      name: Max Accuracy
    - type: max_accuracy_threshold
      value: 77.57926177978516
      name: Max Accuracy Threshold
    - type: max_f1
      value: 0.8327645051194539
      name: Max F1
    - type: max_f1_threshold
      value: 79.14703369140625
      name: Max F1 Threshold
    - type: max_precision
      value: 0.737012987012987
      name: Max Precision
    - type: max_recall
      value: 0.9838709677419355
      name: Max Recall
    - type: max_ap
      value: 0.8563235829800693
      name: Max Ap
  - task:
      type: binary-classification
      name: Binary Classification
    dataset:
      name: Vitaminc test
      type: Vitaminc-test
    metrics:
    - type: cosine_accuracy
      value: 0.5684210526315789
      name: Cosine Accuracy
    - type: cosine_accuracy_threshold
      value: 0.7028586268424988
      name: Cosine Accuracy Threshold
    - type: cosine_f1
      value: 0.6755218216318786
      name: Cosine F1
    - type: cosine_f1_threshold
      value: 0.5077509880065918
      name: Cosine F1 Threshold
    - type: cosine_precision
      value: 0.52046783625731
      name: Cosine Precision
    - type: cosine_recall
      value: 0.9621621621621622
      name: Cosine Recall
    - type: cosine_ap
      value: 0.5651043866206488
      name: Cosine Ap
    - type: dot_accuracy
      value: 0.5684210526315789
      name: Dot Accuracy
    - type: dot_accuracy_threshold
      value: 19.693286895751953
      name: Dot Accuracy Threshold
    - type: dot_f1
      value: 0.6691449814126395
      name: Dot F1
    - type: dot_f1_threshold
      value: 13.839346885681152
      name: Dot F1 Threshold
    - type: dot_precision
      value: 0.509915014164306
      name: Dot Precision
    - type: dot_recall
      value: 0.972972972972973
      name: Dot Recall
    - type: dot_ap
      value: 0.5463931769790206
      name: Dot Ap
    - type: manhattan_accuracy
      value: 0.5815789473684211
      name: Manhattan Accuracy
    - type: manhattan_accuracy_threshold
      value: 87.21337890625
      name: Manhattan Accuracy Threshold
    - type: manhattan_f1
      value: 0.6666666666666667
      name: Manhattan F1
    - type: manhattan_f1_threshold
      value: 141.26380920410156
      name: Manhattan F1 Threshold
    - type: manhattan_precision
      value: 0.505586592178771
      name: Manhattan Precision
    - type: manhattan_recall
      value: 0.9783783783783784
      name: Manhattan Recall
    - type: manhattan_ap
      value: 0.5572154085134091
      name: Manhattan Ap
    - type: euclidean_accuracy
      value: 0.5894736842105263
      name: Euclidean Accuracy
    - type: euclidean_accuracy_threshold
      value: 4.252468585968018
      name: Euclidean Accuracy Threshold
    - type: euclidean_f1
      value: 0.6666666666666666
      name: Euclidean F1
    - type: euclidean_f1_threshold
      value: 6.922356128692627
      name: Euclidean F1 Threshold
    - type: euclidean_precision
      value: 0.5041551246537396
      name: Euclidean Precision
    - type: euclidean_recall
      value: 0.9837837837837838
      name: Euclidean Recall
    - type: euclidean_ap
      value: 0.5569049511912931
      name: Euclidean Ap
    - type: max_accuracy
      value: 0.5894736842105263
      name: Max Accuracy
    - type: max_accuracy_threshold
      value: 87.21337890625
      name: Max Accuracy Threshold
    - type: max_f1
      value: 0.6755218216318786
      name: Max F1
    - type: max_f1_threshold
      value: 141.26380920410156
      name: Max F1 Threshold
    - type: max_precision
      value: 0.52046783625731
      name: Max Precision
    - type: max_recall
      value: 0.9837837837837838
      name: Max Recall
    - type: max_ap
      value: 0.5651043866206488
      name: Max Ap
---

# SentenceTransformer based on bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp](https://huggingface.co/bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp) on the [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli), [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum), [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression), [compression-pairs2](https://huggingface.co/datasets/sentence-transformers/sentence-compression), [compression-pairs3](https://huggingface.co/datasets/sentence-transformers/sentence-compression), [sciq_pairs](https://huggingface.co/datasets/allenai/sciq), [qasc_pairs](https://huggingface.co/datasets/allenai/qasc), [qasc_facts_sym](https://huggingface.co/datasets/allenai/qasc), openbookqa_pairs, [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [msmarco_pairs2](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions), [nq_pairs2](https://huggingface.co/datasets/sentence-transformers/natural-questions), [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa), [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates), [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq), [gooaq_pairs2](https://huggingface.co/datasets/sentence-transformers/gooaq) and [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp](https://huggingface.co/bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp) <!-- at revision 4859fef4e21d101c2d445bcd33db1e3308f35dc4 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Datasets:**
    - [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli)
    - [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc)
    - [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue)
    - [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail)
    - [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail)
    - [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum)
    - [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression)
    - [compression-pairs2](https://huggingface.co/datasets/sentence-transformers/sentence-compression)
    - [compression-pairs3](https://huggingface.co/datasets/sentence-transformers/sentence-compression)
    - [sciq_pairs](https://huggingface.co/datasets/allenai/sciq)
    - [qasc_pairs](https://huggingface.co/datasets/allenai/qasc)
    - [qasc_facts_sym](https://huggingface.co/datasets/allenai/qasc)
    - openbookqa_pairs
    - [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3)
    - [msmarco_pairs2](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3)
    - [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions)
    - [nq_pairs2](https://huggingface.co/datasets/sentence-transformers/natural-questions)
    - [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa)
    - [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates)
    - [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq)
    - [gooaq_pairs2](https://huggingface.co/datasets/sentence-transformers/gooaq)
    - [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue)
- **Language:** en
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa-ST-AllLayers-v3.1bis-checkpoints-tmp")
# Run inference
sentences = [
    'what are light bulbs powered by?',
    'an incandescent light bulb converts electricity into light by sending electricity through a filament. Incandescence is light from heat energy.. light bulbs are powered by heat',
    'Viruses infect and live inside the cells of living organisms.. Life is a living organism.. viruses infect and live inside the cells of life.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Semantic Similarity
* Dataset: `StS-test`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)

| Metric              | Value      |
|:--------------------|:-----------|
| pearson_cosine      | 0.8821     |
| **spearman_cosine** | **0.8943** |
| pearson_manhattan   | 0.8705     |
| spearman_manhattan  | 0.8737     |
| pearson_euclidean   | 0.8697     |
| spearman_euclidean  | 0.8716     |
| pearson_dot         | 0.8033     |
| spearman_dot        | 0.8087     |
| pearson_max         | 0.8821     |
| spearman_max        | 0.8943     |

#### Binary Classification
* Dataset: `mrpc-test`
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)

| Metric                       | Value      |
|:-----------------------------|:-----------|
| cosine_accuracy              | 0.7474     |
| cosine_accuracy_threshold    | 0.7146     |
| cosine_f1                    | 0.8328     |
| cosine_f1_threshold          | 0.6522     |
| cosine_precision             | 0.7219     |
| cosine_recall                | 0.9839     |
| cosine_ap                    | 0.8563     |
| dot_accuracy                 | 0.7026     |
| dot_accuracy_threshold       | 14.4546    |
| dot_f1                       | 0.8055     |
| dot_f1_threshold             | 13.7529    |
| dot_precision                | 0.6982     |
| dot_recall                   | 0.9516     |
| dot_ap                       | 0.7964     |
| manhattan_accuracy           | 0.7289     |
| manhattan_accuracy_threshold | 77.5793    |
| manhattan_f1                 | 0.8157     |
| manhattan_f1_threshold       | 79.147     |
| manhattan_precision          | 0.7331     |
| manhattan_recall             | 0.9194     |
| manhattan_ap                 | 0.8209     |
| euclidean_accuracy           | 0.7316     |
| euclidean_accuracy_threshold | 3.8903     |
| euclidean_f1                 | 0.8165     |
| euclidean_f1_threshold       | 3.8903     |
| euclidean_precision          | 0.737      |
| euclidean_recall             | 0.9153     |
| euclidean_ap                 | 0.8252     |
| max_accuracy                 | 0.7474     |
| max_accuracy_threshold       | 77.5793    |
| max_f1                       | 0.8328     |
| max_f1_threshold             | 79.147     |
| max_precision                | 0.737      |
| max_recall                   | 0.9839     |
| **max_ap**                   | **0.8563** |

#### Binary Classification
* Dataset: `Vitaminc-test`
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)

| Metric                       | Value      |
|:-----------------------------|:-----------|
| cosine_accuracy              | 0.5684     |
| cosine_accuracy_threshold    | 0.7029     |
| cosine_f1                    | 0.6755     |
| cosine_f1_threshold          | 0.5078     |
| cosine_precision             | 0.5205     |
| cosine_recall                | 0.9622     |
| cosine_ap                    | 0.5651     |
| dot_accuracy                 | 0.5684     |
| dot_accuracy_threshold       | 19.6933    |
| dot_f1                       | 0.6691     |
| dot_f1_threshold             | 13.8393    |
| dot_precision                | 0.5099     |
| dot_recall                   | 0.973      |
| dot_ap                       | 0.5464     |
| manhattan_accuracy           | 0.5816     |
| manhattan_accuracy_threshold | 87.2134    |
| manhattan_f1                 | 0.6667     |
| manhattan_f1_threshold       | 141.2638   |
| manhattan_precision          | 0.5056     |
| manhattan_recall             | 0.9784     |
| manhattan_ap                 | 0.5572     |
| euclidean_accuracy           | 0.5895     |
| euclidean_accuracy_threshold | 4.2525     |
| euclidean_f1                 | 0.6667     |
| euclidean_f1_threshold       | 6.9224     |
| euclidean_precision          | 0.5042     |
| euclidean_recall             | 0.9838     |
| euclidean_ap                 | 0.5569     |
| max_accuracy                 | 0.5895     |
| max_accuracy_threshold       | 87.2134    |
| max_f1                       | 0.6755     |
| max_f1_threshold             | 141.2638   |
| max_precision                | 0.5205     |
| max_recall                   | 0.9838     |
| **max_ap**                   | **0.5651** |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Datasets

#### nli-pairs

* Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 10,000 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                        |
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                           |
  | details | <ul><li>min: 5 tokens</li><li>mean: 16.62 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.46 tokens</li><li>max: 29 tokens</li></ul> |
* Samples:
  | sentence1                                                                  | sentence2                                        |
  |:---------------------------------------------------------------------------|:-------------------------------------------------|
  | <code>A person on a horse jumps over a broken down airplane.</code>        | <code>A person is outdoors, on a horse.</code>   |
  | <code>Children smiling and waving at camera</code>                         | <code>There are children present</code>          |
  | <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### vitaminc-pairs

* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 11,500 training samples
* Columns: <code>claim</code> and <code>evidence</code>
* Approximate statistics based on the first 1000 samples:
  |         | claim                                                                            | evidence                                                                           |
  |:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                             |
  | details | <ul><li>min: 8 tokens</li><li>mean: 17.5 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 38.51 tokens</li><li>max: 381 tokens</li></ul> |
* Samples:
  | claim                                                                                                                     | evidence                                                                                                                                                                                 |
  |:--------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Danny Trevathan made more than 100 tackles in the 2013 season .</code>                                              | <code>In the 2013 season , Trevathan led the team with 125 tackles .</code>                                                                                                              |
  | <code>Katy Perry : Part of Me has grossed over $ 6.7 million internationally and more than $ 32 million globally .</code> | <code>Katy Perry : Part of Me has grossed $ 25,326,071 domestically and $ 6,882,884 internationally , totaling $ 32,208,955 worldwide .</code>                                           |
  | <code>Bark at the Moon became Platinum for selling more than 700,000 copies in the United States .</code>                 | <code>The album peaked at number 19 on the Billboard album chart and within several weeks of release was certified Platinum for over a million sales in the United States alone .</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### qnli-contrastive

* Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c)
* Size: 13,300 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          | label                        |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------|
  | type    | string                                                                            | string                                                                             | int                          |
  | details | <ul><li>min: 6 tokens</li><li>mean: 13.78 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 35.59 tokens</li><li>max: 178 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> |
* Samples:
  | sentence1                                                                                         | sentence2                                                                                                                                                        | label          |
  |:--------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
  | <code>How does the Center for Measuring University performance rank Washington University?</code> | <code>According to the Center for Measuring University Performance, it is considered to be one of the top 10 private research universities in the nation.</code> | <code>0</code> |
  | <code>Besides tax shifting, what is another need?</code>                                          | <code>Just as there is a need for tax shifting, there is also a need for subsidy shifting.</code>                                                                | <code>0</code> |
  | <code>How large is the Irish diaspora that was caused by the Great Irish Famine?</code>           | <code>However, since the Great Irish Famine, the population of Ireland has fallen to less than one tenth of the population of the British Isles.</code>          | <code>0</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "OnlineContrastiveLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### scitail-pairs-qa

* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 11,155 training samples
* Columns: <code>sentence2</code> and <code>sentence1</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence2                                                                         | sentence1                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 15.95 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.04 tokens</li><li>max: 41 tokens</li></ul> |
* Samples:
  | sentence2                                                                                     | sentence1                                                                                 |
  |:----------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------|
  | <code>The first living cells may have evolved around 4 billion years ago.</code>              | <code>The first living cells may have evolved around how long ago?</code>                 |
  | <code>By 8 weeks, all major organs start developing.</code>                                   | <code>By how many weeks do all major organs start developing?</code>                      |
  | <code>Intrinsic muscles allow your fingers to also make precise movements for actions.</code> | <code>Which muscles allow your fingers to also make precise movements for actions?</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### scitail-pairs-pos

* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 8,600 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 23.56 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.39 tokens</li><li>max: 35 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                           | sentence2                                                                                            |
  |:--------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------|
  | <code>Retina The retina uses nerve cells known as rods and cones (photoreceptors) to detect light and color.</code> | <code>The light-sensing cells in the retina are called rods and cones.</code>                        |
  | <code>Breakwaters are  structures that protect the coast like barrier islands.</code>                               | <code>The name of artificial barriers that people build to protect shorelines is breakwaters.</code> |
  | <code>The speed of the wave is equal to the wavelength times the frequency.</code>                                  | <code>The product of a wave's wavelength and its frequency is its speed.</code>                      |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### xsum-pairs

* Dataset: [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) at [788ddaf](https://huggingface.co/datasets/sentence-transformers/xsum/tree/788ddafe04e539956d56b567bc32a036ee7b9206)
* Size: 7,000 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                           | sentence2                                                                         |
  |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                              | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 190.44 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 25.62 tokens</li><li>max: 75 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     | sentence2                                                                                                                                                   |
  |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>The Surrey force said a 51-year-old woman from East Molesey voluntarily spoke to officers.<br>The incident at Hampton Court was filmed by cycling instructor David Williams, 47, who posted it on YouTube.<br>Officers said its traffic and process unit would consider whether to take further action.<br>The woman was given a referral notice of "driving whilst not in proper control of motor vehicle".<br>The police force said it would consider whether the woman should face penalties, including being required to attend a driver improvement course or receive a fixed penalty notice.<br>The alleged incident happened at the junction of Creek Road and the A309 Hampton Court Way in Molesey.</code>                                                                                                     | <code>A woman who was filmed allegedly eating a bowl of cereal while driving at a busy junction in south-west London has been interviewed by police.</code> |
  | <code>Paul Hegarty was sacked last month, with the Gable Endies picking up one point from three games since his departure.<br>Petrie, 46, leaves his post as as assistant manager at Junior Super League side Broughty Athletic.<br>Born in Dundee, he played for Forfar, Dunfermline and Ross County and had spells in Australia and Singapore.<br>Petrie has previously served as assistant manager at Forfar and Arbroath.<br>"He is highly motivated and enthusiastic, comes with a wealth of playing and coaching experience and most importantly understands the lower leagues of Scottish football," said chairman John Crawford.<br>"I would like to thank John Holt for his support in stepping in and looking after the team since the departure of Paul Hegarty and wish him every success for the future."</code> | <code>Stewart Petrie is the new manager at struggling Montrose, taking over with the club at the foot of League Two.</code>                                 |
  | <code>The 28-year-old former Bournemouth, Portsmouth and Leeds United centre-back has signed a three-year contract with the Addicks.<br>Pearce made 33 appearances last season as the Latics won the League One title and becomes Charlton's seventh signing of the summer transfer window.<br>"He's an absolute warrior," boss Russell Slade told the club website.<br>Find all the latest football transfers on our dedicated page.</code>                                                                                                                                                                                                                                                                                                                                                                                  | <code>League One club Charlton Athletic have signed defender Jason Pearce from Wigan for an undisclosed fee.</code>                                         |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### compression-pairs

* Dataset: [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90)
* Size: 4,014 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                         |
  |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                            |
  | details | <ul><li>min: 11 tokens</li><li>mean: 31.5 tokens</li><li>max: 125 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.17 tokens</li><li>max: 28 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                      | sentence2                                                               |
  |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------|
  | <code>one who survived one of the city's worst mass shootings but left behind a trail of death and violence in British Columbia and Alberta going back almost a decade.</code> | <code>Victim had survived mass shooting</code>                          |
  | <code>``It's an awesome feeling,'' Vick said of getting the bankruptcy plan approved, which includes how his creditors will be repaid more than $20 million.</code>            | <code>Michael Vick's bankruptcy plan approved</code>                    |
  | <code>Kevin Youkilis put on pinstripes and played at Yankee Stadium for the first time as a member of the New York Yankees on Monday afternoon.</code>                         | <code>Youkilis downplays putting on pinstripes at Yankee Stadium</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### compression-pairs2

* Dataset: [compression-pairs2](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90)
* Size: 7,960 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                         |
  |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                            |
  | details | <ul><li>min: 11 tokens</li><li>mean: 31.9 tokens</li><li>max: 329 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.05 tokens</li><li>max: 22 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                         | sentence2                                                                       |
  |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|
  | <code>Industrial Parks in Asia are seeing very strong demand as Foreign Companies take advantage of the many investment incentives on offer.</code>                                                                               | <code>Industrial Parks in Asia see strong demand</code>                         |
  | <code>Western Australia has the highest loan delinquency rates in the country with two percent of its home loan payments delayed by more than a month compared to the national average of 1.54 percent says Fitch Ratings.</code> | <code>Western Australia has highest delinquency rate in the country</code>      |
  | <code>Governor Chet Culver today issued a disaster emergency proclamation for Des Moines County in response to severe thunderstorms that hit the area on May 12.</code>                                                           | <code>Governor Culver issues disaster proclamation for Des Moines County</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": 3,
      "last_layer_weight": 0.15,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 0.75,
      "kl_temperature": 0.5
  }
  ```

#### compression-pairs3

* Dataset: [compression-pairs3](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90)
* Size: 7,960 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                           | sentence2                                                                         |
  |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                              | string                                                                            |
  | details | <ul><li>min: 10 tokens</li><li>mean: 32.24 tokens</li><li>max: 166 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.08 tokens</li><li>max: 22 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                       | sentence2                                                                       |
  |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|
  | <code>WHILE London 2012 might seem a bit farfetched for a 17-year-old, water polo player Tuesday Birmingham is hoping her undefeated streak makes selectors sit up and take notice.</code>                                      | <code>Birmingham hoping selectors sit up and take notice</code>                 |
  | <code>Police are searching for a man accused of trying to kidnap a girl from her North Charlotte bus stop.</code>                                                                                                               | <code>Police search for man accused of trying to kidnap girl at bus stop</code> |
  | <code>NASCAR has ``called a mandatory meeting for all Sprint Cup drivers and team owners for Tuesday morning'' at its Research and Development Center in Concord, North Carolina, according to David Newton of ESPN.com.</code> | <code>NASCAR calls mandatory meeting for Sprint Cup drivers, owners</code>      |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesRankingLoss",
      "n_layers_per_step": 3,
      "last_layer_weight": 0.05,
      "prior_layers_weight": 10,
      "kl_div_weight": 5,
      "kl_temperature": 0.2
  }
  ```

#### sciq_pairs

* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 10,750 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 6 tokens</li><li>mean: 16.74 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 79.87 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                               | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 |
  |:------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What do secondary spermatocytes form when completing meiosis?</code>                                              | <code>Spermatogonia lining the seminiferous tubule undergo mitosis to form primary spermatocytes , which are also diploid. The primary spermatocytes undergo the first meiotic division to form secondary spermatocytes , which are haploid. Spermatocytes make up the next layer of cells inside the seminiferous tubule. Finally, the secondary spermatocytes complete meiosis to form spermatids . Spermatids make up a third layer of cells in the tubule.</code>                                                                                                                                                                                                                                                                                                                                                                                                     |
  | <code>Unlike ammonia, oxygen cannot be liquefied at room temperature because its what is below room temperature?</code> | <code>Check Your Learning Ammonia can be liquefied by compression at room temperature; oxygen cannot be liquefied under these conditions. Why do the two gases exhibit different behavior? Answer: The critical temperature of ammonia is 405.5 K, which is higher than room temperature. The critical temperature of oxygen is below room temperature; thus oxygen cannot be liquefied at room temperature.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                       |
  | <code>The distance between two consecutive z discs or z lines is called what?</code>                                    | <code>When a sarcomere shortens, some regions shorten whereas others stay the same length. A sarcomere is defined as the distance between two consecutive Z discs or Z lines; when a muscle contracts, the distance between the Z discs is reduced. The H zone—the central region of the A zone—contains only thick filaments and is shortened during contraction. The I band contains only thin filaments and also shortens. The A band does not shorten—it remains the same length—but A bands of different sarcomeres move closer together during contraction, eventually disappearing. Thin filaments are pulled by the thick filaments toward the center of the sarcomere until the Z discs approach the thick filaments. The zone of overlap, in which thin filaments and thick filaments occupy the same area, increases as the thin filaments move inward.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### qasc_pairs

* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 7,889 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 4 tokens</li><li>mean: 11.41 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 34.81 tokens</li><li>max: 75 tokens</li></ul> |
* Samples:
  | sentence1                                                                                      | sentence2                                                                                                                                                                                                             |
  |:-----------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What is a type of the basic units of the structure and function of living things?</code> | <code>Cells are the basic units of the structure and function of living things.. Osteogenic cell types are indicated.. Osteogenic is a type of the basic units of the structure and function of living things.</code> |
  | <code>bacteria can cause people to need what?</code>                                           | <code>bacteria can cause people to become ill. Serious illness needing hospitalization is uncommon.. bacteria can cause people to need hospitalization.</code>                                                        |
  | <code>What type of energy generated by the bulbs is wasted?</code>                             | <code>some light bulbs convert electricity into light and heat energy. Since the purpose of a light bulb is to generate light, the heat is wasted energy.. The heat energy generated by the bulbs gets wasted.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### qasc_facts_sym

* Dataset: [qasc_facts_sym](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 7,889 training samples
* Columns: <code>combinedfact</code> and <code>facts</code>
* Approximate statistics based on the first 1000 samples:
  |         | combinedfact                                                                      | facts                                                                              |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 5 tokens</li><li>mean: 11.82 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 25.18 tokens</li><li>max: 45 tokens</li></ul> |
* Samples:
  | combinedfact                                                                     | facts                                                                                                                                                             |
  |:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>weathering means that rocks are eroded down again.</code>                  | <code>weathering means breaking down rocks from larger whole into smaller pieces by weather. Erosion breaks the rock down again, and the cycle continues..</code> |
  | <code>cutting down trees has a negative impact on the health of humankind</code> | <code>cutting down trees has a negative impact on an ecosystem. When ecosystems are healthy, humankind is healthy..</code>                                        |
  | <code>Fossils are formed with sand and mud cover the remains over time</code>    | <code>fossils are formed when layers of sediment cover the remains of organisms over time. Sand and mud are examples of sediments..</code>                        |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### openbookqa_pairs

* Dataset: openbookqa_pairs
* Size: 4,505 training samples
* Columns: <code>question</code> and <code>fact</code>
* Approximate statistics based on the first 1000 samples:
  |         | question                                                                          | fact                                                                              |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 3 tokens</li><li>mean: 13.81 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.49 tokens</li><li>max: 30 tokens</li></ul> |
* Samples:
  | question                                                                     | fact                                                                                  |
  |:-----------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|
  | <code>What is animal competition?</code>                                     | <code>if two animals eat the same prey then those animals compete for that pey</code> |
  | <code>If you wanted to make a metal bed frame, where would you start?</code> | <code>alloys are made of two or more metals</code>                                    |
  | <code>Places lacking warmth have few what</code>                             | <code>cold environments contain few organisms</code>                                  |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### msmarco_pairs

* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 6,875 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                          |
  |:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                             |
  | details | <ul><li>min: 4 tokens</li><li>mean: 8.68 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 77.4 tokens</li><li>max: 232 tokens</li></ul> |
* Samples:
  | sentence1                                            | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
  |:-----------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>benefits of reishi mushroom</code>             | <code>Reishi mushroom has been used to help enhance the immune system, reduce stress, improve sleep, and lessen fatigue. People also take reishi mushroom for health conditions such as: High blood pressure</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                          |
  | <code>shoulder replacement how long driving</code>   | <code>Shoulder Replacement Surger. After undergoing shoulder replacement surgery, it is important to have realistic expectations about the types of activities you may do. Minimal stretching of your new shoulder is recommended throughout your lifetime to maintain your full range of motion.You may give you permission to drive within four weeks after the surgery. If your surgery was on the right side, driving permission may not be. given until a month or six weeks following the surgery.inimal stretching of your new shoulder is recommended throughout your lifetime to maintain your full range of motion. You may give you permission to drive within four weeks after the surgery. If your surgery was on the right side, driving permission may not be. given until a month or six weeks following the surgery.</code> |
  | <code>cabinet has how many people in it trump</code> | <code>Should all of President-elect Donald Trump’s Cabinet nominees eventually be confirmed, he will start his administration with one of the most heavily business-oriented Cabinets in U.S. history. Five of the 15 people Trump has nominated to be Cabinet secretaries have spent all or nearly all their careers in the business world, with no significant public office or senior military service on their résumés.</code>                                                                                                                                                                                                                                                                                                                                                                                                       |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### msmarco_pairs2

* Dataset: [msmarco_pairs2](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 4,500 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                          |
  |:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                             |
  | details | <ul><li>min: 4 tokens</li><li>mean: 8.75 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 76.9 tokens</li><li>max: 229 tokens</li></ul> |
* Samples:
  | sentence1                                          | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
  |:---------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>when is grant of probate not required</code> | <code>We are frequently asked whether a grant of probate is always required after somebody’s death. Very small estates under £15,000 do not require a grant while those assets held in joint names (such as bank accounts or property) pass to the survivor on the Deceased persons death, meaning that no grant is required.he grant is a legal document which states the Deceased’s full name, as well as the Executors entrusted with the administration of the estate. The grant can then be presented to various financial institutions to collect the Deceased’s assets.</code> |
  | <code>what does speculation mean</code>            | <code>Wiktionary(3.00 / 1 vote)Rate this definition: 1  speculation(Noun) The process of thinking or meditating on a subject. 2  speculation(Noun) A judgment or conclusion reached by speculating. 3  speculation(Noun) An investment involving higher than normal risk in order to obtain a higher than normal return.</code>                                                                                                                                                                                                                                                              |
  | <code>what county is mint hill nc</code>           | <code>8. 21 Acres Mint Hill, Mecklenburg County, North Carolina $2,310,000. 21 Acres Mint Hill, Mecklenburg County, North Carolina $2,310,000. Great piece of property just waiting for the right buyer. Flat and open and conveniently located at 485 and Idlewild Rd....a rare find indeed!</code>                                                                                                                                                                                                                                                                                         |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": 3,
      "last_layer_weight": 0.15,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 0.75,
      "kl_temperature": 0.5
  }
  ```

#### nq_pairs

* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 3,667 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                            |
  |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                               |
  | details | <ul><li>min: 10 tokens</li><li>mean: 11.85 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 132.66 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                                            | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                      |
  |:-----------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>where are the powers of congress listed in the constitution</code>                             | <code>Powers of the United States Congress Article I of the Constitution sets forth most of the powers of Congress, which include numerous explicit powers enumerated in Section 8. Constitutional amendments have granted Congress additional powers. Congress also has implied powers derived from the Necessary and Proper Clause of the Constitution.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               |
  | <code>who is the man in we are never ever getting back together</code>                               | <code>We Are Never Ever Getting Back Together After writing Speak Now (2010) entirely solo, Swift opted to collaborate with different songwriters and producers for Red. Thus, she called Max Martin and Shellback, two songwriters and producers whose work she admired, to discuss a possible collaboration. The trio conceived the concept for "We Are Never Ever Getting Back Together" shortly after a friend of Swift's ex-boyfriend walked into the recording studio and spoke of rumors he heard that Swift and her former flame were reuniting. After the friend left, Martin and Shellback asked Swift to elaborate on the details of the relationship, which she described as "break up, get back together, break up, get back together, just, ugh, the worst". When Martin suggested that they write about the incident. Swift began playing the guitar and singing, "We are never ever......", and the song flowed rapidly afterwards. She described the process as one of the most humorous experiences she had while recording, and said the musical partners matched her expectations. An audio clip of her sarcastically speaking about breakups can be heard before the final chorus.[2] The song is reportedly about Swift's ex, Jake Gyllenhaal, as the two had broken up in January 2011 but had been seen on a date a few days later.[3] After the release of the music video, more clues linking the song to Gyllenhaal emerged,[3] with the actor looking like Gyllenhaal,[4] the actor in the video giving her a scarf as Gyllenhaal had reportedly done for Swift and a bracelet Swift wears in the video that is speculated to look similar to that of which Gyllenhaal was rumored to have given Swift for her birthday.[3]</code> |
  | <code>which agreement settled a boundary dispute between british canada and the united states</code> | <code>Alaska boundary dispute Finally, in 1903, the Hay-Herbert Treaty between the United States and Britain entrusted the decision to an arbitration by a mixed tribunal of six members: three Americans (Elihu Root, Secretary of War; Henry Cabot Lodge, senator from Massachusetts; and George Turner, ex-senator from Washington), two Canadians (Sir Louis A. Jette, Lieutenant Governor of Quebec; and Allen B. Aylesworth, K.C., from Toronto), and one Briton (Baron Alverstone). All sides respected Root, but he was a member of the U.S. Cabinet. Canadians ridiculed the choice of the obscure ex-Senator Turner and, especially, Lodge, a leading historian and diplomatic specialist whom they saw as an unobjective.[8]</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### nq_pairs2

* Dataset: [nq_pairs2](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 3,001 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                              |
  | details | <ul><li>min: 10 tokens</li><li>mean: 11.8 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 132.85 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                     | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                          |
  |:--------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what do i need to carry a gun in indiana</code>         | <code>Gun laws in Indiana Indiana requires a license for carrying a handgun generally, although there are several exceptions. Manner of carry, whether open or concealed, is not explicitly specified in the code, thus the license is required for concealed, open, or transport within vehicle, unless covered by one of the exceptions. A license is not required if carrying on property "owned, rented, leased, or otherwise legally controlled" by the carrier, or carried on another person's legally controlled property if permission has been obtained. A license is not required for carrying at a "gun show, firearm expo, gun owner's club or convention, hunting club, shooting club, or training course", or places where carrier is receiving "firearm related services." A license is also not required at a "shooting range", at a location where one is participating in a "firearms instructional course", or during "legal hunting activity." One may transport a handgun in a vehicle without a license if the handgun is "unloaded", "not readily accessible", and "secured in a case." A violation of is a class A misdemeanor. [5]</code> |
  | <code>who plays kristen and susan on days of our lives</code> | <code>Susan Banks Susan Banks is a fictional character on NBC's daytime drama Days of Our Lives. She was played by Eileen Davidson from November 4, 1996 to April 8, 1998, and again in 2014 and 2017. Susan is the eccentric mother of Elvis "EJ" DiMera, and once acted as Kristen Blake's doppelganger. In November 2011, it was announced that Brynn Thayer would take over the role of Susan, since Davidson was committed to The Young and the Restless. Thayer made her brief one-off appearance as Susan on December 7, 2011.[1]</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
  | <code>what is the revelation in the hero journey</code>       | <code>Hero's journey This is the point of realization in which a greater understanding is achieved. Armed with this new knowledge and perception, the hero is resolved and ready for the more difficult part of the adventure.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": 3,
      "last_layer_weight": 0.15,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 0.75,
      "kl_temperature": 0.5
  }
  ```

#### trivia_pairs

* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 9,700 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                            |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               |
  | details | <ul><li>min: 8 tokens</li><li>mean: 16.55 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 446.79 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                           | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  |:--------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Most of the Three Tenors come from which country?</code>      | <code>The Three Tenors - Pavarotti, Domingo, and Carreras The Three Tenors The Three Tenors Jose Carreras, Placido Domingo, and Luciano Pavarotti The Three Tenors - Luciano Pavarotti, Placido Domingo, and Jose Carreras.  Photo By: Yoshikatsu Tsuno / Getty Images By Aaron Green Who Are the Three Tenors? The Three Tenors are made up of three of the world's most famous and beloved operatic tenors which include Jose Carreras, Placido Domingo, and Luciano Pavarotti. Jose Carreras (1946-) Born in Barcelona, Spain, Jose Carreras has been performing since he was 11 years old.  Professionally, Carreras began his operatic career on December 19, 1970, when he sang the principal role of Gennaro alongside Montserrat Caballe in Donizetti's Lucrezia Borgia .  Aside from performing, Carreras serves as president for the José Carreras International Leukaemia Foundation, which he founded after successfully overcoming his own battle with cancer. Placido Domingo (1941-) With over 100 operas and 147 roles under his belt, Placido Domingo is a seasoned operatic star.  Born in Madrid, Spain, the celebrated tenor made his operatic debut as “Alfredo” in La Triviata at Monterrey, Mexico in 1961.  Just as Carreras and Pavarotti, Domingo has performed in opera houses around the world.  Now in his mid-70s and understanding the changes in his voice, Domingo sings baritone roles instead.  In 1993, Domingo founded a young singer competition called Operalia.  The competition is open to 18-32 year olds, and is hosted in a different city every year.  Out of nearly 1,000 entrants, only the top 40 are selected for the competition. continue reading below our video Great Singers Gone too Soon Luciano Pavarotti (1935-2007) Born in Modena, Italy, Pavarotti had dreams of becoming a soccer goalkeeper, and it turned out he was quite good.  However, his interest in music edged its way ahead after he won first place in the Llangollen International Singing Competition in Wales.  Pavarotti went on to become one of the first opera stars to have nearly his entire performing career recorded musically and visually.  He easily sold out shows and performed for millions of people in single performances. The Origin of the Three Tenors The idea for the Three Tenors came from Mario Dradi, an Italian manager and producer.  Dradi's idea was to create a group of tenors for a concert and donate a portion of the proceeds to Jose Carreras's foundation after his successful treatment of leukemia.  Jose Carreras, along with his two friends,  Placido Domingo and Luciano Pavarotti , agreed to perform as the Three Tenors .  Dradi's idea came to fruition on July 7, 1990, the day before the FIFA World Cup in Rome.  The concert was watched by over 800 million viewers, and was so well received that when a recording of the concert was released, it became the biggest selling classical album in history.  The album, Carreras - Domingo - Pavarotti: the Three Tenors in Concert , set a Guinness World Record.  Because of the trio's instant success, they performed at the following three FIFA World Cups: Los Angeles in 1994, Paris in 1998, and Yokohama in 2002.  The tremendous reception of the Three Tenors was due largely in part to their incredible voices, down-to-earth, likable personalities, and song selections. The trio would regularly perform classic and well-known operatic arias, as well as popular Broadway show tunes that even the most novice classical music listener could love and appreciate.  Given the trio's enormous popularity, imitations of the Three Tenors quickly arose all over the world, including the Three Canadian Tenors, the Chinese Tenors, as well as the Three Mo' Tenors. The Three Tenors: Recommended YouTube Videos The Three Tenors sing "O Sole Mio" (1994) ( Watch on YouTube ) The Three Tenors sing "La donna e mobile" (1994) ( Watch on YouTube ) The Three Tenors sing "Singin' in the Rain" (1994) ( Watch on YouTube ) The Three Tenors sing "New York, New York" (1996) ( Watch on YouTube ) The Three Tenors sing "Ti Voglio Tanto Bene" (1998) ( Watch on YouTube ) The Three Tenors sing "Nessun Dorm</code> |
  | <code>What are the two inferior planets in our solar system?</code> | <code>An Overview of the Solar System, it's alignment and pictures The inner solar system contains the Sun , Mercury , Venus , Earth and Mars : The main asteroid belt (not shown) lies between the orbits of Mars and Jupiter. The planets of the outer solar system are Jupiter , Saturn , Uranus , and Neptune ( Pluto is now classified as a dwarf planet): The first thing to notice is that the solar system is mostly empty space. The planets are very small compared to the space between them. Even the dots on the diagrams above are too big to be in proper scale with respect to the sizes of the orbits. The orbits of the planets are ellipses with the Sun at one focus, though all except Mercury are very nearly circular. The orbits of the planets are all more or less in the same plane (called the ecliptic and defined by the plane of the Earth's orbit) . The ecliptic is inclined only 7 degrees from the plane of the Sun's equator. The above diagrams show the relative sizes of the orbits of the eight planets (plus Pluto) from a perspective somewhat above the ecliptic (hence their non-circular appearance). They all orbit in the same direction (counter-clockwise looking down from above the Sun's north pole); all but Venus, Uranus and Pluto also rotate in that same sense. (The above diagrams show correct positions for October 1996 as generated by the excellent planetarium program Starry Night ; there are also many other similar programs available, some free. You can also use Emerald Chronometer on your iPhone or Emerald Observatory on your iPad to find the current positions.) Sizes The above composite shows the eight planets and Pluto with approximately correct relative sizes (see another similar composite and a comparison of the terrestrial planets or Appendix 2 for more). One way to help visualize the relative sizes in the solar system is to imagine a model in which everything is reduced in size by a factor of a billion. Then the model Earth would be about 1.3 cm in diameter (the size of a grape). The Moon would be about 30 cm (about a foot) from the Earth. The Sun would be 1.5 meters in diameter (about the height of a man) and 150 meters (about a city block) from the Earth. Jupiter would be 15 cm in diameter (the size of a large grapefruit) and 5 blocks away from the Sun. Saturn (the size of an orange) would be 10 blocks away; Uranus and Neptune (lemons) 20 and 30 blocks away. A human on this scale would be the size of an atom but the nearest star would be over 40000 km away. Not shown in the above illustrations are the numerous smaller bodies that inhabit the solar system: the satellites of the planets; the large number of asteroids (small rocky bodies) orbiting the Sun, mostly between Mars and Jupiter but also elsewhere; the comets (small icy bodies) which come and go from the inner parts of the solar system in highly elongated orbits and at random orientations to the ecliptic; and the many small icy bodies beyond Neptune in the Kuiper Belt . With a few exceptions, the planetary satellites orbit in the same sense as the planets and approximately in the plane of the ecliptic but this is not generally true for comets and asteroids. The classification of these objects is a matter of minor controversy. Traditionally, the solar system has been divided into planets (the big bodies orbiting the Sun), their satellites (a.k.a. moons, variously sized objects orbiting the planets), asteroids (small dense objects orbiting the Sun) and comets (small icy objects with highly eccentric orbits). Unfortunately, the solar system has been found to be more complicated than this would suggest: there are several moons larger than Pluto and two larger than Mercury; there are many small moons that are probably started out as asteroids and were only later captured by a planet; comets sometimes fizzle out and become indistinguishable from asteroids; the Kuiper Belt objects (including Pluto) and others like    Chiron don't fit this scheme well The Earth/Moon and Pluto/Charon systems    are sometimes considered "double planets". Other classification</code>                     |
  | <code>What is the word for ‘Friend’ in Swahili?</code>              | <code>Swahili Plural Swahili Plural Swahili Lessons Swahili Plural If you're trying to learn Swahili Plural which is also called Kiswahili, check our courses about Plural and Singular... to help you with your Swahili grammar. Try to concentrate on the lesson and notice the pattern that occurs each time the word changes its place. Also don't forget to check the rest of our other lessons listed on Learn Swahili . Enjoy the rest of the lesson! Swahili Plural Learning the Swahili Plural displayed below is vital to the language. Swahili Plurals are grammatical numbers, typically referring to more than one of the referent in the real world. In the English language, singular and plural are the only grammatical numbers. Grammar Tips:  mtu amekuja( a person hascome), watu wamekuja( persons have come) msichana ameingia( a girl hasentered), wasichana wameingia ( girls have entered) daktari ametoka( the doctor hasgone ot) Madaktari wametoka( doctors have gone out)   This class is called A-WA class as you cansee from sentence construction. * In this class many nouns have prefix m- inthe singular and wa- in the prulal. However there are very many irregular nounsthat don’t follow this rule. E.g Rafiki( friend) Daktari( doctor) Kiwete(lameperson), Rafiki(a friend) becomes: marafiki ( friends) Mwanamke(one woman) becomes: Wanawake (manywomen) Hilii ni gari langu lekundu (this is my redcar) becomes: Haya ni magari yangu mekundu (these are my red cars) Here are some examples:</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### quora_pairs

* Dataset: [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb)
* Size: 9,822 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                         |
  |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                            |
  | details | <ul><li>min: 6 tokens</li><li>mean: 13.4 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 13.45 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | sentence1                                                                                  | sentence2                                                                                     |
  |:-------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------|
  | <code>If the world war 3 breaks out, which country would be safest to live and why?</code> | <code>If a global war, or WW3 broke out, which country would be the safest to live in?</code> |
  | <code>From where and how to learn math?</code>                                             | <code>How can I learn math?</code>                                                            |
  | <code>Does any one exist in this universe who never broke down any rules?</code>           | <code>Does any one exist in this universe who never broke down the rules?</code>              |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": 3,
      "last_layer_weight": 0.15,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 0.75,
      "kl_temperature": 0.5
  }
  ```

#### gooaq_pairs

* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 6,875 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                              |
  | details | <ul><li>min: 8 tokens</li><li>mean: 11.39 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 57.58 tokens</li><li>max: 150 tokens</li></ul> |
* Samples:
  | sentence1                                          | sentence2                                                                                                                                                                                                                                                                                                                       |
  |:---------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>are sandwiches bad for cholesterol?</code>   | <code>However, if you have started watching your cholesterol and triglycerides, adding those plentiful layers of deli meat could sabotage an otherwise heart-healthy meal. Animal meats contain varying amounts of saturated fat — which may increase lipid levels in your blood.</code>                                        |
  | <code>what are the vocal music of mindanao?</code> | <code>Mindanao folk music includes the ancient Muslim folk song and dance called estijaro, and a Mindanao folk song called uruyan. These are usually accompanied by drums, gongs, or other percussion instruments like the subing, a gong.</code>                                                                               |
  | <code>what are immediate cause of death?</code>    | <code>Immediate cause of death:The final disease or injury causing the death. Intermediate cause of death: A disease or condition that preceded and caused the immediate cause of death. Underlying cause of death: A disease or condition present before, and leading to, the intermediate or immediate cause of death.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### gooaq_pairs2

* Dataset: [gooaq_pairs2](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 5,625 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                              |
  | details | <ul><li>min: 8 tokens</li><li>mean: 11.38 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 57.49 tokens</li><li>max: 156 tokens</li></ul> |
* Samples:
  | sentence1                                      | sentence2                                                                                                                                                                                                                     |
  |:-----------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>is mhrb legal in canada?</code>          | <code>Canada: The Mimosa Hostilis plant is legal in Canada.</code>                                                                                                                                                            |
  | <code>are 4k blu ray discs region free?</code> | <code>When it comes to regional restrictions, the good news is 4K Blu-ray discs have just one region code: worldwide. Essentially, they're region-free. They can be played on any 4K player, in any part of the world.</code> |
  | <code>are rsd and crps the same?</code>        | <code>What is Reflex Sympathetic Dystrophy (RSD) Syndrome? RSD is an older term used to describe one form of Complex Regional Pain Syndrome (CRPS).</code>                                                                    |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": 3,
      "last_layer_weight": 0.15,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 0.75,
      "kl_temperature": 0.5
  }
  ```

#### mrpc_pairs

* Dataset: [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c)
* Size: 2,474 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 9 tokens</li><li>mean: 26.45 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 26.45 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                  | sentence2                                                                                                                             |
  |:-------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Prosecutors filed a motion informing Lee they intend to seek the death penalty .</code>                                              | <code>He added that prosecutors will seek the death penalty .</code>                                                                  |
  | <code>Although Prempro pills were used in the study , the researchers said they had no evidence that other brands were safer .</code>      | <code>Athough Prempro pills were used in the study , the researchers say they have no evidence that other brands are safer .</code>   |
  | <code>D 'Cunha said , from a science standpoint , Toronto 's last case was April 19 , so the all-clear day was actually yesterday .</code> | <code>He said , from a science standpoint , the city 's last case was April 19 , so the all clear day was actually yesterday .</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

### Evaluation Datasets

#### nli-pairs

* Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 160 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                            | positive                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                           |
  | details | <ul><li>min: 5 tokens</li><li>mean: 17.27 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 9.57 tokens</li><li>max: 21 tokens</li></ul> |
* Samples:
  | anchor                                                                                                                                                                         | positive                                                    |
  |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------|
  | <code>Two women are embracing while holding to go packages.</code>                                                                                                             | <code>Two woman are holding packages.</code>                |
  | <code>Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink.</code> | <code>Two kids in numbered jerseys wash their hands.</code> |
  | <code>A man selling donuts to a customer during a world exhibition event held in the city of Angeles</code>                                                                    | <code>A man selling donuts to a customer.</code>            |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### vitaminc-pairs

* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 133 evaluation samples
* Columns: <code>claim</code> and <code>evidence</code>
* Approximate statistics based on the first 1000 samples:
  |         | claim                                                                             | evidence                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 9 tokens</li><li>mean: 21.43 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 35.38 tokens</li><li>max: 79 tokens</li></ul> |
* Samples:
  | claim                                                                               | evidence                                                                                                                                                                                                                                                                                                                                               |
  |:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Dragon Con had over 5000 guests .</code>                                      | <code>Among the more than 6000 guests and musical performers at the 2009 convention were such notables as Patrick Stewart , William Shatner , Leonard Nimoy , Terry Gilliam , Bruce Boxleitner , James Marsters , and Mary McDonnell .</code>                                                                                                          |
  | <code>COVID-19 has reached more than 185 countries .</code>                         | <code>As of , more than cases of COVID-19 have been reported in more than 190 countries and 200 territories , resulting in more than deaths .</code>                                                                                                                                                                                                   |
  | <code>In March , Italy had 3.6x times more cases of coronavirus than China .</code> | <code>As of 12 March , among nations with at least one million citizens , Italy has the world 's highest per capita rate of positive coronavirus cases at 206.1 cases per million people ( 3.6x times the rate of China ) and is the country with the second-highest number of positive cases as well as of deaths in the world , after China .</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### qnli-contrastive

* Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          | label                        |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------|
  | type    | string                                                                            | string                                                                             | int                          |
  | details | <ul><li>min: 7 tokens</li><li>mean: 14.57 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 37.21 tokens</li><li>max: 115 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> |
* Samples:
  | sentence1                                                                 | sentence2                                                                                                                                        | label          |
  |:--------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
  | <code>What came into force after the new constitution was herald?</code>  | <code>As of that day, the new constitution heralding the Second Republic came into force.</code>                                                 | <code>0</code> |
  | <code>What is the first major city in the stream of the Rhine?</code>     | <code>The most important tributaries in this area are the Ill below of Strasbourg, the Neckar in Mannheim and the Main across from Mainz.</code> | <code>0</code> |
  | <code>What is the minimum required if you want to teach in Canada?</code> | <code>In most provinces a second Bachelor's Degree such as a Bachelor of Education is required to become a qualified teacher.</code>             | <code>0</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "OnlineContrastiveLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### scitail-pairs-qa

* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 160 evaluation samples
* Columns: <code>sentence2</code> and <code>sentence1</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence2                                                                         | sentence1                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 15.09 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 14.38 tokens</li><li>max: 30 tokens</li></ul> |
* Samples:
  | sentence2                                                                                                                             | sentence1                                                                                                                                  |
  |:--------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Modern members of the reptiles group live in many different habitats and are found on every continent except antarctica.</code> | <code>Modern members of what broad animal group live in many different habitats and are found on every continent except antarctica?</code> |
  | <code>Plants not only contribute food but oxygen for organisms.</code>                                                                | <code>Plants not only contribute food but what else for organisms?</code>                                                                  |
  | <code>The water cycle involves movement of water between air and land.</code>                                                         | <code>The water cycle involves movement of water between air and what?</code>                                                              |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### scitail-pairs-pos

* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 23.01 tokens</li><li>max: 61 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.46 tokens</li><li>max: 36 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                         | sentence2                                                                                          |
  |:----------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------|
  | <code>An introduction to atoms and elements, compounds, atomic structure and bonding, the molecule and chemical reactions.</code> | <code>Replace another in a molecule happens to atoms during a substitution reaction.</code>        |
  | <code>Wavelength The distance between two consecutive points on a sinusoidal wave that are in phase;</code>                       | <code>Wavelength is the distance between two corresponding points of adjacent waves called.</code> |
  | <code>humans normally have 23 pairs of chromosomes.</code>                                                                        | <code>Humans typically have 23 pairs pairs of chromosomes.</code>                                  |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### xsum-pairs

* Dataset: [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) at [788ddaf](https://huggingface.co/datasets/sentence-transformers/xsum/tree/788ddafe04e539956d56b567bc32a036ee7b9206)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                            | sentence2                                                                          |
  |:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                               | string                                                                             |
  | details | <ul><li>min: 29 tokens</li><li>mean: 178.81 tokens</li><li>max: 350 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 25.62 tokens</li><li>max: 50 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   | sentence2                                                                                                                                  |
  |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Lily the kid became unwell when she was three weeks old and was taken inside to be looked after.<br>Her owner Rebecca Mineards hopes Lily, who has to wear nappies in the house, will soon be able to return to her mother.<br>Meanwhile, she is learning to play with the household pups and even wrestling their toys away from them.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        | <code>A sickly baby goat being nursed back to health in Northamptonshire has started behaving like the dogs she shares a home with.</code> |
  | <code>It was spotted at Antrim Grammar School on Tuesday morning.<br>"Several attempts were made to safely and humanely manage the animal," a PSNI spokesperson said.<br>"Regrettably, after consultation with the vet working with officers at the scene, we were required to shoot the stag as it had become very agitated and posed a risk to the public."<br>The office of the Police Ombudsman has been informed.<br>Greg Kayne, chairman of the British Deer Society in Northern Ireland, told the BBC: "The professionals on the ground would have had to make a risk assessment and that risk assessment would have been focused on public safety.<br>"Unpalatable though the outcome was for the deer, it sounds as though they had few if any options other than to do what they actually had to do."<br>School principal Hilary Woods said the incident had upset some pupils.<br>"We had to basically keep the school in lockdown until the situation was resolved, and there were a number of pupils who were obviously very distressed when they heard about the final outcome," she said.<br>"It actually ran past some of the pupils when they were outside.<br>"It could have caused damage and it would have been far worse for me as a principal to deal with, if a child or a member of the public had been injured."<br>Pupil Jordan McKelvey, who is 17, said: "I just saw the deer trapped and a lot of people and hearing the gunshots and it was quite distracting in class.<br>"It was quite sad and distressing to see that and hear it."</code> | <code>A police armed response unit has shot dead a large wild stag on the grounds of a school in County Antrim.</code>                     |
  | <code>Rubble fell down the 30m-high (100ft) cliff in East Cliff on 24 April and partially engulfed the Edwardian funicular carriages.<br>Bournemouth Borough Council said the carriages had been removed but work to stabilise the slope was continuing.<br>Environment boss Larry Austin said "all options" were being looked at for reinstating the lift.<br>He said: "We remain committed to the future of a cliff lift. However, it will be some time before these assessments are made and any necessary stabilisation work commences. The cliff lift will not reopen this summer.<br>"Our next step after the removal of the final debris is a thorough assessment of the site to consider the next course of action."<br>An abseiling team was used to secure the carriages of the East Cliff Lift to the running rails ahead of their removal.<br>A temporary road closure remains in place along East Overcliff Drive while the work is carried out.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        | <code>A cliff railway which was damaged by a landslide in Bournemouth will not reopen this summer, a council has said.</code>              |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### compression-pairs

* Dataset: [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                        |
  |:--------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                           |
  | details | <ul><li>min: 15 tokens</li><li>mean: 32.36 tokens</li><li>max: 77 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 10.5 tokens</li><li>max: 33 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                                                                                                                                                                                                 | sentence2                                                                       |
  |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|
  | <code>Harris Corporation, an international communications and information technology company, has acquired key infrastructure assets of the government business of Core180, Inc. Based in Fairfax, Virginia, Core180 is a telecom network integrator that provides wide area networks and management services to government agencies, large enterprises, telecom carriers and systems integrators.</code> | <code>Harris Corporation acquires infrastructure assets from Core180</code>     |
  | <code>Road rage is believed to have led to the deadly shooting of a Shreveport man this afternoon.</code>                                                                                                                                                                                                                                                                                                 | <code>Road rage could be behind deadly shooting</code>                          |
  | <code>Vice President Hamid Ansari will be undertaking a six-day bilateral visit to Turkey starting from Monday that will further strengthen trade and commerce ties between the two countries.</code>                                                                                                                                                                                                     | <code>Hamid Ansari to visit Turkey to strengthen trade and commerce ties</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### sciq_pairs

* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 9 tokens</li><li>mean: 16.81 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 76.27 tokens</li><li>max: 424 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                            | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  |:---------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Transport epithelia that function in maintaining water balance also often function in disposal of what?</code> | <code></code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                 |
  | <code>What is the 'stuff' that all things are made of?</code>                                                        | <code>Living things are made of matter. In fact, matter is the “stuff” of which all things are made. Anything that occupies space and has mass is known as matter. Matter, in turn, consists of chemical substances. A chemical substance is a material that has a definite chemical composition. It is also homogeneous, so the same chemical composition is found uniformly throughout the substance. A chemical substance may be an element or a chemical compound.</code> |
  | <code>What is transmitted that makes up the electromagnetic spectrum?</code>                                         | <code>Electromagnetic radiation is energy transmitted as waves with different wavelengths. This makes up the electromagnetic spectrum.</code>                                                                                                                                                                                                                                                                                                                                 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### qasc_pairs

* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 5 tokens</li><li>mean: 11.19 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 34.3 tokens</li><li>max: 59 tokens</li></ul> |
* Samples:
  | sentence1                                                                  | sentence2                                                                                                                                                                                                                        |
  |:---------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what uses unsaturated fatty acids to store energy?</code>            | <code>Plants use unsaturated fatty acids to store energy.. All plants are of bush type.. bushes use unsaturated fatty acids to store energy</code>                                                                               |
  | <code>What happens when faults move?</code>                                | <code>faulting of rock in Earth 's crust causes earthquakes. Faults, cracks in the Earth's crust, produce earthquakes when they move or slip.. Earthquakes happen when faults move. </code>                                      |
  | <code>ats and proteins can be used for energy by the cells of what?</code> | <code>Glucose is used for energy by the cells of most organisms.. After hours of no glucose ingestion, fats and proteins can be used for energy.. fats and proteins can be used for energy by the cells of most organisms</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### qasc_facts_sym

* Dataset: [qasc_facts_sym](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 160 evaluation samples
* Columns: <code>combinedfact</code> and <code>facts</code>
* Approximate statistics based on the first 1000 samples:
  |         | combinedfact                                                                      | facts                                                                              |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 5 tokens</li><li>mean: 11.56 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 24.51 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | combinedfact                                                                           | facts                                                                                                                          |
  |:---------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
  | <code>plasma ionizes metal</code>                                                      | <code>plasma is formed by electrons separating from atoms in stars. Metal atoms are ionized in an intense plasma..</code>      |
  | <code>Sharks use gills to breathe</code>                                               | <code>breathing is when a gill converts from oxygen in water into oxygen in blood. Sharks breathe under water..</code>         |
  | <code>Gases released during the use of fossil fuels threaten the entire planet.</code> | <code>gases released during the use of fossil fuels causes global warming. Global warming threatens the entire planet..</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

#### openbookqa_pairs

* Dataset: openbookqa_pairs
* Size: 160 evaluation samples
* Columns: <code>question</code> and <code>fact</code>
* Approximate statistics based on the first 1000 samples:
  |         | question                                                                          | fact                                                                              |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 3 tokens</li><li>mean: 13.64 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.46 tokens</li><li>max: 28 tokens</li></ul> |
* Samples:
  | question                                                               | fact                                                                         |
  |:-----------------------------------------------------------------------|:-----------------------------------------------------------------------------|
  | <code>The thermal production of a stove is generically used for</code> | <code>a stove generates heat for cooking usually</code>                      |
  | <code>What creates a valley?</code>                                    | <code>a valley is formed by a river flowing</code>                           |
  | <code>when it turns day and night on a planet, what cause this?</code> | <code>a planet rotating causes cycles of day and night on that planet</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### msmarco_pairs

* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                           |
  |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                              |
  | details | <ul><li>min: 4 tokens</li><li>mean: 8.42 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 74.97 tokens</li><li>max: 183 tokens</li></ul> |
* Samples:
  | sentence1                              | sentence2                                                                                                                                                                                                                                        |
  |:---------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>who is jesse james decker</code> | <code>Jessie James Decker. Singer/songwriter/designer Kittenish/fave4 For inquiries beautybloggerjjd@gmail.com www.kittenish.com smarturl.it/LDL.</code>                                                                                         |
  | <code>what sick day means</code>       | <code>sick day noun [C]. › a day for which an ​employee ​receives ​pay while ​absent from ​work because of ​illness. (Definition of sick day from the Cambridge Academic Content Dictionary © Cambridge University Press).</code> |
  | <code>what is pressors</code>          | <code>Antihypotensive agent. An antihypotensive agent, also known as a vasopressor agent or pressor, is any medication that tends to raise reduced blood pressure.</code>                                                                        |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### nq_pairs

* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                            |
  |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                               |
  | details | <ul><li>min: 10 tokens</li><li>mean: 12.07 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 126.39 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                   | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  |:----------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what tsar was overthrown in the russian revolution</code>             | <code>Russian Revolution The Russian Revolution was a pair of revolutions in Russia in 1917 which dismantled the Tsarist autocracy and led to the rise of the Soviet Union. The Russian Empire collapsed with the abdication of Emperor Nicholas II and the old regime was replaced by a provisional government during the first revolution of February 1917 (March in the Gregorian calendar; the older Julian calendar was in use in Russia at the time). Alongside it arose grassroots community assemblies (called 'soviets') which contended for authority. In the second revolution that October, the Provisional Government was toppled and all power was given to the soviets.</code> |
  | <code>who was the longest reigning monarch in the history of england</code> | <code>List of monarchs in Britain by length of reign Queen Elizabeth II became the longest-reigning British monarch on 9 September 2015 when she surpassed the reign of her great-great-grandmother Victoria.[1][2] On 6 February 2017 she became the first British monarch to celebrate a sapphire jubilee, commemorating 65 years on the throne.</code>                                                                                                                                                                                                                                                                                                                                     |
  | <code>where does isle of man tt take place</code>                           | <code>Isle of Man TT The International Isle of Man TT (Tourist Trophy) Race is an annual motorcycle sport event run on the Isle of Man in May or June of most years since its inaugural race in 1907.[3]</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                               |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### trivia_pairs

* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                             |
  |:--------|:----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                                |
  | details | <ul><li>min: 9 tokens</li><li>mean: 16.94 tokens</li><li>max: 35 tokens</li></ul> | <ul><li>min: 140 tokens</li><li>mean: 463.59 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                                          | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  |:---------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Mr Kitty is the pet cat of which character in the US animated tv series ‘South Park’?</code> | <code>South Park Other Recurring Characters / Characters - TV Tropes Supporting Leader : During the "Imaginationland" trilogy. Took a Level in Badass : He was originally quite meek. Then, starting in "Red Sleigh Down", Jesus kicks all sorts of ass.     Mayor MacDaniels  "My geologist? Now? Tell him the infection is fine and I don't need another check-up." Voiced by: Mary Kay Bergman (1997-1999), Eliza Schneider (1999-2003), April Stewart (2004-present) Debut: "Weight Gain 4000" Corrupt and idiotic mayor of South Park. Bi the Way : She's implied to be in a relationship with Officer Barbrady and slept with Chef, but she's also slept with Mrs. Cartman and was shown looking at a porn magazine of busty women in one episode. Characterization Marches On : Originally she was a lot stupider than most of the other South Park adults, but as they got dumber she's generally become smarter; modern episodes tend to vary on whether she's just as dumb as them or the Only Sane Adult in town. Jerkass : She considers everyone in town a bunch of hillbillies and isn't above trying to convince Barbrady to shoot kids. Took a Level in Kindness : From "Red Hot Catholic Love" and on, he's a Good Shepherd instead of a Fundamentalist .     Officer Barbrady  "Okay, people, move along. Nothing to see here." Voiced by: Trey Parker Debut: "Cartman Gets an Anal Probe" The town cop. Incredibly stupid and unhelpful, at least in the earlier seasons. His squad car says "To Patronise And Annoy". Coinciding with decreased prominence, it seems that he has gotten more competent over the years. A Day in the Limelight : "Naughty Ninjas" gave him more focus and development than he'd received in years. Anti-Hero : However his heart is in the right place. Bad Cop/Incompetent Cop : Incompetent; the mayor can make him believe the people he left to die in jail never came despite the bodies being in front of him. He apparently killed the chicken lover just to show Cartman how you hit someone with a nightstick. Man Child : When he returns to class so he can learn to read he reverts quickly to a child. The Only Believer : The only member of the South Park Police Department who actually cares about justice. Out of Focus : In later seasons when South Park gets an actual police force. Police Are Useless : Sometimes is unable to deal with the problems in the town. Token Good Teammate : He might be an incompetent moron, but Barbrady is often portrayed as the only member of South Park's police force who is neither corrupt nor racist. "Naughty Ninjas" seems to imply he's the only member of the force who joined to protect the town rather than to beat up minorities. Took a Level in Badass : Grew more competent in later seasons, helping to arrest a pharmacist for selling cough syrup to minors in "Quest for Ratings", warned people about the giant hamsters attacking in "Pandemic", attempted to stop the drunken party in "About Last Night" (here he fails because the partiers tip his car over), and in "Medicinal Fried Chicken", helps to take down the illegal KFC cartel, surviving the shootout between the cops and Colonel Sanders' enforcers. He also saves Jimmy's life from a psychotic GEICO representative in "Sponsored Content". Trauma Conga Line : The entirety of "Naughty Ninjas", with him losing his job, being unable to provide for his sick and elderly dog, and ending up on the street and homeless. Vetinari Job Security : He had this in the first few seasons, when he was the only police officer in South Park.     Sergeant Harrison Yates  " JESUS CHRIST MONKEY BALLS!!! We could have made an innocent man go to jail who wasn't black!" Voiced by: Trey Parker Debut: "Christian Rock Hard" An incompetent and corrupt police officer who hates rich black people. Works under the Park County Police Department. Anti-Hero : Beating minorities is part of the policework and why the whole precinct save Barbrady joined in. But he tries getting the police job done. Bad Cop/Incompetent Cop : Abuses minorities, arrests someone for calling for a different type of cock magic and takes hours to figure out an obvious</code> |
  | <code>Which Roman God is one of the symbols of St Valentine's Day?</code>                          | <code>The Truth Behind St. Valentine’s Day   Home   Library   Articles     Man’s Holidays   The Truth Behind St. Valentine’s Day St. Valentine’s Day is the world’s “holiday of love.” Since the Bible states that God is love ( I John 4:8 , 16 ), does He approve of the celebration of this day? Does He want His people—true Christians—partaking of the candy and cards, or any customs associated with this day? Save Subscribe When God says He wants you to live life abundantly ( John 10:10 ), does that include celebrating a festive, seemingly harmless holiday like Valentine’s Day? The God who gives us everything—life, food, drink, the ability to think for ourselves, etc.—surely approves of St. Valentine’s Day, the holiday for lovers to exchange gifts—right? Do not be so certain. Do not assume anything. Do not even take this article’s word for it. Go to history books and encyclopedias. Go to the Bible. Then you will know the real truth behind St. Valentine’s Day. And you will know what God expects you to do about it! Valentine’s Past Like Christmas, Easter, Halloween, New Year’s and other holidays of this world, St. Valentine’s Day is another attempt to “whitewash” perverted customs and observances of pagan gods and idols by “Christianizing” them. As innocent and harmless as St. Valentine’s Day may appear, its traditions and customs originate from two of the most sexually perverted pagan festivals of ancient history: Lupercalia and the feast day of Juno Februata. Celebrated on February 15, Lupercalia (known as the “festival of sexual license”) was held by the ancient Romans in honor of Lupercus, god of fertility and husbandry, protector of herds and crops, and a mighty hunter—especially of wolves. The Romans believed that Lupercus would protect Rome from roving bands of wolves, which devoured livestock and people. Assisted by Vestal Virgins, the Luperci (male priests) conducted purification rites by sacrificing goats and a dog in the Lupercal cave on Palatine Hill, where the Romans believed the twins Romulus and Remus had been sheltered and nursed by a she-wolf before they eventually founded Rome. Clothed in loincloths made from sacrificed goats and smeared in their blood, the Luperci would run about Rome, striking women with februa, thongs made from skins of the sacrificed goats. The Luperci believed that the floggings purified women and guaranteed their fertility and ease of childbirth. February derives from februa or “means of purification.” To the Romans, February was also sacred to Juno Februata, the goddess of febris (“fever”) of love, and of women and marriage. On February 14, billets (small pieces of paper, each of which had the name of a teen-aged girl written on it) were put into a container. Teen-aged boys would then choose one billet at random. The boy and the girl whose name was drawn would become a “couple,” joining in erotic games at feasts and parties celebrated throughout Rome. After the festival, they would remain sexual partners for the rest of the year. This custom was observed in the Roman Empire for centuries. Whitewashing Perversion In A.D. 494, Pope Gelasius renamed the festival of Juno Februata as the “Feast of the Purification of the Virgin Mary.” The date of its observance was later changed from February 14 to February 2, then changed back to the 14. It is also known as Candlemas, the Presentation of the Lord, the Purification of the Blessed Virgin and the Feast of the Presentation of Christ in the Temple. After Constantine had made the Roman church’s brand of Christianity the official religion of the Roman Empire (A.D. 325), church leaders wanted to do away with the pagan festivals of the people. Lupercalia was high on their list. But the Roman citizens thought otherwise. It was not until A.D. 496 that the church at Rome was able to do anything about Lupercalia. Powerless to get rid of it, Pope Gelasius instead changed it from February 15 to the 14th and called it St. Valentine’s Day. It was named after one of that church’s saints, who, in A.D. 270, was executed by the emperor for his beliefs. According to th</code> |
  | <code>46664 was the prison number of which famous political figure?</code>                         | <code>Nelson Mandela Fast Facts - CNN.com Nelson Mandela Fast Facts CNN Library Updated 8:13 PM ET, Tue September 22, 2015 Chat with us in Facebook Messenger. Find out what's happening in the world as it unfolds. Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Nelson Mandela, the prisoner-turned-president who reconciled South Africa after the end of apartheid, died on December 5, 2013. He was 95. Hide Caption 1 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela became president of the African National Congress Youth League in 1951. Hide Caption Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela poses for a photo, circa 1950. Hide Caption Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela poses in boxing gloves in 1952. Hide Caption 4 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela in the office of Mandela & Tambo, a law practice set up in Johannesburg by Mandela and Oliver Tambo to provide free or affordable legal representation to black South Africans. Hide Caption 5 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – From left: Patrick Molaoa, Robert Resha and Mandela walk to the courtroom for their treason trial in Johannesburg. Hide Caption 6 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela married his second wife, social worker Winnie Madikizela, in 1958. At the time, he was an active member of the African National Congress and had begun his lifelong commitment to ending segregation in South Africa. Hide Caption 7 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Nelson and Winnie Mandela raise their fists to salute a cheering crowd upon his 1990 release from Victor Verster Prison. He was still as upright and proud, he would say, as the day he walked into prison 27 years before. Hide Caption 8 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – A jubilant South African holds up a newspaper announcing Mandela's release from prison at an ANC rally in Soweto on February 11, 1990. Two days later, more than 100,000 people attended a rally celebrating his release from jail. Hide Caption 9 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela and Zambian President Kenneth Kaunda arrive at an ANC rally on March 3, 1990, in Lusaka, Zambia. Mandela was elected president of the ANC the next year. Hide Caption 10 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – After his release in 1990, Mandela embarked on a world tour, meeting U.S. President George H.W. Bush at the White House in June. Hide Caption 11 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – At his Soweto home on July 18, 1990, Mandela blows out the candles on his 72nd birthday cake. It was the first birthday he celebrated as a free man since the 1960s. Hide Caption 12 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela and his wife react to supporters during a visit to Brazil at the governor's palace in Rio De Janeiro, on August 1, 1991. Hide Caption 13 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – South African President Frederik de Klerk, right, and Mandela shared a Nobel Peace Prize in 1993 for their work to secure a peaceful transition from apartheid rule. Hide Caption 14 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela votes for the first time in his life on March 26, 1994. Hide Caption 15 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – On April 27, 1994, a long line of people snake toward a polling station in the black township of Soweto outside of Johannesburg in the nation's first all-race elections. Hide Caption 16 of 31 Photos: The evolution of Nelson Mandela The evolution of Nelson Mandela – Mandela in Mmabatho for an election rally on March 15,</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### quora_pairs

* Dataset: [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb)
* Size: 675 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 6 tokens</li><li>mean: 13.57 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 13.43 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | sentence1                                                           | sentence2                                                     |
  |:--------------------------------------------------------------------|:--------------------------------------------------------------|
  | <code>Does a transition matrix have to be square?</code>            | <code>Does a transition matrix have to be square? Why?</code> |
  | <code>What is Triple Talaaq?</code>                                 | <code>How does triple talaq work?</code>                      |
  | <code>What does it mean when your period is three days late?</code> | <code>My period is 5 days late, what do I do?</code>          |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": 3,
      "last_layer_weight": 0.15,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 0.75,
      "kl_temperature": 0.5
  }
  ```

#### gooaq_pairs

* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                              |
  | details | <ul><li>min: 8 tokens</li><li>mean: 11.18 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 58.74 tokens</li><li>max: 124 tokens</li></ul> |
* Samples:
  | sentence1                                                           | sentence2                                                                                                                                                                                                                                                                                                                                 |
  |:--------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>how to connect your airpods to your playstation 4?</code>     | <code>['Turn on your PS4.', 'Plug the bluetooth adapter into the USB port at the front of your PS4.', 'Put the adapter into pairing mode — how you do this will depend on what type of adapter you have. ... ', 'Press and hold the pairing button on the back of your AirPod charging case to put it into pairing mode as well.']</code> |
  | <code>what is the difference between quickbooks iif and qbo?</code> | <code>QuickBooks accounting software is offered as QuickBooks Desktop (QBD) or QuickBooks Online. ... QBO and IIF format are different: QBO (Web Connect) is to import bank transactions, and IIF is more 'low level' import allowing to create various transactions between QuickBooks accounts.</code>                                  |
  | <code>can you see who viewed your whatsapp status?</code>           | <code>A view counter is placed at the bottom of your screen, showing you how many people have watched or looked at your status. You can swipe up on the screen to view a list of contact names who have viewed your Status.</code>                                                                                                        |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.75,
      "prior_layers_weight": 0.5,
      "kl_div_weight": 1.25,
      "kl_temperature": 0.9
  }
  ```

#### mrpc_pairs

* Dataset: [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c)
* Size: 160 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                          |
  |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                             |
  | details | <ul><li>min: 11 tokens</li><li>mean: 26.75 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 26.34 tokens</li><li>max: 41 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                 | sentence2                                                                                                                                            |
  |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Mr Annan also warned the US should not use the war on terror as an excuse to suppress " long-cherished freedoms " .</code>                                                          | <code>Annan warned that the dangers of extremism after September 11 should not be used as an excuse to suppress " long-cherished " freedoms .</code> |
  | <code>Citigroup Inc . C.N , the world 's largest financial services company , on Wednesday promoted Marjorie Magner to chairman and chief executive of its global consumer group .</code> | <code>Citigroup ( C ) on Wednesday named Marjorie Magner chairman and chief executive of its colossal global consumer business .</code>              |
  | <code>They were among about 40 people attending the traditional Jewish ceremony colored by some non-traditional touches .</code>                                                          | <code>He said about 40 people attended the traditional Jewish ceremony colored by some nontraditional touches .</code>                               |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 1.25,
      "kl_div_weight": 0.8,
      "kl_temperature": 0.75
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `learning_rate`: 3e-05
- `weight_decay`: 0.0001
- `num_train_epochs`: 5
- `lr_scheduler_type`: cosine_with_restarts
- `lr_scheduler_kwargs`: {'num_cycles': 3}
- `warmup_ratio`: 0.2
- `save_safetensors`: False
- `fp16`: True
- `push_to_hub`: True
- `hub_model_id`: bobox/DeBERTa-ST-AllLayers-v3.1bis-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `batch_sampler`: no_duplicates

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `learning_rate`: 3e-05
- `weight_decay`: 0.0001
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: cosine_with_restarts
- `lr_scheduler_kwargs`: {'num_cycles': 3}
- `warmup_ratio`: 0.2
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: False
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: bobox/DeBERTa-ST-AllLayers-v3.1bis-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional

</details>

### Training Logs
| Epoch  | Step | Training Loss | vitaminc-pairs loss | quora pairs loss | qnli-contrastive loss | qasc facts sym loss | xsum-pairs loss | nli-pairs loss | mrpc pairs loss | qasc pairs loss | sciq pairs loss | openbookqa pairs loss | gooaq pairs loss | nq pairs loss | scitail-pairs-qa loss | scitail-pairs-pos loss | msmarco pairs loss | trivia pairs loss | compression-pairs loss | StS-test_spearman_cosine | Vitaminc-test_max_ap | mrpc-test_max_ap |
|:------:|:----:|:-------------:|:-------------------:|:----------------:|:---------------------:|:-------------------:|:---------------:|:--------------:|:---------------:|:---------------:|:---------------:|:---------------------:|:----------------:|:-------------:|:---------------------:|:----------------------:|:------------------:|:-----------------:|:----------------------:|:------------------------:|:--------------------:|:----------------:|
| 0      | 0    | -             | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | 0.8972                   | 0.5594               | 0.8571           |
| 0.0126 | 65   | 0.4577        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.0252 | 130  | 0.4707        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.0377 | 195  | 0.5259        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.0503 | 260  | 0.5501        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.0629 | 325  | 0.5089        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.0755 | 390  | 0.4816        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.0881 | 455  | 0.5822        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1007 | 520  | 0.5686        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1132 | 585  | 0.5686        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1258 | 650  | 0.517         | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1384 | 715  | 0.3615        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1510 | 780  | 0.5978        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1636 | 845  | 0.5153        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1762 | 910  | 0.5059        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.1887 | 975  | 0.5624        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2013 | 1040 | 0.5201        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2139 | 1105 | 0.6127        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2265 | 1170 | 0.5333        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2391 | 1235 | 0.494         | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2501 | 1292 | -             | 5.7698              | 0.2283           | 0.1211                | 0.1645              | 0.3134          | 0.8093         | 0.0461          | 0.1987          | 0.2693          | 1.7182                | 0.4996           | 0.4005        | 0.0755                | 0.3979                 | 0.4961             | 0.6545            | 0.0832                 | 0.8943                   | 0.5651               | 0.8563           |
| 0.2516 | 1300 | 0.6236        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2642 | 1365 | 0.4947        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2768 | 1430 | 0.5595        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.2894 | 1495 | 0.641         | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3020 | 1560 | 0.5188        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3146 | 1625 | 0.4927        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3271 | 1690 | 0.657         | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3397 | 1755 | 0.4665        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3523 | 1820 | 0.4645        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3649 | 1885 | 0.5887        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3775 | 1950 | 0.5308        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.3901 | 2015 | 0.536         | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4026 | 2080 | 0.4841        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4152 | 2145 | 0.6499        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4278 | 2210 | 0.5982        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4404 | 2275 | 0.5281        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4530 | 2340 | 0.6657        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4655 | 2405 | 0.5746        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4781 | 2470 | 0.5853        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |
| 0.4907 | 2535 | 0.5828        | -                   | -                | -                     | -                   | -               | -              | -               | -               | -               | -                     | -                | -             | -                     | -                      | -                  | -                 | -                      | -                        | -                    | -                |


### Framework Versions
- Python: 3.10.13
- Sentence Transformers: 3.0.1
- Transformers: 4.42.3
- PyTorch: 2.1.2
- Accelerate: 0.32.1
- Datasets: 2.20.0
- Tokenizers: 0.19.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### AdaptiveLayerLoss
```bibtex
@misc{li20242d,
    title={2D Matryoshka Sentence Embeddings}, 
    author={Xianming Li and Zongxi Li and Jing Li and Haoran Xie and Qing Li},
    year={2024},
    eprint={2402.14776},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

#### GISTEmbedLoss
```bibtex
@misc{solatorio2024gistembed,
    title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning}, 
    author={Aivin V. Solatorio},
    year={2024},
    eprint={2402.16829},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
```

#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->