train_stsb_1745333586
This model is a fine-tuned version of google/gemma-3-1b-it on the stsb dataset. It achieves the following results on the evaluation set:
- Loss: 0.2857
- Num Input Tokens Seen: 61089232
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 123
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
---|---|---|---|---|
2.6663 | 0.6182 | 200 | 2.6001 | 305312 |
1.5964 | 1.2349 | 400 | 1.5902 | 610048 |
1.3194 | 1.8532 | 600 | 1.2956 | 917664 |
1.0831 | 2.4699 | 800 | 1.1056 | 1223104 |
0.8933 | 3.0866 | 1000 | 0.9622 | 1528432 |
0.8848 | 3.7048 | 1200 | 0.8386 | 1837520 |
0.694 | 4.3215 | 1400 | 0.7405 | 2143216 |
0.6061 | 4.9397 | 1600 | 0.6707 | 2448176 |
0.5286 | 5.5564 | 1800 | 0.6208 | 2752768 |
0.5691 | 6.1731 | 2000 | 0.5814 | 3059504 |
0.511 | 6.7913 | 2200 | 0.5471 | 3364688 |
0.5119 | 7.4080 | 2400 | 0.5170 | 3672432 |
0.3985 | 8.0247 | 2600 | 0.4941 | 3978272 |
0.4411 | 8.6430 | 2800 | 0.4751 | 4285856 |
0.4162 | 9.2597 | 3000 | 0.4618 | 4588608 |
0.3888 | 9.8779 | 3200 | 0.4467 | 4894432 |
0.3758 | 10.4946 | 3400 | 0.4360 | 5200528 |
0.3395 | 11.1113 | 3600 | 0.4265 | 5504960 |
0.3473 | 11.7295 | 3800 | 0.4188 | 5808800 |
0.3039 | 12.3462 | 4000 | 0.4106 | 6114608 |
0.3892 | 12.9645 | 4200 | 0.4043 | 6419376 |
0.3539 | 13.5811 | 4400 | 0.3987 | 6725664 |
0.3529 | 14.1978 | 4600 | 0.3898 | 7030208 |
0.4065 | 14.8161 | 4800 | 0.3843 | 7335712 |
0.2762 | 15.4328 | 5000 | 0.3787 | 7641232 |
0.2823 | 16.0495 | 5200 | 0.3738 | 7945360 |
0.2642 | 16.6677 | 5400 | 0.3714 | 8252048 |
0.3033 | 17.2844 | 5600 | 0.3662 | 8557024 |
0.3188 | 17.9026 | 5800 | 0.3605 | 8862080 |
0.2898 | 18.5193 | 6000 | 0.3590 | 9167248 |
0.2698 | 19.1360 | 6200 | 0.3529 | 9472816 |
0.3197 | 19.7543 | 6400 | 0.3506 | 9779344 |
0.2401 | 20.3709 | 6600 | 0.3468 | 10085888 |
0.2583 | 20.9892 | 6800 | 0.3446 | 10391904 |
0.2791 | 21.6059 | 7000 | 0.3424 | 10697664 |
0.3009 | 22.2226 | 7200 | 0.3394 | 11000832 |
0.2429 | 22.8408 | 7400 | 0.3374 | 11308384 |
0.2698 | 23.4575 | 7600 | 0.3348 | 11614048 |
0.2817 | 24.0742 | 7800 | 0.3335 | 11917328 |
0.2645 | 24.6924 | 8000 | 0.3295 | 12224848 |
0.2448 | 25.3091 | 8200 | 0.3284 | 12530128 |
0.2616 | 25.9274 | 8400 | 0.3268 | 12838032 |
0.3067 | 26.5440 | 8600 | 0.3257 | 13142096 |
0.2747 | 27.1607 | 8800 | 0.3241 | 13447712 |
0.3315 | 27.7790 | 9000 | 0.3227 | 13751968 |
0.2463 | 28.3957 | 9200 | 0.3216 | 14060176 |
0.28 | 29.0124 | 9400 | 0.3208 | 14362928 |
0.2407 | 29.6306 | 9600 | 0.3195 | 14669168 |
0.3 | 30.2473 | 9800 | 0.3150 | 14973568 |
0.279 | 30.8655 | 10000 | 0.3168 | 15279840 |
0.2804 | 31.4822 | 10200 | 0.3153 | 15586352 |
0.2615 | 32.0989 | 10400 | 0.3144 | 15891232 |
0.2743 | 32.7172 | 10600 | 0.3129 | 16197472 |
0.2027 | 33.3338 | 10800 | 0.3123 | 16500992 |
0.2204 | 33.9521 | 11000 | 0.3112 | 16807808 |
0.2145 | 34.5688 | 11200 | 0.3101 | 17112928 |
0.2237 | 35.1855 | 11400 | 0.3104 | 17420016 |
0.2885 | 35.8037 | 11600 | 0.3090 | 17726608 |
0.2753 | 36.4204 | 11800 | 0.3089 | 18030288 |
0.2601 | 37.0371 | 12000 | 0.3076 | 18337584 |
0.2834 | 37.6553 | 12200 | 0.3079 | 18640720 |
0.2627 | 38.2720 | 12400 | 0.3066 | 18946400 |
0.2868 | 38.8903 | 12600 | 0.3058 | 19254240 |
0.2572 | 39.5070 | 12800 | 0.3037 | 19558592 |
0.218 | 40.1236 | 13000 | 0.3038 | 19861168 |
0.2732 | 40.7419 | 13200 | 0.3041 | 20169712 |
0.2422 | 41.3586 | 13400 | 0.3043 | 20475008 |
0.2555 | 41.9768 | 13600 | 0.3035 | 20782016 |
0.2463 | 42.5935 | 13800 | 0.3028 | 21085440 |
0.2763 | 43.2102 | 14000 | 0.3013 | 21391616 |
0.2846 | 43.8284 | 14200 | 0.3006 | 21696768 |
0.2261 | 44.4451 | 14400 | 0.3007 | 22001488 |
0.2482 | 45.0618 | 14600 | 0.3010 | 22307216 |
0.2298 | 45.6801 | 14800 | 0.3002 | 22612016 |
0.2185 | 46.2968 | 15000 | 0.2987 | 22917744 |
0.2236 | 46.9150 | 15200 | 0.2983 | 23224720 |
0.24 | 47.5317 | 15400 | 0.2991 | 23531040 |
0.2229 | 48.1484 | 15600 | 0.2973 | 23836048 |
0.279 | 48.7666 | 15800 | 0.2975 | 24140240 |
0.2798 | 49.3833 | 16000 | 0.2978 | 24445104 |
0.221 | 50.0 | 16200 | 0.2961 | 24750256 |
0.2458 | 50.6182 | 16400 | 0.2964 | 25055056 |
0.2611 | 51.2349 | 16600 | 0.2959 | 25360976 |
0.2294 | 51.8532 | 16800 | 0.2961 | 25669136 |
0.2512 | 52.4699 | 17000 | 0.2954 | 25972400 |
0.2519 | 53.0866 | 17200 | 0.2968 | 26280272 |
0.2512 | 53.7048 | 17400 | 0.2952 | 26583184 |
0.1989 | 54.3215 | 17600 | 0.2957 | 26891152 |
0.2058 | 54.9397 | 17800 | 0.2957 | 27197008 |
0.2957 | 55.5564 | 18000 | 0.2942 | 27500160 |
0.2871 | 56.1731 | 18200 | 0.2940 | 27805616 |
0.2948 | 56.7913 | 18400 | 0.2945 | 28112336 |
0.2173 | 57.4080 | 18600 | 0.2939 | 28419888 |
0.2471 | 58.0247 | 18800 | 0.2929 | 28724096 |
0.2238 | 58.6430 | 19000 | 0.2935 | 29031328 |
0.2005 | 59.2597 | 19200 | 0.2930 | 29336560 |
0.2914 | 59.8779 | 19400 | 0.2931 | 29642224 |
0.2229 | 60.4946 | 19600 | 0.2929 | 29947456 |
0.3407 | 61.1113 | 19800 | 0.2924 | 30252288 |
0.2678 | 61.7295 | 20000 | 0.2920 | 30557408 |
0.2238 | 62.3462 | 20200 | 0.2923 | 30862656 |
0.2312 | 62.9645 | 20400 | 0.2917 | 31169472 |
0.2101 | 63.5811 | 20600 | 0.2914 | 31474928 |
0.2466 | 64.1978 | 20800 | 0.2924 | 31778496 |
0.2311 | 64.8161 | 21000 | 0.2917 | 32086304 |
0.3184 | 65.4328 | 21200 | 0.2907 | 32389328 |
0.291 | 66.0495 | 21400 | 0.2886 | 32696656 |
0.236 | 66.6677 | 21600 | 0.2914 | 33001008 |
0.2248 | 67.2844 | 21800 | 0.2910 | 33306288 |
0.3447 | 67.9026 | 22000 | 0.2895 | 33612592 |
0.221 | 68.5193 | 22200 | 0.2899 | 33914992 |
0.2936 | 69.1360 | 22400 | 0.2891 | 34219808 |
0.2011 | 69.7543 | 22600 | 0.2899 | 34525536 |
0.2105 | 70.3709 | 22800 | 0.2899 | 34829856 |
0.1985 | 70.9892 | 23000 | 0.2899 | 35134560 |
0.2103 | 71.6059 | 23200 | 0.2911 | 35439168 |
0.3389 | 72.2226 | 23400 | 0.2896 | 35744608 |
0.2062 | 72.8408 | 23600 | 0.2894 | 36050688 |
0.2302 | 73.4575 | 23800 | 0.2884 | 36353808 |
0.2637 | 74.0742 | 24000 | 0.2892 | 36660560 |
0.2606 | 74.6924 | 24200 | 0.2893 | 36968464 |
0.2697 | 75.3091 | 24400 | 0.2875 | 37273264 |
0.1824 | 75.9274 | 24600 | 0.2886 | 37578896 |
0.2278 | 76.5440 | 24800 | 0.2904 | 37882832 |
0.1994 | 77.1607 | 25000 | 0.2881 | 38187312 |
0.2387 | 77.7790 | 25200 | 0.2892 | 38492720 |
0.3011 | 78.3957 | 25400 | 0.2871 | 38796864 |
0.2614 | 79.0124 | 25600 | 0.2906 | 39103824 |
0.2389 | 79.6306 | 25800 | 0.2891 | 39410448 |
0.301 | 80.2473 | 26000 | 0.2897 | 39715280 |
0.2125 | 80.8655 | 26200 | 0.2894 | 40021520 |
0.267 | 81.4822 | 26400 | 0.2884 | 40325376 |
0.2492 | 82.0989 | 26600 | 0.2878 | 40631296 |
0.2823 | 82.7172 | 26800 | 0.2881 | 40937312 |
0.2149 | 83.3338 | 27000 | 0.2880 | 41240464 |
0.2289 | 83.9521 | 27200 | 0.2882 | 41550128 |
0.2296 | 84.5688 | 27400 | 0.2884 | 41855152 |
0.2279 | 85.1855 | 27600 | 0.2891 | 42158912 |
0.2293 | 85.8037 | 27800 | 0.2882 | 42461856 |
0.2445 | 86.4204 | 28000 | 0.2878 | 42769760 |
0.2542 | 87.0371 | 28200 | 0.2872 | 43074800 |
0.2287 | 87.6553 | 28400 | 0.2881 | 43378640 |
0.2691 | 88.2720 | 28600 | 0.2876 | 43683840 |
0.2301 | 88.8903 | 28800 | 0.2875 | 43988256 |
0.2044 | 89.5070 | 29000 | 0.2884 | 44294256 |
0.2024 | 90.1236 | 29200 | 0.2868 | 44598464 |
0.2007 | 90.7419 | 29400 | 0.2878 | 44904928 |
0.1841 | 91.3586 | 29600 | 0.2872 | 45208784 |
0.2503 | 91.9768 | 29800 | 0.2878 | 45516336 |
0.2356 | 92.5935 | 30000 | 0.2864 | 45820432 |
0.2046 | 93.2102 | 30200 | 0.2866 | 46127408 |
0.256 | 93.8284 | 30400 | 0.2864 | 46431888 |
0.2289 | 94.4451 | 30600 | 0.2874 | 46736368 |
0.2062 | 95.0618 | 30800 | 0.2868 | 47043472 |
0.2091 | 95.6801 | 31000 | 0.2875 | 47348976 |
0.2645 | 96.2968 | 31200 | 0.2867 | 47652864 |
0.1946 | 96.9150 | 31400 | 0.2862 | 47959872 |
0.2462 | 97.5317 | 31600 | 0.2871 | 48265392 |
0.2221 | 98.1484 | 31800 | 0.2881 | 48569984 |
0.2252 | 98.7666 | 32000 | 0.2870 | 48874016 |
0.24 | 99.3833 | 32200 | 0.2866 | 49181056 |
0.2445 | 100.0 | 32400 | 0.2865 | 49485120 |
0.2089 | 100.6182 | 32600 | 0.2876 | 49790304 |
0.26 | 101.2349 | 32800 | 0.2866 | 50097008 |
0.2674 | 101.8532 | 33000 | 0.2865 | 50403088 |
0.207 | 102.4699 | 33200 | 0.2861 | 50707088 |
0.2795 | 103.0866 | 33400 | 0.2870 | 51010144 |
0.2436 | 103.7048 | 33600 | 0.2876 | 51318976 |
0.2728 | 104.3215 | 33800 | 0.2863 | 51623136 |
0.2344 | 104.9397 | 34000 | 0.2877 | 51930240 |
0.2448 | 105.5564 | 34200 | 0.2869 | 52233888 |
0.1904 | 106.1731 | 34400 | 0.2869 | 52541008 |
0.2313 | 106.7913 | 34600 | 0.2875 | 52845904 |
0.1858 | 107.4080 | 34800 | 0.2873 | 53150720 |
0.2712 | 108.0247 | 35000 | 0.2860 | 53456816 |
0.1837 | 108.6430 | 35200 | 0.2866 | 53760816 |
0.2304 | 109.2597 | 35400 | 0.2865 | 54066160 |
0.2346 | 109.8779 | 35600 | 0.2865 | 54371888 |
0.2149 | 110.4946 | 35800 | 0.2868 | 54676672 |
0.2485 | 111.1113 | 36000 | 0.2859 | 54983008 |
0.2231 | 111.7295 | 36200 | 0.2863 | 55289472 |
0.1988 | 112.3462 | 36400 | 0.2869 | 55591632 |
0.2679 | 112.9645 | 36600 | 0.2877 | 55898640 |
0.2445 | 113.5811 | 36800 | 0.2861 | 56202288 |
0.2035 | 114.1978 | 37000 | 0.2861 | 56510384 |
0.235 | 114.8161 | 37200 | 0.2863 | 56816880 |
0.1888 | 115.4328 | 37400 | 0.2865 | 57119232 |
0.2795 | 116.0495 | 37600 | 0.2876 | 57424224 |
0.2023 | 116.6677 | 37800 | 0.2864 | 57729856 |
0.1991 | 117.2844 | 38000 | 0.2862 | 58034352 |
0.2265 | 117.9026 | 38200 | 0.2857 | 58342576 |
0.2438 | 118.5193 | 38400 | 0.2868 | 58648384 |
0.2393 | 119.1360 | 38600 | 0.2861 | 58953568 |
0.2284 | 119.7543 | 38800 | 0.2866 | 59257088 |
0.2043 | 120.3709 | 39000 | 0.2863 | 59562208 |
0.2401 | 120.9892 | 39200 | 0.2865 | 59867712 |
0.2845 | 121.6059 | 39400 | 0.2868 | 60173616 |
0.213 | 122.2226 | 39600 | 0.2868 | 60476592 |
0.216 | 122.8408 | 39800 | 0.2868 | 60782960 |
0.2205 | 123.4575 | 40000 | 0.2868 | 61089232 |
Framework versions
- PEFT 0.15.1
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support