rtdetr-v2-r50-cppe5-finetune-2

This model is a fine-tuned version of PekingU/rtdetr_v2_r50vd on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 9.6769
  • Map: 0.5368
  • Map 50: 0.8312
  • Map 75: 0.5962
  • Map Small: 0.5364
  • Map Medium: 0.441
  • Map Large: 0.7689
  • Mar 1: 0.3954
  • Mar 10: 0.6567
  • Mar 100: 0.6967
  • Mar Small: 0.6067
  • Mar Medium: 0.6153
  • Mar Large: 0.8557
  • Map Coverall: 0.5756
  • Mar 100 Coverall: 0.7821
  • Map Face Shield: 0.6521
  • Mar 100 Face Shield: 0.8059
  • Map Gloves: 0.4261
  • Mar 100 Gloves: 0.5627
  • Map Goggles: 0.4722
  • Mar 100 Goggles: 0.6897
  • Map Mask: 0.5578
  • Mar 100 Mask: 0.6431

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 300
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Coverall Mar 100 Coverall Map Face Shield Mar 100 Face Shield Map Gloves Mar 100 Gloves Map Goggles Mar 100 Goggles Map Mask Mar 100 Mask
No log 1.0 107 25.4682 0.0465 0.0795 0.0437 0.0028 0.0102 0.0671 0.0688 0.1912 0.2765 0.1031 0.1968 0.5097 0.2085 0.5748 0.002 0.243 0.0029 0.1746 0.001 0.1492 0.0184 0.2409
No log 2.0 214 15.4462 0.1823 0.3465 0.1617 0.0681 0.1115 0.2591 0.2151 0.4239 0.4884 0.2985 0.4232 0.7013 0.4591 0.6716 0.0879 0.5013 0.0658 0.3906 0.0495 0.4308 0.2489 0.4476
No log 3.0 321 12.7644 0.2555 0.4657 0.2476 0.0847 0.191 0.4466 0.2627 0.4528 0.5148 0.2473 0.4805 0.7435 0.5487 0.7185 0.1397 0.5468 0.1609 0.4183 0.1243 0.4138 0.3041 0.4764
No log 4.0 428 12.1356 0.2919 0.5558 0.2588 0.1466 0.2271 0.5099 0.285 0.4706 0.5293 0.3241 0.4886 0.729 0.5238 0.6986 0.2541 0.6101 0.1794 0.4281 0.1888 0.4508 0.3136 0.4587
36.0648 5.0 535 11.8591 0.3218 0.5856 0.2997 0.1436 0.2566 0.5379 0.3028 0.4821 0.5359 0.3055 0.5098 0.7267 0.5498 0.7104 0.2962 0.6139 0.1811 0.4147 0.2364 0.46 0.3456 0.4804
36.0648 6.0 642 11.7190 0.3157 0.5685 0.2997 0.1426 0.2557 0.5212 0.2823 0.4795 0.5398 0.3338 0.4924 0.7223 0.5437 0.7099 0.2111 0.5886 0.2288 0.4482 0.255 0.4677 0.34 0.4844
36.0648 7.0 749 11.9062 0.3212 0.5979 0.2974 0.1367 0.259 0.5387 0.2942 0.4765 0.5428 0.3495 0.5085 0.7153 0.5386 0.6905 0.303 0.6139 0.2067 0.4522 0.209 0.4754 0.3487 0.4818
36.0648 8.0 856 11.6933 0.3183 0.5969 0.2901 0.1414 0.2541 0.54 0.2977 0.483 0.5404 0.3387 0.4954 0.7399 0.5594 0.705 0.2812 0.6 0.2174 0.4429 0.208 0.4785 0.3253 0.4756
36.0648 9.0 963 11.6233 0.3202 0.5826 0.3226 0.1359 0.2583 0.5543 0.3035 0.4884 0.5462 0.3383 0.5126 0.7321 0.5472 0.6937 0.2746 0.619 0.2086 0.4545 0.2237 0.4754 0.347 0.4884
15.3215 10.0 1070 11.4090 0.3421 0.6207 0.3279 0.1389 0.2796 0.5757 0.316 0.4935 0.5477 0.3142 0.5067 0.745 0.5714 0.6991 0.3224 0.6278 0.2331 0.4371 0.2325 0.4754 0.3511 0.4991
15.3215 11.0 1177 11.5408 0.3436 0.6394 0.3212 0.1544 0.2759 0.5751 0.3175 0.5003 0.5554 0.3501 0.5119 0.7464 0.5444 0.7059 0.329 0.5949 0.2259 0.4571 0.2647 0.5108 0.354 0.5084
15.3215 12.0 1284 11.7707 0.3296 0.6169 0.3078 0.1525 0.2669 0.5336 0.3037 0.4793 0.5399 0.3164 0.4945 0.7323 0.5477 0.6892 0.32 0.6101 0.2295 0.429 0.21 0.4862 0.3408 0.4849
15.3215 13.0 1391 11.6683 0.3415 0.6231 0.3243 0.1447 0.2795 0.5746 0.3181 0.4946 0.5536 0.3625 0.5028 0.7387 0.5571 0.6901 0.3245 0.6114 0.2424 0.4585 0.2356 0.5092 0.3479 0.4987
15.3215 14.0 1498 11.7344 0.3305 0.6156 0.3133 0.1478 0.2877 0.5388 0.3194 0.4876 0.5468 0.3387 0.5232 0.7116 0.5236 0.7113 0.327 0.6 0.2255 0.4563 0.2373 0.4738 0.3391 0.4924
13.4858 15.0 1605 11.6264 0.3307 0.6056 0.3161 0.1213 0.2799 0.5711 0.3242 0.4933 0.5506 0.3454 0.5066 0.7434 0.5581 0.7144 0.3034 0.5962 0.2163 0.4701 0.2598 0.4862 0.3159 0.4862
13.4858 16.0 1712 11.5521 0.3287 0.6044 0.3125 0.1686 0.2751 0.5519 0.3171 0.4922 0.5484 0.3757 0.4978 0.7246 0.5635 0.7162 0.3018 0.5861 0.234 0.4714 0.236 0.48 0.3084 0.4884
13.4858 17.0 1819 11.7578 0.3382 0.6292 0.3237 0.164 0.281 0.5516 0.3215 0.4924 0.548 0.3353 0.5037 0.7302 0.5505 0.709 0.3225 0.6076 0.2187 0.4402 0.2704 0.5031 0.3286 0.48
13.4858 18.0 1926 11.5963 0.3454 0.6381 0.3218 0.1607 0.2921 0.5647 0.3294 0.4951 0.5507 0.3516 0.489 0.7565 0.5498 0.705 0.348 0.6051 0.2177 0.45 0.2613 0.5138 0.3504 0.4796
12.4151 19.0 2033 11.5293 0.3469 0.6347 0.3237 0.1415 0.2967 0.5694 0.323 0.4923 0.5415 0.3126 0.4894 0.733 0.5663 0.7032 0.345 0.5975 0.2389 0.4469 0.2667 0.4846 0.3175 0.4756
12.4151 20.0 2140 11.5551 0.3414 0.6306 0.3143 0.1716 0.2916 0.5768 0.3312 0.4969 0.5521 0.3257 0.513 0.7292 0.5629 0.7243 0.3179 0.5633 0.2498 0.4696 0.267 0.52 0.3095 0.4831
12.4151 21.0 2247 11.9833 0.3286 0.6184 0.2991 0.1597 0.277 0.533 0.3224 0.4898 0.5452 0.3502 0.5003 0.7228 0.5478 0.6955 0.2979 0.5899 0.2414 0.4638 0.2361 0.4923 0.3197 0.4844
12.4151 22.0 2354 11.9215 0.3408 0.6259 0.3184 0.142 0.2864 0.5548 0.3264 0.4893 0.5399 0.3216 0.4872 0.744 0.5429 0.6923 0.3578 0.619 0.2483 0.4585 0.2269 0.4569 0.3282 0.4729
12.4151 23.0 2461 12.0853 0.3304 0.6162 0.3031 0.1564 0.2852 0.5542 0.3198 0.4856 0.5275 0.309 0.4927 0.7118 0.5404 0.7041 0.3271 0.5886 0.242 0.4237 0.2325 0.4492 0.3097 0.472
11.6364 24.0 2568 11.8409 0.3344 0.622 0.3186 0.1689 0.2871 0.5485 0.3217 0.4938 0.5446 0.3457 0.4936 0.7208 0.5455 0.7126 0.2952 0.5899 0.2615 0.4638 0.2534 0.4862 0.3164 0.4707
11.6364 25.0 2675 12.1816 0.3201 0.5981 0.3021 0.1342 0.2717 0.5455 0.3151 0.4803 0.5303 0.3205 0.4817 0.7252 0.5315 0.7023 0.2957 0.5911 0.2163 0.4415 0.2379 0.4462 0.3188 0.4707
11.6364 26.0 2782 11.9448 0.3291 0.6113 0.2964 0.1687 0.2751 0.5635 0.3163 0.4875 0.5385 0.3434 0.4928 0.7221 0.5433 0.6919 0.2817 0.5797 0.2582 0.4746 0.2367 0.4708 0.3254 0.4756
11.6364 27.0 2889 11.9042 0.322 0.6094 0.2899 0.1286 0.2739 0.5564 0.3211 0.4919 0.5404 0.3371 0.4771 0.7404 0.5306 0.7005 0.3051 0.5975 0.2411 0.4509 0.228 0.4769 0.3052 0.4764
11.6364 28.0 2996 12.1391 0.3242 0.6003 0.3057 0.1342 0.2714 0.5507 0.3144 0.483 0.5308 0.3222 0.4825 0.7136 0.5356 0.6986 0.298 0.5848 0.2454 0.4487 0.2544 0.4538 0.2875 0.468
11.0017 29.0 3103 12.0627 0.3371 0.6166 0.3215 0.1445 0.2846 0.5479 0.3212 0.4874 0.5441 0.3267 0.5051 0.7284 0.5411 0.7077 0.3292 0.6038 0.2528 0.4451 0.2572 0.4877 0.3052 0.4764
11.0017 30.0 3210 12.3028 0.3353 0.6079 0.3192 0.158 0.2837 0.5625 0.315 0.4875 0.5332 0.2965 0.4917 0.7294 0.5347 0.6905 0.3534 0.6 0.246 0.4464 0.242 0.4662 0.3001 0.4631
11.0017 31.0 3317 11.9750 0.339 0.6148 0.325 0.1401 0.2827 0.5603 0.3195 0.4821 0.5328 0.2975 0.4866 0.7262 0.5451 0.7063 0.3469 0.5848 0.2502 0.4522 0.2412 0.4585 0.3115 0.4622
11.0017 32.0 3424 12.0644 0.3361 0.6158 0.3151 0.1374 0.2836 0.5539 0.3197 0.4886 0.5368 0.2821 0.5061 0.7212 0.5472 0.6982 0.3281 0.5886 0.2436 0.4612 0.2432 0.4615 0.3186 0.4747
10.4746 33.0 3531 11.9360 0.3323 0.615 0.3027 0.1597 0.2821 0.5495 0.3162 0.4863 0.5366 0.294 0.4971 0.7228 0.531 0.7032 0.3396 0.5949 0.2532 0.4705 0.2255 0.4477 0.3121 0.4667
10.4746 34.0 3638 11.7375 0.3393 0.6215 0.3145 0.1483 0.2853 0.5579 0.326 0.4915 0.5427 0.3199 0.5005 0.7224 0.5444 0.6968 0.3343 0.6076 0.2515 0.4638 0.2479 0.4615 0.3183 0.4836
10.4746 35.0 3745 11.9828 0.3282 0.605 0.3017 0.1392 0.2717 0.5518 0.3145 0.4801 0.5305 0.2918 0.4795 0.7182 0.5313 0.6973 0.3315 0.5835 0.2456 0.4616 0.2217 0.4446 0.3108 0.4653
10.4746 36.0 3852 11.8752 0.3302 0.6169 0.31 0.1597 0.2683 0.5583 0.3127 0.4814 0.5278 0.297 0.4793 0.709 0.5367 0.6968 0.3315 0.5848 0.2429 0.4402 0.2198 0.4477 0.32 0.4693
10.4746 37.0 3959 11.9312 0.3304 0.6097 0.3073 0.1444 0.2765 0.5464 0.3185 0.4809 0.536 0.3159 0.4909 0.7086 0.5382 0.6923 0.3265 0.6013 0.2545 0.454 0.213 0.4538 0.3198 0.4787
9.9527 38.0 4066 11.9053 0.3355 0.6135 0.3116 0.1443 0.283 0.5541 0.3188 0.4856 0.5333 0.3009 0.4804 0.7224 0.5382 0.6919 0.3493 0.5962 0.2481 0.4527 0.2271 0.4523 0.3151 0.4733
9.9527 39.0 4173 11.9321 0.331 0.6118 0.3094 0.1417 0.2754 0.5537 0.313 0.4817 0.5325 0.3038 0.4847 0.7139 0.538 0.6995 0.3312 0.5949 0.2491 0.4549 0.2289 0.4415 0.3079 0.4716
9.9527 40.0 4280 11.8940 0.3317 0.6135 0.3061 0.1432 0.276 0.5536 0.3136 0.4796 0.5283 0.2983 0.477 0.715 0.5378 0.7009 0.3298 0.5823 0.2506 0.4496 0.2308 0.4446 0.3095 0.464

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.7.0+cu126
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
4
Safetensors
Model size
42.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for godminhkhoa/rtdetr-v2-r50-cppe5-finetune-2

Finetuned
(14)
this model