license-plate-detr-dinov3

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.4069

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 8
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
5.9639 0.3408 500 5.7109
3.7311 0.6817 1000 5.8289
2.9953 1.0225 1500 5.5977
2.4684 1.3633 2000 4.9900
2.1458 1.7042 2500 3.0389
1.8966 2.0450 3000 2.9854
1.711 2.3858 3500 2.8960
1.7188 2.7267 4000 2.8657
1.6345 3.0675 4500 2.9198
1.6162 3.4083 5000 3.0553
1.5216 3.7491 5500 3.0822
1.4746 4.0900 6000 3.2468
1.4821 4.4308 6500 3.0554
1.4824 4.7716 7000 2.6149
1.4697 5.1125 7500 2.5948
1.3859 5.4533 8000 2.5043
1.4207 5.7941 8500 2.7286
1.3432 6.1350 9000 2.6156
1.3261 6.4758 9500 2.4771
1.3504 6.8166 10000 2.7247
1.2941 7.1575 10500 2.4605
1.3013 7.4983 11000 2.4045
1.3526 7.8391 11500 2.4069

Framework versions

  • Transformers 4.57.0.dev0
  • Pytorch 2.8.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.1
Downloads last month
179
Safetensors
Model size
0.2B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support