LabiraEdu-v1.0x / README.md
Labira's picture
Training in progress epoch 37
0981fc1
|
raw
history blame
3.26 kB
metadata
license: mit
base_model: indolem/indobert-base-uncased
tags:
  - generated_from_keras_callback
model-index:
  - name: Labira/LabiraEdu-v1.0x
    results: []

Labira/LabiraEdu-v1.0x

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0652
  • Validation Loss: 4.2064
  • Epoch: 37

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 1100, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
5.0565 3.9761 0
3.6621 3.2932 1
3.0961 3.2587 2
2.7357 3.2031 3
2.3059 3.2519 4
1.8933 3.4772 5
1.9076 3.1664 6
1.5492 3.4201 7
1.2578 3.5190 8
1.0478 3.4076 9
1.0130 3.5961 10
0.9073 3.4919 11
0.7071 3.5013 12
0.5616 4.0259 13
0.4798 3.9766 14
0.5938 3.8146 15
0.6476 3.7065 16
0.4264 4.1631 17
0.5290 3.7455 18
0.4637 3.6362 19
0.3826 3.8389 20
0.2876 3.7611 21
0.2221 4.0540 22
0.1752 4.0683 23
0.1544 4.0452 24
0.1600 4.0417 25
0.1390 4.0668 26
0.1134 4.0659 27
0.0965 4.0700 28
0.0820 4.2026 29
0.0810 4.3008 30
0.1166 4.0835 31
0.0776 4.0886 32
0.1033 4.1303 33
0.0512 4.1014 34
0.0484 4.1462 35
0.0565 4.2404 36
0.0652 4.2064 37

Framework versions

  • Transformers 4.41.2
  • TensorFlow 2.15.0
  • Datasets 2.19.2
  • Tokenizers 0.19.1