eunyounglee's picture
End of training
45e4529 verified
|
raw
history blame
2.75 kB
metadata
license: cc-by-sa-4.0
base_model: klue/bert-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: degree-bert-finetuning-2
    results: []

degree-bert-finetuning-2

This model is a fine-tuned version of klue/bert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5876
  • Accuracy: 0.7
  • F1: 0.7002

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
0.9822 1.0 104 0.7872 0.638 0.6374
0.8238 2.0 208 0.7232 0.654 0.6532
0.787 3.0 312 0.7135 0.66 0.6543
0.7555 4.0 416 0.6902 0.682 0.6801
0.7369 5.0 520 0.6555 0.7 0.7005
0.7163 6.0 624 0.6495 0.7 0.6994
0.7028 7.0 728 0.6438 0.708 0.7080
0.6914 8.0 832 0.6087 0.698 0.6972
0.6761 9.0 936 0.6039 0.7 0.6995
0.6676 10.0 1040 0.6042 0.692 0.6911
0.6572 11.0 1144 0.6019 0.704 0.7033
0.6527 12.0 1248 0.5927 0.712 0.7126
0.6364 13.0 1352 0.5951 0.708 0.7086
0.6387 14.0 1456 0.5917 0.688 0.6864
0.6326 15.0 1560 0.5870 0.71 0.7105
0.6199 16.0 1664 0.5944 0.696 0.6942
0.6107 17.0 1768 0.5850 0.714 0.7145
0.6118 18.0 1872 0.5853 0.716 0.7168
0.6083 19.0 1976 0.5895 0.704 0.7037
0.5946 20.0 2080 0.5876 0.7 0.7002

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.2.0
  • Datasets 2.17.1
  • Tokenizers 0.15.2