eunyounglee's picture
End of training
e193ad2 verified
|
raw
history blame
2.75 kB
metadata
license: cc-by-sa-4.0
base_model: klue/bert-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: degree-bert-finetuning-2
    results: []

degree-bert-finetuning-2

This model is a fine-tuned version of klue/bert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6113
  • Accuracy: 0.698
  • F1: 0.6968

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.06 1.0 104 0.8187 0.61 0.6100
0.849 2.0 208 0.7525 0.642 0.6415
0.8117 3.0 312 0.7479 0.616 0.6044
0.7757 4.0 416 0.7266 0.652 0.6489
0.7638 5.0 520 0.6960 0.674 0.6742
0.7412 6.0 624 0.6845 0.676 0.6739
0.7338 7.0 728 0.6655 0.692 0.6922
0.723 8.0 832 0.6500 0.676 0.6733
0.7047 9.0 936 0.6415 0.672 0.6681
0.6979 10.0 1040 0.6333 0.686 0.6852
0.6911 11.0 1144 0.6360 0.684 0.6825
0.6877 12.0 1248 0.6239 0.704 0.7044
0.6718 13.0 1352 0.6238 0.698 0.6978
0.6732 14.0 1456 0.6257 0.678 0.6736
0.6699 15.0 1560 0.6129 0.704 0.7042
0.6592 16.0 1664 0.6201 0.688 0.6853
0.653 17.0 1768 0.6075 0.706 0.7062
0.6528 18.0 1872 0.6099 0.704 0.7040
0.6512 19.0 1976 0.6129 0.7 0.6985
0.6405 20.0 2080 0.6113 0.698 0.6968

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.2.0
  • Datasets 2.17.1
  • Tokenizers 0.15.2