GlycerinLOL's picture
Model save
06bbcd3 verified
metadata
base_model: google/pegasus-large
tags:
  - generated_from_trainer
metrics:
  - rouge
  - precision
  - recall
  - f1
model-index:
  - name: LLM_Teached_Pegasus_FS
    results: []

LLM_Teached_Pegasus_FS

This model is a fine-tuned version of google/pegasus-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6167
  • Rouge1: 0.4649
  • Rouge2: 0.2096
  • Rougel: 0.3686
  • Rougelsum: 0.3688
  • Gen Len: 30.6191
  • Precision: 0.9102
  • Recall: 0.9083
  • F1: 0.9091

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 24
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 96
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 16
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len Precision Recall F1
No log 1.0 208 1.8075 0.411 0.1689 0.3152 0.3155 29.9091 0.901 0.897 0.8988
No log 2.0 417 1.7312 0.4379 0.1893 0.3442 0.3446 29.9073 0.9059 0.9024 0.904
2.0112 3.0 625 1.6987 0.4475 0.1978 0.352 0.3525 30.0173 0.9075 0.9039 0.9055
2.0112 4.0 834 1.6768 0.4514 0.1981 0.357 0.3573 30.0618 0.9082 0.9047 0.9063
1.7647 5.0 1042 1.6617 0.4537 0.2003 0.3592 0.3595 30.3264 0.9084 0.9055 0.9068
1.7647 6.0 1251 1.6502 0.4554 0.2021 0.3607 0.361 30.0827 0.9089 0.9057 0.9072
1.7647 7.0 1459 1.6416 0.4592 0.2052 0.3639 0.3641 30.0218 0.9099 0.9064 0.908
1.6948 8.0 1668 1.6360 0.4612 0.2054 0.3649 0.365 30.7827 0.909 0.9074 0.9081
1.6948 9.0 1876 1.6302 0.4621 0.2062 0.3645 0.3647 30.6291 0.9095 0.9074 0.9083
1.6501 10.0 2085 1.6265 0.4606 0.2051 0.3651 0.3655 30.4818 0.9095 0.9073 0.9083
1.6501 11.0 2293 1.6230 0.4625 0.2073 0.3658 0.366 30.8064 0.9097 0.908 0.9087
1.6222 12.0 2502 1.6205 0.4644 0.2082 0.3674 0.3679 30.5527 0.9103 0.9081 0.909
1.6222 13.0 2710 1.6188 0.4648 0.2087 0.3681 0.3683 30.8055 0.9101 0.9083 0.909
1.6222 14.0 2919 1.6172 0.4654 0.2097 0.3685 0.3689 30.6709 0.9104 0.9084 0.9093
1.6048 15.0 3127 1.6169 0.465 0.21 0.3693 0.3697 30.6309 0.9104 0.9084 0.9093
1.6048 15.96 3328 1.6167 0.4649 0.2096 0.3686 0.3688 30.6191 0.9102 0.9083 0.9091

Framework versions

  • Transformers 4.36.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.15.0