GlycerinLOL's picture
Model save
6cc241b verified
|
raw
history blame
3.24 kB
metadata
base_model: google/pegasus-xsum
tags:
  - generated_from_trainer
metrics:
  - rouge
  - precision
  - recall
  - f1
model-index:
  - name: LLM_Teached_Pegasus_50k
    results: []

LLM_Teached_Pegasus_50k

This model is a fine-tuned version of google/pegasus-xsum on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6541
  • Rouge1: 0.4665
  • Rouge2: 0.2182
  • Rougel: 0.3824
  • Rougelsum: 0.3824
  • Gen Len: 26.5458
  • Precision: 0.9101
  • Recall: 0.9085
  • F1: 0.9092

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 12
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step F1 Gen Len Validation Loss Precision Recall Rouge1 Rouge2 Rougel Rougelsum
No log 1.0 390 0.9034 26.2967 1.8258 0.9049 0.9023 0.4338 0.1906 0.3496 0.3498
2.1621 2.0 781 0.9054 26.2727 1.7537 0.9068 0.9044 0.4449 0.2005 0.3633 0.3633
1.8794 3.0 1172 0.9066 26.4345 1.7268 0.9078 0.9058 0.4518 0.2061 0.3696 0.3695
1.8271 4.0 1560 0.9069 26.3971 1.7157 0.9082 0.906 0.4539 0.2075 0.3716 0.3714
1.8271 5.0 1951 0.9074 26.3015 1.7033 0.9087 0.9065 0.4561 0.2098 0.3735 0.3734
1.8067 6.0 2340 0.9077 26.4389 1.6897 0.9089 0.9069 0.4592 0.2114 0.3762 0.3759
1.7833 7.0 2731 0.9079 26.3745 1.6819 0.9092 0.9071 0.4598 0.2115 0.3764 0.376
1.7683 8.0 3120 1.6763 0.4621 0.2133 0.3791 0.3789 26.6204 0.9094 0.9076 0.9083
1.7559 9.0 3511 1.6662 0.4632 0.215 0.38 0.3799 26.424 0.9098 0.9078 0.9086
1.7559 10.0 3902 1.6594 0.4651 0.2168 0.3812 0.3812 26.5425 0.9099 0.9082 0.9089
1.7357 11.0 4293 1.6555 0.4663 0.2178 0.3824 0.3823 26.6051 0.91 0.9086 0.9091
1.7297 11.99 4680 1.6541 0.4665 0.2182 0.3824 0.3824 26.5458 0.9101 0.9085 0.9092

Framework versions

  • Transformers 4.36.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.15.0