sujayC66's picture
End of training
466fc4e verified
metadata
license: apache-2.0
base_model: google-t5/t5-base
tags:
  - generated_from_trainer
metrics:
  - rouge
model-index:
  - name: t5-base-finetuned-stocknews_1
    results: []

t5-base-finetuned-stocknews_1

This model is a fine-tuned version of google-t5/t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4299
  • Rouge1: 31.2675
  • Rouge2: 18.3987
  • Rougel: 27.1272
  • Rougelsum: 28.0372
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 99 1.1909 26.7564 14.0847 23.0574 24.0225 19.0
No log 2.0 198 1.1513 26.8525 14.3487 23.0252 24.0357 19.0
No log 3.0 297 1.1358 27.9251 15.4858 24.1529 25.0564 19.0
No log 4.0 396 1.1249 28.9647 16.322 25.1393 25.9351 19.0
No log 5.0 495 1.1230 29.3277 16.643 25.3965 26.3924 19.0
1.1304 6.0 594 1.1257 29.3298 16.6756 25.2931 26.3113 19.0
1.1304 7.0 693 1.1274 29.8143 17.0961 25.8392 26.7922 19.0
1.1304 8.0 792 1.1349 29.7039 16.8019 25.7436 26.7177 19.0
1.1304 9.0 891 1.1398 29.7954 17.0393 25.9506 26.6055 19.0
1.1304 10.0 990 1.1436 30.2308 17.5247 26.6431 27.2773 19.0
0.8223 11.0 1089 1.1646 30.1807 17.4666 26.4978 27.1534 19.0
0.8223 12.0 1188 1.1700 30.1808 17.7926 26.5241 27.2625 19.0
0.8223 13.0 1287 1.1811 30.5494 18.0376 26.7185 27.5291 19.0
0.8223 14.0 1386 1.1847 30.4785 18.0418 26.8702 27.5021 19.0
0.8223 15.0 1485 1.2043 30.5933 18.3907 27.1218 27.8091 19.0
0.6312 16.0 1584 1.2219 30.5586 18.5247 26.8513 27.6566 19.0
0.6312 17.0 1683 1.2214 30.5018 18.1947 26.9409 27.7452 19.0
0.6312 18.0 1782 1.2322 30.6322 18.1167 26.6699 27.509 19.0
0.6312 19.0 1881 1.2421 31.0753 18.5194 27.0614 27.912 19.0
0.6312 20.0 1980 1.2566 30.8549 18.3715 27.0343 27.8685 19.0
0.513 21.0 2079 1.2740 30.7621 18.5321 26.9539 27.7937 19.0
0.513 22.0 2178 1.2798 31.6185 18.7955 27.4786 28.2485 19.0
0.513 23.0 2277 1.2859 31.0127 18.438 27.0895 27.833 19.0
0.513 24.0 2376 1.3103 31.4955 18.4432 27.3754 28.1693 19.0
0.513 25.0 2475 1.3260 31.6346 18.3461 27.2447 28.1406 19.0
0.4278 26.0 2574 1.3191 31.6779 18.5516 27.5072 28.3363 19.0
0.4278 27.0 2673 1.3293 31.2316 18.2088 27.0875 27.9376 19.0
0.4278 28.0 2772 1.3313 31.2469 18.3832 27.2194 27.9704 19.0
0.4278 29.0 2871 1.3440 31.6021 18.5638 27.328 28.2197 19.0
0.4278 30.0 2970 1.3473 31.7773 18.5585 27.5498 28.3816 19.0
0.3693 31.0 3069 1.3598 31.2278 18.5905 27.0409 27.8962 19.0
0.3693 32.0 3168 1.3686 31.0198 18.4271 26.8683 27.9364 19.0
0.3693 33.0 3267 1.3798 30.8732 18.5114 26.9202 27.8493 19.0
0.3693 34.0 3366 1.3805 31.2322 18.7093 27.3125 28.1878 19.0
0.3693 35.0 3465 1.3870 31.0199 18.5469 27.1357 27.9645 19.0
0.3289 36.0 3564 1.3916 31.3317 18.7421 27.3709 28.2084 19.0
0.3289 37.0 3663 1.3961 31.2699 18.7424 27.3036 28.1781 19.0
0.3289 38.0 3762 1.4041 31.0176 18.4756 27.1868 27.9935 19.0
0.3289 39.0 3861 1.4104 31.1198 18.3739 27.1332 27.979 19.0
0.3289 40.0 3960 1.4142 30.9397 18.4267 27.1613 27.952 19.0
0.2963 41.0 4059 1.4191 31.2112 18.5405 27.2365 28.0131 19.0
0.2963 42.0 4158 1.4159 31.4348 18.6802 27.2705 28.1629 19.0
0.2963 43.0 4257 1.4217 31.3161 18.4061 27.1797 27.9911 19.0
0.2963 44.0 4356 1.4221 31.2979 18.6064 27.2486 28.1006 19.0
0.2963 45.0 4455 1.4231 31.24 18.4439 27.1825 28.0577 19.0
0.2796 46.0 4554 1.4251 31.24 18.4439 27.1825 28.0577 19.0
0.2796 47.0 4653 1.4278 31.3015 18.4439 27.213 28.1327 19.0
0.2796 48.0 4752 1.4292 31.2708 18.3724 27.1466 28.0132 19.0
0.2796 49.0 4851 1.4297 31.2675 18.3987 27.1272 28.0372 19.0
0.2796 50.0 4950 1.4299 31.2675 18.3987 27.1272 28.0372 19.0

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2