Edit model card

pakawadeep/mt5-small-finetuned-ctfl-augmented_05

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.8100
  • Validation Loss: 0.9101
  • Train Rouge1: 7.9562
  • Train Rouge2: 1.3861
  • Train Rougel: 7.9562
  • Train Rougelsum: 7.9562
  • Train Gen Len: 11.9851
  • Epoch: 29

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Train Rouge1 Train Rouge2 Train Rougel Train Rougelsum Train Gen Len Epoch
9.2952 2.6353 1.5618 0.0 1.5117 1.5288 16.5347 0
4.7507 1.8159 5.5776 0.2888 5.5611 5.5281 12.2475 1
3.4617 1.8004 4.7218 0.2888 4.7218 4.6723 11.2723 2
2.8272 1.7197 6.1410 0.8251 6.1410 6.0113 11.1634 3
2.4003 1.6328 7.7086 2.1782 7.7086 7.7086 11.7277 4
2.0952 1.5374 8.2037 2.1782 8.2037 8.2037 11.8713 5
1.8634 1.4405 8.2037 2.1782 8.2037 8.2037 11.9406 6
1.6782 1.3615 8.2037 2.1782 8.2037 8.2037 11.9307 7
1.5333 1.3046 8.6987 2.1782 8.6987 8.6987 11.9356 8
1.4151 1.2718 8.6987 2.1782 8.6987 8.6987 11.9455 9
1.3320 1.2373 8.6987 2.1782 8.6987 8.6987 11.9406 10
1.2664 1.2064 8.9816 2.3762 8.9109 8.9816 11.9158 11
1.2220 1.1775 8.9816 2.3762 8.9109 8.9816 11.9356 12
1.1767 1.1413 8.9816 2.3762 8.9109 8.9816 11.9554 13
1.1303 1.1072 8.9816 2.3762 8.9109 8.9816 11.9505 14
1.0972 1.0782 8.6987 1.7822 8.6987 8.6987 11.9703 15
1.0664 1.0578 8.4866 1.3861 8.4512 8.4866 11.9554 16
1.0392 1.0504 8.4866 1.3861 8.4512 8.4866 11.9752 17
1.0152 1.0242 8.4866 1.3861 8.4512 8.4866 11.9653 18
0.9892 1.0193 8.4866 1.3861 8.4512 8.4866 11.9703 19
0.9627 0.9990 8.4866 1.3861 8.4512 8.4866 11.9802 20
0.9460 0.9869 8.4866 1.3861 8.4512 8.4866 11.9752 21
0.9262 0.9735 7.9562 1.3861 7.9562 7.9562 11.9703 22
0.9083 0.9637 8.4866 1.3861 8.4512 8.4866 11.9554 23
0.8924 0.9525 8.4866 1.3861 8.4512 8.4866 11.9653 24
0.8749 0.9622 8.4866 1.3861 8.4512 8.4866 11.9703 25
0.8588 0.9417 8.4866 1.3861 8.4512 8.4866 11.9554 26
0.8421 0.9427 8.4866 1.3861 8.4512 8.4866 11.9653 27
0.8249 0.9339 7.9562 1.3861 7.9562 7.9562 11.9703 28
0.8100 0.9101 7.9562 1.3861 7.9562 7.9562 11.9851 29

Framework versions

  • Transformers 4.41.2
  • TensorFlow 2.15.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for pakawadeep/mt5-small-finetuned-ctfl-augmented_05

Base model

google/mt5-small
Finetuned
(301)
this model