Edit model card

t5-small-cyclonePamKP

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1208
  • Rouge1: 49.4993
  • Rouge2: 41.292
  • Rougel: 49.4867
  • Rougelsum: 49.4893
  • Gen Len: 8.6479

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.197 1.0 6199 1.2715 48.7826 40.9132 48.7107 48.673 8.8249
1.0007 2.0 12398 1.1895 49.2848 41.2383 49.2157 49.2427 8.8057
0.9704 3.0 18597 1.1555 48.6181 40.4911 48.6226 48.5723 8.3833
0.8767 4.0 24796 1.1741 50.095 41.7648 50.0118 50.0238 8.6996
0.8609 5.0 30995 1.1208 49.4993 41.292 49.4867 49.4893 8.6479
0.8116 6.0 37194 1.1342 48.6285 40.1426 48.6128 48.588 8.5086
0.7775 7.0 43393 1.1344 48.8308 40.1481 48.7914 48.7812 8.4947
0.7684 8.0 49592 1.1341 49.1873 40.4318 49.1699 49.1436 8.5676

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
2
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for rizvi-rahil786/t5-small-cyclonePamKP

Base model

google-t5/t5-small
Finetuned
(1506)
this model