Edit model card

t5-base-finetuned-DEPlain

This model is a fine-tuned version of t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1801
  • Rouge1: 56.7543
  • Rouge2: 34.5465
  • Rougel: 50.3496
  • Rougelsum: 51.2324
  • Gen Len: 16.8188

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.554 1.0 667 1.3293 56.1265 33.8636 49.5938 50.4918 16.84
1.4507 2.0 1334 1.2801 56.3814 34.0926 49.8536 50.7672 16.9245
1.3372 3.0 2001 1.2520 56.3402 33.9261 49.7162 50.6615 16.892
1.2872 4.0 2668 1.2327 56.3268 33.963 49.7334 50.6969 16.9155
1.2636 5.0 3335 1.2176 56.4841 33.8839 49.7693 50.7099 16.8708
1.2075 6.0 4002 1.2100 56.5246 34.1424 49.8971 50.8385 16.8457
1.1809 7.0 4669 1.2013 56.5925 34.0925 49.9624 50.9224 16.8091
1.1611 8.0 5336 1.1959 56.7085 34.2704 50.1433 51.0436 16.8067
1.1331 9.0 6003 1.1922 56.7095 34.0918 50.0821 51.0102 16.8107
1.1047 10.0 6670 1.1864 56.7457 34.2806 50.19 51.1058 16.814
1.1056 11.0 7337 1.1852 56.673 34.3557 50.2595 51.1949 16.8424
1.0808 12.0 8004 1.1847 56.7362 34.4604 50.316 51.2366 16.801
1.0549 13.0 8671 1.1812 56.6744 34.4499 50.2533 51.1119 16.8123
1.0677 14.0 9338 1.1825 56.7276 34.4141 50.235 51.1764 16.8058
1.0481 15.0 10005 1.1797 56.869 34.6091 50.4321 51.3106 16.8058
1.0368 16.0 10672 1.1807 56.7085 34.4924 50.3168 51.2012 16.8262
1.035 17.0 11339 1.1809 56.6515 34.4276 50.2845 51.1447 16.7904
1.0272 18.0 12006 1.1802 56.6906 34.5219 50.3562 51.2191 16.8172
1.0201 19.0 12673 1.1799 56.6978 34.4779 50.2927 51.1814 16.8188
1.0122 20.0 13340 1.1801 56.7543 34.5465 50.3496 51.2324 16.8188

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.1
Downloads last month
6
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jonathandechert/t5-base-finetuned-DEPlain

Base model

google-t5/t5-base
Finetuned
(392)
this model