Edit model card

ingredient_prune

This model is a fine-tuned version of t5-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0255
  • Rouge1: 88.3061
  • Rouge2: 76.6099
  • Rougel: 88.3242
  • Rougelsum: 88.2429
  • Gen Len: 10.5872

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
2.9499 0.09 10 1.3100 33.1645 23.9561 32.6647 32.7137 14.7431
1.9454 0.18 20 0.6787 30.1119 21.203 29.5079 29.6061 13.8349
1.309 0.28 30 0.5147 25.3399 17.694 24.4102 24.4425 11.6514
1.0307 0.37 40 0.4398 17.4522 11.66 16.2846 16.3817 8.5413
0.9574 0.46 50 0.4302 16.6745 10.6799 15.8568 16.4301 8.0092
0.7183 0.55 60 0.3818 14.4343 9.4646 13.9825 14.1979 6.9725
0.5636 0.64 70 0.3096 9.4156 5.2844 9.0143 9.239 5.5596
0.4603 0.73 80 0.2664 8.6106 4.7574 7.9285 8.4429 5.0917
0.4607 0.83 90 0.2319 6.7868 3.9309 6.1844 6.7007 3.8349
0.352 0.92 100 0.1991 6.2965 3.5572 5.3616 5.9941 3.2661
0.3426 1.01 110 0.1735 6.1795 3.1174 5.3783 5.9261 3.3119
0.2901 1.1 120 0.1553 5.5031 2.739 4.9926 5.5079 3.1376
0.3619 1.19 130 0.1452 4.1403 1.8462 4.0877 4.1877 3.0092
0.2509 1.28 140 0.1338 4.1399 1.8019 3.9836 4.1506 2.9541
0.1938 1.38 150 0.1187 2.9515 1.2174 2.7845 3.0192 2.2569
0.1987 1.47 160 0.1068 4.8991 3.4459 4.7552 4.9489 2.1284
0.1702 1.56 170 0.0983 8.7082 5.5788 8.5531 8.8267 3.4587
0.1535 1.65 180 0.0871 11.5572 7.6669 11.4688 11.5381 4.6972
0.1629 1.74 190 0.0771 16.33 11.587 16.0842 16.1965 6.6055
0.1618 1.83 200 0.0690 21.4186 14.9296 21.2789 21.2002 8.367
0.1617 1.93 210 0.0628 27.6198 19.8907 27.4479 27.4515 10.3394
0.1136 2.02 220 0.0572 36.7416 28.2903 36.7181 36.719 12.3578
0.1278 2.11 230 0.0526 46.9007 36.6481 47.1002 46.8623 13.7064
0.0915 2.2 240 0.0486 56.1238 45.5624 56.3372 56.0369 14.1101
0.0736 2.29 250 0.0448 63.3857 51.8889 63.6163 63.2685 13.4771
0.0855 2.39 260 0.0420 72.669 59.9359 72.7393 72.6055 12.3486
0.0921 2.48 270 0.0388 78.2224 65.2581 78.2789 77.9532 11.3578
0.077 2.57 280 0.0364 82.3881 68.397 82.4999 82.3175 10.5872
0.0848 2.66 290 0.0347 85.4014 72.793 85.495 85.3917 10.633
0.0978 2.75 300 0.0332 86.0947 72.9678 86.1325 86.0028 10.5138
0.0635 2.84 310 0.0323 86.158 73.833 86.2727 86.1471 10.5596
0.0555 2.94 320 0.0314 86.0306 73.8297 86.0421 85.9571 10.5688
0.0792 3.03 330 0.0305 87.5066 75.3885 87.6496 87.3874 10.3761
0.0536 3.12 340 0.0297 88.0844 75.8754 88.1956 87.9164 10.4954
0.063 3.21 350 0.0290 88.0844 75.8754 88.1956 87.9164 10.4954
0.0563 3.3 360 0.0283 88.0783 75.989 88.2233 87.9578 10.5138
0.0547 3.39 370 0.0279 88.1265 76.3196 88.3078 88.0765 10.6147
0.0635 3.49 380 0.0275 86.9846 74.8237 87.0556 86.9021 10.5872
0.0835 3.58 390 0.0271 86.933 75.3277 87.0357 86.931 10.6147
0.0628 3.67 400 0.0269 87.5981 75.5811 87.6905 87.4594 10.6789
0.0554 3.76 410 0.0267 88.0124 76.5633 88.174 87.9292 10.578
0.0342 3.85 420 0.0266 88.0124 76.5633 88.174 87.9292 10.578
0.0396 3.94 430 0.0263 88.0064 76.6947 88.1712 87.9434 10.5872
0.045 4.04 440 0.0262 87.7466 76.3605 87.8932 87.6273 10.5505
0.0566 4.13 450 0.0262 87.8577 76.5633 88.0399 87.7835 10.6055
0.0582 4.22 460 0.0261 87.8103 76.1351 87.9277 87.7032 10.6697
0.051 4.31 470 0.0260 87.8103 76.1351 87.9277 87.7032 10.6697
0.0398 4.4 480 0.0258 88.1974 76.4006 88.2158 88.0622 10.6789
0.0364 4.5 490 0.0257 88.3353 76.5513 88.3291 88.2557 10.633
0.0498 4.59 500 0.0257 88.4083 76.5513 88.4132 88.35 10.6147
0.0406 4.68 510 0.0256 88.3061 76.6099 88.3242 88.2429 10.5872
0.0403 4.77 520 0.0256 88.3061 76.6099 88.3242 88.2429 10.5872
0.0421 4.86 530 0.0255 88.3061 76.6099 88.3242 88.2429 10.5872
0.0271 4.95 540 0.0255 88.3061 76.6099 88.3242 88.2429 10.5872

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2
Downloads last month
4
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for AkshayPM/ingredient_prune

Base model

google-t5/t5-base
Finetuned
(392)
this model