AkshayPM commited on
Commit
24ef7ed
1 Parent(s): aa5eadc

End of training

Browse files
README.md CHANGED
@@ -1,8 +1,10 @@
1
  ---
2
  license: apache-2.0
3
- base_model: google/flan-t5-base
4
  tags:
5
  - generated_from_trainer
 
 
6
  model-index:
7
  - name: ingredient_prune
8
  results: []
@@ -13,9 +15,14 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # ingredient_prune
15
 
16
- This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.0140
 
 
 
 
 
19
 
20
  ## Model description
21
 
@@ -35,42 +42,72 @@ More information needed
35
 
36
  The following hyperparameters were used during training:
37
  - learning_rate: 2e-05
38
- - train_batch_size: 8
39
- - eval_batch_size: 8
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
- - num_epochs: 20
44
  - mixed_precision_training: Native AMP
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss |
49
- |:-------------:|:-----:|:----:|:---------------:|
50
- | 17.212 | 0.82 | 100 | 4.0184 |
51
- | 2.4003 | 1.64 | 200 | 0.2977 |
52
- | 0.3173 | 2.46 | 300 | 0.0432 |
53
- | 0.068 | 3.28 | 400 | 0.0203 |
54
- | 0.0356 | 4.1 | 500 | 0.0167 |
55
- | 0.026 | 4.92 | 600 | 0.0145 |
56
- | 0.0194 | 5.74 | 700 | 0.0138 |
57
- | 0.0179 | 6.56 | 800 | 0.0132 |
58
- | 0.0154 | 7.38 | 900 | 0.0134 |
59
- | 0.0133 | 8.2 | 1000 | 0.0134 |
60
- | 0.0125 | 9.02 | 1100 | 0.0132 |
61
- | 0.013 | 9.84 | 1200 | 0.0131 |
62
- | 0.011 | 10.66 | 1300 | 0.0136 |
63
- | 0.0106 | 11.48 | 1400 | 0.0133 |
64
- | 0.0098 | 12.3 | 1500 | 0.0133 |
65
- | 0.0093 | 13.11 | 1600 | 0.0134 |
66
- | 0.009 | 13.93 | 1700 | 0.0138 |
67
- | 0.0085 | 14.75 | 1800 | 0.0141 |
68
- | 0.0083 | 15.57 | 1900 | 0.0136 |
69
- | 0.0083 | 16.39 | 2000 | 0.0139 |
70
- | 0.0082 | 17.21 | 2100 | 0.0139 |
71
- | 0.0083 | 18.03 | 2200 | 0.0139 |
72
- | 0.0076 | 18.85 | 2300 | 0.0140 |
73
- | 0.0085 | 19.67 | 2400 | 0.0140 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
74
 
75
 
76
  ### Framework versions
 
1
  ---
2
  license: apache-2.0
3
+ base_model: t5-base
4
  tags:
5
  - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
  model-index:
9
  - name: ingredient_prune
10
  results: []
 
15
 
16
  # ingredient_prune
17
 
18
+ This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.0255
21
+ - Rouge1: 88.3061
22
+ - Rouge2: 76.6099
23
+ - Rougel: 88.3242
24
+ - Rougelsum: 88.2429
25
+ - Gen Len: 10.5872
26
 
27
  ## Model description
28
 
 
42
 
43
  The following hyperparameters were used during training:
44
  - learning_rate: 2e-05
45
+ - train_batch_size: 4
46
+ - eval_batch_size: 4
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
+ - num_epochs: 5
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
57
+ | 2.9499 | 0.09 | 10 | 1.3100 | 33.1645 | 23.9561 | 32.6647 | 32.7137 | 14.7431 |
58
+ | 1.9454 | 0.18 | 20 | 0.6787 | 30.1119 | 21.203 | 29.5079 | 29.6061 | 13.8349 |
59
+ | 1.309 | 0.28 | 30 | 0.5147 | 25.3399 | 17.694 | 24.4102 | 24.4425 | 11.6514 |
60
+ | 1.0307 | 0.37 | 40 | 0.4398 | 17.4522 | 11.66 | 16.2846 | 16.3817 | 8.5413 |
61
+ | 0.9574 | 0.46 | 50 | 0.4302 | 16.6745 | 10.6799 | 15.8568 | 16.4301 | 8.0092 |
62
+ | 0.7183 | 0.55 | 60 | 0.3818 | 14.4343 | 9.4646 | 13.9825 | 14.1979 | 6.9725 |
63
+ | 0.5636 | 0.64 | 70 | 0.3096 | 9.4156 | 5.2844 | 9.0143 | 9.239 | 5.5596 |
64
+ | 0.4603 | 0.73 | 80 | 0.2664 | 8.6106 | 4.7574 | 7.9285 | 8.4429 | 5.0917 |
65
+ | 0.4607 | 0.83 | 90 | 0.2319 | 6.7868 | 3.9309 | 6.1844 | 6.7007 | 3.8349 |
66
+ | 0.352 | 0.92 | 100 | 0.1991 | 6.2965 | 3.5572 | 5.3616 | 5.9941 | 3.2661 |
67
+ | 0.3426 | 1.01 | 110 | 0.1735 | 6.1795 | 3.1174 | 5.3783 | 5.9261 | 3.3119 |
68
+ | 0.2901 | 1.1 | 120 | 0.1553 | 5.5031 | 2.739 | 4.9926 | 5.5079 | 3.1376 |
69
+ | 0.3619 | 1.19 | 130 | 0.1452 | 4.1403 | 1.8462 | 4.0877 | 4.1877 | 3.0092 |
70
+ | 0.2509 | 1.28 | 140 | 0.1338 | 4.1399 | 1.8019 | 3.9836 | 4.1506 | 2.9541 |
71
+ | 0.1938 | 1.38 | 150 | 0.1187 | 2.9515 | 1.2174 | 2.7845 | 3.0192 | 2.2569 |
72
+ | 0.1987 | 1.47 | 160 | 0.1068 | 4.8991 | 3.4459 | 4.7552 | 4.9489 | 2.1284 |
73
+ | 0.1702 | 1.56 | 170 | 0.0983 | 8.7082 | 5.5788 | 8.5531 | 8.8267 | 3.4587 |
74
+ | 0.1535 | 1.65 | 180 | 0.0871 | 11.5572 | 7.6669 | 11.4688 | 11.5381 | 4.6972 |
75
+ | 0.1629 | 1.74 | 190 | 0.0771 | 16.33 | 11.587 | 16.0842 | 16.1965 | 6.6055 |
76
+ | 0.1618 | 1.83 | 200 | 0.0690 | 21.4186 | 14.9296 | 21.2789 | 21.2002 | 8.367 |
77
+ | 0.1617 | 1.93 | 210 | 0.0628 | 27.6198 | 19.8907 | 27.4479 | 27.4515 | 10.3394 |
78
+ | 0.1136 | 2.02 | 220 | 0.0572 | 36.7416 | 28.2903 | 36.7181 | 36.719 | 12.3578 |
79
+ | 0.1278 | 2.11 | 230 | 0.0526 | 46.9007 | 36.6481 | 47.1002 | 46.8623 | 13.7064 |
80
+ | 0.0915 | 2.2 | 240 | 0.0486 | 56.1238 | 45.5624 | 56.3372 | 56.0369 | 14.1101 |
81
+ | 0.0736 | 2.29 | 250 | 0.0448 | 63.3857 | 51.8889 | 63.6163 | 63.2685 | 13.4771 |
82
+ | 0.0855 | 2.39 | 260 | 0.0420 | 72.669 | 59.9359 | 72.7393 | 72.6055 | 12.3486 |
83
+ | 0.0921 | 2.48 | 270 | 0.0388 | 78.2224 | 65.2581 | 78.2789 | 77.9532 | 11.3578 |
84
+ | 0.077 | 2.57 | 280 | 0.0364 | 82.3881 | 68.397 | 82.4999 | 82.3175 | 10.5872 |
85
+ | 0.0848 | 2.66 | 290 | 0.0347 | 85.4014 | 72.793 | 85.495 | 85.3917 | 10.633 |
86
+ | 0.0978 | 2.75 | 300 | 0.0332 | 86.0947 | 72.9678 | 86.1325 | 86.0028 | 10.5138 |
87
+ | 0.0635 | 2.84 | 310 | 0.0323 | 86.158 | 73.833 | 86.2727 | 86.1471 | 10.5596 |
88
+ | 0.0555 | 2.94 | 320 | 0.0314 | 86.0306 | 73.8297 | 86.0421 | 85.9571 | 10.5688 |
89
+ | 0.0792 | 3.03 | 330 | 0.0305 | 87.5066 | 75.3885 | 87.6496 | 87.3874 | 10.3761 |
90
+ | 0.0536 | 3.12 | 340 | 0.0297 | 88.0844 | 75.8754 | 88.1956 | 87.9164 | 10.4954 |
91
+ | 0.063 | 3.21 | 350 | 0.0290 | 88.0844 | 75.8754 | 88.1956 | 87.9164 | 10.4954 |
92
+ | 0.0563 | 3.3 | 360 | 0.0283 | 88.0783 | 75.989 | 88.2233 | 87.9578 | 10.5138 |
93
+ | 0.0547 | 3.39 | 370 | 0.0279 | 88.1265 | 76.3196 | 88.3078 | 88.0765 | 10.6147 |
94
+ | 0.0635 | 3.49 | 380 | 0.0275 | 86.9846 | 74.8237 | 87.0556 | 86.9021 | 10.5872 |
95
+ | 0.0835 | 3.58 | 390 | 0.0271 | 86.933 | 75.3277 | 87.0357 | 86.931 | 10.6147 |
96
+ | 0.0628 | 3.67 | 400 | 0.0269 | 87.5981 | 75.5811 | 87.6905 | 87.4594 | 10.6789 |
97
+ | 0.0554 | 3.76 | 410 | 0.0267 | 88.0124 | 76.5633 | 88.174 | 87.9292 | 10.578 |
98
+ | 0.0342 | 3.85 | 420 | 0.0266 | 88.0124 | 76.5633 | 88.174 | 87.9292 | 10.578 |
99
+ | 0.0396 | 3.94 | 430 | 0.0263 | 88.0064 | 76.6947 | 88.1712 | 87.9434 | 10.5872 |
100
+ | 0.045 | 4.04 | 440 | 0.0262 | 87.7466 | 76.3605 | 87.8932 | 87.6273 | 10.5505 |
101
+ | 0.0566 | 4.13 | 450 | 0.0262 | 87.8577 | 76.5633 | 88.0399 | 87.7835 | 10.6055 |
102
+ | 0.0582 | 4.22 | 460 | 0.0261 | 87.8103 | 76.1351 | 87.9277 | 87.7032 | 10.6697 |
103
+ | 0.051 | 4.31 | 470 | 0.0260 | 87.8103 | 76.1351 | 87.9277 | 87.7032 | 10.6697 |
104
+ | 0.0398 | 4.4 | 480 | 0.0258 | 88.1974 | 76.4006 | 88.2158 | 88.0622 | 10.6789 |
105
+ | 0.0364 | 4.5 | 490 | 0.0257 | 88.3353 | 76.5513 | 88.3291 | 88.2557 | 10.633 |
106
+ | 0.0498 | 4.59 | 500 | 0.0257 | 88.4083 | 76.5513 | 88.4132 | 88.35 | 10.6147 |
107
+ | 0.0406 | 4.68 | 510 | 0.0256 | 88.3061 | 76.6099 | 88.3242 | 88.2429 | 10.5872 |
108
+ | 0.0403 | 4.77 | 520 | 0.0256 | 88.3061 | 76.6099 | 88.3242 | 88.2429 | 10.5872 |
109
+ | 0.0421 | 4.86 | 530 | 0.0255 | 88.3061 | 76.6099 | 88.3242 | 88.2429 | 10.5872 |
110
+ | 0.0271 | 4.95 | 540 | 0.0255 | 88.3061 | 76.6099 | 88.3242 | 88.2429 | 10.5872 |
111
 
112
 
113
  ### Framework versions
config.json CHANGED
@@ -1,20 +1,20 @@
1
  {
2
- "_name_or_path": "google/flan-t5-base",
3
  "architectures": [
4
  "T5ForConditionalGeneration"
5
  ],
6
  "classifier_dropout": 0.0,
7
- "d_ff": 2048,
8
  "d_kv": 64,
9
  "d_model": 768,
10
  "decoder_start_token_id": 0,
11
- "dense_act_fn": "gelu_new",
12
  "dropout_rate": 0.1,
13
  "eos_token_id": 1,
14
- "feed_forward_proj": "gated-gelu",
15
  "initializer_factor": 1.0,
16
  "is_encoder_decoder": true,
17
- "is_gated_act": true,
18
  "layer_norm_epsilon": 1e-06,
19
  "model_type": "t5",
20
  "n_positions": 512,
@@ -54,7 +54,6 @@
54
  "prefix": "translate English to Romanian: "
55
  }
56
  },
57
- "tie_word_embeddings": false,
58
  "torch_dtype": "float32",
59
  "transformers_version": "4.38.2",
60
  "use_cache": true,
 
1
  {
2
+ "_name_or_path": "t5-base",
3
  "architectures": [
4
  "T5ForConditionalGeneration"
5
  ],
6
  "classifier_dropout": 0.0,
7
+ "d_ff": 3072,
8
  "d_kv": 64,
9
  "d_model": 768,
10
  "decoder_start_token_id": 0,
11
+ "dense_act_fn": "relu",
12
  "dropout_rate": 0.1,
13
  "eos_token_id": 1,
14
+ "feed_forward_proj": "relu",
15
  "initializer_factor": 1.0,
16
  "is_encoder_decoder": true,
17
+ "is_gated_act": false,
18
  "layer_norm_epsilon": 1e-06,
19
  "model_type": "t5",
20
  "n_positions": 512,
 
54
  "prefix": "translate English to Romanian: "
55
  }
56
  },
 
57
  "torch_dtype": "float32",
58
  "transformers_version": "4.38.2",
59
  "use_cache": true,
generation_config.json CHANGED
@@ -1,5 +1,4 @@
1
  {
2
- "_from_model_config": true,
3
  "decoder_start_token_id": 0,
4
  "eos_token_id": 1,
5
  "pad_token_id": 0,
 
1
  {
 
2
  "decoder_start_token_id": 0,
3
  "eos_token_id": 1,
4
  "pad_token_id": 0,
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4a6d29efc5671f216a56027ac9f94db20161dd36fca529a1c2334b4f65cf718f
3
- size 990345064
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:748d64e4c64aa8b8f55f19dedfee6ea3ef7c2a3a1c89678ea746ab2e5393f68e
3
+ size 891644712
runs/Apr23_13-58-39_bcff8d0abbd9/events.out.tfevents.1713880729.bcff8d0abbd9.34.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3596b012ee2304b921f084307f4627d6bc65797e4617a22c98d7bfbe78ea646a
3
+ size 17125
runs/Apr23_14-05-07_bcff8d0abbd9/events.out.tfevents.1713881108.bcff8d0abbd9.34.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae48622f944c770ea51763277f7b1f5f83a5c9aa3e52d7add6df9a43aa2a8ebb
3
+ size 5767
runs/Apr23_14-06-35_bcff8d0abbd9/events.out.tfevents.1713881196.bcff8d0abbd9.34.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:71847a557d89aee0e4efad70c6311d1dc5b4115b23879d1d68bd70a11656be38
3
+ size 52613
special_tokens_map.json CHANGED
@@ -101,25 +101,7 @@
101
  "<extra_id_98>",
102
  "<extra_id_99>"
103
  ],
104
- "eos_token": {
105
- "content": "</s>",
106
- "lstrip": false,
107
- "normalized": false,
108
- "rstrip": false,
109
- "single_word": false
110
- },
111
- "pad_token": {
112
- "content": "<pad>",
113
- "lstrip": false,
114
- "normalized": false,
115
- "rstrip": false,
116
- "single_word": false
117
- },
118
- "unk_token": {
119
- "content": "<unk>",
120
- "lstrip": false,
121
- "normalized": false,
122
- "rstrip": false,
123
- "single_word": false
124
- }
125
  }
 
101
  "<extra_id_98>",
102
  "<extra_id_99>"
103
  ],
104
+ "eos_token": "</s>",
105
+ "pad_token": "<pad>",
106
+ "unk_token": "<unk>"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
107
  }
tokenizer.json CHANGED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json CHANGED
@@ -932,7 +932,6 @@
932
  "extra_ids": 100,
933
  "model_max_length": 128,
934
  "pad_token": "<pad>",
935
- "sp_model_kwargs": {},
936
  "tokenizer_class": "T5Tokenizer",
937
  "unk_token": "<unk>"
938
  }
 
932
  "extra_ids": 100,
933
  "model_max_length": 128,
934
  "pad_token": "<pad>",
 
935
  "tokenizer_class": "T5Tokenizer",
936
  "unk_token": "<unk>"
937
  }
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:aaedefbc90eabca7075c2c56a07c9c692277ce37e1f1de92e6e0775bcb0e06dd
3
  size 5048
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ac460abe19f37f41bd8624e0c0f825129628eae8203e58c80e43fa839de25063
3
  size 5048