Cem13 commited on
Commit
de3cd74
1 Parent(s): cfa8f5a

cem13/complaint_to_sythoms_mix_8x7b

Browse files
README.md ADDED
@@ -0,0 +1,235 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: mistralai/Mixtral-8x7B-v0.1
3
+ datasets:
4
+ - generator
5
+ library_name: peft
6
+ license: apache-2.0
7
+ tags:
8
+ - trl
9
+ - sft
10
+ - generated_from_trainer
11
+ model-index:
12
+ - name: Mixtral_Alpace_v2
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # Mixtral_Alpace_v2
20
+
21
+ This model is a fine-tuned version of [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) on the generator dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.5881
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 2.5e-05
43
+ - train_batch_size: 8
44
+ - eval_batch_size: 8
45
+ - seed: 42
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - lr_scheduler_warmup_steps: 15
49
+ - num_epochs: 15
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss |
54
+ |:-------------:|:-------:|:----:|:---------------:|
55
+ | 1.5291 | 0.0870 | 10 | 1.6326 |
56
+ | 1.58 | 0.1739 | 20 | 1.5665 |
57
+ | 1.4109 | 0.2609 | 30 | 1.4856 |
58
+ | 1.4493 | 0.3478 | 40 | 1.4159 |
59
+ | 1.2503 | 0.4348 | 50 | 1.3493 |
60
+ | 1.2441 | 0.5217 | 60 | 1.2719 |
61
+ | 1.1923 | 0.6087 | 70 | 1.1930 |
62
+ | 1.1158 | 0.6957 | 80 | 1.1193 |
63
+ | 1.0184 | 0.7826 | 90 | 1.0541 |
64
+ | 1.0231 | 0.8696 | 100 | 1.0056 |
65
+ | 0.9731 | 0.9565 | 110 | 0.9619 |
66
+ | 0.892 | 1.0435 | 120 | 0.9170 |
67
+ | 0.911 | 1.1304 | 130 | 0.8727 |
68
+ | 0.7789 | 1.2174 | 140 | 0.8338 |
69
+ | 0.8049 | 1.3043 | 150 | 0.8041 |
70
+ | 0.7691 | 1.3913 | 160 | 0.7788 |
71
+ | 0.7869 | 1.4783 | 170 | 0.7589 |
72
+ | 0.7366 | 1.5652 | 180 | 0.7428 |
73
+ | 0.7436 | 1.6522 | 190 | 0.7282 |
74
+ | 0.7271 | 1.7391 | 200 | 0.7157 |
75
+ | 0.6809 | 1.8261 | 210 | 0.7056 |
76
+ | 0.7068 | 1.9130 | 220 | 0.6960 |
77
+ | 0.6446 | 2.0 | 230 | 0.6872 |
78
+ | 0.6682 | 2.0870 | 240 | 0.6819 |
79
+ | 0.7003 | 2.1739 | 250 | 0.6745 |
80
+ | 0.6859 | 2.2609 | 260 | 0.6701 |
81
+ | 0.6169 | 2.3478 | 270 | 0.6655 |
82
+ | 0.666 | 2.4348 | 280 | 0.6607 |
83
+ | 0.6325 | 2.5217 | 290 | 0.6575 |
84
+ | 0.6408 | 2.6087 | 300 | 0.6536 |
85
+ | 0.6371 | 2.6957 | 310 | 0.6507 |
86
+ | 0.5933 | 2.7826 | 320 | 0.6474 |
87
+ | 0.6313 | 2.8696 | 330 | 0.6450 |
88
+ | 0.6453 | 2.9565 | 340 | 0.6421 |
89
+ | 0.6807 | 3.0435 | 350 | 0.6407 |
90
+ | 0.6217 | 3.1304 | 360 | 0.6390 |
91
+ | 0.589 | 3.2174 | 370 | 0.6355 |
92
+ | 0.5591 | 3.3043 | 380 | 0.6337 |
93
+ | 0.6818 | 3.3913 | 390 | 0.6319 |
94
+ | 0.6269 | 3.4783 | 400 | 0.6306 |
95
+ | 0.611 | 3.5652 | 410 | 0.6286 |
96
+ | 0.5602 | 3.6522 | 420 | 0.6268 |
97
+ | 0.6735 | 3.7391 | 430 | 0.6251 |
98
+ | 0.5269 | 3.8261 | 440 | 0.6246 |
99
+ | 0.6109 | 3.9130 | 450 | 0.6232 |
100
+ | 0.5745 | 4.0 | 460 | 0.6221 |
101
+ | 0.6348 | 4.0870 | 470 | 0.6227 |
102
+ | 0.5398 | 4.1739 | 480 | 0.6203 |
103
+ | 0.6145 | 4.2609 | 490 | 0.6194 |
104
+ | 0.621 | 4.3478 | 500 | 0.6178 |
105
+ | 0.6123 | 4.4348 | 510 | 0.6172 |
106
+ | 0.6113 | 4.5217 | 520 | 0.6162 |
107
+ | 0.5991 | 4.6087 | 530 | 0.6154 |
108
+ | 0.5244 | 4.6957 | 540 | 0.6143 |
109
+ | 0.5832 | 4.7826 | 550 | 0.6136 |
110
+ | 0.6284 | 4.8696 | 560 | 0.6120 |
111
+ | 0.54 | 4.9565 | 570 | 0.6121 |
112
+ | 0.541 | 5.0435 | 580 | 0.6120 |
113
+ | 0.5204 | 5.1304 | 590 | 0.6108 |
114
+ | 0.5961 | 5.2174 | 600 | 0.6101 |
115
+ | 0.5522 | 5.3043 | 610 | 0.6098 |
116
+ | 0.5778 | 5.3913 | 620 | 0.6087 |
117
+ | 0.6059 | 5.4783 | 630 | 0.6090 |
118
+ | 0.5852 | 5.5652 | 640 | 0.6085 |
119
+ | 0.5687 | 5.6522 | 650 | 0.6072 |
120
+ | 0.5685 | 5.7391 | 660 | 0.6061 |
121
+ | 0.593 | 5.8261 | 670 | 0.6052 |
122
+ | 0.5975 | 5.9130 | 680 | 0.6055 |
123
+ | 0.5489 | 6.0 | 690 | 0.6047 |
124
+ | 0.567 | 6.0870 | 700 | 0.6049 |
125
+ | 0.5706 | 6.1739 | 710 | 0.6035 |
126
+ | 0.658 | 6.2609 | 720 | 0.6024 |
127
+ | 0.559 | 6.3478 | 730 | 0.6023 |
128
+ | 0.545 | 6.4348 | 740 | 0.6019 |
129
+ | 0.6096 | 6.5217 | 750 | 0.6021 |
130
+ | 0.5385 | 6.6087 | 760 | 0.6018 |
131
+ | 0.5505 | 6.6957 | 770 | 0.6012 |
132
+ | 0.5058 | 6.7826 | 780 | 0.6003 |
133
+ | 0.5899 | 6.8696 | 790 | 0.5999 |
134
+ | 0.5102 | 6.9565 | 800 | 0.5995 |
135
+ | 0.5185 | 7.0435 | 810 | 0.5995 |
136
+ | 0.5055 | 7.1304 | 820 | 0.5991 |
137
+ | 0.5907 | 7.2174 | 830 | 0.5997 |
138
+ | 0.5636 | 7.3043 | 840 | 0.5991 |
139
+ | 0.5505 | 7.3913 | 850 | 0.5986 |
140
+ | 0.5621 | 7.4783 | 860 | 0.5977 |
141
+ | 0.4968 | 7.5652 | 870 | 0.5976 |
142
+ | 0.5713 | 7.6522 | 880 | 0.5970 |
143
+ | 0.5968 | 7.7391 | 890 | 0.5970 |
144
+ | 0.531 | 7.8261 | 900 | 0.5964 |
145
+ | 0.538 | 7.9130 | 910 | 0.5959 |
146
+ | 0.6087 | 8.0 | 920 | 0.5959 |
147
+ | 0.5845 | 8.0870 | 930 | 0.5963 |
148
+ | 0.5197 | 8.1739 | 940 | 0.5960 |
149
+ | 0.5128 | 8.2609 | 950 | 0.5959 |
150
+ | 0.5613 | 8.3478 | 960 | 0.5956 |
151
+ | 0.5268 | 8.4348 | 970 | 0.5953 |
152
+ | 0.5696 | 8.5217 | 980 | 0.5952 |
153
+ | 0.5755 | 8.6087 | 990 | 0.5941 |
154
+ | 0.5014 | 8.6957 | 1000 | 0.5945 |
155
+ | 0.5568 | 8.7826 | 1010 | 0.5936 |
156
+ | 0.5934 | 8.8696 | 1020 | 0.5944 |
157
+ | 0.5178 | 8.9565 | 1030 | 0.5941 |
158
+ | 0.4618 | 9.0435 | 1040 | 0.5936 |
159
+ | 0.4867 | 9.1304 | 1050 | 0.5934 |
160
+ | 0.5402 | 9.2174 | 1060 | 0.5937 |
161
+ | 0.5177 | 9.3043 | 1070 | 0.5936 |
162
+ | 0.5825 | 9.3913 | 1080 | 0.5926 |
163
+ | 0.5523 | 9.4783 | 1090 | 0.5929 |
164
+ | 0.583 | 9.5652 | 1100 | 0.5920 |
165
+ | 0.5232 | 9.6522 | 1110 | 0.5927 |
166
+ | 0.5367 | 9.7391 | 1120 | 0.5920 |
167
+ | 0.5321 | 9.8261 | 1130 | 0.5913 |
168
+ | 0.5672 | 9.9130 | 1140 | 0.5910 |
169
+ | 0.5549 | 10.0 | 1150 | 0.5910 |
170
+ | 0.5191 | 10.0870 | 1160 | 0.5915 |
171
+ | 0.5463 | 10.1739 | 1170 | 0.5915 |
172
+ | 0.5275 | 10.2609 | 1180 | 0.5913 |
173
+ | 0.5484 | 10.3478 | 1190 | 0.5915 |
174
+ | 0.5293 | 10.4348 | 1200 | 0.5910 |
175
+ | 0.519 | 10.5217 | 1210 | 0.5903 |
176
+ | 0.5129 | 10.6087 | 1220 | 0.5898 |
177
+ | 0.5793 | 10.6957 | 1230 | 0.5900 |
178
+ | 0.4481 | 10.7826 | 1240 | 0.5901 |
179
+ | 0.5309 | 10.8696 | 1250 | 0.5903 |
180
+ | 0.5887 | 10.9565 | 1260 | 0.5898 |
181
+ | 0.5109 | 11.0435 | 1270 | 0.5907 |
182
+ | 0.5776 | 11.1304 | 1280 | 0.5902 |
183
+ | 0.4984 | 11.2174 | 1290 | 0.5898 |
184
+ | 0.5656 | 11.3043 | 1300 | 0.5898 |
185
+ | 0.4931 | 11.3913 | 1310 | 0.5902 |
186
+ | 0.531 | 11.4783 | 1320 | 0.5900 |
187
+ | 0.5163 | 11.5652 | 1330 | 0.5892 |
188
+ | 0.5314 | 11.6522 | 1340 | 0.5894 |
189
+ | 0.4766 | 11.7391 | 1350 | 0.5893 |
190
+ | 0.5201 | 11.8261 | 1360 | 0.5896 |
191
+ | 0.6127 | 11.9130 | 1370 | 0.5889 |
192
+ | 0.5441 | 12.0 | 1380 | 0.5888 |
193
+ | 0.5258 | 12.0870 | 1390 | 0.5894 |
194
+ | 0.5722 | 12.1739 | 1400 | 0.5887 |
195
+ | 0.5228 | 12.2609 | 1410 | 0.5891 |
196
+ | 0.524 | 12.3478 | 1420 | 0.5884 |
197
+ | 0.4951 | 12.4348 | 1430 | 0.5894 |
198
+ | 0.5235 | 12.5217 | 1440 | 0.5893 |
199
+ | 0.5071 | 12.6087 | 1450 | 0.5889 |
200
+ | 0.5417 | 12.6957 | 1460 | 0.5886 |
201
+ | 0.4882 | 12.7826 | 1470 | 0.5889 |
202
+ | 0.548 | 12.8696 | 1480 | 0.5889 |
203
+ | 0.529 | 12.9565 | 1490 | 0.5889 |
204
+ | 0.5646 | 13.0435 | 1500 | 0.5887 |
205
+ | 0.5142 | 13.1304 | 1510 | 0.5889 |
206
+ | 0.5161 | 13.2174 | 1520 | 0.5886 |
207
+ | 0.5008 | 13.3043 | 1530 | 0.5888 |
208
+ | 0.5187 | 13.3913 | 1540 | 0.5887 |
209
+ | 0.5334 | 13.4783 | 1550 | 0.5886 |
210
+ | 0.5099 | 13.5652 | 1560 | 0.5884 |
211
+ | 0.5644 | 13.6522 | 1570 | 0.5888 |
212
+ | 0.5242 | 13.7391 | 1580 | 0.5882 |
213
+ | 0.4912 | 13.8261 | 1590 | 0.5886 |
214
+ | 0.5459 | 13.9130 | 1600 | 0.5884 |
215
+ | 0.5204 | 14.0 | 1610 | 0.5881 |
216
+ | 0.4644 | 14.0870 | 1620 | 0.5884 |
217
+ | 0.5364 | 14.1739 | 1630 | 0.5885 |
218
+ | 0.5852 | 14.2609 | 1640 | 0.5887 |
219
+ | 0.5135 | 14.3478 | 1650 | 0.5884 |
220
+ | 0.5192 | 14.4348 | 1660 | 0.5885 |
221
+ | 0.5093 | 14.5217 | 1670 | 0.5880 |
222
+ | 0.5398 | 14.6087 | 1680 | 0.5884 |
223
+ | 0.469 | 14.6957 | 1690 | 0.5882 |
224
+ | 0.5163 | 14.7826 | 1700 | 0.5883 |
225
+ | 0.5165 | 14.8696 | 1710 | 0.5883 |
226
+ | 0.5441 | 14.9565 | 1720 | 0.5881 |
227
+
228
+
229
+ ### Framework versions
230
+
231
+ - PEFT 0.12.0
232
+ - Transformers 4.44.0
233
+ - Pytorch 2.4.0+cu121
234
+ - Datasets 2.20.0
235
+ - Tokenizers 0.19.1
adapter_config.json ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "mistralai/Mixtral-8x7B-v0.1",
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": true,
8
+ "init_lora_weights": true,
9
+ "layer_replication": null,
10
+ "layers_pattern": null,
11
+ "layers_to_transform": null,
12
+ "loftq_config": {},
13
+ "lora_alpha": 16,
14
+ "lora_dropout": 0.1,
15
+ "megatron_config": null,
16
+ "megatron_core": "megatron.core",
17
+ "modules_to_save": null,
18
+ "peft_type": "LORA",
19
+ "r": 64,
20
+ "rank_pattern": {},
21
+ "revision": null,
22
+ "target_modules": [
23
+ "down_proj",
24
+ "o_proj",
25
+ "k_proj",
26
+ "q_proj",
27
+ "lm_head",
28
+ "gate_proj",
29
+ "v_proj",
30
+ "up_proj"
31
+ ],
32
+ "task_type": "CAUSAL_LM",
33
+ "use_dora": false,
34
+ "use_rslora": false
35
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:28c25b5955384c7f1cdbd3faa1f80d43eb7119c32b3f1dc5a912f8d14ab7191f
3
+ size 751667752
runs/Aug12_07-24-44_9bcd5dae9cf3/events.out.tfevents.1723447489.9bcd5dae9cf3.1307.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:879769f8f8dbaa626026e478b17ffe8ec362a25009f0495435622817ed2e4cee
3
+ size 88975
special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "</s>",
17
+ "unk_token": {
18
+ "content": "<unk>",
19
+ "lstrip": false,
20
+ "normalized": false,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dadfd56d766715c61d2ef780a525ab43b8e6da4de6865bda3d95fdef5e134055
3
+ size 493443
tokenizer_config.json ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "add_prefix_space": null,
5
+ "added_tokens_decoder": {
6
+ "0": {
7
+ "content": "<unk>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false,
12
+ "special": true
13
+ },
14
+ "1": {
15
+ "content": "<s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false,
20
+ "special": true
21
+ },
22
+ "2": {
23
+ "content": "</s>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": false,
27
+ "single_word": false,
28
+ "special": true
29
+ }
30
+ },
31
+ "additional_special_tokens": [],
32
+ "bos_token": "<s>",
33
+ "clean_up_tokenization_spaces": false,
34
+ "eos_token": "</s>",
35
+ "legacy": true,
36
+ "model_max_length": 1000000000000000019884624838656,
37
+ "pad_token": "</s>",
38
+ "sp_model_kwargs": {},
39
+ "spaces_between_special_tokens": false,
40
+ "tokenizer_class": "LlamaTokenizer",
41
+ "unk_token": "<unk>",
42
+ "use_default_system_prompt": false
43
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c6143df11c770bb500f028645adeb1c6c16aa2d198b7648eb603005a4b0aa840
3
+ size 5432