fydhfzh commited on
Commit
991aa7a
1 Parent(s): ebd77aa

Training in progress, step 500

Browse files
Files changed (21) hide show
  1. README.md +126 -0
  2. config.json +248 -0
  3. model.safetensors +3 -0
  4. preprocessor_config.json +9 -0
  5. runs/Jun12_18-13-20_LAPTOP-1GID9RGH/events.out.tfevents.1718208802.LAPTOP-1GID9RGH.22420.0 +3 -0
  6. runs/Jun12_18-13-52_LAPTOP-1GID9RGH/events.out.tfevents.1718208833.LAPTOP-1GID9RGH.22420.1 +3 -0
  7. runs/Jun12_18-15-08_LAPTOP-1GID9RGH/events.out.tfevents.1718208909.LAPTOP-1GID9RGH.22948.0 +3 -0
  8. runs/Jun12_23-04-00_LAPTOP-1GID9RGH/events.out.tfevents.1718208242.LAPTOP-1GID9RGH.10712.0 +3 -0
  9. runs/Jun12_23-21-21_LAPTOP-1GID9RGH/events.out.tfevents.1718209283.LAPTOP-1GID9RGH.18804.0 +3 -0
  10. runs/Jun13_13-23-20_LAPTOP-1GID9RGH/events.out.tfevents.1718259800.LAPTOP-1GID9RGH.9068.0 +3 -0
  11. runs/Jun13_13-23-20_LAPTOP-1GID9RGH/events.out.tfevents.1718260129.LAPTOP-1GID9RGH.9068.1 +3 -0
  12. runs/Jun13_13-41-06_LAPTOP-1GID9RGH/events.out.tfevents.1718260867.LAPTOP-1GID9RGH.9068.2 +3 -0
  13. runs/Jun13_13-41-06_LAPTOP-1GID9RGH/events.out.tfevents.1718261195.LAPTOP-1GID9RGH.9068.3 +3 -0
  14. runs/Jun13_15-01-53_LAPTOP-1GID9RGH/events.out.tfevents.1718265714.LAPTOP-1GID9RGH.23548.0 +3 -0
  15. runs/Jun13_15-01-53_LAPTOP-1GID9RGH/events.out.tfevents.1718267821.LAPTOP-1GID9RGH.23548.1 +3 -0
  16. runs/Jun13_22-17-36_LAPTOP-1GID9RGH/events.out.tfevents.1718291857.LAPTOP-1GID9RGH.21000.0 +3 -0
  17. runs/Jun13_22-20-06_LAPTOP-1GID9RGH/events.out.tfevents.1718292007.LAPTOP-1GID9RGH.20972.0 +3 -0
  18. runs/Jun13_22-20-06_LAPTOP-1GID9RGH/events.out.tfevents.1718292615.LAPTOP-1GID9RGH.20972.1 +3 -0
  19. runs/Jun17_14-19-03_LAPTOP-1GID9RGH/events.out.tfevents.1718608744.LAPTOP-1GID9RGH.13984.0 +3 -0
  20. runs/Jun17_15-21-17_LAPTOP-1GID9RGH/events.out.tfevents.1718612478.LAPTOP-1GID9RGH.11192.0 +3 -0
  21. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,126 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/hubert-base-ls960
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ - precision
9
+ - recall
10
+ - f1
11
+ model-index:
12
+ - name: hubert-classifier
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # hubert-classifier
20
+
21
+ This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 1.1058
24
+ - Accuracy: 0.7748
25
+ - Precision: 0.8018
26
+ - Recall: 0.7748
27
+ - F1: 0.7651
28
+ - Binary: 0.8455
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+
42
+ ## Training procedure
43
+
44
+ ### Training hyperparameters
45
+
46
+ The following hyperparameters were used during training:
47
+ - learning_rate: 3e-05
48
+ - train_batch_size: 32
49
+ - eval_batch_size: 32
50
+ - seed: 42
51
+ - gradient_accumulation_steps: 4
52
+ - total_train_batch_size: 128
53
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
+ - lr_scheduler_type: linear
55
+ - num_epochs: 10
56
+ - mixed_precision_training: Native AMP
57
+
58
+ ### Training results
59
+
60
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
61
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
62
+ | No log | 0.17 | 50 | 4.2665 | 0.0412 | 0.0107 | 0.0412 | 0.0127 | 0.2923 |
63
+ | No log | 0.35 | 100 | 3.9427 | 0.0339 | 0.0016 | 0.0339 | 0.0030 | 0.3172 |
64
+ | No log | 0.52 | 150 | 3.7412 | 0.0363 | 0.0025 | 0.0363 | 0.0041 | 0.3206 |
65
+ | No log | 0.69 | 200 | 3.6193 | 0.0654 | 0.0238 | 0.0654 | 0.0259 | 0.3373 |
66
+ | No log | 0.86 | 250 | 3.4784 | 0.1041 | 0.0460 | 0.1041 | 0.0459 | 0.3663 |
67
+ | No log | 1.04 | 300 | 3.3705 | 0.1211 | 0.0602 | 0.1211 | 0.0466 | 0.3789 |
68
+ | No log | 1.21 | 350 | 3.2597 | 0.1768 | 0.0811 | 0.1768 | 0.0894 | 0.4218 |
69
+ | No log | 1.38 | 400 | 3.1606 | 0.2082 | 0.1867 | 0.2082 | 0.1416 | 0.4424 |
70
+ | No log | 1.55 | 450 | 3.0720 | 0.1913 | 0.1490 | 0.1913 | 0.1296 | 0.4312 |
71
+ | 3.6525 | 1.73 | 500 | 2.9557 | 0.2446 | 0.1432 | 0.2446 | 0.1609 | 0.4671 |
72
+ | 3.6525 | 1.9 | 550 | 2.8287 | 0.2857 | 0.2265 | 0.2857 | 0.2059 | 0.4973 |
73
+ | 3.6525 | 2.07 | 600 | 2.7005 | 0.3075 | 0.2103 | 0.3075 | 0.2154 | 0.5136 |
74
+ | 3.6525 | 2.24 | 650 | 2.6183 | 0.3414 | 0.2398 | 0.3414 | 0.2486 | 0.5341 |
75
+ | 3.6525 | 2.42 | 700 | 2.5133 | 0.3632 | 0.2942 | 0.3632 | 0.2732 | 0.5516 |
76
+ | 3.6525 | 2.59 | 750 | 2.4277 | 0.3753 | 0.3322 | 0.3753 | 0.2948 | 0.5615 |
77
+ | 3.6525 | 2.76 | 800 | 2.3329 | 0.4092 | 0.3538 | 0.4092 | 0.3338 | 0.5845 |
78
+ | 3.6525 | 2.93 | 850 | 2.2465 | 0.4407 | 0.4125 | 0.4407 | 0.3745 | 0.6073 |
79
+ | 3.6525 | 3.11 | 900 | 2.1792 | 0.4600 | 0.4329 | 0.4600 | 0.3995 | 0.6203 |
80
+ | 3.6525 | 3.28 | 950 | 2.1004 | 0.5109 | 0.4995 | 0.5109 | 0.4540 | 0.6550 |
81
+ | 2.6844 | 3.45 | 1000 | 2.0314 | 0.5109 | 0.4799 | 0.5109 | 0.4520 | 0.6557 |
82
+ | 2.6844 | 3.62 | 1050 | 1.9561 | 0.5400 | 0.5309 | 0.5400 | 0.4859 | 0.6743 |
83
+ | 2.6844 | 3.8 | 1100 | 1.9362 | 0.5472 | 0.5441 | 0.5472 | 0.5066 | 0.6804 |
84
+ | 2.6844 | 3.97 | 1150 | 1.8666 | 0.5642 | 0.5647 | 0.5642 | 0.5232 | 0.6930 |
85
+ | 2.6844 | 4.14 | 1200 | 1.8204 | 0.5811 | 0.5716 | 0.5811 | 0.5416 | 0.7048 |
86
+ | 2.6844 | 4.31 | 1250 | 1.7494 | 0.5908 | 0.6153 | 0.5908 | 0.5618 | 0.7109 |
87
+ | 2.6844 | 4.49 | 1300 | 1.6973 | 0.6126 | 0.6062 | 0.6126 | 0.5804 | 0.7291 |
88
+ | 2.6844 | 4.66 | 1350 | 1.6615 | 0.6053 | 0.5864 | 0.6053 | 0.5707 | 0.7211 |
89
+ | 2.6844 | 4.83 | 1400 | 1.6120 | 0.6295 | 0.6304 | 0.6295 | 0.6000 | 0.7385 |
90
+ | 2.6844 | 5.0 | 1450 | 1.5620 | 0.6610 | 0.6605 | 0.6610 | 0.6333 | 0.7615 |
91
+ | 2.1096 | 5.18 | 1500 | 1.5330 | 0.6538 | 0.6424 | 0.6538 | 0.6223 | 0.7581 |
92
+ | 2.1096 | 5.35 | 1550 | 1.5112 | 0.6707 | 0.6830 | 0.6707 | 0.6484 | 0.7707 |
93
+ | 2.1096 | 5.52 | 1600 | 1.4732 | 0.6659 | 0.6793 | 0.6659 | 0.6430 | 0.7685 |
94
+ | 2.1096 | 5.69 | 1650 | 1.4420 | 0.6755 | 0.6969 | 0.6755 | 0.6538 | 0.7734 |
95
+ | 2.1096 | 5.87 | 1700 | 1.4011 | 0.7094 | 0.7461 | 0.7094 | 0.6929 | 0.7988 |
96
+ | 2.1096 | 6.04 | 1750 | 1.3924 | 0.6780 | 0.6835 | 0.6780 | 0.6557 | 0.7760 |
97
+ | 2.1096 | 6.21 | 1800 | 1.3604 | 0.7022 | 0.7116 | 0.7022 | 0.6838 | 0.7937 |
98
+ | 2.1096 | 6.38 | 1850 | 1.3271 | 0.7070 | 0.7079 | 0.7070 | 0.6882 | 0.7954 |
99
+ | 2.1096 | 6.56 | 1900 | 1.3104 | 0.7264 | 0.7338 | 0.7264 | 0.7110 | 0.8099 |
100
+ | 2.1096 | 6.73 | 1950 | 1.2804 | 0.7312 | 0.7591 | 0.7312 | 0.7159 | 0.8131 |
101
+ | 1.7648 | 6.9 | 2000 | 1.2722 | 0.7312 | 0.7739 | 0.7312 | 0.7185 | 0.8131 |
102
+ | 1.7648 | 7.08 | 2050 | 1.2777 | 0.7240 | 0.7581 | 0.7240 | 0.7109 | 0.8099 |
103
+ | 1.7648 | 7.25 | 2100 | 1.2319 | 0.7288 | 0.7373 | 0.7288 | 0.7114 | 0.8123 |
104
+ | 1.7648 | 7.42 | 2150 | 1.2074 | 0.7433 | 0.7717 | 0.7433 | 0.7317 | 0.8215 |
105
+ | 1.7648 | 7.59 | 2200 | 1.2150 | 0.7433 | 0.7850 | 0.7433 | 0.7348 | 0.8235 |
106
+ | 1.7648 | 7.77 | 2250 | 1.1787 | 0.7603 | 0.7930 | 0.7603 | 0.7462 | 0.8344 |
107
+ | 1.7648 | 7.94 | 2300 | 1.1815 | 0.7676 | 0.7932 | 0.7676 | 0.7576 | 0.8404 |
108
+ | 1.7648 | 8.11 | 2350 | 1.1578 | 0.7676 | 0.7972 | 0.7676 | 0.7601 | 0.8404 |
109
+ | 1.7648 | 8.28 | 2400 | 1.1605 | 0.7651 | 0.7982 | 0.7651 | 0.7560 | 0.8387 |
110
+ | 1.7648 | 8.46 | 2450 | 1.1563 | 0.7627 | 0.7937 | 0.7627 | 0.7548 | 0.8370 |
111
+ | 1.5781 | 8.63 | 2500 | 1.1303 | 0.7579 | 0.7847 | 0.7579 | 0.7476 | 0.8337 |
112
+ | 1.5781 | 8.8 | 2550 | 1.1217 | 0.7797 | 0.8117 | 0.7797 | 0.7702 | 0.8489 |
113
+ | 1.5781 | 8.97 | 2600 | 1.1278 | 0.7724 | 0.8025 | 0.7724 | 0.7640 | 0.8438 |
114
+ | 1.5781 | 9.15 | 2650 | 1.1188 | 0.7748 | 0.8022 | 0.7748 | 0.7653 | 0.8455 |
115
+ | 1.5781 | 9.32 | 2700 | 1.1161 | 0.7676 | 0.7979 | 0.7676 | 0.7588 | 0.8404 |
116
+ | 1.5781 | 9.49 | 2750 | 1.1078 | 0.7748 | 0.8012 | 0.7748 | 0.7650 | 0.8446 |
117
+ | 1.5781 | 9.66 | 2800 | 1.1104 | 0.7724 | 0.7973 | 0.7724 | 0.7632 | 0.8429 |
118
+ | 1.5781 | 9.84 | 2850 | 1.1058 | 0.7748 | 0.8018 | 0.7748 | 0.7651 | 0.8455 |
119
+
120
+
121
+ ### Framework versions
122
+
123
+ - Transformers 4.38.2
124
+ - Pytorch 2.3.0
125
+ - Datasets 2.19.1
126
+ - Tokenizers 0.15.1
config.json ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "facebook/hubert-base-ls960",
3
+ "activation_dropout": 0.1,
4
+ "apply_spec_augment": true,
5
+ "architectures": [
6
+ "HubertForSequenceClassification"
7
+ ],
8
+ "attention_dropout": 0.1,
9
+ "bos_token_id": 1,
10
+ "classifier_proj_size": 256,
11
+ "conv_bias": false,
12
+ "conv_dim": [
13
+ 512,
14
+ 512,
15
+ 512,
16
+ 512,
17
+ 512,
18
+ 512,
19
+ 512
20
+ ],
21
+ "conv_kernel": [
22
+ 10,
23
+ 3,
24
+ 3,
25
+ 3,
26
+ 3,
27
+ 2,
28
+ 2
29
+ ],
30
+ "conv_stride": [
31
+ 5,
32
+ 2,
33
+ 2,
34
+ 2,
35
+ 2,
36
+ 2,
37
+ 2
38
+ ],
39
+ "ctc_loss_reduction": "sum",
40
+ "ctc_zero_infinity": false,
41
+ "do_stable_layer_norm": false,
42
+ "eos_token_id": 2,
43
+ "feat_extract_activation": "gelu",
44
+ "feat_extract_dropout": 0.0,
45
+ "feat_extract_norm": "group",
46
+ "feat_proj_dropout": 0.1,
47
+ "feat_proj_layer_norm": true,
48
+ "final_dropout": 0.1,
49
+ "gradient_checkpointing": false,
50
+ "hidden_act": "gelu",
51
+ "hidden_dropout": 0.1,
52
+ "hidden_dropout_prob": 0.1,
53
+ "hidden_size": 768,
54
+ "id2label": {
55
+ "0": "LABEL_0",
56
+ "1": "LABEL_1",
57
+ "2": "LABEL_2",
58
+ "3": "LABEL_3",
59
+ "4": "LABEL_4",
60
+ "5": "LABEL_5",
61
+ "6": "LABEL_6",
62
+ "7": "LABEL_7",
63
+ "8": "LABEL_8",
64
+ "9": "LABEL_9",
65
+ "10": "LABEL_10",
66
+ "11": "LABEL_11",
67
+ "12": "LABEL_12",
68
+ "13": "LABEL_13",
69
+ "14": "LABEL_14",
70
+ "15": "LABEL_15",
71
+ "16": "LABEL_16",
72
+ "17": "LABEL_17",
73
+ "18": "LABEL_18",
74
+ "19": "LABEL_19",
75
+ "20": "LABEL_20",
76
+ "21": "LABEL_21",
77
+ "22": "LABEL_22",
78
+ "23": "LABEL_23",
79
+ "24": "LABEL_24",
80
+ "25": "LABEL_25",
81
+ "26": "LABEL_26",
82
+ "27": "LABEL_27",
83
+ "28": "LABEL_28",
84
+ "29": "LABEL_29",
85
+ "30": "LABEL_30",
86
+ "31": "LABEL_31",
87
+ "32": "LABEL_32",
88
+ "33": "LABEL_33",
89
+ "34": "LABEL_34",
90
+ "35": "LABEL_35",
91
+ "36": "LABEL_36",
92
+ "37": "LABEL_37",
93
+ "38": "LABEL_38",
94
+ "39": "LABEL_39",
95
+ "40": "LABEL_40",
96
+ "41": "LABEL_41",
97
+ "42": "LABEL_42",
98
+ "43": "LABEL_43",
99
+ "44": "LABEL_44",
100
+ "45": "LABEL_45",
101
+ "46": "LABEL_46",
102
+ "47": "LABEL_47",
103
+ "48": "LABEL_48",
104
+ "49": "LABEL_49",
105
+ "50": "LABEL_50",
106
+ "51": "LABEL_51",
107
+ "52": "LABEL_52",
108
+ "53": "LABEL_53",
109
+ "54": "LABEL_54",
110
+ "55": "LABEL_55",
111
+ "56": "LABEL_56",
112
+ "57": "LABEL_57",
113
+ "58": "LABEL_58",
114
+ "59": "LABEL_59",
115
+ "60": "LABEL_60",
116
+ "61": "LABEL_61",
117
+ "62": "LABEL_62",
118
+ "63": "LABEL_63",
119
+ "64": "LABEL_64",
120
+ "65": "LABEL_65",
121
+ "66": "LABEL_66",
122
+ "67": "LABEL_67",
123
+ "68": "LABEL_68",
124
+ "69": "LABEL_69",
125
+ "70": "LABEL_70",
126
+ "71": "LABEL_71",
127
+ "72": "LABEL_72",
128
+ "73": "LABEL_73",
129
+ "74": "LABEL_74",
130
+ "75": "LABEL_75",
131
+ "76": "LABEL_76",
132
+ "77": "LABEL_77",
133
+ "78": "LABEL_78",
134
+ "79": "LABEL_79",
135
+ "80": "LABEL_80",
136
+ "81": "LABEL_81",
137
+ "82": "LABEL_82",
138
+ "83": "LABEL_83"
139
+ },
140
+ "initializer_range": 0.02,
141
+ "intermediate_size": 3072,
142
+ "label2id": {
143
+ "LABEL_0": 0,
144
+ "LABEL_1": 1,
145
+ "LABEL_10": 10,
146
+ "LABEL_11": 11,
147
+ "LABEL_12": 12,
148
+ "LABEL_13": 13,
149
+ "LABEL_14": 14,
150
+ "LABEL_15": 15,
151
+ "LABEL_16": 16,
152
+ "LABEL_17": 17,
153
+ "LABEL_18": 18,
154
+ "LABEL_19": 19,
155
+ "LABEL_2": 2,
156
+ "LABEL_20": 20,
157
+ "LABEL_21": 21,
158
+ "LABEL_22": 22,
159
+ "LABEL_23": 23,
160
+ "LABEL_24": 24,
161
+ "LABEL_25": 25,
162
+ "LABEL_26": 26,
163
+ "LABEL_27": 27,
164
+ "LABEL_28": 28,
165
+ "LABEL_29": 29,
166
+ "LABEL_3": 3,
167
+ "LABEL_30": 30,
168
+ "LABEL_31": 31,
169
+ "LABEL_32": 32,
170
+ "LABEL_33": 33,
171
+ "LABEL_34": 34,
172
+ "LABEL_35": 35,
173
+ "LABEL_36": 36,
174
+ "LABEL_37": 37,
175
+ "LABEL_38": 38,
176
+ "LABEL_39": 39,
177
+ "LABEL_4": 4,
178
+ "LABEL_40": 40,
179
+ "LABEL_41": 41,
180
+ "LABEL_42": 42,
181
+ "LABEL_43": 43,
182
+ "LABEL_44": 44,
183
+ "LABEL_45": 45,
184
+ "LABEL_46": 46,
185
+ "LABEL_47": 47,
186
+ "LABEL_48": 48,
187
+ "LABEL_49": 49,
188
+ "LABEL_5": 5,
189
+ "LABEL_50": 50,
190
+ "LABEL_51": 51,
191
+ "LABEL_52": 52,
192
+ "LABEL_53": 53,
193
+ "LABEL_54": 54,
194
+ "LABEL_55": 55,
195
+ "LABEL_56": 56,
196
+ "LABEL_57": 57,
197
+ "LABEL_58": 58,
198
+ "LABEL_59": 59,
199
+ "LABEL_6": 6,
200
+ "LABEL_60": 60,
201
+ "LABEL_61": 61,
202
+ "LABEL_62": 62,
203
+ "LABEL_63": 63,
204
+ "LABEL_64": 64,
205
+ "LABEL_65": 65,
206
+ "LABEL_66": 66,
207
+ "LABEL_67": 67,
208
+ "LABEL_68": 68,
209
+ "LABEL_69": 69,
210
+ "LABEL_7": 7,
211
+ "LABEL_70": 70,
212
+ "LABEL_71": 71,
213
+ "LABEL_72": 72,
214
+ "LABEL_73": 73,
215
+ "LABEL_74": 74,
216
+ "LABEL_75": 75,
217
+ "LABEL_76": 76,
218
+ "LABEL_77": 77,
219
+ "LABEL_78": 78,
220
+ "LABEL_79": 79,
221
+ "LABEL_8": 8,
222
+ "LABEL_80": 80,
223
+ "LABEL_81": 81,
224
+ "LABEL_82": 82,
225
+ "LABEL_83": 83,
226
+ "LABEL_9": 9
227
+ },
228
+ "layer_norm_eps": 1e-05,
229
+ "layerdrop": 0.1,
230
+ "mask_feature_length": 10,
231
+ "mask_feature_min_masks": 0,
232
+ "mask_feature_prob": 0.0,
233
+ "mask_time_length": 10,
234
+ "mask_time_min_masks": 2,
235
+ "mask_time_prob": 0.05,
236
+ "model_type": "hubert",
237
+ "num_attention_heads": 12,
238
+ "num_conv_pos_embedding_groups": 16,
239
+ "num_conv_pos_embeddings": 128,
240
+ "num_feat_extract_layers": 7,
241
+ "num_hidden_layers": 12,
242
+ "pad_token_id": 0,
243
+ "tokenizer_class": "Wav2Vec2CTCTokenizer",
244
+ "torch_dtype": "float32",
245
+ "transformers_version": "4.38.2",
246
+ "use_weighted_layer_sum": false,
247
+ "vocab_size": 32
248
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ddb5c0ab5d631c2b0945f54721a08a37a3a159d5241799e32ff94d972ec3c271
3
+ size 378386248
preprocessor_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_normalize": true,
3
+ "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
+ "feature_size": 1,
5
+ "padding_side": "right",
6
+ "padding_value": 0,
7
+ "return_attention_mask": false,
8
+ "sampling_rate": 16000
9
+ }
runs/Jun12_18-13-20_LAPTOP-1GID9RGH/events.out.tfevents.1718208802.LAPTOP-1GID9RGH.22420.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b7f5f14fbe8df781482a5eb28fc3448cf98c461a9efd6dd59bfc73fdb23e726f
3
+ size 9022
runs/Jun12_18-13-52_LAPTOP-1GID9RGH/events.out.tfevents.1718208833.LAPTOP-1GID9RGH.22420.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb5b2bf9fe61e5afe0225601ffc7a43a87cc07a8e1baed5f1ce41590e33fc482
3
+ size 9022
runs/Jun12_18-15-08_LAPTOP-1GID9RGH/events.out.tfevents.1718208909.LAPTOP-1GID9RGH.22948.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:56d1018d88c99b3ad6fba63ee5172123a53608d09de74e813cdf5b1c0a121d28
3
+ size 13178
runs/Jun12_23-04-00_LAPTOP-1GID9RGH/events.out.tfevents.1718208242.LAPTOP-1GID9RGH.10712.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:97a217d3d887c742fd893166cd7fbacfecd6f1b5e0ca32a2baafdba73d259c03
3
+ size 15832
runs/Jun12_23-21-21_LAPTOP-1GID9RGH/events.out.tfevents.1718209283.LAPTOP-1GID9RGH.18804.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a91dc680923c8170a1c06d57185e06855df6cb0dbf6420791ce19122391819fb
3
+ size 21952
runs/Jun13_13-23-20_LAPTOP-1GID9RGH/events.out.tfevents.1718259800.LAPTOP-1GID9RGH.9068.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:67d5746478317fab5050500a9f3ff691eb8de8071ec8b9faf327af102c62ae52
3
+ size 15309
runs/Jun13_13-23-20_LAPTOP-1GID9RGH/events.out.tfevents.1718260129.LAPTOP-1GID9RGH.9068.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f3189229c17292fc9374bf77d5d716459f77b1ecae086ccbea72fb07205e573f
3
+ size 610
runs/Jun13_13-41-06_LAPTOP-1GID9RGH/events.out.tfevents.1718260867.LAPTOP-1GID9RGH.9068.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7e8a11bdcc133bb778dd02e6ffaf39b3623055abf0556d59920c0d34ea222980
3
+ size 15309
runs/Jun13_13-41-06_LAPTOP-1GID9RGH/events.out.tfevents.1718261195.LAPTOP-1GID9RGH.9068.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:97de84ee453e4c65327c059182a45e7ca07dc00e4a098178f73993809015e736
3
+ size 610
runs/Jun13_15-01-53_LAPTOP-1GID9RGH/events.out.tfevents.1718265714.LAPTOP-1GID9RGH.23548.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3aca503abae403229ced35179f1035b2f1a0d90643f0e089448cf982adc84786
3
+ size 38077
runs/Jun13_15-01-53_LAPTOP-1GID9RGH/events.out.tfevents.1718267821.LAPTOP-1GID9RGH.23548.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2285ca11a79c52c3b3af886dd99a71c0a2366db4c4889b5a46841a38713af295
3
+ size 610
runs/Jun13_22-17-36_LAPTOP-1GID9RGH/events.out.tfevents.1718291857.LAPTOP-1GID9RGH.21000.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae949855dfa2d1cca7ac6c41d4880d44fbb51960b0323e71a43fd3df459064c5
3
+ size 10568
runs/Jun13_22-20-06_LAPTOP-1GID9RGH/events.out.tfevents.1718292007.LAPTOP-1GID9RGH.20972.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1facfeb5d750e6b7dd2fe8fcdf5e9301bebc01ac786529567e735f269f7bead9
3
+ size 18441
runs/Jun13_22-20-06_LAPTOP-1GID9RGH/events.out.tfevents.1718292615.LAPTOP-1GID9RGH.20972.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:824e41826fa631f53b1cf06a0d3a82f608c16fdf20a7b8bc424ef43d391cb5b7
3
+ size 610
runs/Jun17_14-19-03_LAPTOP-1GID9RGH/events.out.tfevents.1718608744.LAPTOP-1GID9RGH.13984.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:059b500dc475705259cdfeaba11338609c08bb96806ca2ddb6e2462431e7e6a6
3
+ size 40165
runs/Jun17_15-21-17_LAPTOP-1GID9RGH/events.out.tfevents.1718612478.LAPTOP-1GID9RGH.11192.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6c48ff10b6402be188c4125ade5499f1f7e6ded85c5d3a3778371abef5fd2fad
3
+ size 14433
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d36a194b28f8c98989af66488a72a17c56b73547c978053f7ceeae42c221ec5b
3
+ size 4984