File size: 14,365 Bytes
160c5c1 22bc532 160c5c1 f03e49c 160c5c1 22bc532 160c5c1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 |
---
license: apache-2.0
base_model: facebook/hubert-base-ls960
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: hubert-classifier-aug-fold-0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hubert-classifier-aug-fold-0
This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6722
- Accuracy: 0.8706
- Precision: 0.8828
- Recall: 0.8706
- F1: 0.8708
- Binary: 0.9097
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
| No log | 0.24 | 50 | 4.4197 | 0.0210 | 0.0174 | 0.0210 | 0.0057 | 0.1713 |
| No log | 0.48 | 100 | 4.2999 | 0.0479 | 0.0457 | 0.0479 | 0.0233 | 0.3186 |
| No log | 0.72 | 150 | 3.9495 | 0.0517 | 0.0304 | 0.0517 | 0.0221 | 0.3297 |
| No log | 0.96 | 200 | 3.6566 | 0.0779 | 0.0264 | 0.0779 | 0.0319 | 0.3483 |
| 4.2316 | 1.2 | 250 | 3.4461 | 0.0944 | 0.0410 | 0.0944 | 0.0381 | 0.3633 |
| 4.2316 | 1.44 | 300 | 3.2464 | 0.1266 | 0.0635 | 0.1266 | 0.0624 | 0.3869 |
| 4.2316 | 1.68 | 350 | 3.0578 | 0.1476 | 0.0942 | 0.1476 | 0.0816 | 0.3983 |
| 4.2316 | 1.92 | 400 | 2.7652 | 0.2210 | 0.1506 | 0.2210 | 0.1354 | 0.4527 |
| 3.3453 | 2.16 | 450 | 2.4759 | 0.3026 | 0.2342 | 0.3026 | 0.2120 | 0.5108 |
| 3.3453 | 2.4 | 500 | 2.1916 | 0.3925 | 0.3092 | 0.3925 | 0.3043 | 0.5742 |
| 3.3453 | 2.63 | 550 | 1.9549 | 0.4524 | 0.3866 | 0.4524 | 0.3861 | 0.6169 |
| 3.3453 | 2.87 | 600 | 1.7926 | 0.4891 | 0.4796 | 0.4891 | 0.4231 | 0.6419 |
| 2.4259 | 3.11 | 650 | 1.5900 | 0.5700 | 0.5456 | 0.5700 | 0.5217 | 0.6991 |
| 2.4259 | 3.35 | 700 | 1.3724 | 0.6180 | 0.6275 | 0.6180 | 0.5730 | 0.7328 |
| 2.4259 | 3.59 | 750 | 1.2748 | 0.6502 | 0.6406 | 0.6502 | 0.6102 | 0.7560 |
| 2.4259 | 3.83 | 800 | 1.1681 | 0.6704 | 0.6791 | 0.6704 | 0.6384 | 0.7703 |
| 1.7305 | 4.07 | 850 | 1.0720 | 0.7139 | 0.7255 | 0.7139 | 0.6889 | 0.8001 |
| 1.7305 | 4.31 | 900 | 1.0337 | 0.7146 | 0.7298 | 0.7146 | 0.6921 | 0.7993 |
| 1.7305 | 4.55 | 950 | 0.9137 | 0.7423 | 0.7541 | 0.7423 | 0.7231 | 0.8199 |
| 1.7305 | 4.79 | 1000 | 0.8462 | 0.7633 | 0.7716 | 0.7633 | 0.7494 | 0.8345 |
| 1.3376 | 5.03 | 1050 | 0.8048 | 0.7790 | 0.7985 | 0.7790 | 0.7685 | 0.8462 |
| 1.3376 | 5.27 | 1100 | 0.7739 | 0.7850 | 0.7900 | 0.7850 | 0.7706 | 0.8493 |
| 1.3376 | 5.51 | 1150 | 0.7713 | 0.7955 | 0.8096 | 0.7955 | 0.7892 | 0.8569 |
| 1.3376 | 5.75 | 1200 | 0.7841 | 0.7925 | 0.8059 | 0.7925 | 0.7866 | 0.8550 |
| 1.3376 | 5.99 | 1250 | 0.7026 | 0.8007 | 0.8249 | 0.8007 | 0.7966 | 0.8609 |
| 1.0806 | 6.23 | 1300 | 0.6965 | 0.8112 | 0.8240 | 0.8112 | 0.8078 | 0.8685 |
| 1.0806 | 6.47 | 1350 | 0.6891 | 0.8142 | 0.8312 | 0.8142 | 0.8097 | 0.8697 |
| 1.0806 | 6.71 | 1400 | 0.6624 | 0.8262 | 0.8387 | 0.8262 | 0.8214 | 0.8781 |
| 1.0806 | 6.95 | 1450 | 0.6302 | 0.8337 | 0.8441 | 0.8337 | 0.8299 | 0.8834 |
| 0.9458 | 7.19 | 1500 | 0.6213 | 0.8367 | 0.8468 | 0.8367 | 0.8321 | 0.8854 |
| 0.9458 | 7.43 | 1550 | 0.6815 | 0.8195 | 0.8331 | 0.8195 | 0.8155 | 0.8738 |
| 0.9458 | 7.66 | 1600 | 0.6206 | 0.8427 | 0.8538 | 0.8427 | 0.8408 | 0.8902 |
| 0.9458 | 7.9 | 1650 | 0.5314 | 0.8577 | 0.8687 | 0.8577 | 0.8556 | 0.9007 |
| 0.8202 | 8.14 | 1700 | 0.5861 | 0.8390 | 0.8505 | 0.8390 | 0.8369 | 0.8874 |
| 0.8202 | 8.38 | 1750 | 0.5927 | 0.8532 | 0.8661 | 0.8532 | 0.8519 | 0.8975 |
| 0.8202 | 8.62 | 1800 | 0.6158 | 0.8449 | 0.8592 | 0.8449 | 0.8420 | 0.8919 |
| 0.8202 | 8.86 | 1850 | 0.5726 | 0.8457 | 0.8569 | 0.8457 | 0.8416 | 0.8918 |
| 0.7454 | 9.1 | 1900 | 0.6392 | 0.8360 | 0.8528 | 0.8360 | 0.8315 | 0.8858 |
| 0.7454 | 9.34 | 1950 | 0.5566 | 0.8577 | 0.8710 | 0.8577 | 0.8569 | 0.9006 |
| 0.7454 | 9.58 | 2000 | 0.5260 | 0.8592 | 0.8693 | 0.8592 | 0.8561 | 0.9010 |
| 0.7454 | 9.82 | 2050 | 0.5470 | 0.8659 | 0.8760 | 0.8659 | 0.8651 | 0.9058 |
| 0.6472 | 10.06 | 2100 | 0.5692 | 0.8554 | 0.8643 | 0.8554 | 0.8541 | 0.9001 |
| 0.6472 | 10.3 | 2150 | 0.5730 | 0.8599 | 0.8683 | 0.8599 | 0.8574 | 0.9016 |
| 0.6472 | 10.54 | 2200 | 0.5408 | 0.8637 | 0.8715 | 0.8637 | 0.8619 | 0.9048 |
| 0.6472 | 10.78 | 2250 | 0.5869 | 0.8652 | 0.8739 | 0.8652 | 0.8635 | 0.9052 |
| 0.6204 | 11.02 | 2300 | 0.6284 | 0.8539 | 0.8638 | 0.8539 | 0.8511 | 0.8985 |
| 0.6204 | 11.26 | 2350 | 0.5792 | 0.8599 | 0.8674 | 0.8599 | 0.8565 | 0.9024 |
| 0.6204 | 11.5 | 2400 | 0.6085 | 0.8592 | 0.8704 | 0.8592 | 0.8568 | 0.9011 |
| 0.6204 | 11.74 | 2450 | 0.6259 | 0.8517 | 0.8590 | 0.8517 | 0.8493 | 0.8958 |
| 0.6204 | 11.98 | 2500 | 0.6429 | 0.8494 | 0.8634 | 0.8494 | 0.8474 | 0.8945 |
| 0.5797 | 12.22 | 2550 | 0.6478 | 0.8502 | 0.8596 | 0.8502 | 0.8480 | 0.8960 |
| 0.5797 | 12.46 | 2600 | 0.5734 | 0.8652 | 0.8737 | 0.8652 | 0.8619 | 0.9055 |
| 0.5797 | 12.69 | 2650 | 0.6109 | 0.8569 | 0.8667 | 0.8569 | 0.8528 | 0.9003 |
| 0.5797 | 12.93 | 2700 | 0.5982 | 0.8652 | 0.8784 | 0.8652 | 0.8632 | 0.9058 |
| 0.542 | 13.17 | 2750 | 0.6024 | 0.8539 | 0.8655 | 0.8539 | 0.8527 | 0.8975 |
| 0.542 | 13.41 | 2800 | 0.5819 | 0.8629 | 0.8707 | 0.8629 | 0.8609 | 0.9056 |
| 0.542 | 13.65 | 2850 | 0.5870 | 0.8689 | 0.8781 | 0.8689 | 0.8680 | 0.9085 |
| 0.542 | 13.89 | 2900 | 0.5818 | 0.8637 | 0.8710 | 0.8637 | 0.8619 | 0.9042 |
| 0.5116 | 14.13 | 2950 | 0.5965 | 0.8599 | 0.8709 | 0.8599 | 0.8590 | 0.9035 |
| 0.5116 | 14.37 | 3000 | 0.6023 | 0.8607 | 0.8675 | 0.8607 | 0.8581 | 0.9029 |
| 0.5116 | 14.61 | 3050 | 0.6432 | 0.8637 | 0.8745 | 0.8637 | 0.8620 | 0.9040 |
| 0.5116 | 14.85 | 3100 | 0.6255 | 0.8584 | 0.8703 | 0.8584 | 0.8574 | 0.9014 |
| 0.4756 | 15.09 | 3150 | 0.6000 | 0.8629 | 0.8710 | 0.8629 | 0.8615 | 0.9040 |
| 0.4756 | 15.33 | 3200 | 0.6462 | 0.8689 | 0.8793 | 0.8689 | 0.8682 | 0.9082 |
| 0.4756 | 15.57 | 3250 | 0.6419 | 0.8539 | 0.8641 | 0.8539 | 0.8518 | 0.8984 |
| 0.4756 | 15.81 | 3300 | 0.6592 | 0.8569 | 0.8624 | 0.8569 | 0.8538 | 0.9012 |
| 0.4492 | 16.05 | 3350 | 0.6195 | 0.8607 | 0.8687 | 0.8607 | 0.8591 | 0.9034 |
| 0.4492 | 16.29 | 3400 | 0.6042 | 0.8697 | 0.8803 | 0.8697 | 0.8687 | 0.9090 |
| 0.4492 | 16.53 | 3450 | 0.6235 | 0.8562 | 0.8664 | 0.8562 | 0.8544 | 0.8998 |
| 0.4492 | 16.77 | 3500 | 0.6332 | 0.8674 | 0.8756 | 0.8674 | 0.8659 | 0.9069 |
| 0.4383 | 17.01 | 3550 | 0.6278 | 0.8584 | 0.8661 | 0.8584 | 0.8566 | 0.9011 |
| 0.4383 | 17.25 | 3600 | 0.5924 | 0.8719 | 0.8806 | 0.8719 | 0.8709 | 0.9100 |
| 0.4383 | 17.49 | 3650 | 0.6176 | 0.8712 | 0.8817 | 0.8712 | 0.8696 | 0.9105 |
| 0.4383 | 17.72 | 3700 | 0.6186 | 0.8712 | 0.8788 | 0.8712 | 0.8694 | 0.9106 |
| 0.4383 | 17.96 | 3750 | 0.6185 | 0.8749 | 0.8849 | 0.8749 | 0.8736 | 0.9124 |
| 0.4249 | 18.2 | 3800 | 0.6101 | 0.8742 | 0.8820 | 0.8742 | 0.8735 | 0.9116 |
| 0.4249 | 18.44 | 3850 | 0.6121 | 0.8689 | 0.8802 | 0.8689 | 0.8682 | 0.9085 |
| 0.4249 | 18.68 | 3900 | 0.6568 | 0.8614 | 0.8719 | 0.8614 | 0.8599 | 0.9031 |
| 0.4249 | 18.92 | 3950 | 0.6292 | 0.8697 | 0.8797 | 0.8697 | 0.8688 | 0.9091 |
| 0.4073 | 19.16 | 4000 | 0.6200 | 0.8719 | 0.8822 | 0.8719 | 0.8702 | 0.9103 |
| 0.4073 | 19.4 | 4050 | 0.6544 | 0.8644 | 0.8740 | 0.8644 | 0.8635 | 0.9052 |
| 0.4073 | 19.64 | 4100 | 0.6441 | 0.8652 | 0.8731 | 0.8652 | 0.8639 | 0.9061 |
| 0.4073 | 19.88 | 4150 | 0.6056 | 0.8779 | 0.8836 | 0.8779 | 0.8764 | 0.9146 |
| 0.3797 | 20.12 | 4200 | 0.6192 | 0.8742 | 0.8815 | 0.8742 | 0.8728 | 0.9117 |
| 0.3797 | 20.36 | 4250 | 0.5936 | 0.8787 | 0.8864 | 0.8787 | 0.8775 | 0.9156 |
| 0.3797 | 20.6 | 4300 | 0.6288 | 0.8749 | 0.8836 | 0.8749 | 0.8736 | 0.9124 |
| 0.3797 | 20.84 | 4350 | 0.6280 | 0.8734 | 0.8812 | 0.8734 | 0.8717 | 0.9116 |
| 0.3727 | 21.08 | 4400 | 0.6542 | 0.8712 | 0.8782 | 0.8712 | 0.8694 | 0.9097 |
| 0.3727 | 21.32 | 4450 | 0.6506 | 0.8667 | 0.8761 | 0.8667 | 0.8643 | 0.9063 |
| 0.3727 | 21.56 | 4500 | 0.6217 | 0.8727 | 0.8789 | 0.8727 | 0.8707 | 0.9105 |
| 0.3727 | 21.8 | 4550 | 0.6120 | 0.8779 | 0.8836 | 0.8779 | 0.8769 | 0.9142 |
| 0.3495 | 22.04 | 4600 | 0.6275 | 0.8704 | 0.8786 | 0.8704 | 0.8689 | 0.9092 |
| 0.3495 | 22.28 | 4650 | 0.6258 | 0.8794 | 0.8862 | 0.8794 | 0.8777 | 0.9153 |
| 0.3495 | 22.51 | 4700 | 0.6255 | 0.8682 | 0.8770 | 0.8682 | 0.8663 | 0.9079 |
| 0.3495 | 22.75 | 4750 | 0.6442 | 0.8689 | 0.8772 | 0.8689 | 0.8667 | 0.9085 |
| 0.3495 | 22.99 | 4800 | 0.6274 | 0.8727 | 0.8816 | 0.8727 | 0.8716 | 0.9109 |
| 0.3363 | 23.23 | 4850 | 0.6241 | 0.8712 | 0.8783 | 0.8712 | 0.8693 | 0.9103 |
| 0.3363 | 23.47 | 4900 | 0.5921 | 0.8824 | 0.8886 | 0.8824 | 0.8811 | 0.9175 |
| 0.3363 | 23.71 | 4950 | 0.6452 | 0.8749 | 0.8832 | 0.8749 | 0.8732 | 0.9124 |
| 0.3363 | 23.95 | 5000 | 0.6247 | 0.8757 | 0.8851 | 0.8757 | 0.8739 | 0.9129 |
| 0.3218 | 24.19 | 5050 | 0.6176 | 0.8816 | 0.8897 | 0.8816 | 0.8797 | 0.9173 |
| 0.3218 | 24.43 | 5100 | 0.6232 | 0.8772 | 0.8846 | 0.8772 | 0.8753 | 0.9139 |
| 0.3218 | 24.67 | 5150 | 0.6267 | 0.8757 | 0.8833 | 0.8757 | 0.8742 | 0.9131 |
| 0.3218 | 24.91 | 5200 | 0.6109 | 0.8749 | 0.8825 | 0.8749 | 0.8736 | 0.9124 |
| 0.3173 | 25.15 | 5250 | 0.6192 | 0.8801 | 0.8878 | 0.8801 | 0.8786 | 0.9160 |
| 0.3173 | 25.39 | 5300 | 0.6303 | 0.8764 | 0.8853 | 0.8764 | 0.8750 | 0.9134 |
| 0.3173 | 25.63 | 5350 | 0.6552 | 0.8742 | 0.8818 | 0.8742 | 0.8726 | 0.9115 |
| 0.3173 | 25.87 | 5400 | 0.6291 | 0.8712 | 0.8782 | 0.8712 | 0.8697 | 0.9094 |
| 0.316 | 26.11 | 5450 | 0.6041 | 0.8816 | 0.8874 | 0.8816 | 0.8805 | 0.9169 |
| 0.316 | 26.35 | 5500 | 0.6254 | 0.8809 | 0.8887 | 0.8809 | 0.8792 | 0.9166 |
| 0.316 | 26.59 | 5550 | 0.6147 | 0.8801 | 0.8868 | 0.8801 | 0.8789 | 0.9160 |
| 0.316 | 26.83 | 5600 | 0.6255 | 0.8794 | 0.8866 | 0.8794 | 0.8780 | 0.9155 |
| 0.2917 | 27.07 | 5650 | 0.5997 | 0.8824 | 0.8893 | 0.8824 | 0.8811 | 0.9173 |
| 0.2917 | 27.31 | 5700 | 0.5993 | 0.8831 | 0.8906 | 0.8831 | 0.8817 | 0.9181 |
| 0.2917 | 27.54 | 5750 | 0.6007 | 0.8809 | 0.8889 | 0.8809 | 0.8796 | 0.9166 |
| 0.2917 | 27.78 | 5800 | 0.6041 | 0.8787 | 0.8871 | 0.8787 | 0.8772 | 0.9152 |
| 0.2896 | 28.02 | 5850 | 0.5977 | 0.8854 | 0.8921 | 0.8854 | 0.8844 | 0.9196 |
| 0.2896 | 28.26 | 5900 | 0.5875 | 0.8869 | 0.8937 | 0.8869 | 0.8858 | 0.9210 |
| 0.2896 | 28.5 | 5950 | 0.6133 | 0.8764 | 0.8843 | 0.8764 | 0.8750 | 0.9136 |
| 0.2896 | 28.74 | 6000 | 0.6153 | 0.8794 | 0.8874 | 0.8794 | 0.8783 | 0.9157 |
| 0.2896 | 28.98 | 6050 | 0.6031 | 0.8816 | 0.8891 | 0.8816 | 0.8799 | 0.9173 |
| 0.2821 | 29.22 | 6100 | 0.6034 | 0.8839 | 0.8908 | 0.8839 | 0.8823 | 0.9189 |
| 0.2821 | 29.46 | 6150 | 0.6003 | 0.8831 | 0.8895 | 0.8831 | 0.8815 | 0.9184 |
| 0.2821 | 29.7 | 6200 | 0.6013 | 0.8846 | 0.8911 | 0.8846 | 0.8832 | 0.9194 |
### Framework versions
- Transformers 4.38.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.15.1
|