File size: 7,497 Bytes
c99a0e1
 
 
 
 
 
 
 
 
 
 
42c5c7d
c99a0e1
 
 
 
 
 
42c5c7d
c99a0e1
 
 
ba51d4c
 
 
 
 
 
c99a0e1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ba51d4c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c99a0e1
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---

license: apache-2.0
base_model: facebook/hubert-base-ls960
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: hubert-classifier-aug-fold-0
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# hubert-classifier-aug-fold-0

This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5592
- Accuracy: 0.8464
- Precision: 0.8588
- Recall: 0.8464
- F1: 0.8431
- Binary: 0.8926

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.22  | 50   | 4.4295          | 0.0135   | 0.0002    | 0.0135 | 0.0004 | 0.1332 |

| No log        | 0.43  | 100  | 4.4254          | 0.0148   | 0.0002    | 0.0148 | 0.0004 | 0.1274 |

| No log        | 0.65  | 150  | 3.8186          | 0.0364   | 0.0121    | 0.0364 | 0.0050 | 0.3090 |

| No log        | 0.86  | 200  | 3.5321          | 0.0391   | 0.0090    | 0.0391 | 0.0062 | 0.3193 |

| 4.1413        | 1.08  | 250  | 3.3337          | 0.0728   | 0.0256    | 0.0728 | 0.0286 | 0.3453 |

| 4.1413        | 1.29  | 300  | 3.1664          | 0.0970   | 0.0489    | 0.0970 | 0.0400 | 0.3590 |

| 4.1413        | 1.51  | 350  | 2.9961          | 0.1253   | 0.0613    | 0.1253 | 0.0631 | 0.3821 |

| 4.1413        | 1.73  | 400  | 2.8225          | 0.1739   | 0.0798    | 0.1739 | 0.0904 | 0.4181 |

| 4.1413        | 1.94  | 450  | 2.6439          | 0.2116   | 0.1109    | 0.2116 | 0.1236 | 0.4457 |

| 3.2276        | 2.16  | 500  | 2.4578          | 0.2385   | 0.1802    | 0.2385 | 0.1570 | 0.4670 |

| 3.2276        | 2.37  | 550  | 2.2801          | 0.3396   | 0.2831    | 0.3396 | 0.2516 | 0.5358 |

| 3.2276        | 2.59  | 600  | 2.0684          | 0.4003   | 0.3030    | 0.4003 | 0.3068 | 0.5796 |

| 3.2276        | 2.8   | 650  | 1.9308          | 0.4299   | 0.3493    | 0.4299 | 0.3516 | 0.6005 |

| 2.5852        | 3.02  | 700  | 1.8448          | 0.4501   | 0.4000    | 0.4501 | 0.3811 | 0.6146 |

| 2.5852        | 3.24  | 750  | 1.6568          | 0.5283   | 0.4743    | 0.5283 | 0.4552 | 0.6689 |

| 2.5852        | 3.45  | 800  | 1.6974          | 0.4690   | 0.4551    | 0.4690 | 0.4169 | 0.6264 |

| 2.5852        | 3.67  | 850  | 1.4828          | 0.5687   | 0.5769    | 0.5687 | 0.5231 | 0.6978 |

| 2.5852        | 3.88  | 900  | 1.4420          | 0.5580   | 0.5477    | 0.5580 | 0.5126 | 0.6896 |

| 2.1226        | 4.1   | 950  | 1.3306          | 0.6186   | 0.6133    | 0.6186 | 0.5784 | 0.7315 |

| 2.1226        | 4.31  | 1000 | 1.2209          | 0.6456   | 0.6561    | 0.6456 | 0.6076 | 0.7500 |

| 2.1226        | 4.53  | 1050 | 1.1256          | 0.6698   | 0.6865    | 0.6698 | 0.6404 | 0.7664 |

| 2.1226        | 4.75  | 1100 | 1.0700          | 0.6846   | 0.7003    | 0.6846 | 0.6586 | 0.7770 |

| 2.1226        | 4.96  | 1150 | 1.0085          | 0.7156   | 0.7415    | 0.7156 | 0.6942 | 0.7993 |

| 1.8257        | 5.18  | 1200 | 1.0190          | 0.7224   | 0.7397    | 0.7224 | 0.7028 | 0.8046 |

| 1.8257        | 5.39  | 1250 | 0.9742          | 0.7102   | 0.7244    | 0.7102 | 0.6886 | 0.7961 |

| 1.8257        | 5.61  | 1300 | 0.8793          | 0.7561   | 0.7680    | 0.7561 | 0.7384 | 0.8284 |

| 1.8257        | 5.83  | 1350 | 0.8472          | 0.7547   | 0.7763    | 0.7547 | 0.7426 | 0.8280 |

| 1.5842        | 6.04  | 1400 | 0.8424          | 0.7601   | 0.7956    | 0.7601 | 0.7487 | 0.8327 |

| 1.5842        | 6.26  | 1450 | 0.7802          | 0.7642   | 0.7846    | 0.7642 | 0.7513 | 0.8348 |

| 1.5842        | 6.47  | 1500 | 0.7447          | 0.7965   | 0.8096    | 0.7965 | 0.7914 | 0.8574 |

| 1.5842        | 6.69  | 1550 | 0.7081          | 0.7844   | 0.8035    | 0.7844 | 0.7772 | 0.8499 |

| 1.5842        | 6.9   | 1600 | 0.7616          | 0.7722   | 0.7995    | 0.7722 | 0.7681 | 0.8399 |

| 1.4387        | 7.12  | 1650 | 0.7133          | 0.7709   | 0.7904    | 0.7709 | 0.7607 | 0.8403 |

| 1.4387        | 7.34  | 1700 | 0.6570          | 0.8127   | 0.8301    | 0.8127 | 0.8094 | 0.8695 |

| 1.4387        | 7.55  | 1750 | 0.6325          | 0.8221   | 0.8461    | 0.8221 | 0.8212 | 0.8761 |

| 1.4387        | 7.77  | 1800 | 0.6352          | 0.8032   | 0.8251    | 0.8032 | 0.8004 | 0.8625 |

| 1.4387        | 7.98  | 1850 | 0.6313          | 0.8086   | 0.8270    | 0.8086 | 0.8040 | 0.8678 |

| 1.3174        | 8.2   | 1900 | 0.6843          | 0.8154   | 0.8372    | 0.8154 | 0.8100 | 0.8710 |

| 1.3174        | 8.41  | 1950 | 0.6142          | 0.8194   | 0.8360    | 0.8194 | 0.8153 | 0.8739 |

| 1.3174        | 8.63  | 2000 | 0.6324          | 0.8154   | 0.8229    | 0.8154 | 0.8102 | 0.8710 |

| 1.3174        | 8.85  | 2050 | 0.5751          | 0.8383   | 0.8566    | 0.8383 | 0.8351 | 0.8852 |

| 1.2131        | 9.06  | 2100 | 0.5873          | 0.8275   | 0.8439    | 0.8275 | 0.8250 | 0.8805 |

| 1.2131        | 9.28  | 2150 | 0.6016          | 0.8167   | 0.8346    | 0.8167 | 0.8131 | 0.8729 |

| 1.2131        | 9.49  | 2200 | 0.5982          | 0.8410   | 0.8617    | 0.8410 | 0.8387 | 0.8879 |

| 1.2131        | 9.71  | 2250 | 0.5490          | 0.8437   | 0.8564    | 0.8437 | 0.8410 | 0.8912 |

| 1.2131        | 9.92  | 2300 | 0.5587          | 0.8342   | 0.8537    | 0.8342 | 0.8309 | 0.8837 |

| 1.1426        | 10.14 | 2350 | 0.5969          | 0.8261   | 0.8446    | 0.8261 | 0.8214 | 0.8790 |

| 1.1426        | 10.36 | 2400 | 0.5936          | 0.8410   | 0.8575    | 0.8410 | 0.8382 | 0.8889 |

| 1.1426        | 10.57 | 2450 | 0.5656          | 0.8383   | 0.8579    | 0.8383 | 0.8364 | 0.8865 |

| 1.1426        | 10.79 | 2500 | 0.5130          | 0.8625   | 0.8756    | 0.8625 | 0.8593 | 0.9054 |

| 1.0738        | 11.0  | 2550 | 0.5832          | 0.8396   | 0.8618    | 0.8396 | 0.8389 | 0.8880 |

| 1.0738        | 11.22 | 2600 | 0.5554          | 0.8423   | 0.8634    | 0.8423 | 0.8417 | 0.8908 |

| 1.0738        | 11.43 | 2650 | 0.5763          | 0.8275   | 0.8490    | 0.8275 | 0.8238 | 0.8801 |

| 1.0738        | 11.65 | 2700 | 0.5697          | 0.8329   | 0.8452    | 0.8329 | 0.8281 | 0.8857 |

| 1.0738        | 11.87 | 2750 | 0.5413          | 0.8464   | 0.8655    | 0.8464 | 0.8432 | 0.8922 |

| 1.0326        | 12.08 | 2800 | 0.5954          | 0.8235   | 0.8443    | 0.8235 | 0.8176 | 0.8761 |

| 1.0326        | 12.3  | 2850 | 0.5665          | 0.8410   | 0.8611    | 0.8410 | 0.8354 | 0.8908 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1