BTX24 commited on
Commit
4d5164b
1 Parent(s): f575a6f

Model save

Browse files
README.md ADDED
@@ -0,0 +1,84 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/convnextv2-tiny-1k-224
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ - f1
9
+ - precision
10
+ - recall
11
+ model-index:
12
+ - name: convnextv2-tiny-1k-224-finetuned-four-five
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # convnextv2-tiny-1k-224-finetuned-four-five
20
+
21
+ This model is a fine-tuned version of [facebook/convnextv2-tiny-1k-224](https://huggingface.co/facebook/convnextv2-tiny-1k-224) on an unknown dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.6029
24
+ - Accuracy: 0.6544
25
+ - F1: 0.6540
26
+ - Precision: 0.6564
27
+ - Recall: 0.6544
28
+
29
+ ## Model description
30
+
31
+ More information needed
32
+
33
+ ## Intended uses & limitations
34
+
35
+ More information needed
36
+
37
+ ## Training and evaluation data
38
+
39
+ More information needed
40
+
41
+ ## Training procedure
42
+
43
+ ### Training hyperparameters
44
+
45
+ The following hyperparameters were used during training:
46
+ - learning_rate: 5e-05
47
+ - train_batch_size: 32
48
+ - eval_batch_size: 32
49
+ - seed: 42
50
+ - gradient_accumulation_steps: 4
51
+ - total_train_batch_size: 128
52
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
+ - lr_scheduler_type: linear
54
+ - lr_scheduler_warmup_ratio: 0.1
55
+ - num_epochs: 15
56
+ - mixed_precision_training: Native AMP
57
+
58
+ ### Training results
59
+
60
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
61
+ |:-------------:|:-------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
62
+ | 0.7064 | 0.9455 | 13 | 0.7007 | 0.5023 | 0.3813 | 0.4662 | 0.5023 |
63
+ | 0.6997 | 1.9636 | 27 | 0.6937 | 0.5276 | 0.4406 | 0.6010 | 0.5276 |
64
+ | 0.6895 | 2.9818 | 41 | 0.6864 | 0.5276 | 0.4815 | 0.5549 | 0.5276 |
65
+ | 0.6862 | 4.0 | 55 | 0.6769 | 0.5922 | 0.5887 | 0.5983 | 0.5922 |
66
+ | 0.6745 | 4.9455 | 68 | 0.6455 | 0.6336 | 0.6336 | 0.6344 | 0.6336 |
67
+ | 0.6443 | 5.9636 | 82 | 0.6340 | 0.6406 | 0.6399 | 0.6406 | 0.6406 |
68
+ | 0.6243 | 6.9818 | 96 | 0.6220 | 0.6590 | 0.6534 | 0.6661 | 0.6590 |
69
+ | 0.6243 | 8.0 | 110 | 0.6151 | 0.6705 | 0.6668 | 0.6754 | 0.6705 |
70
+ | 0.6248 | 8.9455 | 123 | 0.6104 | 0.6567 | 0.6565 | 0.6566 | 0.6567 |
71
+ | 0.6149 | 9.9636 | 137 | 0.6100 | 0.6751 | 0.6734 | 0.6816 | 0.6751 |
72
+ | 0.5968 | 10.9818 | 151 | 0.6026 | 0.6705 | 0.6705 | 0.6713 | 0.6705 |
73
+ | 0.5813 | 12.0 | 165 | 0.6028 | 0.6590 | 0.6589 | 0.6602 | 0.6590 |
74
+ | 0.5892 | 12.9455 | 178 | 0.6059 | 0.6452 | 0.6420 | 0.6537 | 0.6452 |
75
+ | 0.5738 | 13.9636 | 192 | 0.6029 | 0.6544 | 0.6541 | 0.6561 | 0.6544 |
76
+ | 0.5738 | 14.1818 | 195 | 0.6029 | 0.6544 | 0.6540 | 0.6564 | 0.6544 |
77
+
78
+
79
+ ### Framework versions
80
+
81
+ - Transformers 4.42.4
82
+ - Pytorch 2.3.1+cu121
83
+ - Datasets 2.20.0
84
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:61b813f2aee1356029c260569950d536c5f71e4590500bb3570139ab526d5410
3
  size 111495808
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9fc63a86e48422929b70d2c57a41c5f9fa4e2ff8e668c1e72da0180615e32ec2
3
  size 111495808
runs/Aug07_19-45-10_b5a50763f646/events.out.tfevents.1723059920.b5a50763f646.813.4 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:56bd27b6985c25e16e372bdfe60ce34d5a60b617d8fe8bacabb086d1bc2d7ef7
3
- size 15509
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6d97c3ed59bd2401b3fb3862337852c06155599ae71775ffdb1ae95bda9d514b
3
+ size 16335