metadata
license: apache-2.0
base_model: thezeivier/Grietas_10k
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: Grietas_10k-Fine-tuning
results: []
Grietas_10k-Fine-tuning
This model is a fine-tuned version of thezeivier/Grietas_10k on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3864
- Accuracy: 0.8860
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 80
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 320
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.8 | 2 | 1.3737 | 0.3679 |
No log | 2.0 | 5 | 1.0234 | 0.6218 |
No log | 2.8 | 7 | 0.8146 | 0.7254 |
1.0488 | 4.0 | 10 | 0.6621 | 0.7772 |
1.0488 | 4.8 | 12 | 0.6295 | 0.8031 |
1.0488 | 6.0 | 15 | 0.5390 | 0.8083 |
1.0488 | 6.8 | 17 | 0.4902 | 0.8290 |
0.4981 | 8.0 | 20 | 0.4645 | 0.8290 |
0.4981 | 8.8 | 22 | 0.4484 | 0.8497 |
0.4981 | 10.0 | 25 | 0.4543 | 0.8446 |
0.4981 | 10.8 | 27 | 0.4325 | 0.8394 |
0.3669 | 12.0 | 30 | 0.4210 | 0.8497 |
0.3669 | 12.8 | 32 | 0.4303 | 0.8342 |
0.3669 | 14.0 | 35 | 0.4170 | 0.8497 |
0.3669 | 14.8 | 37 | 0.3861 | 0.8601 |
0.2811 | 16.0 | 40 | 0.3629 | 0.8705 |
0.2811 | 16.8 | 42 | 0.3982 | 0.8653 |
0.2811 | 18.0 | 45 | 0.4492 | 0.8290 |
0.2811 | 18.8 | 47 | 0.4216 | 0.8342 |
0.2026 | 20.0 | 50 | 0.4614 | 0.8394 |
0.2026 | 20.8 | 52 | 0.4325 | 0.8446 |
0.2026 | 22.0 | 55 | 0.4755 | 0.8342 |
0.2026 | 22.8 | 57 | 0.4175 | 0.8394 |
0.1709 | 24.0 | 60 | 0.4175 | 0.8497 |
0.1709 | 24.8 | 62 | 0.4105 | 0.8446 |
0.1709 | 26.0 | 65 | 0.4140 | 0.8601 |
0.1709 | 26.8 | 67 | 0.4641 | 0.8394 |
0.1293 | 28.0 | 70 | 0.4214 | 0.8394 |
0.1293 | 28.8 | 72 | 0.3802 | 0.8808 |
0.1293 | 30.0 | 75 | 0.4875 | 0.8290 |
0.1293 | 30.8 | 77 | 0.3972 | 0.8705 |
0.1167 | 32.0 | 80 | 0.4853 | 0.8394 |
0.1167 | 32.8 | 82 | 0.4082 | 0.8549 |
0.1167 | 34.0 | 85 | 0.3917 | 0.8601 |
0.1167 | 34.8 | 87 | 0.3573 | 0.8653 |
0.1034 | 36.0 | 90 | 0.4312 | 0.8497 |
0.1034 | 36.8 | 92 | 0.4035 | 0.8497 |
0.1034 | 38.0 | 95 | 0.4413 | 0.8238 |
0.1034 | 38.8 | 97 | 0.4728 | 0.8446 |
0.0782 | 40.0 | 100 | 0.3977 | 0.8808 |
0.0782 | 40.8 | 102 | 0.3449 | 0.8912 |
0.0782 | 42.0 | 105 | 0.4146 | 0.8808 |
0.0782 | 42.8 | 107 | 0.4380 | 0.8601 |
0.083 | 44.0 | 110 | 0.4579 | 0.8497 |
0.083 | 44.8 | 112 | 0.5234 | 0.8549 |
0.083 | 46.0 | 115 | 0.4053 | 0.8756 |
0.083 | 46.8 | 117 | 0.4724 | 0.8394 |
0.0741 | 48.0 | 120 | 0.4631 | 0.8549 |
0.0741 | 48.8 | 122 | 0.4351 | 0.8653 |
0.0741 | 50.0 | 125 | 0.4191 | 0.8756 |
0.0741 | 50.8 | 127 | 0.3772 | 0.8964 |
0.067 | 52.0 | 130 | 0.3960 | 0.8808 |
0.067 | 52.8 | 132 | 0.3749 | 0.8964 |
0.067 | 54.0 | 135 | 0.4395 | 0.8653 |
0.067 | 54.8 | 137 | 0.5284 | 0.8342 |
0.0632 | 56.0 | 140 | 0.3332 | 0.8808 |
0.0632 | 56.8 | 142 | 0.4342 | 0.8497 |
0.0632 | 58.0 | 145 | 0.3986 | 0.8756 |
0.0632 | 58.8 | 147 | 0.4771 | 0.8549 |
0.063 | 60.0 | 150 | 0.4505 | 0.8497 |
0.063 | 60.8 | 152 | 0.4023 | 0.8653 |
0.063 | 62.0 | 155 | 0.5208 | 0.8290 |
0.063 | 62.8 | 157 | 0.4915 | 0.8601 |
0.0571 | 64.0 | 160 | 0.4412 | 0.8756 |
0.0571 | 64.8 | 162 | 0.4554 | 0.8653 |
0.0571 | 66.0 | 165 | 0.4318 | 0.8653 |
0.0571 | 66.8 | 167 | 0.4317 | 0.8549 |
0.0608 | 68.0 | 170 | 0.4509 | 0.8653 |
0.0608 | 68.8 | 172 | 0.4176 | 0.8705 |
0.0608 | 70.0 | 175 | 0.5203 | 0.8394 |
0.0608 | 70.8 | 177 | 0.4375 | 0.8756 |
0.0478 | 72.0 | 180 | 0.4196 | 0.8601 |
0.0478 | 72.8 | 182 | 0.4744 | 0.8601 |
0.0478 | 74.0 | 185 | 0.4362 | 0.8808 |
0.0478 | 74.8 | 187 | 0.4804 | 0.8653 |
0.0519 | 76.0 | 190 | 0.4861 | 0.8446 |
0.0519 | 76.8 | 192 | 0.4605 | 0.8601 |
0.0519 | 78.0 | 195 | 0.4730 | 0.8394 |
0.0519 | 78.8 | 197 | 0.4650 | 0.8705 |
0.0553 | 80.0 | 200 | 0.3864 | 0.8860 |
Framework versions
- Transformers 4.33.1
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3