metadata
license: apache-2.0
base_model: google/flan-t5-small
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: flan_t5_small_twitter
results: []
flan_t5_small_twitter
This model is a fine-tuned version of google/flan-t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4985
- Accuracy: 0.7472
- F1 Macro: 0.6954
- F1 Micro: 0.7472
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | F1 Micro |
---|---|---|---|---|---|---|
0.5055 | 0.18 | 50 | 0.5210 | 0.7537 | 0.7230 | 0.7537 |
0.5045 | 0.37 | 100 | 0.5018 | 0.7445 | 0.6934 | 0.7445 |
0.4727 | 0.55 | 150 | 0.5356 | 0.7243 | 0.6002 | 0.7243 |
0.4924 | 0.74 | 200 | 0.4985 | 0.7472 | 0.6954 | 0.7472 |
0.4847 | 0.92 | 250 | 0.4992 | 0.7528 | 0.7107 | 0.7528 |
0.4107 | 1.1 | 300 | 0.5264 | 0.75 | 0.7086 | 0.75 |
0.4197 | 1.29 | 350 | 0.5231 | 0.7436 | 0.7126 | 0.7436 |
0.4002 | 1.47 | 400 | 0.5312 | 0.7509 | 0.7009 | 0.7509 |
0.4381 | 1.65 | 450 | 0.5216 | 0.7482 | 0.7000 | 0.7482 |
0.4125 | 1.84 | 500 | 0.5262 | 0.7509 | 0.7161 | 0.7509 |
0.3665 | 2.02 | 550 | 0.5205 | 0.7546 | 0.7190 | 0.7546 |
0.3855 | 2.21 | 600 | 0.5672 | 0.7537 | 0.7161 | 0.7537 |
0.3125 | 2.39 | 650 | 0.5732 | 0.7509 | 0.7066 | 0.7509 |
0.2955 | 2.57 | 700 | 0.5928 | 0.7555 | 0.7122 | 0.7555 |
0.3556 | 2.76 | 750 | 0.5704 | 0.7537 | 0.7174 | 0.7537 |
0.3578 | 2.94 | 800 | 0.5706 | 0.7518 | 0.7069 | 0.7518 |
Framework versions
- Transformers 4.39.0.dev0
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2