metadata
license: other
base_model: yahma/llama-7b-hf
tags:
- generated_from_trainer
model-index:
- name: V0305B2
results: []
V0305B2
This model is a fine-tuned version of yahma/llama-7b-hf on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0894
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 20
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.352 | 0.09 | 10 | 2.3256 |
2.1754 | 0.17 | 20 | 1.8064 |
1.2745 | 0.26 | 30 | 0.6844 |
0.3789 | 0.34 | 40 | 0.1687 |
0.1587 | 0.43 | 50 | 0.1487 |
0.1563 | 0.51 | 60 | 0.1506 |
0.1505 | 0.6 | 70 | 0.1502 |
0.1525 | 0.68 | 80 | 0.1487 |
0.1481 | 0.77 | 90 | 0.1492 |
0.1504 | 0.85 | 100 | 0.1441 |
0.1501 | 0.94 | 110 | 0.1436 |
0.1439 | 1.02 | 120 | 0.1360 |
0.1411 | 1.11 | 130 | 0.1276 |
0.1349 | 1.19 | 140 | 0.1259 |
0.1345 | 1.28 | 150 | 0.1190 |
0.1299 | 1.37 | 160 | 0.1114 |
0.1275 | 1.45 | 170 | 0.1058 |
0.1159 | 1.54 | 180 | 0.1013 |
0.1189 | 1.62 | 190 | 0.0997 |
0.1203 | 1.71 | 200 | 0.1012 |
0.1177 | 1.79 | 210 | 0.0973 |
0.1144 | 1.88 | 220 | 0.0932 |
0.1128 | 1.96 | 230 | 0.0933 |
0.1084 | 2.05 | 240 | 0.0952 |
0.1081 | 2.13 | 250 | 0.0930 |
0.1037 | 2.22 | 260 | 0.0921 |
0.1011 | 2.3 | 270 | 0.0923 |
0.1072 | 2.39 | 280 | 0.0912 |
0.1058 | 2.47 | 290 | 0.0902 |
0.1107 | 2.56 | 300 | 0.0899 |
0.1066 | 2.65 | 310 | 0.0897 |
0.1091 | 2.73 | 320 | 0.0895 |
0.103 | 2.82 | 330 | 0.0893 |
0.1021 | 2.9 | 340 | 0.0893 |
0.103 | 2.99 | 350 | 0.0894 |
Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.14.6
- Tokenizers 0.14.1