gpt2-multi
This model is a fine-tuned version of gpt2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 5.6627
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 12
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.8011 | 1.0 | 6 | 3.8540 |
1.6433 | 2.0 | 12 | 4.4045 |
1.4762 | 3.0 | 18 | 4.2463 |
1.3941 | 4.0 | 24 | 4.9549 |
1.3447 | 5.0 | 30 | 5.3510 |
1.337 | 6.0 | 36 | 4.9287 |
1.36 | 7.0 | 42 | 5.3027 |
1.0973 | 8.0 | 48 | 5.2258 |
1.0005 | 9.0 | 54 | 5.7433 |
0.9298 | 10.0 | 60 | 5.5088 |
1.0995 | 11.0 | 66 | 5.5535 |
1.0031 | 12.0 | 72 | 5.6627 |
Framework versions
- Transformers 4.27.2
- Pytorch 2.0.0+cu117
- Datasets 2.10.1
- Tokenizers 0.13.2
- Downloads last month
- 18
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.