ec-biogpt-masked-pubmed
This model is a fine-tuned version of microsoft/biogpt on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7418
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.3707 | 0.07 | 500 | 0.8468 |
0.5388 | 0.14 | 1000 | 0.7643 |
0.5857 | 0.21 | 1500 | 0.7669 |
0.5441 | 0.28 | 2000 | 0.7576 |
0.5294 | 0.36 | 2500 | 0.7570 |
0.7544 | 0.43 | 3000 | 0.7227 |
0.7075 | 0.5 | 3500 | 0.7153 |
0.7513 | 0.57 | 4000 | 0.7105 |
0.7101 | 0.64 | 4500 | 0.7059 |
0.7369 | 0.71 | 5000 | 0.7031 |
0.7477 | 0.78 | 5500 | 0.6991 |
0.6831 | 0.85 | 6000 | 0.6978 |
0.6458 | 0.93 | 6500 | 0.6940 |
0.6998 | 1.0 | 7000 | 0.6907 |
0.5901 | 1.07 | 7500 | 0.7036 |
0.633 | 1.14 | 8000 | 0.7016 |
0.6375 | 1.21 | 8500 | 0.7020 |
0.6378 | 1.28 | 9000 | 0.6988 |
0.5952 | 1.35 | 9500 | 0.6965 |
0.5714 | 1.42 | 10000 | 0.6960 |
0.5874 | 1.5 | 10500 | 0.6957 |
0.5828 | 1.57 | 11000 | 0.6917 |
0.5921 | 1.64 | 11500 | 0.6920 |
0.6086 | 1.71 | 12000 | 0.6905 |
0.5872 | 1.78 | 12500 | 0.6878 |
0.5895 | 1.85 | 13000 | 0.6883 |
0.5953 | 1.92 | 13500 | 0.6860 |
0.598 | 1.99 | 14000 | 0.6852 |
0.4805 | 2.07 | 14500 | 0.7077 |
0.4885 | 2.14 | 15000 | 0.7107 |
0.5048 | 2.21 | 15500 | 0.7083 |
0.4665 | 2.28 | 16000 | 0.7098 |
0.5057 | 2.35 | 16500 | 0.7088 |
0.4706 | 2.42 | 17000 | 0.7081 |
0.5056 | 2.49 | 17500 | 0.7076 |
0.4884 | 2.56 | 18000 | 0.7068 |
0.487 | 2.64 | 18500 | 0.7051 |
0.5327 | 2.71 | 19000 | 0.7062 |
0.4902 | 2.78 | 19500 | 0.7042 |
0.5277 | 2.85 | 20000 | 0.7021 |
0.499 | 2.92 | 20500 | 0.7024 |
0.4981 | 2.99 | 21000 | 0.7002 |
0.4174 | 3.06 | 21500 | 0.7237 |
0.4233 | 3.13 | 22000 | 0.7244 |
0.4331 | 3.21 | 22500 | 0.7265 |
0.4203 | 3.28 | 23000 | 0.7275 |
0.4265 | 3.35 | 23500 | 0.7252 |
0.4302 | 3.42 | 24000 | 0.7271 |
0.4343 | 3.49 | 24500 | 0.7244 |
0.4264 | 3.56 | 25000 | 0.7265 |
0.4565 | 3.63 | 25500 | 0.7247 |
0.4258 | 3.7 | 26000 | 0.7245 |
0.4191 | 3.78 | 26500 | 0.7246 |
0.4412 | 3.85 | 27000 | 0.7234 |
0.4604 | 3.92 | 27500 | 0.7249 |
0.4197 | 3.99 | 28000 | 0.7238 |
0.3666 | 4.06 | 28500 | 0.7413 |
0.3772 | 4.13 | 29000 | 0.7414 |
0.3628 | 4.2 | 29500 | 0.7410 |
0.3611 | 4.27 | 30000 | 0.7431 |
0.3736 | 4.35 | 30500 | 0.7414 |
0.3741 | 4.42 | 31000 | 0.7420 |
0.3661 | 4.49 | 31500 | 0.7424 |
0.3966 | 4.56 | 32000 | 0.7423 |
0.4058 | 4.63 | 32500 | 0.7423 |
0.4028 | 4.7 | 33000 | 0.7423 |
0.4028 | 4.77 | 33500 | 0.7420 |
0.3802 | 4.84 | 34000 | 0.7421 |
0.3612 | 4.92 | 34500 | 0.7418 |
0.3804 | 4.99 | 35000 | 0.7418 |
Framework versions
- Transformers 4.27.3
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.