Edit model card

esm2_t12_35M-lora-binding-sites_2024-04-25_14-47-08

This model is a fine-tuned version of facebook/esm2_t12_35M_UR50D on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4214
  • Accuracy: 0.8574

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005701568055793089
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 8893
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6683 1.0 24 0.6799 0.5820
0.6546 2.0 48 0.6737 0.5820
0.665 3.0 72 0.6597 0.5820
0.6569 4.0 96 0.6247 0.6426
0.6524 5.0 120 0.6101 0.6582
0.6161 6.0 144 0.5936 0.6699
0.4919 7.0 168 0.5802 0.6680
0.461 8.0 192 0.6265 0.6465
0.6359 9.0 216 0.5477 0.7051
0.4399 10.0 240 0.5543 0.7109
0.7217 11.0 264 0.6668 0.6719
0.4323 12.0 288 0.4740 0.7656
0.4103 13.0 312 0.4999 0.7637
0.2916 14.0 336 0.3996 0.8320
0.262 15.0 360 0.4088 0.8418
0.4494 16.0 384 0.4432 0.8164
0.3895 17.0 408 0.3702 0.8379
0.3254 18.0 432 0.3501 0.8438
0.2065 19.0 456 0.3646 0.8438
0.167 20.0 480 0.3768 0.8320
0.3051 21.0 504 0.3557 0.8457
0.2773 22.0 528 0.3551 0.8730
0.2969 23.0 552 0.3434 0.8555
0.1427 24.0 576 0.3390 0.8594
0.327 25.0 600 0.4370 0.8652
0.1195 26.0 624 0.3594 0.8496
0.3383 27.0 648 0.4215 0.8672
0.1738 28.0 672 0.3671 0.8711
0.2686 29.0 696 0.3913 0.8457
0.1049 30.0 720 0.3803 0.8652
0.1809 31.0 744 0.4294 0.8691
0.1036 32.0 768 0.4279 0.8613
0.1664 33.0 792 0.4326 0.8594
0.246 34.0 816 0.4770 0.8535
0.0664 35.0 840 0.5014 0.8516
0.1116 36.0 864 0.5981 0.8555
0.0323 37.0 888 0.5228 0.8633
0.0751 38.0 912 0.5393 0.8594
0.0659 39.0 936 0.5420 0.8555
0.0699 40.0 960 0.5920 0.8535
0.0427 41.0 984 0.6336 0.8555
0.0265 42.0 1008 0.6485 0.8594
0.0386 43.0 1032 0.6955 0.8516
0.0759 44.0 1056 0.8761 0.8555
0.164 45.0 1080 0.8223 0.8496
0.0632 46.0 1104 0.8234 0.8594
0.0709 47.0 1128 0.8806 0.8535
0.0042 48.0 1152 0.9198 0.8594
0.0198 49.0 1176 0.8870 0.8652
0.002 50.0 1200 0.9676 0.8496
0.0156 51.0 1224 0.9507 0.8613
0.0551 52.0 1248 0.9955 0.8555
0.018 53.0 1272 1.0277 0.8535
0.0041 54.0 1296 1.0293 0.8633
0.0021 55.0 1320 1.0939 0.8652
0.0851 56.0 1344 1.1512 0.8574
0.0257 57.0 1368 1.0998 0.8516
0.0364 58.0 1392 1.1812 0.8496
0.0019 59.0 1416 1.1941 0.8438
0.0015 60.0 1440 1.2219 0.8574
0.0868 61.0 1464 1.2075 0.8555
0.0002 62.0 1488 1.2761 0.8574
0.0005 63.0 1512 1.2235 0.8535
0.0149 64.0 1536 1.2502 0.8613
0.002 65.0 1560 1.2890 0.8477
0.0001 66.0 1584 1.2766 0.8496
0.0488 67.0 1608 1.2966 0.8496
0.0002 68.0 1632 1.3242 0.8535
0.0008 69.0 1656 1.3247 0.8535
0.0024 70.0 1680 1.3615 0.8613
0.0001 71.0 1704 1.3805 0.8574
0.0017 72.0 1728 1.3145 0.8555
0.0004 73.0 1752 1.3214 0.8613
0.0121 74.0 1776 1.3500 0.8613
0.0229 75.0 1800 1.3902 0.8516
0.0022 76.0 1824 1.3923 0.8555
0.0007 77.0 1848 1.3887 0.8496
0.0036 78.0 1872 1.3787 0.8535
0.0001 79.0 1896 1.3920 0.8535
0.0 80.0 1920 1.3965 0.8574
0.0008 81.0 1944 1.3935 0.8633
0.0 82.0 1968 1.3969 0.8594
0.0 83.0 1992 1.3986 0.8574
0.0001 84.0 2016 1.3891 0.8594
0.0017 85.0 2040 1.4158 0.8633
0.0002 86.0 2064 1.4081 0.8574
0.0054 87.0 2088 1.4131 0.8613
0.0002 88.0 2112 1.4065 0.8633
0.0108 89.0 2136 1.4221 0.8613
0.0002 90.0 2160 1.4166 0.8613
0.0 91.0 2184 1.4192 0.8555
0.0 92.0 2208 1.4152 0.8613
0.0001 93.0 2232 1.4160 0.8613
0.0412 94.0 2256 1.4141 0.8613
0.0001 95.0 2280 1.4159 0.8613
0.0073 96.0 2304 1.4179 0.8613
0.0 97.0 2328 1.4222 0.8633
0.0209 98.0 2352 1.4202 0.8594
0.0001 99.0 2376 1.4203 0.8594
0.0001 100.0 2400 1.4214 0.8574

Framework versions

  • PEFT 0.10.0
  • Transformers 4.39.3
  • Pytorch 2.2.1
  • Datasets 2.16.1
  • Tokenizers 0.15.2
Downloads last month
1
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for wcvz/esm2_t12_35M-lora-binding-sites_2024-04-25_14-47-08

Adapter
this model