--- license: mit base_model: facebook/esm2_t12_35M_UR50D tags: - generated_from_trainer metrics: - accuracy - precision - recall - f1 model-index: - name: esm2_t12_35M_qlora_glycosylation_sites_2024-02-11_22-11-09 results: [] --- # esm2_t12_35M_qlora_glycosylation_sites_2024-02-11_22-11-09 This model is a fine-tuned version of [facebook/esm2_t12_35M_UR50D](https://huggingface.co/facebook/esm2_t12_35M_UR50D) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1117 - Accuracy: 0.9968 - Precision: 0.4831 - Recall: 0.9671 - F1: 0.6443 - Auc: 0.9820 - Mcc: 0.6823 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003701568055793089 - train_batch_size: 36 - eval_batch_size: 36 - seed: 8893 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Auc | Mcc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|:------:| | 0.1789 | 1.0 | 295 | 0.1102 | 0.9962 | 0.4391 | 0.9638 | 0.6034 | 0.9801 | 0.6492 | | 0.0145 | 2.0 | 590 | 0.1105 | 0.9967 | 0.4776 | 0.9663 | 0.6393 | 0.9816 | 0.6782 | | 0.0115 | 3.0 | 885 | 0.1117 | 0.9968 | 0.4831 | 0.9671 | 0.6443 | 0.9820 | 0.6823 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1