--- license: cc-by-nc-sa-4.0 base_model: InstaDeepAI/nucleotide-transformer-v2-500m-multi-species tags: - generated_from_trainer metrics: - f1 - matthews_correlation - accuracy model-index: - name: gut_1024-finetuned-lora-NT-v2-500m-multi-species results: [] --- # gut_1024-finetuned-lora-NT-v2-500m-multi-species This model is a fine-tuned version of [InstaDeepAI/nucleotide-transformer-v2-500m-multi-species](https://huggingface.co/InstaDeepAI/nucleotide-transformer-v2-500m-multi-species) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4480 - F1: 0.8532 - Matthews Correlation: 0.6018 - Accuracy: 0.8091 - F1 Score: 0.8532 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 8 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Matthews Correlation | Accuracy | F1 Score | |:-------------:|:-----:|:----:|:---------------:|:------:|:--------------------:|:--------:|:--------:| | 0.7913 | 0.02 | 100 | 0.6865 | 0.7478 | 0.0 | 0.5971 | 0.7478 | | 0.6762 | 0.04 | 200 | 0.7888 | 0.6217 | 0.3157 | 0.6284 | 0.6217 | | 0.6291 | 0.05 | 300 | 0.5765 | 0.7628 | 0.4323 | 0.7234 | 0.7628 | | 0.563 | 0.07 | 400 | 0.5184 | 0.8304 | 0.5258 | 0.7724 | 0.8304 | | 0.5206 | 0.09 | 500 | 0.5402 | 0.8281 | 0.5142 | 0.7580 | 0.8281 | | 0.4639 | 0.11 | 600 | 0.4681 | 0.8461 | 0.5775 | 0.7969 | 0.8461 | | 0.4359 | 0.12 | 700 | 0.5136 | 0.8470 | 0.5774 | 0.7918 | 0.8470 | | 0.4861 | 0.14 | 800 | 0.4530 | 0.8365 | 0.5714 | 0.7965 | 0.8365 | | 0.4923 | 0.16 | 900 | 0.4480 | 0.8496 | 0.5889 | 0.8024 | 0.8496 | | 0.4369 | 0.18 | 1000 | 0.4480 | 0.8532 | 0.6018 | 0.8091 | 0.8532 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.1 - Tokenizers 0.15.2