--- library_name: transformers license: apache-2.0 base_model: ntu-spml/distilhubert tags: - generated_from_trainer metrics: - accuracy - f1 - precision - recall model-index: - name: distilhubert-finetuned-cry-detector results: [] --- # distilhubert-finetuned-cry-detector This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2255 - Accuracy: 0.9883 - F1: 0.9883 - Precision: 0.9883 - Recall: 0.9883 - Confusion Matrix: [[960, 10], [6, 389]] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 123 - gradient_accumulation_steps: 2 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine_with_restarts - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 - label_smoothing_factor: 0.1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | Confusion Matrix | |:-------------:|:-------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|:----------------------:| | 0.3124 | 2.3256 | 100 | 0.2739 | 0.9641 | 0.9640 | 0.9640 | 0.9641 | [[948, 22], [27, 368]] | | 0.2337 | 4.6512 | 200 | 0.2385 | 0.9736 | 0.9737 | 0.9737 | 0.9736 | [[950, 20], [16, 379]] | | 0.2064 | 6.9767 | 300 | 0.2295 | 0.9832 | 0.9832 | 0.9832 | 0.9832 | [[958, 12], [11, 384]] | | 0.2023 | 9.3023 | 400 | 0.2277 | 0.9868 | 0.9869 | 0.9870 | 0.9868 | [[957, 13], [5, 390]] | | 0.2003 | 11.6279 | 500 | 0.2254 | 0.9875 | 0.9876 | 0.9876 | 0.9875 | [[960, 10], [7, 388]] | | 0.2002 | 13.9535 | 600 | 0.2259 | 0.9875 | 0.9876 | 0.9876 | 0.9875 | [[959, 11], [6, 389]] | | 0.1994 | 16.2791 | 700 | 0.2255 | 0.9883 | 0.9883 | 0.9883 | 0.9883 | [[960, 10], [6, 389]] | | 0.1997 | 18.6047 | 800 | 0.2254 | 0.9883 | 0.9883 | 0.9883 | 0.9883 | [[960, 10], [6, 389]] | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Tokenizers 0.19.1