--- library_name: transformers license: apache-2.0 base_model: google-t5/t5-base tags: - generated_from_trainer metrics: - bleu model-index: - name: t5-base-spanish-yoremnokki results: [] --- # t5-base-spanish-yoremnokki This model is a fine-tuned version of [google-t5/t5-base](https://huggingface.co/google-t5/t5-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.7824 - Bleu: 12.9294 - Gen Len: 14.0092 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 6 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | |:-------------:|:------:|:----:|:---------------:|:-------:|:-------:| | 3.4916 | 0.9994 | 846 | 2.3536 | 0.21 | 14.4369 | | 2.4194 | 2.0 | 1693 | 2.0655 | 2.0366 | 13.9808 | | 2.1821 | 2.9994 | 2539 | 1.9102 | 7.286 | 14.0406 | | 2.1132 | 4.0 | 3386 | 1.8290 | 12.0392 | 14.003 | | 2.0125 | 4.9994 | 4232 | 1.7935 | 12.8393 | 14.01 | | 1.9896 | 5.9965 | 5076 | 1.7824 | 12.9294 | 14.0092 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.5.0+cu121 - Datasets 3.1.0 - Tokenizers 0.19.1