--- license: apache-2.0 base_model: google/flan-t5-base tags: - simplification - generated_from_trainer metrics: - bleu model-index: - name: flant5 results: [] --- # flant5 This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1632 - Bleu: 32.6829 - Gen Len: 17.6119 ## Model description El modelo FLAN T5 Base es una variante del modelo de lenguaje T5 (Text-To-Text Transfer Transformer) desarrollado por Google. T5 es una arquitectura de modelo de lenguaje basada en transformers que ha demostrado un rendimiento sobresaliente en una amplia gama de tareas de procesamiento de lenguaje natural (NLP). ## Intended uses & limitations En este caso el modelo es un traductor automático que simplifica el francés. ## Training and evaluation data Se utilizan 266 filas de la tabla como entrenamiento y 67 como test. ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5.6e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:| | No log | 1.0 | 67 | 1.1644 | 32.6829 | 17.6119 | | No log | 2.0 | 134 | 1.1632 | 32.6829 | 17.6119 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.1 - Tokenizers 0.15.2