Edit model card

coedit-small

This model is a fine-tuned version of google/flan-t5-small on the CoEdIT dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8242
  • Rouge1: 58.7504
  • Rouge2: 45.1374
  • Rougel: 55.4161
  • Rougelsum: 55.4599
  • Gen Len: 16.5245

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
0.9482 1.0 4317 0.8878 58.4501 44.2623 54.4468 54.51 16.5088
0.9155 2.0 8634 0.8485 58.6609 44.7759 54.9844 55.0503 16.5339
0.8964 3.0 12951 0.8402 58.712 44.9838 55.2171 55.2697 16.5251
0.9049 4.0 17268 0.8305 58.7767 45.1325 55.3955 55.4522 16.5181
0.8948 5.0 21585 0.8242 58.7504 45.1374 55.4161 55.4599 16.5245

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.7
  • Tokenizers 0.15.0
Downloads last month
13
Safetensors
Model size
77M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jbochi/coedit-small

Finetuned
(297)
this model

Dataset used to train jbochi/coedit-small