paligemma-cnmc-ft / README.md
dwb2023's picture
dwb2023/paligemma-cnmc-ft
262fa59 verified
|
raw
history blame
2.08 kB
metadata
base_model: google/paligemma-3b-pt-224
library_name: peft
license: gemma
tags:
  - generated_from_trainer
model-index:
  - name: paligemma-cnmc-ft
    results: []

paligemma-cnmc-ft

This model is a fine-tuned version of google/paligemma-3b-pt-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1868

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 170
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss
No log 0.9645 17 1.4167
No log 1.9858 35 1.1455
1.2186 2.9504 52 0.6528
1.2186 3.9716 70 0.3555
1.2186 4.9929 88 0.2881
0.3872 5.9574 105 0.2618
0.3872 6.9787 123 0.2299
0.3872 8.0 141 0.1961
0.2563 8.9645 158 0.1834
0.2563 9.9858 176 0.1523
0.2563 10.9504 193 0.1612
0.2196 11.9716 211 0.1505
0.2196 12.9929 229 0.1868

Framework versions

  • PEFT 0.11.1
  • Transformers 4.43.0.dev0
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1