metadata
license: gemma
library_name: transformers
base_model:
- google/gemma-2-27b-it
datasets:
- jondurbin/gutenberg-dpo-v0.1
gemma2-gutenberg-9B
google/gemma-2-27b-it finetuned on jondurbin/gutenberg-dpo-v0.1.
Method
ORPO tuned with an 80GB A100 on RunPod (plz sponsor me) for 3 epochs.