CombinHorizon's picture
Update README.md
5e96cff verified
|
raw
history blame
338 Bytes
metadata
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation
tags:
  - DPO

This is a model released from the preprint: SimPO: Simple Preference Optimization with a Reference-Free Reward Please refer to our repository for more details.