Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
MichaelFFan
/
dpo_model
like
0
PEFT
Safetensors
Model card
Files
Files and versions
Community
Use this model
90276f3
dpo_model
/
README.md
Commit History
Update README.md
90276f3
verified
MichaelFFan
commited on
16 days ago
Update README.md
f94939a
verified
MichaelFFan
commited on
16 days ago
Upload folder using huggingface_hub
551e5d9
verified
MichaelFFan
commited on
16 days ago