Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
MichaelFFan
/
dpo_model
like
0
PEFT
Safetensors
Model card
Files
Files and versions
Community
Use this model
main
dpo_model
Commit History
Update README.md
fd389ee
verified
MichaelFFan
commited on
16 days ago
Update README.md
a7056ea
verified
MichaelFFan
commited on
16 days ago
Update README.md
90276f3
verified
MichaelFFan
commited on
16 days ago
Update README.md
f94939a
verified
MichaelFFan
commited on
16 days ago
Upload folder using huggingface_hub
551e5d9
verified
MichaelFFan
commited on
16 days ago
initial commit
9c4c0fb
verified
MichaelFFan
commited on
16 days ago