LongReward-llama3.1-8b-DPO / configuration.json
davidlvxin's picture
Upload folder using huggingface_hub
a74f280 verified
raw
history blame contribute delete
48 Bytes
{"framework":"Pytorch","task":"text-generation"}