Llama3-8B-CLoud-RM / config.json
ankner's picture
Upload folder using huggingface_hub
4409a6a verified
raw
history blame
211 Bytes
{
"architectures": [
"FeedbackRewardModel"
],
"base_model_name_or_path": "meta-llama/Meta-Llama-3-8B",
"feedback_method": "teacher",
"torch_dtype": "bfloat16",
"transformers_version": "4.40.2"
}