Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
nicholasKluge
/
Aux-RewardModel
like
0
Text Classification
Transformers
Safetensors
nicholasKluge/toxic-aira-dataset
Anthropic/hh-rlhf
English
roberta
reward model
alignment
preference model
RLHF
Carbon Emissions
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Aux-RewardModel
/
scheduler.pt
Commit History
Upload folder using huggingface_hub
70e65be
verified
nicholasKluge
commited on
May 27