Text Classification
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
FsfairX-LLaMA3-RM-v0.1 / special_tokens_map.json
hendrydong's picture
Upload tokenizer
e20fb3b verified
raw
history blame contribute delete
439 Bytes
{
"bos_token": {
"content": "<|begin_of_text|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"eos_token": {
"content": "<|end_of_text|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "[PAD]",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}