Text Classification
Transformers
Safetensors
English
llama
text-generation-inference
Inference Endpoints
tulu-v2.5-13b-stackexchange-60k-rm / special_tokens_map.json
hamishivi's picture
Upload folder using huggingface_hub
2858b05 verified
raw
history blame
330 Bytes
{"bos_token": {"content": "<s>", "lstrip": false, "normalized": true, "rstrip": false, "single_word": false}, "eos_token": {"content": "</s>", "lstrip": false, "normalized": true, "rstrip": false, "single_word": false}, "unk_token": {"content": "<unk>", "lstrip": false, "normalized": true, "rstrip": false, "single_word": false}}