zero-shot-implicit-gpt2 / special_tokens_map.json
StefanH's picture
add tokenizer
e398c3a
raw
history blame contribute delete
258 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>", "pad_token": "[PAD]", "additional_special_tokens": ["<|question|>", "<|text|>", "<|answer|>", "<|answer_sep|>", "[QUES]", "[TEXT]", "[ANSW]", "[eot]", "[ASPECT_SEP]"]}