wav2vec2-large-xls-r-300m-english-colab / special_tokens_map.json
shacharm's picture
add tokenizer
be94a3b
raw
history blame contribute delete
309 Bytes
{"bos_token": "<s>", "eos_token": "</s>", "unk_token": "[UNK]", "pad_token": "[PAD]", "additional_special_tokens": [{"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}]}