Wenzhong2.0-GPT2-3.5B-chinese / tokenizer_config.json
Zimix's picture
add files
6d658d1
raw
history blame contribute delete
236 Bytes
{"unk_token": "<|endoftext|>", "bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "add_prefix_space": false, "model_max_length": 1024, "special_tokens_map_file": null, "name_or_path": "gpt2", "tokenizer_class": "GPT2Tokenizer"}