KardelRuveyda commited on
Commit
a24145b
1 Parent(s): a78f23e

Upload tokenizer

Browse files
merges.txt CHANGED
The diff for this file is too large to render. See raw diff
 
special_tokens_map.json CHANGED
@@ -13,7 +13,6 @@
13
  "rstrip": false,
14
  "single_word": false
15
  },
16
- "pad_token": "<|endoftext|>",
17
  "unk_token": {
18
  "content": "<|endoftext|>",
19
  "lstrip": false,
 
13
  "rstrip": false,
14
  "single_word": false
15
  },
 
16
  "unk_token": {
17
  "content": "<|endoftext|>",
18
  "lstrip": false,
tokenizer.json CHANGED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json CHANGED
@@ -16,7 +16,7 @@
16
  "eos_token": "<|endoftext|>",
17
  "errors": "replace",
18
  "model_max_length": 1024,
19
- "pad_token": "<|endoftext|>",
20
  "tokenizer_class": "GPT2Tokenizer",
21
  "unk_token": "<|endoftext|>"
22
  }
 
16
  "eos_token": "<|endoftext|>",
17
  "errors": "replace",
18
  "model_max_length": 1024,
19
+ "pad_token": null,
20
  "tokenizer_class": "GPT2Tokenizer",
21
  "unk_token": "<|endoftext|>"
22
  }
vocab.json CHANGED
The diff for this file is too large to render. See raw diff