trained_luke_large_model / added_tokens.json
hama3's picture
Upload tokenizer
6cd700e verified
raw
history blame contribute delete
40 Bytes
{
"<ent2>": 32771,
"<ent>": 32770
}