Tokenizer
#1
by
jblazick
- opened
"Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained." Then it outputs garbage. Any ideas?
That’s weird.
Could you share the full example and traceback