Fix tokenizer_config.json
#9
by
Xenova
HF staff
- opened
Without this fix, if you do:
from transformers import pipeline
pipeline('text-generation', 'JackFram/llama-160m')
you get an error:
RecursionError: maximum recursion depth exceeded while calling a Python object
See this issue for more info.
JackFram
changed pull request status to
merged
Thx for the fix!
Gotcha, just get it fixed!