thanks for all the nice models. unfortunately all CodeLlamas throw "KeyError: ‘pad_token_id’"
#2
by
DQ83
- opened
when i try loading with ExLlama the KeyEroor occurs. Any clue what happens? i am on the latest oobabooga
@TheBloke
I made a mistake, you have to put "pad_token_id": 0, on the config.json instead
@TheYuriLover yeah I just learned that. I think it should be in both actually
I have just now updated it to config.json in all repos
This comment has been hidden
TheBloke
changed discussion status to
closed