Maximum context lenght
#3
by
IRITakeda
- opened
What is the upper limit of maximum context lenght for japanese-gpt-neox-3.6b-instruction-ppo?
In the case of chatgpt, the total number of tokens in the prompt and response must be within the limit, is the same for Rinna-3.6B?
Yes it is the same as the pretrained 3.6B model, and the maximum length can also be found in the config file.
https://huggingface.co/rinna/japanese-gpt-neox-3.6b-instruction-ppo/blob/2140541486bfb31269acd035edd51208da40185b/config.json#L12
Thank you.
The number of tokens is different between ChatGPT and Rinna-3.6B, even though they have the same input.
Is there any way to calculate the number of tokens for Rinna-3.6B input?
Yes, you can check the length of token_ids
as in the README example.
tianyuz
changed discussion status to
closed