maxTokens
#54
by
islamn25
- opened
How do I set the number of maximum tokens that you get back when you prompt the model?
You can set max_new_token
see in the documentation here : https://huggingface.co/docs/transformers/main_classes/text_generation#transformers.GenerationConfig
ArthurZ
changed discussion status to
closed
What about in Node.js?
I have this setup in Node.js
const model = new HuggingFaceInference({
model: "google/flan-t5-xxl",
apiKey: process.env.HUGGINGFACEHUB_API_KEY,
max_new_tokens: 300,
});
and max_new_tokens does not do anything
islamn25
changed discussion status to
open