How can this model : CodeLlama 7b be used for code generation
#16
by
Rajath-jain
- opened
This is something I am wondering too. I need a lot more tokens than Codellama is giving me. How do I get more full and complete answer from CodeLLama?
Consider adding this parameter:model.generate(input_ids, max_length=720)
By default, the max length is set to a decently short number iirc. That said, increasing that parameter should solve the problem.
This comment has been hidden