Spaces:
Runtime error
Runtime error
Instruct model output
#2
by
Omnibus
- opened
I have found that the instruct (...-it) versions of the models running on the InferenceClient on this space will:
• hallucinate user input
• repeat itself
• freeze the space
• answer gibberish if the Repetition Penalty is >= 1
The default settings in the space seem to be the best for all models.
Welcoming feedback on how to best encourage valid output using the InferenceClient, and noting that this is not a documented way of deploying these models.
Omnibus
changed discussion status to
closed
Omnibus
changed discussion status to
open