Post
2566
Adding a long prompt can help you fight LLM hallucinations. However, if you know exactly how you want your LLM output constrained, there are much better strategies! πͺ
Did you know you can force your LLM to ALWAYS generate a valid JSON file? Or to follow a well-defined answer template? You can do that and more with the π€ transformers-compatible
It doesn't only allow you to master your LLM -- your text generation application will also become faster! π₯ The more constrained your text generation is, the bigger speedups you'll see!
Follow @remi and other
Did you know you can force your LLM to ALWAYS generate a valid JSON file? Or to follow a well-defined answer template? You can do that and more with the π€ transformers-compatible
outlines
library.It doesn't only allow you to master your LLM -- your text generation application will also become faster! π₯ The more constrained your text generation is, the bigger speedups you'll see!
Follow @remi and other
outlines
folks to stay on top of the constrained generation game π§