David Berenstein

davidberenstein1957

AI & ML interests

Everything NLP and knowledge graphs

Articles

Organizations

Posts 26

view post
Post
1646
⚑️ LLMs do a good job at NER, but don't you want to do learn how to do more with less?

Go from 🐒 -> πŸ‡

If you want a small model to perform well on your problem, you need to fine-tune it.

Bootstrap with a teacher model.

Correct potential mistakes to get high-quality data.

Fine-tune your student model

Go more accurate and more efficient.

Free signup: https://lu.ma/zx2t7irs
view post
Post
1602
You can now build a custom text classifier without days of human labeling!

πŸ‘ LLMs work reasonably well as text classifiers.
πŸ‘Ž They are expensive to run at scale and their performance drops in specialized domains.

πŸ‘ Purpose-built classifiers have low latency and can potentially run on CPU.
πŸ‘Ž They require labeled training data.

Combine the best of both worlds: the automatic labeling capabilities of LLMs and the high-quality annotations from human experts to train and deploy a specialized model.

Blog: https://huggingface.co/blog/sdiazlor/custom-text-classifier-ai-human-feedback