How we leveraged distilabel to create an Argilla 2.0 Chatbot
•
31
reflection.py
.Let's go!!! This can only mean one thing... more datasets!!! 🚀
Quite excited about Qwen1.5-MoE2.7BA and the upcycling process they used to initialise the weights using those of Qwen1.5-1.8B
Let's see how far can we push the open-source annotation!
@Blevlabs If that helps, we're using 4xA40 (192GB of VRAM) to serve Notux 8x7b v1. I think you need at least 2 x A100 80 GB to serve it.
There you go, hope you enjoy it 🤗 https://huggingface.co/spaces/argilla/notux-chat-ui