Pretraining from scratch?
#8
by
MengboZhou
- opened
I would like to know if your Mistral code can perform pretraining from scratch, and if so, what kind of script would be needed? For example, something like train.py?