Continued pretraining technique
#5
by
levulinh
- opened
Hi, thank you for your great work. I wonder what technique you used to continued pre-train this model. e.g. LoRa, Part Freeze, or just full model training? I am currently working on a similar project and I found full model pretraining has a risk of catastrophic forgetting. Do you have any tips?
Thanks a bunch.
Hi, most my continued pretraining involves full params training, since it leads the best performance on target language.
of course there is severe catastrophic forgetting in this experiment, but it could be overcome via training with some english corpus or multilingual corpus(check https://huggingface.co/beomi/Llama-3-KoEn-8B-preview and https://huggingface.co/beomi/gemma-mling-7b)
beomi
changed discussion status to
closed