library_name: transformers | |
datasets: | |
- HuggingFaceFW/fineweb-edu | |
license: mit | |
This is a GPT-2 (350M) model trained in llm.c for 100B tokens with WSD (Warmup-Stable-Decay) learning rate schedule on FineWeb-EDU. | |
A lot more detailed info and observations are here: https://x.com/Yuchenj_UW/status/1816508452518482319 |