๐ช SmolLM Collection A series of smol LLMs: 135M, 360M and 1.7B. We release base and Instruct models as well as the training corpus and some WebGPU demos โข 12 items โข Updated Aug 18 โข 200