Post
775
๐๐ฟ๐ฒ ๐๐ฐ๐ฎ๐น๐ถ๐ป๐ด ๐น๐ฎ๐๐ ๐ผ๐๐ฒ๐ฟ? ๐ ๐ฟ๐ฒ๐ฝ๐ผ๐ฟ๐ ๐ณ๐ฟ๐ผ๐บ ๐๐ต๐ฒ ๐๐ป๐ณ๐ผ๐ฟ๐บ๐ฎ๐๐ถ๐ผ๐ป ๐ฎ๐ป๐ป๐ผ๐๐ป๐ฐ๐ฒ๐ฑ ๐๐ต๐ฎ๐ ๐ข๐ฝ๐ฒ๐ป๐๐ ๐ถ๐ ๐๐ฒ๐ฒ๐ถ๐ป๐ด ๐ฑ๐ถ๐บ๐ถ๐ป๐ถ๐๐ต๐ถ๐ป๐ด ๐ฟ๐ฒ๐๐๐ฟ๐ป๐ ๐ณ๐ฟ๐ผ๐บ ๐๐ฐ๐ฎ๐น๐ถ๐ป๐ด ๐๐ฝ ๐๐ต๐ฒ ๐ป๐ฒ๐
๐ ๐๐ฃ๐ง ๐บ๐ผ๐ฑ๐ฒ๐น๐.
๐ What are scaling laws? These are empiric laws that say "Every time you increase compute spent in training 10-fold, your LLM's performance will go up by a predictable tick". Of course, they apply only if you train your model with the right methods.
The image below illustrates it: they're from a paper by Google, "Scaling Autoregressive Models for Content-Rich Text-to-Image Generation", and they show how quality and instruction following of models improve when you scale the model up (which is equivalent to scaling up the compute spent in training).
โก๏ธ These scaling laws have immense impact: they triggered the largest gold rush ever, with companies pouring billions into scaling up theiur training. Microsoft and OpenAI spent 100B into their "Startgate" mega training cluster, due to start running in 2028.
๐ค So, what about these reports of scaling laws slowing down?
If they are true, they would mean a gigantic paradigm shift, as the hundreds of billions poured by AI companies into scaling could be a dead-end. โ๏ธ
But I doubt it: until the most recent publications, scaling laws showed no signs of weakness, and the researchers at the higher end of the scale-up seems to imply the scaling up continues.
Wait and see!
๐ What are scaling laws? These are empiric laws that say "Every time you increase compute spent in training 10-fold, your LLM's performance will go up by a predictable tick". Of course, they apply only if you train your model with the right methods.
The image below illustrates it: they're from a paper by Google, "Scaling Autoregressive Models for Content-Rich Text-to-Image Generation", and they show how quality and instruction following of models improve when you scale the model up (which is equivalent to scaling up the compute spent in training).
โก๏ธ These scaling laws have immense impact: they triggered the largest gold rush ever, with companies pouring billions into scaling up theiur training. Microsoft and OpenAI spent 100B into their "Startgate" mega training cluster, due to start running in 2028.
๐ค So, what about these reports of scaling laws slowing down?
If they are true, they would mean a gigantic paradigm shift, as the hundreds of billions poured by AI companies into scaling could be a dead-end. โ๏ธ
But I doubt it: until the most recent publications, scaling laws showed no signs of weakness, and the researchers at the higher end of the scale-up seems to imply the scaling up continues.
Wait and see!