My quants and silly expriment.

#1
by ZeroWw - opened

This is super cool, thank you! I am super curious about the silly version. You changed the weights by 3% on average and did not detect any noticeable changes in the model? That's super cool. I want to play around with that. That's super weird to me lol I am excited to dive into it.

This is super cool, thank you! I am super curious about the silly version. You changed the weights by 3% on average and did not detect any noticeable changes in the model? That's super cool. I want to play around with that. That's super weird to me lol I am excited to dive into it.

No. I changed them by 20% of the value (between -10% to +10%) randomly.
The colab notebook shows you how.. I modified the quantizer to add the divergence.
Up until 20% the models usually don't degrade... they just change slightly the behavior.
Higher they start writing bullsh*t and higher than 50% they become random generators :D
But at 20% they are pretty good... I think that could be used for a generational algorythm... generate models, then select the best then do it again and again.. let them evolve.
But I don't have the resources to do that.

Sign up or log in to comment