winner winner chicken dinner
Not surprised Microsoft deleted it. Stupid company. GG.
cohere is better anyways
They could at least say why they nuked it
Probably something dumb like ethics.
It’s OK, they all have their own process and in big companies things like this happen.
But, this is a hell of a model! I have been playing with Q2, and I am just impressed!
Hopefully, they come back quickly and also they add the 70B model they missed in the first run.
I'm pretty sure this can be dewokefied: https://huggingface.co/alpindale/WizardLM-2-8x22B/discussions/5
I have Mixtral-8x22B-v0.1
downloading and got WizardLM-2-8x22B
before it was pulled, but it will likely be 2 days before I can get the other 2 model to try this...
It’s OK, they all have their own process and in big companies things like this happen.
But, this is a hell of a model! I have been playing with Q2, and I am just impressed!
Hopefully, they come back quickly and also they add the 70B model they missed in the first run.
Yeah, just got this running as Q4_K_S
and it is really good at coding... Crazy times; nothing for a few months then all this lot drops at once! 3 days in a row of "yep, this is the best coding model yet, oh but...".
openbmb/Eurux-8x22b-nca
is next I think... :D
I've uploaded the imatrix I created from FP16
using groups_merged.txt
if anybody wants to use it: https://huggingface.co/jukofyork/WizardLM-2-8x22B-imatrix
Not sure how much effect it has on Q4
, but for lower quants it will probably help a lot and likely a better choice than wiki.train.raw
if you want to use it for coding.
I've uploaded the imatrix I created from FP16 using groups_merged.txt if anybody wants to use it: https://huggingface.co/jukofyork/WizardLM-2-8x22B-imatrix
I just finished making imatrix based on the same txt file! Upload the IQ-1 quants as we speak. I wish I would have seen this first, it took forever to make that imtarix data file!