Well this is a blast from the past!

#1
by YearZero - opened

Are you just reaching deep into the bucket now and pulling out old favorites? It's like classic games, only with LLM's. Imagine in 2030's and 40's when people can look back and play with the LLM's that started the ASI revolution. It's like playing the SNES, it's nostalgic!

Are you just reaching deep into the bucket now and pulling out old favorites? It's like classic games, only with LLM's. Imagine in 2030's and 40's when people can look back and play with the LLM's that started the ASI revolution. It's like playing the SNES, it's nostalgic!

Nah, by 2030 Zoe Graystone will appear.. and its all downhill for the colonies at that point.

hahaha this one specifically is because a recent llama.cpp commit (https://github.com/ggerganov/llama.cpp/pull/10026) broke old 8x7b mixtral models, and allegedly re-quantizing should fix it, considering i never even made this the first time around figured I'd see if it was true :) may need to revisit some other ones for posterity

Thanks to @bartowski ;
I am the troublemaker here ; just sayin.
This model was in my top 1% of models at the beginning of the year (part of 600 surveyed).
Getting a "good quant" of it was critical for some work on low BPW model operation / optimization and actual work-work.

Sign up or log in to comment