Any chance of a 1B/2B/3B/4B model?
Any chance of a 1B/2B/3B/4B model?
Hey, we would love to release smaller, more performant versions of Aya in the future. As a lab, one of our main focus is efficiency and so we would love to do or support work in this direction.
There's no ongoing project for this yet but it is definitely something we would like to explore in the future.
In case you're interested in working on this then feel free to draft a research proposal and reach out to us on Aya Discord. We'd be happy to support and give feedback.
Also closing this issue for now but feel free to reopen in case of more doubts/questions :)
As of now I made my quants and my "silly" version:
Quants:
https://huggingface.co/ZeroWw/aya-expanse-8b-GGUF
"Silly" version:
https://huggingface.co/ZeroWw/aya-expanse-8b-SILLY