Request for Mistral Large Instruct GPTQ INT4
#2
by
sparsh35
- opened
Please include Mistral Large Instruct 2 as well in the collection , would have pushed my self but not enough system ram.
I have conducted quantizations for the Mistral Models. Please check here:
shuyuej/Mistral-7B-Instruct-v0.1-GPTQ
: https://huggingface.co/shuyuej/Mistral-7B-Instruct-v0.1-GPTQshuyuej/Mistral-7B-Instruct-v0.2-GPTQ
: https://huggingface.co/shuyuej/Mistral-7B-Instruct-v0.2-GPTQshuyuej/Mistral-7B-Instruct-v0.3-GPTQ
: https://huggingface.co/shuyuej/Mistral-7B-Instruct-v0.3-GPTQshuyuej/Mistral-Nemo-Instruct-2407-GPTQ
: https://huggingface.co/shuyuej/Mistral-Nemo-Instruct-2407-GPTQ
Sorry maybe I was not clear enough , I meant this model mistralai/Mistral-Large-Instruct-2407 , the repo https://huggingface.co/mistralai/Mistral-Large-Instruct-2407
sparsh35
changed discussion status to
closed
@sparsh35
I have conducted quantizations for the Mistral Large Model. Please check here:shuyuej/Mistral-Large-Instruct-2407-GPTQ
: https://huggingface.co/shuyuej/Mistral-Large-Instruct-2407-GPTQ
Thanks a lot Shuyuej