Request for Mistral Large Instruct GPTQ INT4

#2
by sparsh35 - opened

Please include Mistral Large Instruct 2 as well in the collection , would have pushed my self but not enough system ram.

I have conducted quantizations for the Mistral Models. Please check here:

  1. shuyuej/Mistral-7B-Instruct-v0.1-GPTQ: https://huggingface.co/shuyuej/Mistral-7B-Instruct-v0.1-GPTQ
  2. shuyuej/Mistral-7B-Instruct-v0.2-GPTQ: https://huggingface.co/shuyuej/Mistral-7B-Instruct-v0.2-GPTQ
  3. shuyuej/Mistral-7B-Instruct-v0.3-GPTQ: https://huggingface.co/shuyuej/Mistral-7B-Instruct-v0.3-GPTQ
  4. shuyuej/Mistral-Nemo-Instruct-2407-GPTQ: https://huggingface.co/shuyuej/Mistral-Nemo-Instruct-2407-GPTQ

Sorry maybe I was not clear enough , I meant this model mistralai/Mistral-Large-Instruct-2407 , the repo https://huggingface.co/mistralai/Mistral-Large-Instruct-2407

sparsh35 changed discussion status to closed

@sparsh35 I have conducted quantizations for the Mistral Large Model. Please check here:
shuyuej/Mistral-Large-Instruct-2407-GPTQ: https://huggingface.co/shuyuej/Mistral-Large-Instruct-2407-GPTQ

Thanks a lot Shuyuej

Sign up or log in to comment