Spaces:
Running
on
CPU Upgrade
Evaluation of Dolphin Mixtral series failed
Hello,
please check the following models:
https://huggingface.co/datasets/open-llm-leaderboard/requests/blob/main/cognitivecomputations/dolphin-2.5-mixtral-8x7b_eval_request_False_float16_Original.json
https://huggingface.co/datasets/open-llm-leaderboard/requests/blob/main/cognitivecomputations/dolphin-2.6-mixtral-8x7b_eval_request_False_bfloat16_Original.json
https://huggingface.co/datasets/open-llm-leaderboard/requests/blob/main/cognitivecomputations/dolphin-2.7-mixtral-8x7b_eval_request_False_bfloat16_Original.json
@Dampfinchen I was wondering the same about Dolphin so I went to its model card and it says "trust_remote_code is required". I've heard evaluations fail if remote code is required.
Hi, the models failed during download, probably because it took too long to download, I re-added them to the queue.
Evaluation failed again.
https://huggingface.co/datasets/open-llm-leaderboard/requests/blob/main/cognitivecomputations/dolphin-2.6-mixtral-8x7b_eval_request_False_bfloat16_Original.json
https://huggingface.co/datasets/open-llm-leaderboard/requests/blob/main/cognitivecomputations/dolphin-2.5-mixtral-8x7b_eval_request_False_float16_Original.json
https://huggingface.co/datasets/open-llm-leaderboard/requests/blob/main/cognitivecomputations/dolphin-2.7-mixtral-8x7b_eval_request_False_bfloat16_Original.json
For some reason, these Dolphin Mixtrals have issues getting evaluated and when they are in the queue, it seems the whole process gets stuck :(
Hi!
These models all failed at download again. Could you make sure their weights are in safetensors?
(As requested by the submit form :) )
That's good to know. These are not my models but it seems they are not in the safetensors format. I guess that explains it then, thank you.
@Dampfinchen Perhaps try submitting Dolphin's GGUF versions. They should score virtually the same as their unquantized versions and safetensors shouldn't be an issue.
https://huggingface.co/TheBloke/dolphin-2.7-mixtral-8x7b-GGUF
Edit: Or GPTQ
https://huggingface.co/TheBloke/dolphin-2.7-mixtral-8x7b-GPTQ