Switch import mechanism for flash_attn
#51 opened 9 days ago
by
nvwilliamz
ModuleNotFoundError: No module named 'transformers_modules.microsoft.Phi-3'
#49 opened about 1 month ago
by
hsmanju
Model consistently gets into a loop to repeat itself if there is too much in the context window
2
#48 opened about 1 month ago
by
mstachow
Resource Requirements to load and save model
#47 opened about 2 months ago
by
nana123652
KeyError: 'factor'
#45 opened about 2 months ago
by
surak
How much GPU is needed to load the Phi-3.5-MoE-instruct model
2
#44 opened about 2 months ago
by
cyt78
QAT
3
#42 opened 2 months ago
by
rezzie-rich
ModuleNotFoundError: No module named 'triton'
#41 opened 2 months ago
by
Maximum2000
Cannot use transformer library to inference the
8
#40 opened 2 months ago
by
manishbaral
Validation loss
#39 opened 3 months ago
by
Mani5112
The model 'PhiMoEForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', .......
11
#34 opened 3 months ago
by
xxbadarxx
Only CPU is used during inference.
#33 opened 3 months ago
by
rockcat-miao
The provided example doesn't work
5
#32 opened 3 months ago
by
kqsong
need gguf
18
#4 opened 3 months ago
by
windkkk