Quantization Error
#1
by
ch4rL
- opened
When I try to use the model via mlx-lm i get the following error message:
ValueError: [quantize] All dimensions should be divisible by 32 for now
Update mlx to latest version pip install -U mlx
@ch4rL have you been able to run this with latest mlx version?