MLX
mixtral

Parameters

#1
by l0d0v1c - opened

When I use this model with lora.py I get:

Total parameters 7411.242M
Trainable parameters 2.229M

But the complete model gives

Total parameters 46705.579M
Trainable parameters 2.787M

Is it normal?

MLX Community org
β€’
edited Mar 14

There is a known issue in the lora.py that incorrectly calculates the quant parameter, hope it will be fixed soon.

l0d0v1c changed discussion status to closed
deleted
This comment has been hidden

Sign up or log in to comment