config issue
#3
by
agurla
- opened
config provided is not woring, getting following error : [INFO] Loading model from disk.
Traceback (most recent call last):
File "/tiny-llama-mlx/mlx-examples/llms/llama/llama.py", line 387, in
model = load_model(args.model)
File "/tiny-llama-mlx/mlx-examples/llms/llama/llama.py", line 335, in load_model
n_heads = config["n_heads"]
KeyError: 'n_heads'
agurla
changed discussion status to
closed