runtime error

Exit code: 1. Reason: llm_load_print_meta: f_norm_rms_eps = 1.0e-05 llm_load_print_meta: f_clamp_kqv = 0.0e+00 llm_load_print_meta: f_max_alibi_bias = 0.0e+00 llm_load_print_meta: f_logit_scale = 0.0e+00 llm_load_print_meta: n_ff = 14336 llm_load_print_meta: n_expert = 8 llm_load_print_meta: n_expert_used = 2 llm_load_print_meta: causal attn = 1 llm_load_print_meta: pooling type = 0 llm_load_print_meta: rope type = 0 llm_load_print_meta: rope scaling = linear llm_load_print_meta: freq_base_train = 1000000.0 llm_load_print_meta: freq_scale_train = 1 llm_load_print_meta: n_ctx_orig_yarn = 32768 llm_load_print_meta: rope_finetuned = unknown llm_load_print_meta: ssm_d_conv = 0 llm_load_print_meta: ssm_d_inner = 0 llm_load_print_meta: ssm_d_state = 0 llm_load_print_meta: ssm_dt_rank = 0 llm_load_print_meta: ssm_dt_b_c_rms = 0 llm_load_print_meta: model type = 8x7B llm_load_print_meta: model ftype = IQ2_XS - 2.3125 bpw llm_load_print_meta: model params = 46.70 B llm_load_print_meta: model size = 12.73 GiB (2.34 BPW) llm_load_print_meta: general.name = mixtral llm_load_print_meta: BOS token = 1 '<s>' llm_load_print_meta: EOS token = 2 '</s>' llm_load_print_meta: UNK token = 0 '<unk>' llm_load_print_meta: LF token = 13 '<0x0A>' llm_load_print_meta: EOG token = 2 '</s>' llm_load_print_meta: max token length = 48 llama_model_load: error loading model: missing tensor 'blk.0.ffn_down_exps.weight' llama_load_model_from_file: failed to load model Traceback (most recent call last): File "/home/user/app/app.py", line 53, in <module> model = Llama( File "/home/user/.local/lib/python3.10/site-packages/llama_cpp/llama.py", line 369, in __init__ internals.LlamaModel( File "/home/user/.local/lib/python3.10/site-packages/llama_cpp/_internals.py", line 56, in __init__ raise ValueError(f"Failed to load model from file: {path_model}") ValueError: Failed to load model from file: ggml-model-iq2_xs.gguf

Container logs:

Fetching error logs...