|
/home/cfruan/.conda/envs/mlc-source-311/bin/python -m mlc_chat gen_config /models/gemma-2b-it --quantization q0f32 --conv-template gemma_instruction --output /tmp/tmpe_lwnsrp --context-window-size 8192 --prefill-chunk-size 1024 |
|
[2024-02-21 23:21:06] INFO auto_config.py:115: [92mFound[0m model configuration: /models/gemma-2b-it/config.json |
|
[2024-02-21 23:21:06] INFO auto_config.py:153: [92mFound[0m model type: [1mgemma[0m. Use `--model-type` to override. |
|
[2024-02-21 23:21:06] INFO gemma_model.py:55: [1mcontext_window_size[0m not found in config.json. Falling back to [1mmax_position_embeddings[0m (8192) |
|
[2024-02-21 23:21:06] INFO gemma_model.py:70: [1mprefill_chunk_size[0m defaults to [1mcontext_window_size[0m (8192) |
|
[2024-02-21 23:21:06] INFO config.py:106: Overriding [1mcontext_window_size[0m from 8192 to 8192 |
|
[2024-02-21 23:21:06] INFO config.py:106: Overriding [1mprefill_chunk_size[0m from 8192 to 1024 |
|
[2024-02-21 23:21:06] INFO config.py:106: Overriding [1mmax_batch_size[0m from 1 to 80 |
|
[2024-02-21 23:21:06] INFO gen_config.py:121: [generation_config.json] Setting [1mbos_token_id[0m: 2 |
|
[2024-02-21 23:21:06] INFO gen_config.py:121: [generation_config.json] Setting [1meos_token_id[0m: 1 |
|
[2024-02-21 23:21:06] INFO gen_config.py:121: [generation_config.json] Setting [1mpad_token_id[0m: 0 |
|
[2024-02-21 23:21:06] INFO gen_config.py:133: [92mFound[0m tokenizer config: /models/gemma-2b-it/tokenizer.model. Copying to [1m/tmp/tmpe_lwnsrp/tokenizer.model[0m |
|
[2024-02-21 23:21:06] INFO gen_config.py:133: [92mFound[0m tokenizer config: /models/gemma-2b-it/tokenizer.json. Copying to [1m/tmp/tmpe_lwnsrp/tokenizer.json[0m |
|
[2024-02-21 23:21:06] INFO gen_config.py:135: [91mNot found[0m tokenizer config: /models/gemma-2b-it/vocab.json |
|
[2024-02-21 23:21:06] INFO gen_config.py:135: [91mNot found[0m tokenizer config: /models/gemma-2b-it/merges.txt |
|
[2024-02-21 23:21:06] INFO gen_config.py:135: [91mNot found[0m tokenizer config: /models/gemma-2b-it/added_tokens.json |
|
[2024-02-21 23:21:06] INFO gen_config.py:133: [92mFound[0m tokenizer config: /models/gemma-2b-it/tokenizer_config.json. Copying to [1m/tmp/tmpe_lwnsrp/tokenizer_config.json[0m |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mtemperature[0m: 0.7 |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mpresence_penalty[0m: 0.0 |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mfrequency_penalty[0m: 0.0 |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mrepetition_penalty[0m: 1.0 |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mtop_p[0m: 0.95 |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mmean_gen_len[0m: 128 |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mmax_gen_len[0m: 512 |
|
[2024-02-21 23:21:06] INFO gen_config.py:74: [System default] Setting [1mshift_fill_factor[0m: 0.3 |
|
[2024-02-21 23:21:06] INFO gen_config.py:186: Dumping configuration file to: [1m/tmp/tmpe_lwnsrp/mlc-chat-config.json[0m |
|
/home/cfruan/.conda/envs/mlc-source-311/bin/python -m mlc_chat convert_weight /models/gemma-2b-it --quantization q0f32 --source-format auto --output /tmp/tmpe_lwnsrp |
|
[2024-02-21 23:21:08] INFO auto_config.py:115: [92mFound[0m model configuration: /models/gemma-2b-it/config.json |
|
[2024-02-21 23:21:09] INFO auto_device.py:76: [92mFound[0m device: cuda:0 |
|
[2024-02-21 23:21:09] INFO auto_device.py:76: [92mFound[0m device: cuda:1 |
|
[2024-02-21 23:21:10] INFO auto_device.py:85: [91mNot found[0m device: rocm:0 |
|
[2024-02-21 23:21:11] INFO auto_device.py:85: [91mNot found[0m device: metal:0 |
|
[2024-02-21 23:21:21] INFO auto_device.py:76: [92mFound[0m device: vulkan:0 |
|
[2024-02-21 23:21:21] INFO auto_device.py:76: [92mFound[0m device: vulkan:1 |
|
[2024-02-21 23:21:21] INFO auto_device.py:76: [92mFound[0m device: vulkan:2 |
|
[2024-02-21 23:21:23] INFO auto_device.py:85: [91mNot found[0m device: opencl:0 |
|
[2024-02-21 23:21:23] INFO auto_device.py:33: Using device: [1mcuda:0[0m |
|
[2024-02-21 23:21:23] INFO auto_weight.py:70: Finding weights in: /models/gemma-2b-it |
|
[2024-02-21 23:21:23] INFO auto_weight.py:136: [91mNot found[0m Huggingface PyTorch |
|
[2024-02-21 23:21:23] INFO auto_weight.py:143: [92mFound[0m source weight format: huggingface-safetensor. Source configuration: /models/gemma-2b-it/model.safetensors.index.json |
|
[2024-02-21 23:21:23] INFO auto_weight.py:106: Using source weight configuration: [1m/models/gemma-2b-it/model.safetensors.index.json[0m. Use `--source` to override. |
|
[2024-02-21 23:21:23] INFO auto_weight.py:110: Using source weight format: [1mhuggingface-safetensor[0m. Use `--source-format` to override. |
|
[2024-02-21 23:21:23] INFO auto_config.py:153: [92mFound[0m model type: [1mgemma[0m. Use `--model-type` to override. |
|
[2024-02-21 23:21:23] INFO gemma_model.py:55: [1mcontext_window_size[0m not found in config.json. Falling back to [1mmax_position_embeddings[0m (8192) |
|
[2024-02-21 23:21:23] INFO gemma_model.py:70: [1mprefill_chunk_size[0m defaults to [1mcontext_window_size[0m (8192) |
|
[1mWeight conversion with arguments:[0m |
|
[1m--config[0m /models/gemma-2b-it/config.json |
|
[1m--quantization[0m NoQuantize(name='q0f32', kind='no-quant', model_dtype='float32') |
|
[1m--model-type[0m gemma |
|
[1m--device[0m cuda:0 |
|
[1m--source[0m /models/gemma-2b-it/model.safetensors.index.json |
|
[1m--source-format[0m huggingface-safetensor |
|
[1m--output[0m /tmp/tmpe_lwnsrp |
|
0%| | 0/110 [00:00<?, ?it/s]
[2024-02-21 23:21:24] INFO huggingface_loader.py:182: Loading HF parameters from: /models/gemma-2b-it/model-00001-of-00002.safetensors |
|
0%| | 0/110 [00:00<?, ?it/s]
[2024-02-21 23:21:30] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.embed_tokens.weight[0m", shape: (256000, 2048), dtype: float32 |
|
0%| | 0/110 [00:06<?, ?it/s]
1%|βββ | 1/110 [00:07<14:19, 7.89s/it]
[2024-02-21 23:21:31] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
1%|βββ | 1/110 [00:07<14:19, 7.89s/it]
[2024-02-21 23:21:32] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
1%|βββ | 1/110 [00:08<14:19, 7.89s/it]
3%|βββββββ | 3/110 [00:08<03:50, 2.15s/it]
[2024-02-21 23:21:32] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
3%|βββββββ | 3/110 [00:08<03:50, 2.15s/it]
4%|βββββββββ | 4/110 [00:09<03:03, 1.73s/it]
[2024-02-21 23:21:33] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
4%|βββββββββ | 4/110 [00:09<03:03, 1.73s/it]
[2024-02-21 23:21:33] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
4%|βββββββββ | 4/110 [00:09<03:03, 1.73s/it]
5%|ββββββββββββββ | 6/110 [00:09<01:39, 1.05it/s]
[2024-02-21 23:21:33] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
5%|ββββββββββββββ | 6/110 [00:09<01:39, 1.05it/s]
[2024-02-21 23:21:33] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
5%|ββββββββββββββ | 6/110 [00:09<01:39, 1.05it/s]
[2024-02-21 23:21:33] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
5%|ββββββββββββββ | 6/110 [00:09<01:39, 1.05it/s]
8%|βββββββββββββββββββββ | 9/110 [00:09<00:50, 1.99it/s]
[2024-02-21 23:21:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
8%|βββββββββββββββββββββ | 9/110 [00:10<00:50, 1.99it/s]
9%|βββββββββββββββββββββββ | 10/110 [00:10<00:57, 1.75it/s]
[2024-02-21 23:21:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
9%|βββββββββββββββββββββββ | 10/110 [00:10<00:57, 1.75it/s]
[2024-02-21 23:21:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
9%|βββββββββββββββββββββββ | 10/110 [00:10<00:57, 1.75it/s]
[2024-02-21 23:21:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
9%|βββββββββββββββββββββββ | 10/110 [00:10<00:57, 1.75it/s]
[2024-02-21 23:21:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
9%|βββββββββββββββββββββββ | 10/110 [00:10<00:57, 1.75it/s]
[2024-02-21 23:21:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
9%|βββββββββββββββββββββββ | 10/110 [00:10<00:57, 1.75it/s]
14%|ββββββββββββββββββββββββββββββββββ | 15/110 [00:10<00:26, 3.53it/s]
[2024-02-21 23:21:35] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
14%|ββββββββββββββββββββββββββββββββββ | 15/110 [00:11<00:26, 3.53it/s]
15%|βββββββββββββββββββββββββββββββββββββ | 16/110 [00:11<00:35, 2.68it/s]
[2024-02-21 23:21:35] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
15%|βββββββββββββββββββββββββββββββββββββ | 16/110 [00:11<00:35, 2.68it/s]
[2024-02-21 23:21:35] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
15%|βββββββββββββββββββββββββββββββββββββ | 16/110 [00:11<00:35, 2.68it/s]
[2024-02-21 23:21:35] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
15%|βββββββββββββββββββββββββββββββββββββ | 16/110 [00:11<00:35, 2.68it/s]
[2024-02-21 23:21:35] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
15%|βββββββββββββββββββββββββββββββββββββ | 16/110 [00:11<00:35, 2.68it/s]
[2024-02-21 23:21:36] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
15%|βββββββββββββββββββββββββββββββββββββ | 16/110 [00:12<00:35, 2.68it/s]
19%|ββββββββββββββββββββββββββββββββββββββββββββββββ | 21/110 [00:12<00:19, 4.65it/s]
[2024-02-21 23:21:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
19%|ββββββββββββββββββββββββββββββββββββββββββββββββ | 21/110 [00:13<00:19, 4.65it/s]
20%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 22/110 [00:13<00:29, 2.97it/s]
[2024-02-21 23:21:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
20%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 22/110 [00:13<00:29, 2.97it/s]
[2024-02-21 23:21:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
20%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 22/110 [00:13<00:29, 2.97it/s]
[2024-02-21 23:21:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
20%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 22/110 [00:13<00:29, 2.97it/s]
[2024-02-21 23:21:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
20%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 22/110 [00:13<00:29, 2.97it/s]
[2024-02-21 23:21:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
20%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 22/110 [00:13<00:29, 2.97it/s]
25%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 27/110 [00:13<00:17, 4.84it/s]
[2024-02-21 23:21:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
25%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 27/110 [00:14<00:17, 4.84it/s]
25%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 28/110 [00:14<00:22, 3.62it/s]
[2024-02-21 23:21:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
25%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 28/110 [00:14<00:22, 3.62it/s]
[2024-02-21 23:21:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
25%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 28/110 [00:14<00:22, 3.62it/s]
[2024-02-21 23:21:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
25%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 28/110 [00:14<00:22, 3.62it/s]
[2024-02-21 23:21:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
25%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 28/110 [00:14<00:22, 3.62it/s]
[2024-02-21 23:21:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
25%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 28/110 [00:14<00:22, 3.62it/s]
30%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 33/110 [00:14<00:13, 5.64it/s]
[2024-02-21 23:21:39] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
30%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 33/110 [00:15<00:13, 5.64it/s]
31%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 34/110 [00:15<00:18, 4.03it/s]
[2024-02-21 23:21:39] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
31%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 34/110 [00:15<00:18, 4.03it/s]
[2024-02-21 23:21:39] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
31%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 34/110 [00:15<00:18, 4.03it/s]
[2024-02-21 23:21:39] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
31%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 34/110 [00:15<00:18, 4.03it/s]
[2024-02-21 23:21:39] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
31%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 34/110 [00:15<00:18, 4.03it/s]
[2024-02-21 23:21:39] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
31%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 34/110 [00:15<00:18, 4.03it/s]
35%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 39/110 [00:15<00:11, 5.96it/s]
[2024-02-21 23:21:40] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
35%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 39/110 [00:16<00:11, 5.96it/s]
36%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 40/110 [00:16<00:18, 3.69it/s]
[2024-02-21 23:21:40] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
36%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 40/110 [00:16<00:18, 3.69it/s]
[2024-02-21 23:21:41] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
36%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 40/110 [00:16<00:18, 3.69it/s]
[2024-02-21 23:21:41] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
36%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 40/110 [00:17<00:18, 3.69it/s]
[2024-02-21 23:21:41] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
36%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 40/110 [00:17<00:18, 3.69it/s]
[2024-02-21 23:21:41] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
36%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 40/110 [00:17<00:18, 3.69it/s]
41%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 45/110 [00:17<00:11, 5.46it/s]
[2024-02-21 23:21:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
41%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 45/110 [00:17<00:11, 5.46it/s]
42%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 46/110 [00:18<00:16, 3.94it/s]
[2024-02-21 23:21:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
42%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 46/110 [00:18<00:16, 3.94it/s]
[2024-02-21 23:21:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
42%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 46/110 [00:18<00:16, 3.94it/s]
[2024-02-21 23:21:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
42%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 46/110 [00:18<00:16, 3.94it/s]
[2024-02-21 23:21:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
42%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 46/110 [00:18<00:16, 3.94it/s]
45%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 50/110 [00:18<00:09, 6.11it/s]
[2024-02-21 23:21:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
45%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 50/110 [00:18<00:09, 6.11it/s]
[2024-02-21 23:21:43] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
45%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 50/110 [00:19<00:09, 6.11it/s]
47%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 52/110 [00:19<00:15, 3.71it/s]
[2024-02-21 23:21:43] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
47%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 52/110 [00:19<00:15, 3.71it/s]
[2024-02-21 23:21:43] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
47%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 52/110 [00:19<00:15, 3.71it/s]
[2024-02-21 23:21:43] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
47%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 52/110 [00:19<00:15, 3.71it/s]
[2024-02-21 23:21:44] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
47%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 52/110 [00:20<00:15, 3.71it/s]
51%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 56/110 [00:20<00:13, 3.88it/s]
[2024-02-21 23:21:44] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
51%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 56/110 [00:20<00:13, 3.88it/s]
[2024-02-21 23:21:44] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
51%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 56/110 [00:20<00:13, 3.88it/s]
[2024-02-21 23:21:44] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
51%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 56/110 [00:20<00:13, 3.88it/s]
[2024-02-21 23:21:44] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
51%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 56/110 [00:20<00:13, 3.88it/s]
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 60/110 [00:20<00:09, 5.06it/s]
[2024-02-21 23:21:46] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 60/110 [00:22<00:09, 5.06it/s]
[2024-02-21 23:21:46] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 60/110 [00:22<00:09, 5.06it/s]
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 62/110 [00:22<00:15, 3.00it/s]
[2024-02-21 23:21:46] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 62/110 [00:22<00:15, 3.00it/s]
[2024-02-21 23:21:46] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 62/110 [00:22<00:15, 3.00it/s]
[2024-02-21 23:21:46] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 62/110 [00:22<00:15, 3.00it/s]
[2024-02-21 23:21:46] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 62/110 [00:22<00:15, 3.00it/s]
60%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 66/110 [00:22<00:10, 4.02it/s]
[2024-02-21 23:21:47] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
60%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 66/110 [00:23<00:10, 4.02it/s]
61%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 67/110 [00:24<00:14, 2.90it/s]
[2024-02-21 23:21:48] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
61%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 67/110 [00:24<00:14, 2.90it/s]
[2024-02-21 23:21:48] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
61%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 67/110 [00:24<00:14, 2.90it/s]
[2024-02-21 23:21:48] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
61%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 67/110 [00:24<00:14, 2.90it/s]
[2024-02-21 23:21:48] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
61%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 67/110 [00:24<00:14, 2.90it/s]
[2024-02-21 23:21:48] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
61%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 67/110 [00:24<00:14, 2.90it/s]
65%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 72/110 [00:24<00:08, 4.45it/s]
[2024-02-21 23:21:49] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
65%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 72/110 [00:25<00:08, 4.45it/s]
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 73/110 [00:25<00:10, 3.44it/s]
[2024-02-21 23:21:49] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 73/110 [00:25<00:10, 3.44it/s]
[2024-02-21 23:21:49] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 73/110 [00:25<00:10, 3.44it/s]
[2024-02-21 23:21:49] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 73/110 [00:25<00:10, 3.44it/s]
[2024-02-21 23:21:49] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 73/110 [00:25<00:10, 3.44it/s]
[2024-02-21 23:21:49] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 73/110 [00:25<00:10, 3.44it/s]
71%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 78/110 [00:25<00:06, 5.21it/s]
[2024-02-21 23:21:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
71%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 78/110 [00:26<00:06, 5.21it/s]
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 79/110 [00:26<00:09, 3.23it/s]
[2024-02-21 23:21:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 79/110 [00:26<00:09, 3.23it/s]
[2024-02-21 23:21:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 79/110 [00:26<00:09, 3.23it/s]
[2024-02-21 23:21:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 79/110 [00:26<00:09, 3.23it/s]
[2024-02-21 23:21:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 79/110 [00:26<00:09, 3.23it/s]
[2024-02-21 23:21:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 79/110 [00:26<00:09, 3.23it/s]
76%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 84/110 [00:27<00:04, 5.32it/s]
[2024-02-21 23:21:51] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
76%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 84/110 [00:27<00:04, 5.32it/s]
[2024-02-21 23:21:51] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
76%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 84/110 [00:27<00:04, 5.32it/s]
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 86/110 [00:27<00:05, 4.26it/s]
[2024-02-21 23:21:51] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 86/110 [00:27<00:05, 4.26it/s]
[2024-02-21 23:21:51] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 86/110 [00:27<00:05, 4.26it/s]
[2024-02-21 23:21:51] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 86/110 [00:27<00:05, 4.26it/s]
[2024-02-21 23:21:52] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 86/110 [00:27<00:05, 4.26it/s]
82%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 90/110 [00:28<00:03, 6.03it/s]
[2024-02-21 23:21:52] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
82%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 90/110 [00:28<00:03, 6.03it/s]
[2024-02-21 23:21:52] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
82%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 90/110 [00:28<00:03, 6.03it/s]
84%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 92/110 [00:28<00:03, 4.59it/s]
[2024-02-21 23:21:52] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
84%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 92/110 [00:28<00:03, 4.59it/s]
[2024-02-21 23:21:52] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
84%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 92/110 [00:28<00:03, 4.59it/s]
[2024-02-21 23:21:52] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
84%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 92/110 [00:28<00:03, 4.59it/s]
[2024-02-21 23:21:53] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
84%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 92/110 [00:29<00:03, 4.59it/s]
87%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 96/110 [00:29<00:02, 5.70it/s]
[2024-02-21 23:21:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
87%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 96/110 [00:29<00:02, 5.70it/s]
88%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 97/110 [00:30<00:03, 3.91it/s]
[2024-02-21 23:21:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
88%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 97/110 [00:30<00:03, 3.91it/s]
[2024-02-21 23:21:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
88%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 97/110 [00:30<00:03, 3.91it/s]
[2024-02-21 23:21:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
88%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 97/110 [00:30<00:03, 3.91it/s]
[2024-02-21 23:21:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
88%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 97/110 [00:30<00:03, 3.91it/s]
[2024-02-21 23:21:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
88%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 97/110 [00:30<00:03, 3.91it/s]
93%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 102/110 [00:30<00:01, 6.07it/s]
[2024-02-21 23:21:55] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.mlp.gate_up_proj.weight[0m", shape: (32768, 2048), dtype: float32 |
|
93%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 102/110 [00:30<00:01, 6.07it/s]
[2024-02-21 23:21:55] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
93%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 102/110 [00:31<00:01, 6.07it/s]
95%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 104/110 [00:31<00:01, 4.89it/s]
[2024-02-21 23:21:55] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.self_attn.qkv_proj.weight[0m", shape: (2560, 2048), dtype: float32 |
|
95%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 104/110 [00:31<00:01, 4.89it/s]
95%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 105/110 [00:31<00:00, 5.02it/s]
[2024-02-21 23:21:55] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.self_attn.o_proj.weight[0m", shape: (2048, 2048), dtype: float32 |
|
95%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 105/110 [00:31<00:00, 5.02it/s]
[2024-02-21 23:21:55] INFO huggingface_loader.py:194: Unloading HF weight file: /models/gemma-2b-it/model-00001-of-00002.safetensors |
|
95%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 105/110 [00:31<00:00, 5.02it/s]
[2024-02-21 23:21:56] INFO huggingface_loader.py:182: Loading HF parameters from: /models/gemma-2b-it/model-00002-of-00002.safetensors |
|
95%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 105/110 [00:32<00:00, 5.02it/s]
[2024-02-21 23:21:56] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.input_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
95%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 105/110 [00:32<00:00, 5.02it/s]
97%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 107/110 [00:32<00:00, 3.86it/s]
[2024-02-21 23:21:56] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.mlp.down_proj.weight[0m", shape: (2048, 16384), dtype: float32 |
|
97%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 107/110 [00:32<00:00, 3.86it/s]
98%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 108/110 [00:32<00:00, 4.10it/s]
[2024-02-21 23:21:56] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.post_attention_layernorm.weight[0m", shape: (2048,), dtype: float32 |
|
98%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 108/110 [00:32<00:00, 4.10it/s]
[2024-02-21 23:21:56] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.norm.weight[0m", shape: (2048,), dtype: float32 |
|
98%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 108/110 [00:32<00:00, 4.10it/s]
100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 110/110 [00:32<00:00, 3.40it/s] |
|
[2024-02-21 23:21:56] INFO huggingface_loader.py:194: Unloading HF weight file: /models/gemma-2b-it/model-00002-of-00002.safetensors |
|
[2024-02-21 23:21:56] INFO stats.py:76: [92mTime usage[0m: HF loading: 3.785 sec; Pre-quantization mapping: 20.160 sec; Quantization: 0.000 sec |
|
[2024-02-21 23:21:56] INFO stats.py:90: [92mRAM usage[0m: Peak RAM: 9.211 GB. Total bytes loaded from disk: 9.336 GB |
|
[2024-02-21 23:21:56] INFO convert_weight.py:132: [92mParameter size[0m after quantization: 9.336 GB |
|
[2024-02-21 23:21:56] INFO convert_weight.py:137: [92mTotal parameters[0m: 2,506,172,416 |
|
[2024-02-21 23:21:56] INFO convert_weight.py:138: [92mBits per parameter[0m: 32.000 |
|
/home/cfruan/.conda/envs/mlc-source-311/lib/python3.11/site-packages/numpy/core/getlimits.py:549: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero. |
|
setattr(self, word, getattr(machar, word).flat[0]) |
|
/home/cfruan/.conda/envs/mlc-source-311/lib/python3.11/site-packages/numpy/core/getlimits.py:89: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero. |
|
return self._float_to_str(self.smallest_subnormal) |
|
Start storing to cache /tmp/tmpe_lwnsrp |
|
[0001/0110] saving model.embed_tokens.weight
[0002/0110] saving model.layers.0.input_layernorm.weight
[0003/0110] saving model.layers.0.mlp.down_proj.weight
[0004/0110] saving model.layers.0.mlp.gate_up_proj.weight
[0005/0110] saving model.layers.0.post_attention_layernorm.weight
[0006/0110] saving model.layers.0.self_attn.qkv_proj.weight
[0007/0110] saving model.layers.0.self_attn.o_proj.weight
[0008/0110] saving model.layers.1.input_layernorm.weight
[0009/0110] saving model.layers.1.mlp.down_proj.weight
[0010/0110] saving model.layers.1.mlp.gate_up_proj.weight
[0011/0110] saving model.layers.1.post_attention_layernorm.weight
[0012/0110] saving model.layers.1.self_attn.qkv_proj.weight
[0013/0110] saving model.layers.1.self_attn.o_proj.weight
[0014/0110] saving model.layers.10.input_layernorm.weight
[0015/0110] saving model.layers.10.mlp.down_proj.weight
[0016/0110] saving model.layers.10.mlp.gate_up_proj.weight
[0017/0110] saving model.layers.10.post_attention_layernorm.weight
[0018/0110] saving model.layers.10.self_attn.qkv_proj.weight
[0019/0110] saving model.layers.10.self_attn.o_proj.weight
[0020/0110] saving model.layers.11.input_layernorm.weight
[0021/0110] saving model.layers.11.mlp.down_proj.weight
[0022/0110] saving model.layers.11.mlp.gate_up_proj.weight
[0023/0110] saving model.layers.11.post_attention_layernorm.weight
[0024/0110] saving model.layers.11.self_attn.qkv_proj.weight
[0025/0110] saving model.layers.11.self_attn.o_proj.weight
[0026/0110] saving model.layers.12.input_layernorm.weight
[0027/0110] saving model.layers.12.mlp.down_proj.weight
[0028/0110] saving model.layers.12.mlp.gate_up_proj.weight
[0029/0110] saving model.layers.12.post_attention_layernorm.weight
[0030/0110] saving model.layers.12.self_attn.qkv_proj.weight
[0031/0110] saving model.layers.12.self_attn.o_proj.weight
[0032/0110] saving model.layers.13.input_layernorm.weight
[0033/0110] saving model.layers.13.mlp.down_proj.weight
[0034/0110] saving model.layers.13.mlp.gate_up_proj.weight
[0035/0110] saving model.layers.13.post_attention_layernorm.weight
[0036/0110] saving model.layers.13.self_attn.qkv_proj.weight
[0037/0110] saving model.layers.13.self_attn.o_proj.weight
[0038/0110] saving model.layers.14.input_layernorm.weight
[0039/0110] saving model.layers.14.mlp.down_proj.weight
[0040/0110] saving model.layers.14.mlp.gate_up_proj.weight
[0041/0110] saving model.layers.14.post_attention_layernorm.weight
[0042/0110] saving model.layers.14.self_attn.qkv_proj.weight
[0043/0110] saving model.layers.14.self_attn.o_proj.weight
[0044/0110] saving model.layers.15.input_layernorm.weight
[0045/0110] saving model.layers.15.mlp.down_proj.weight
[0046/0110] saving model.layers.15.mlp.gate_up_proj.weight
[0047/0110] saving model.layers.15.post_attention_layernorm.weight
[0048/0110] saving model.layers.15.self_attn.qkv_proj.weight
[0049/0110] saving model.layers.15.self_attn.o_proj.weight
[0050/0110] saving model.layers.16.input_layernorm.weight
[0051/0110] saving model.layers.16.mlp.down_proj.weight
[0052/0110] saving model.layers.16.mlp.gate_up_proj.weight
[0053/0110] saving model.layers.16.post_attention_layernorm.weight
[0054/0110] saving model.layers.16.self_attn.qkv_proj.weight
[0055/0110] saving model.layers.16.self_attn.o_proj.weight
[0056/0110] saving model.layers.17.mlp.gate_up_proj.weight
[0057/0110] saving model.layers.17.self_attn.qkv_proj.weight
[0058/0110] saving model.layers.17.self_attn.o_proj.weight
[0059/0110] saving model.layers.2.input_layernorm.weight
[0060/0110] saving model.layers.2.mlp.down_proj.weight
[0061/0110] saving model.layers.2.mlp.gate_up_proj.weight
[0062/0110] saving model.layers.2.post_attention_layernorm.weight
[0063/0110] saving model.layers.2.self_attn.qkv_proj.weight
[0064/0110] saving model.layers.2.self_attn.o_proj.weight
[0065/0110] saving model.layers.3.input_layernorm.weight[2024-02-21 23:22:56] INFO convert_weight.py:154: Saved to directory: [1m/tmp/tmpe_lwnsrp[0m |
|
[0066/0110] saving model.layers.3.mlp.down_proj.weight
[0067/0110] saving model.layers.3.mlp.gate_up_proj.weight
[0068/0110] saving model.layers.3.post_attention_layernorm.weight
[0069/0110] saving model.layers.3.self_attn.qkv_proj.weight
[0070/0110] saving model.layers.3.self_attn.o_proj.weight
[0071/0110] saving model.layers.4.input_layernorm.weight
[0072/0110] saving model.layers.4.mlp.down_proj.weight
[0073/0110] saving model.layers.4.mlp.gate_up_proj.weight
[0074/0110] saving model.layers.4.post_attention_layernorm.weight
[0075/0110] saving model.layers.4.self_attn.qkv_proj.weight
[0076/0110] saving model.layers.4.self_attn.o_proj.weight
[0077/0110] saving model.layers.5.input_layernorm.weight
[0078/0110] saving model.layers.5.mlp.down_proj.weight
[0079/0110] saving model.layers.5.mlp.gate_up_proj.weight
[0080/0110] saving model.layers.5.post_attention_layernorm.weight
[0081/0110] saving model.layers.5.self_attn.qkv_proj.weight
[0082/0110] saving model.layers.5.self_attn.o_proj.weight
[0083/0110] saving model.layers.6.input_layernorm.weight
[0084/0110] saving model.layers.6.mlp.down_proj.weight
[0085/0110] saving model.layers.6.mlp.gate_up_proj.weight
[0086/0110] saving model.layers.6.post_attention_layernorm.weight
[0087/0110] saving model.layers.6.self_attn.qkv_proj.weight
[0088/0110] saving model.layers.6.self_attn.o_proj.weight
[0089/0110] saving model.layers.7.input_layernorm.weight
[0090/0110] saving model.layers.7.mlp.down_proj.weight
[0091/0110] saving model.layers.7.mlp.gate_up_proj.weight
[0092/0110] saving model.layers.7.post_attention_layernorm.weight
[0093/0110] saving model.layers.7.self_attn.qkv_proj.weight
[0094/0110] saving model.layers.7.self_attn.o_proj.weight
[0095/0110] saving model.layers.8.input_layernorm.weight
[0096/0110] saving model.layers.8.mlp.down_proj.weight
[0097/0110] saving model.layers.8.mlp.gate_up_proj.weight
[0098/0110] saving model.layers.8.post_attention_layernorm.weight
[0099/0110] saving model.layers.8.self_attn.qkv_proj.weight
[0100/0110] saving model.layers.8.self_attn.o_proj.weight
[0101/0110] saving model.layers.9.input_layernorm.weight
[0102/0110] saving model.layers.9.mlp.down_proj.weight
[0103/0110] saving model.layers.9.mlp.gate_up_proj.weight
[0104/0110] saving model.layers.9.post_attention_layernorm.weight
[0105/0110] saving model.layers.9.self_attn.qkv_proj.weight
[0106/0110] saving model.layers.9.self_attn.o_proj.weight
[0107/0110] saving model.layers.17.input_layernorm.weight
[0108/0110] saving model.layers.17.mlp.down_proj.weight
[0109/0110] saving model.layers.17.post_attention_layernorm.weight
[0110/0110] saving model.norm.weight |
|
All finished, 49 total shards committed, record saved to /tmp/tmpe_lwnsrp/ndarray-cache.json |
|
Also saved a bf16 record to /tmp/tmpe_lwnsrp/ndarray-cache-b16.json |
|
|