Spaces:
Runtime error
Runtime error
0%| | 0/10 [00:00<?, ?it/s] | |
40%|βββββββββββββββββββββββββββββ | 4/10 [02:34<02:48, 28.03s/it] | |
50%|ββββββββββββββββββββββββββββββββββββ | 5/10 [02:51<02:00, 24.00s/it][INFO|configuration_utils.py:457] 2023-04-21 04:01:37,211 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\config.json | |
[INFO|configuration_utils.py:362] 2023-04-21 04:01:37,214 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\generation_config.json | |
[INFO|modeling_utils.py:1762] 2023-04-21 04:01:37,438 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\pytorch_model.bin | |
[INFO|tokenization_utils_base.py:2163] 2023-04-21 04:01:37,442 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\tokenizer_config.json | |
[INFO|tokenization_utils_base.py:2170] 2023-04-21 04:01:37,443 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\special_tokens_map.json | |
90%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 9/10 [03:51<00:16, 16.71s/it] | |
{'loss': 5.144, 'learning_rate': 0.0, 'epoch': 0.0} | |
Saving PrefixEncoder | |
{'train_runtime': 251.9148, 'train_samples_per_second': 0.635, 'train_steps_per_second': 0.04, 'train_loss': 5.143994140625, 'epoch': 0.0} | |
***** train metrics ***** | |
epoch = 0.0 | |
train_loss = 5.144 | |
train_runtime = 0:04:11.91 | |
train_samples = 114599 | |
train_samples_per_second = 0.635 | |
100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 10/10 [04:05<00:00, 16.01s/it][INFO|configuration_utils.py:457] 2023-04-21 04:02:51,174 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\config.json | |
[INFO|configuration_utils.py:362] 2023-04-21 04:02:51,176 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\generation_config.json | |
[INFO|modeling_utils.py:1762] 2023-04-21 04:02:51,355 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\pytorch_model.bin | |
[INFO|tokenization_utils_base.py:2163] 2023-04-21 04:02:51,360 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\tokenizer_config.json | |
[INFO|tokenization_utils_base.py:2170] 2023-04-21 04:02:51,361 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\special_tokens_map.json | |
100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 10/10 [04:06<00:00, 24.61s/it] |