text
stringlengths 56
1.16k
|
---|
[2023-09-01 16:28:34,278::train::INFO] Namespace(config='/home/data/t030413/AlphaPanda_v3_Vcnn/diffab-main/configs/YueTrain/codesign_single_yue100.yml', debug=False, device='cpu', finetune=None, logdir='/home/data/t030413/AlphaPanda_v3_Vcnn/diffab-main/logsHu', num_workers=0, resume=None, tag='') |
[2023-09-01 16:28:34,279::train::INFO] {'model': {'type': 'diffab', 'res_feat_dim': 128, 'pair_feat_dim': 64, 'diffusion': {'num_steps': 100, 'eps_net_opt': {'num_layers': 6}}, 'train_structure': True, 'train_sequence': True}, 'train': {'loss_weights': {'rot': 1.0, 'pos': 1.0, 'seq': 1.0}, 'max_iters': 100000, 'val_freq': 1000, 'batch_size': 1, 'seed': 2023, 'max_grad_norm': 100.0, 'optimizer': {'type': 'adam', 'lr': 0.001, 'weight_decay': 0.0, 'beta1': 0.9, 'beta2': 0.999}, 'scheduler': {'type': 'plateau', 'factor': 0.8, 'patience': 10, 'min_lr': 5e-06}}, 'dataset': {'train': {'type': 'sabdab', 'summary_path': './data/sabdab_summary_all.tsv', 'chothia_dir': './data/all_structures/chothia', 'processed_dir': './data/processedHu', 'split': 'train', 'transform': [{'type': 'mask_single_cdr'}, {'type': 'merge_chains'}, {'type': 'patch_around_anchor'}]}, 'val': {'type': 'sabdab', 'summary_path': './data/sabdab_summary_all.tsv', 'chothia_dir': './data/all_structures/chothia', 'processed_dir': './data/processedHu', 'split': 'val', 'transform': [{'type': 'mask_single_cdr', 'selection': 'CDR3'}, {'type': 'merge_chains'}, {'type': 'patch_around_anchor'}]}}} |
[2023-09-01 16:28:34,279::train::INFO] Loading dataset... |
[2023-09-01 16:28:38,100::train::INFO] Train 9687 | Val 20 |
[2023-09-01 16:28:38,101::train::INFO] Building model... |
[2023-09-01 16:28:50,613::train::INFO] Number of parameters: 14134790 |
[2023-09-01 16:28:53,473::train::INFO] [train] Iter 00001 | loss 3.2466 | loss(rot) 0.1424 | loss(pos) 3.1035 | loss(seq) 0.0007 | grad 22.9694 | lr 0.0010 | time_forward 1.4290 | time_backward 1.4300 |
[2023-09-01 16:29:06,564::train::INFO] [train] Iter 00002 | loss 5.0191 | loss(rot) 1.8100 | loss(pos) 2.8324 | loss(seq) 0.3767 | grad 32.5125 | lr 0.0010 | time_forward 6.9060 | time_backward 6.1810 |
[2023-09-01 16:29:09,130::train::INFO] [train] Iter 00003 | loss 6.0240 | loss(rot) 3.0528 | loss(pos) 2.9712 | loss(seq) 0.0000 | grad 21.1379 | lr 0.0010 | time_forward 1.1850 | time_backward 1.3770 |
[2023-09-01 16:29:16,112::train::INFO] [train] Iter 00004 | loss 4.4556 | loss(rot) 1.0654 | loss(pos) 2.9725 | loss(seq) 0.4176 | grad 66.1786 | lr 0.0010 | time_forward 2.8180 | time_backward 4.1620 |
[2023-09-01 16:29:18,736::train::INFO] [train] Iter 00005 | loss 4.1319 | loss(rot) 1.5390 | loss(pos) 2.0855 | loss(seq) 0.5074 | grad 24.4627 | lr 0.0010 | time_forward 1.1900 | time_backward 1.4310 |
[2023-09-01 16:29:21,093::train::INFO] [train] Iter 00006 | loss 5.3315 | loss(rot) 2.4637 | loss(pos) 2.3593 | loss(seq) 0.5084 | grad 23.3018 | lr 0.0010 | time_forward 1.1100 | time_backward 1.2430 |
[2023-09-01 16:29:27,514::train::INFO] [train] Iter 00007 | loss 5.9398 | loss(rot) 2.6984 | loss(pos) 3.2414 | loss(seq) 0.0000 | grad 21.3969 | lr 0.0010 | time_forward 2.7250 | time_backward 3.6930 |
[2023-09-01 16:29:29,721::train::INFO] [train] Iter 00008 | loss 6.5481 | loss(rot) 2.1193 | loss(pos) 3.8446 | loss(seq) 0.5842 | grad 14.6934 | lr 0.0010 | time_forward 0.9970 | time_backward 1.2070 |
[2023-09-01 16:29:32,403::train::INFO] [train] Iter 00009 | loss 5.0047 | loss(rot) 2.1163 | loss(pos) 2.8812 | loss(seq) 0.0072 | grad 13.4356 | lr 0.0010 | time_forward 1.2530 | time_backward 1.4260 |
[2023-09-01 16:29:35,179::train::INFO] [train] Iter 00010 | loss 2.7017 | loss(rot) 0.3190 | loss(pos) 2.3774 | loss(seq) 0.0052 | grad 10.9448 | lr 0.0010 | time_forward 1.2970 | time_backward 1.4520 |
[2023-09-01 16:29:42,266::train::INFO] [train] Iter 00011 | loss 3.4911 | loss(rot) 0.5244 | loss(pos) 2.5443 | loss(seq) 0.4224 | grad 8.2482 | lr 0.0010 | time_forward 2.8400 | time_backward 4.2150 |
[2023-09-01 16:29:45,158::train::INFO] [train] Iter 00012 | loss 4.1567 | loss(rot) 1.9011 | loss(pos) 1.6802 | loss(seq) 0.5754 | grad 9.2517 | lr 0.0010 | time_forward 1.4450 | time_backward 1.4440 |
[2023-09-01 16:29:53,975::train::INFO] [train] Iter 00013 | loss 6.0666 | loss(rot) 2.1476 | loss(pos) 3.1638 | loss(seq) 0.7552 | grad 14.6003 | lr 0.0010 | time_forward 3.6510 | time_backward 5.1640 |
[2023-09-01 16:30:00,700::train::INFO] [train] Iter 00014 | loss 4.9507 | loss(rot) 2.9832 | loss(pos) 1.5050 | loss(seq) 0.4625 | grad 8.4124 | lr 0.0010 | time_forward 2.8920 | time_backward 3.8300 |
[2023-09-01 16:30:09,536::train::INFO] [train] Iter 00015 | loss 4.3913 | loss(rot) 2.1631 | loss(pos) 1.7636 | loss(seq) 0.4646 | grad 6.0403 | lr 0.0010 | time_forward 3.8660 | time_backward 4.9670 |
[2023-09-01 16:30:18,095::train::INFO] [train] Iter 00016 | loss 6.4243 | loss(rot) 1.4151 | loss(pos) 4.3550 | loss(seq) 0.6542 | grad 11.4697 | lr 0.0010 | time_forward 3.3010 | time_backward 5.2540 |
[2023-09-01 16:30:26,842::train::INFO] [train] Iter 00017 | loss 5.3693 | loss(rot) 3.0354 | loss(pos) 2.2673 | loss(seq) 0.0666 | grad 7.3095 | lr 0.0010 | time_forward 3.5250 | time_backward 5.2190 |
[2023-09-01 16:30:29,338::train::INFO] [train] Iter 00018 | loss 3.8467 | loss(rot) 2.0856 | loss(pos) 1.2266 | loss(seq) 0.5345 | grad 6.0825 | lr 0.0010 | time_forward 1.2740 | time_backward 1.2180 |
[2023-09-01 16:30:38,078::train::INFO] [train] Iter 00019 | loss 4.4221 | loss(rot) 2.1815 | loss(pos) 1.6512 | loss(seq) 0.5894 | grad 6.7195 | lr 0.0010 | time_forward 3.4100 | time_backward 5.3270 |
[2023-09-01 16:30:40,798::train::INFO] [train] Iter 00020 | loss 3.8112 | loss(rot) 0.3527 | loss(pos) 3.4517 | loss(seq) 0.0068 | grad 9.6219 | lr 0.0010 | time_forward 1.3100 | time_backward 1.4080 |
[2023-09-01 16:30:44,986::train::INFO] [train] Iter 00021 | loss 3.8943 | loss(rot) 2.2940 | loss(pos) 1.0156 | loss(seq) 0.5847 | grad 6.4022 | lr 0.0010 | time_forward 1.3590 | time_backward 1.5100 |
[2023-09-01 16:30:53,625::train::INFO] [train] Iter 00022 | loss 4.8863 | loss(rot) 2.6016 | loss(pos) 2.2594 | loss(seq) 0.0253 | grad 8.7441 | lr 0.0010 | time_forward 3.8380 | time_backward 4.7830 |
[2023-09-01 16:31:00,909::train::INFO] [train] Iter 00023 | loss 3.8815 | loss(rot) 2.1334 | loss(pos) 1.5429 | loss(seq) 0.2052 | grad 9.0429 | lr 0.0010 | time_forward 3.1940 | time_backward 4.0860 |
[2023-09-01 16:31:03,653::train::INFO] [train] Iter 00024 | loss 4.4063 | loss(rot) 3.4095 | loss(pos) 0.9956 | loss(seq) 0.0012 | grad 6.9191 | lr 0.0010 | time_forward 1.2850 | time_backward 1.4570 |
[2023-09-01 16:31:06,454::train::INFO] [train] Iter 00025 | loss 3.6243 | loss(rot) 2.0529 | loss(pos) 1.1698 | loss(seq) 0.4017 | grad 7.3486 | lr 0.0010 | time_forward 1.3190 | time_backward 1.4780 |
[2023-09-01 16:31:17,149::train::INFO] [train] Iter 00026 | loss 2.9889 | loss(rot) 1.2857 | loss(pos) 1.3436 | loss(seq) 0.3597 | grad 6.4844 | lr 0.0010 | time_forward 5.1720 | time_backward 5.4900 |
[2023-09-01 16:31:24,863::train::INFO] [train] Iter 00027 | loss 3.9652 | loss(rot) 2.6753 | loss(pos) 0.8095 | loss(seq) 0.4803 | grad 7.4474 | lr 0.0010 | time_forward 3.3990 | time_backward 4.3120 |
[2023-09-01 16:31:32,488::train::INFO] [train] Iter 00028 | loss 4.6490 | loss(rot) 1.7827 | loss(pos) 2.3107 | loss(seq) 0.5557 | grad 7.6016 | lr 0.0010 | time_forward 3.3390 | time_backward 4.2820 |
[2023-09-01 16:31:35,167::train::INFO] [train] Iter 00029 | loss 3.8354 | loss(rot) 3.0157 | loss(pos) 0.8191 | loss(seq) 0.0007 | grad 3.7961 | lr 0.0010 | time_forward 1.2650 | time_backward 1.4110 |
[2023-09-01 16:31:42,602::train::INFO] [train] Iter 00030 | loss 2.4241 | loss(rot) 0.1600 | loss(pos) 2.2593 | loss(seq) 0.0048 | grad 5.4892 | lr 0.0010 | time_forward 2.9620 | time_backward 4.4480 |
[2023-09-01 16:31:45,276::train::INFO] [train] Iter 00031 | loss 3.3618 | loss(rot) 0.4053 | loss(pos) 2.9086 | loss(seq) 0.0479 | grad 7.3094 | lr 0.0010 | time_forward 1.2720 | time_backward 1.3990 |
[2023-09-01 16:31:48,372::train::INFO] [train] Iter 00032 | loss 4.8440 | loss(rot) 2.0839 | loss(pos) 2.2669 | loss(seq) 0.4932 | grad 5.9623 | lr 0.0010 | time_forward 1.6940 | time_backward 1.3820 |
[2023-09-01 16:31:56,184::train::INFO] [train] Iter 00033 | loss 5.0606 | loss(rot) 2.6036 | loss(pos) 1.9495 | loss(seq) 0.5075 | grad 5.9942 | lr 0.0010 | time_forward 3.6140 | time_backward 4.1950 |
[2023-09-01 16:31:58,833::train::INFO] [train] Iter 00034 | loss 4.1830 | loss(rot) 1.6654 | loss(pos) 1.9751 | loss(seq) 0.5425 | grad 7.9018 | lr 0.0010 | time_forward 1.2400 | time_backward 1.4060 |
[2023-09-01 16:32:05,529::train::INFO] [train] Iter 00035 | loss 2.7268 | loss(rot) 1.0602 | loss(pos) 1.0988 | loss(seq) 0.5678 | grad 7.9811 | lr 0.0010 | time_forward 2.9570 | time_backward 3.7360 |
[2023-09-01 16:32:13,410::train::INFO] [train] Iter 00036 | loss 4.3118 | loss(rot) 0.1960 | loss(pos) 3.8058 | loss(seq) 0.3100 | grad 5.5252 | lr 0.0010 | time_forward 3.3170 | time_backward 4.5620 |
[2023-09-01 16:32:16,201::train::INFO] [train] Iter 00037 | loss 4.3215 | loss(rot) 0.6903 | loss(pos) 3.3156 | loss(seq) 0.3156 | grad 5.6704 | lr 0.0010 | time_forward 1.3210 | time_backward 1.4660 |
[2023-09-01 16:32:22,089::train::INFO] [train] Iter 00038 | loss 3.2502 | loss(rot) 1.4569 | loss(pos) 1.3844 | loss(seq) 0.4089 | grad 3.9446 | lr 0.0010 | time_forward 2.5720 | time_backward 3.3130 |
[2023-09-01 16:32:24,509::train::INFO] [train] Iter 00039 | loss 4.4167 | loss(rot) 3.0886 | loss(pos) 1.2881 | loss(seq) 0.0400 | grad 3.9496 | lr 0.0010 | time_forward 1.1230 | time_backward 1.2800 |
[2023-09-01 16:32:27,470::train::INFO] [train] Iter 00040 | loss 3.9464 | loss(rot) 1.1403 | loss(pos) 2.4565 | loss(seq) 0.3496 | grad 5.2732 | lr 0.0010 | time_forward 1.4380 | time_backward 1.5200 |
[2023-09-01 16:32:36,766::train::INFO] [train] Iter 00041 | loss 3.4577 | loss(rot) 0.8982 | loss(pos) 2.0753 | loss(seq) 0.4842 | grad 4.2191 | lr 0.0010 | time_forward 4.0040 | time_backward 5.2700 |
[2023-09-01 16:32:39,038::train::INFO] [train] Iter 00042 | loss 3.7232 | loss(rot) 2.5492 | loss(pos) 1.1332 | loss(seq) 0.0408 | grad 4.7410 | lr 0.0010 | time_forward 1.0460 | time_backward 1.2240 |
[2023-09-01 16:32:46,779::train::INFO] [train] Iter 00043 | loss 5.3583 | loss(rot) 2.4661 | loss(pos) 2.4435 | loss(seq) 0.4487 | grad 6.3814 | lr 0.0010 | time_forward 3.1600 | time_backward 4.5700 |
[2023-09-01 16:32:49,514::train::INFO] [train] Iter 00044 | loss 4.8983 | loss(rot) 2.1679 | loss(pos) 2.2550 | loss(seq) 0.4754 | grad 6.6691 | lr 0.0010 | time_forward 1.2710 | time_backward 1.4620 |
[2023-09-01 16:32:52,302::train::INFO] [train] Iter 00045 | loss 2.5828 | loss(rot) 0.3075 | loss(pos) 2.2731 | loss(seq) 0.0022 | grad 7.3576 | lr 0.0010 | time_forward 1.3290 | time_backward 1.4550 |
[2023-09-01 16:32:59,137::train::INFO] [train] Iter 00046 | loss 4.1013 | loss(rot) 3.0821 | loss(pos) 1.0191 | loss(seq) 0.0001 | grad 5.6064 | lr 0.0010 | time_forward 2.9500 | time_backward 3.8630 |
[2023-09-01 16:33:07,206::train::INFO] [train] Iter 00047 | loss 3.0552 | loss(rot) 0.7937 | loss(pos) 1.9010 | loss(seq) 0.3605 | grad 5.7778 | lr 0.0010 | time_forward 3.5200 | time_backward 4.5470 |
[2023-09-01 16:33:10,015::train::INFO] [train] Iter 00048 | loss 4.7430 | loss(rot) 0.2572 | loss(pos) 4.4682 | loss(seq) 0.0176 | grad 13.1885 | lr 0.0010 | time_forward 1.4680 | time_backward 1.3360 |
[2023-09-01 16:33:17,494::train::INFO] [train] Iter 00049 | loss 4.7764 | loss(rot) 0.9383 | loss(pos) 3.7339 | loss(seq) 0.1043 | grad 7.6523 | lr 0.0010 | time_forward 3.2190 | time_backward 4.2570 |
[2023-09-01 16:33:20,253::train::INFO] [train] Iter 00050 | loss 5.0350 | loss(rot) 3.3438 | loss(pos) 1.6270 | loss(seq) 0.0641 | grad 7.6847 | lr 0.0010 | time_forward 1.3160 | time_backward 1.4390 |
[2023-09-01 16:33:22,925::train::INFO] [train] Iter 00051 | loss 3.9490 | loss(rot) 2.9721 | loss(pos) 0.9769 | loss(seq) 0.0000 | grad 5.7281 | lr 0.0010 | time_forward 1.2200 | time_backward 1.4360 |
[2023-09-01 16:33:25,736::train::INFO] [train] Iter 00052 | loss 4.2192 | loss(rot) 2.4647 | loss(pos) 1.2013 | loss(seq) 0.5532 | grad 4.1914 | lr 0.0010 | time_forward 1.3000 | time_backward 1.4750 |
[2023-09-01 16:33:32,977::train::INFO] [train] Iter 00053 | loss 3.0426 | loss(rot) 0.3815 | loss(pos) 2.3681 | loss(seq) 0.2930 | grad 4.6980 | lr 0.0010 | time_forward 2.9880 | time_backward 4.2500 |
[2023-09-01 16:33:41,623::train::INFO] [train] Iter 00054 | loss 3.8058 | loss(rot) 2.8869 | loss(pos) 0.9141 | loss(seq) 0.0048 | grad 4.5150 | lr 0.0010 | time_forward 4.0170 | time_backward 4.6260 |
[2023-09-01 16:33:49,647::train::INFO] [train] Iter 00055 | loss 5.9857 | loss(rot) 0.4627 | loss(pos) 5.4702 | loss(seq) 0.0528 | grad 11.6812 | lr 0.0010 | time_forward 3.3580 | time_backward 4.6630 |
[2023-09-01 16:33:58,994::train::INFO] [train] Iter 00056 | loss 4.9236 | loss(rot) 2.7077 | loss(pos) 1.8736 | loss(seq) 0.3423 | grad 5.3009 | lr 0.0010 | time_forward 3.9330 | time_backward 5.3910 |
[2023-09-01 16:34:06,791::train::INFO] [train] Iter 00057 | loss 4.2824 | loss(rot) 2.9730 | loss(pos) 1.0757 | loss(seq) 0.2338 | grad 3.4339 | lr 0.0010 | time_forward 3.3920 | time_backward 4.4020 |
[2023-09-01 16:34:14,495::train::INFO] [train] Iter 00058 | loss 4.3005 | loss(rot) 1.5870 | loss(pos) 2.1381 | loss(seq) 0.5755 | grad 5.3295 | lr 0.0010 | time_forward 3.4050 | time_backward 4.2920 |
[2023-09-01 16:34:21,611::train::INFO] [train] Iter 00059 | loss 4.0332 | loss(rot) 0.3598 | loss(pos) 3.4969 | loss(seq) 0.1765 | grad 4.6435 | lr 0.0010 | time_forward 2.8530 | time_backward 4.2590 |
[2023-09-01 16:34:30,520::train::INFO] [train] Iter 00060 | loss 4.6494 | loss(rot) 2.2789 | loss(pos) 1.9613 | loss(seq) 0.4091 | grad 4.4734 | lr 0.0010 | time_forward 3.5590 | time_backward 5.3470 |
[2023-09-01 16:34:39,652::train::INFO] [train] Iter 00061 | loss 4.6419 | loss(rot) 2.6303 | loss(pos) 2.0069 | loss(seq) 0.0047 | grad 6.2726 | lr 0.0010 | time_forward 3.7150 | time_backward 5.4130 |
[2023-09-01 16:34:42,578::train::INFO] [train] Iter 00062 | loss 4.4247 | loss(rot) 3.0453 | loss(pos) 1.3564 | loss(seq) 0.0230 | grad 5.4566 | lr 0.0010 | time_forward 1.3970 | time_backward 1.5270 |
[2023-09-01 16:34:45,983::train::INFO] [train] Iter 00063 | loss 3.5187 | loss(rot) 0.9704 | loss(pos) 2.0913 | loss(seq) 0.4569 | grad 4.4450 | lr 0.0010 | time_forward 1.5610 | time_backward 1.8410 |
[2023-09-01 16:34:53,616::train::INFO] [train] Iter 00064 | loss 2.8064 | loss(rot) 0.3005 | loss(pos) 2.2342 | loss(seq) 0.2717 | grad 5.9054 | lr 0.0010 | time_forward 3.2900 | time_backward 4.3410 |
[2023-09-01 16:35:02,426::train::INFO] [train] Iter 00065 | loss 2.9368 | loss(rot) 0.4103 | loss(pos) 2.2431 | loss(seq) 0.2834 | grad 5.9685 | lr 0.0010 | time_forward 3.4740 | time_backward 5.3330 |
[2023-09-01 16:35:11,179::train::INFO] [train] Iter 00066 | loss 3.8592 | loss(rot) 3.0003 | loss(pos) 0.7490 | loss(seq) 0.1099 | grad 3.7200 | lr 0.0010 | time_forward 3.6570 | time_backward 5.0920 |
[2023-09-01 16:35:20,247::train::INFO] [train] Iter 00067 | loss 5.0035 | loss(rot) 2.8668 | loss(pos) 1.6306 | loss(seq) 0.5061 | grad 7.3381 | lr 0.0010 | time_forward 3.7890 | time_backward 5.2660 |
[2023-09-01 16:35:27,318::train::INFO] [train] Iter 00068 | loss 3.7751 | loss(rot) 2.9409 | loss(pos) 0.7703 | loss(seq) 0.0640 | grad 5.2551 | lr 0.0010 | time_forward 3.1080 | time_backward 3.9590 |
[2023-09-01 16:35:30,324::train::INFO] [train] Iter 00069 | loss 4.1112 | loss(rot) 2.2748 | loss(pos) 1.2471 | loss(seq) 0.5893 | grad 5.3365 | lr 0.0010 | time_forward 1.4050 | time_backward 1.5970 |
[2023-09-01 16:35:38,063::train::INFO] [train] Iter 00070 | loss 3.6394 | loss(rot) 3.0268 | loss(pos) 0.6059 | loss(seq) 0.0066 | grad 3.8204 | lr 0.0010 | time_forward 3.3730 | time_backward 4.3620 |
[2023-09-01 16:35:46,705::train::INFO] [train] Iter 00071 | loss 3.6873 | loss(rot) 1.7065 | loss(pos) 1.8088 | loss(seq) 0.1720 | grad 11.3929 | lr 0.0010 | time_forward 3.5750 | time_backward 5.0650 |
[2023-09-01 16:35:54,304::train::INFO] [train] Iter 00072 | loss 3.5238 | loss(rot) 2.5693 | loss(pos) 0.5954 | loss(seq) 0.3590 | grad 5.0833 | lr 0.0010 | time_forward 3.0270 | time_backward 4.5680 |
[2023-09-01 16:36:01,743::train::INFO] [train] Iter 00073 | loss 3.6389 | loss(rot) 2.7766 | loss(pos) 0.7775 | loss(seq) 0.0849 | grad 7.1801 | lr 0.0010 | time_forward 3.1170 | time_backward 4.3180 |
[2023-09-01 16:36:08,897::train::INFO] [train] Iter 00074 | loss 3.7372 | loss(rot) 1.5352 | loss(pos) 1.8164 | loss(seq) 0.3856 | grad 8.0746 | lr 0.0010 | time_forward 3.1180 | time_backward 4.0310 |
[2023-09-01 16:36:21,054::train::INFO] [train] Iter 00075 | loss 4.1484 | loss(rot) 2.6074 | loss(pos) 1.0961 | loss(seq) 0.4449 | grad 11.4461 | lr 0.0010 | time_forward 7.8400 | time_backward 4.3140 |
[2023-09-01 16:36:23,708::train::INFO] [train] Iter 00076 | loss 3.4629 | loss(rot) 2.4653 | loss(pos) 0.7821 | loss(seq) 0.2155 | grad 7.2821 | lr 0.0010 | time_forward 1.2470 | time_backward 1.3990 |
[2023-09-01 16:36:26,377::train::INFO] [train] Iter 00077 | loss 4.3846 | loss(rot) 0.1642 | loss(pos) 4.2204 | loss(seq) 0.0000 | grad 12.7275 | lr 0.0010 | time_forward 1.2600 | time_backward 1.3890 |
[2023-09-01 16:36:34,268::train::INFO] [train] Iter 00078 | loss 4.3501 | loss(rot) 3.6306 | loss(pos) 0.7174 | loss(seq) 0.0021 | grad 5.1582 | lr 0.0010 | time_forward 3.3540 | time_backward 4.5320 |
[2023-09-01 16:36:41,975::train::INFO] [train] Iter 00079 | loss 4.1108 | loss(rot) 2.6770 | loss(pos) 1.4246 | loss(seq) 0.0093 | grad 7.3064 | lr 0.0010 | time_forward 3.3520 | time_backward 4.3510 |
[2023-09-01 16:36:44,710::train::INFO] [train] Iter 00080 | loss 3.3868 | loss(rot) 1.7656 | loss(pos) 1.2470 | loss(seq) 0.3742 | grad 6.1790 | lr 0.0010 | time_forward 1.2710 | time_backward 1.4610 |
[2023-09-01 16:36:47,533::train::INFO] [train] Iter 00081 | loss 3.8104 | loss(rot) 2.5927 | loss(pos) 1.1652 | loss(seq) 0.0526 | grad 7.3966 | lr 0.0010 | time_forward 1.2980 | time_backward 1.4990 |
[2023-09-01 16:36:56,267::train::INFO] [train] Iter 00082 | loss 3.7932 | loss(rot) 3.1104 | loss(pos) 0.6807 | loss(seq) 0.0021 | grad 5.3436 | lr 0.0010 | time_forward 3.4330 | time_backward 5.2980 |
[2023-09-01 16:37:04,625::train::INFO] [train] Iter 00083 | loss 3.7546 | loss(rot) 2.2730 | loss(pos) 1.0264 | loss(seq) 0.4552 | grad 6.5783 | lr 0.0010 | time_forward 3.5360 | time_backward 4.8190 |
[2023-09-01 16:37:11,622::train::INFO] [train] Iter 00084 | loss 3.8512 | loss(rot) 1.9338 | loss(pos) 1.6006 | loss(seq) 0.3167 | grad 7.9409 | lr 0.0010 | time_forward 2.8150 | time_backward 4.1700 |
[2023-09-01 16:37:14,928::train::INFO] [train] Iter 00085 | loss 4.8864 | loss(rot) 3.1032 | loss(pos) 1.4841 | loss(seq) 0.2992 | grad 10.2933 | lr 0.0010 | time_forward 2.0020 | time_backward 1.3000 |
[2023-09-01 16:37:22,667::train::INFO] [train] Iter 00086 | loss 3.4242 | loss(rot) 2.3360 | loss(pos) 0.8031 | loss(seq) 0.2851 | grad 6.7243 | lr 0.0010 | time_forward 3.0360 | time_backward 4.6800 |
[2023-09-01 16:37:31,339::train::INFO] [train] Iter 00087 | loss 3.6123 | loss(rot) 2.8403 | loss(pos) 0.7592 | loss(seq) 0.0128 | grad 5.9773 | lr 0.0010 | time_forward 3.2890 | time_backward 5.3790 |
[2023-09-01 16:37:39,235::train::INFO] [train] Iter 00088 | loss 3.8659 | loss(rot) 0.4632 | loss(pos) 3.3559 | loss(seq) 0.0468 | grad 7.3783 | lr 0.0010 | time_forward 3.4110 | time_backward 4.4820 |
[2023-09-01 16:37:42,434::train::INFO] [train] Iter 00089 | loss 4.1910 | loss(rot) 1.0379 | loss(pos) 2.8015 | loss(seq) 0.3516 | grad 7.0885 | lr 0.0010 | time_forward 1.4360 | time_backward 1.7590 |
[2023-09-01 16:37:49,982::train::INFO] [train] Iter 00090 | loss 3.3969 | loss(rot) 2.5976 | loss(pos) 0.7287 | loss(seq) 0.0706 | grad 4.7910 | lr 0.0010 | time_forward 2.9230 | time_backward 4.6220 |
[2023-09-01 16:37:59,202::train::INFO] [train] Iter 00091 | loss 3.5184 | loss(rot) 1.9399 | loss(pos) 1.0758 | loss(seq) 0.5026 | grad 4.3949 | lr 0.0010 | time_forward 3.5500 | time_backward 5.6660 |
[2023-09-01 16:38:09,275::train::INFO] [train] Iter 00092 | loss 4.0955 | loss(rot) 2.6806 | loss(pos) 0.9617 | loss(seq) 0.4532 | grad 4.0232 | lr 0.0010 | time_forward 3.5330 | time_backward 6.3560 |
[2023-09-01 16:38:11,677::train::INFO] [train] Iter 00093 | loss 3.1255 | loss(rot) 2.1171 | loss(pos) 0.9404 | loss(seq) 0.0680 | grad 5.0513 | lr 0.0010 | time_forward 1.1550 | time_backward 1.2420 |
[2023-09-01 16:38:21,804::train::INFO] [train] Iter 00094 | loss 3.5853 | loss(rot) 0.6658 | loss(pos) 2.7034 | loss(seq) 0.2161 | grad 4.8818 | lr 0.0010 | time_forward 3.6960 | time_backward 5.5870 |
End of preview. Expand
in Dataset Viewer.
README.md exists but content is empty.
Use the Edit dataset card button to edit it.
- Downloads last month
- 39