|
2023-10-20 09:06:35,409 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,409 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 128) |
|
(position_embeddings): Embedding(512, 128) |
|
(token_type_embeddings): Embedding(2, 128) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-1): 2 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=128, out_features=128, bias=True) |
|
(key): Linear(in_features=128, out_features=128, bias=True) |
|
(value): Linear(in_features=128, out_features=128, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=128, out_features=512, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=512, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=128, out_features=13, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-20 09:06:35,410 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,410 MultiCorpus: 6183 train + 680 dev + 2113 test sentences |
|
- NER_HIPE_2022 Corpus: 6183 train + 680 dev + 2113 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/topres19th/en/with_doc_seperator |
|
2023-10-20 09:06:35,410 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,410 Train: 6183 sentences |
|
2023-10-20 09:06:35,410 (train_with_dev=False, train_with_test=False) |
|
2023-10-20 09:06:35,410 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,410 Training Params: |
|
2023-10-20 09:06:35,410 - learning_rate: "3e-05" |
|
2023-10-20 09:06:35,410 - mini_batch_size: "4" |
|
2023-10-20 09:06:35,410 - max_epochs: "10" |
|
2023-10-20 09:06:35,410 - shuffle: "True" |
|
2023-10-20 09:06:35,410 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,411 Plugins: |
|
2023-10-20 09:06:35,411 - TensorboardLogger |
|
2023-10-20 09:06:35,411 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-20 09:06:35,411 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,411 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-20 09:06:35,411 - metric: "('micro avg', 'f1-score')" |
|
2023-10-20 09:06:35,411 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,411 Computation: |
|
2023-10-20 09:06:35,411 - compute on device: cuda:0 |
|
2023-10-20 09:06:35,411 - embedding storage: none |
|
2023-10-20 09:06:35,411 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,411 Model training base path: "hmbench-topres19th/en-dbmdz/bert-tiny-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1" |
|
2023-10-20 09:06:35,411 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,411 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:35,411 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-20 09:06:37,833 epoch 1 - iter 154/1546 - loss 3.28397095 - time (sec): 2.42 - samples/sec: 5244.37 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-20 09:06:40,154 epoch 1 - iter 308/1546 - loss 2.99105285 - time (sec): 4.74 - samples/sec: 5139.46 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-20 09:06:42,531 epoch 1 - iter 462/1546 - loss 2.52293414 - time (sec): 7.12 - samples/sec: 5133.11 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-20 09:06:44,753 epoch 1 - iter 616/1546 - loss 2.02560749 - time (sec): 9.34 - samples/sec: 5282.45 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-20 09:06:46,974 epoch 1 - iter 770/1546 - loss 1.67950423 - time (sec): 11.56 - samples/sec: 5315.86 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-20 09:06:49,149 epoch 1 - iter 924/1546 - loss 1.45624081 - time (sec): 13.74 - samples/sec: 5319.42 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-20 09:06:51,518 epoch 1 - iter 1078/1546 - loss 1.28121677 - time (sec): 16.11 - samples/sec: 5324.77 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-20 09:06:53,979 epoch 1 - iter 1232/1546 - loss 1.14682597 - time (sec): 18.57 - samples/sec: 5319.74 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-20 09:06:56,359 epoch 1 - iter 1386/1546 - loss 1.05075802 - time (sec): 20.95 - samples/sec: 5283.56 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-20 09:06:58,781 epoch 1 - iter 1540/1546 - loss 0.96746300 - time (sec): 23.37 - samples/sec: 5294.97 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-20 09:06:58,888 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:06:58,888 EPOCH 1 done: loss 0.9635 - lr: 0.000030 |
|
2023-10-20 09:06:59,562 DEV : loss 0.1460493505001068 - f1-score (micro avg) 0.0 |
|
2023-10-20 09:06:59,573 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:07:01,954 epoch 2 - iter 154/1546 - loss 0.23058634 - time (sec): 2.38 - samples/sec: 5219.28 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-20 09:07:04,312 epoch 2 - iter 308/1546 - loss 0.22254356 - time (sec): 4.74 - samples/sec: 5059.52 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-20 09:07:06,666 epoch 2 - iter 462/1546 - loss 0.22366210 - time (sec): 7.09 - samples/sec: 4999.49 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-20 09:07:09,084 epoch 2 - iter 616/1546 - loss 0.21092520 - time (sec): 9.51 - samples/sec: 5084.24 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-20 09:07:11,446 epoch 2 - iter 770/1546 - loss 0.20806216 - time (sec): 11.87 - samples/sec: 5152.85 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-20 09:07:13,762 epoch 2 - iter 924/1546 - loss 0.20526902 - time (sec): 14.19 - samples/sec: 5138.44 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-20 09:07:16,170 epoch 2 - iter 1078/1546 - loss 0.20379508 - time (sec): 16.60 - samples/sec: 5111.99 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-20 09:07:18,549 epoch 2 - iter 1232/1546 - loss 0.20172155 - time (sec): 18.98 - samples/sec: 5138.82 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-20 09:07:20,996 epoch 2 - iter 1386/1546 - loss 0.20086049 - time (sec): 21.42 - samples/sec: 5126.79 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-20 09:07:23,418 epoch 2 - iter 1540/1546 - loss 0.19704649 - time (sec): 23.84 - samples/sec: 5183.51 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-20 09:07:23,513 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:07:23,513 EPOCH 2 done: loss 0.1964 - lr: 0.000027 |
|
2023-10-20 09:07:24,832 DEV : loss 0.09642348438501358 - f1-score (micro avg) 0.452 |
|
2023-10-20 09:07:24,844 saving best model |
|
2023-10-20 09:07:24,878 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:07:27,163 epoch 3 - iter 154/1546 - loss 0.16756513 - time (sec): 2.28 - samples/sec: 5008.14 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-20 09:07:29,473 epoch 3 - iter 308/1546 - loss 0.15306420 - time (sec): 4.59 - samples/sec: 5243.02 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-20 09:07:31,759 epoch 3 - iter 462/1546 - loss 0.14828552 - time (sec): 6.88 - samples/sec: 5298.73 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-20 09:07:34,118 epoch 3 - iter 616/1546 - loss 0.15847543 - time (sec): 9.24 - samples/sec: 5349.69 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-20 09:07:36,476 epoch 3 - iter 770/1546 - loss 0.15836327 - time (sec): 11.60 - samples/sec: 5290.33 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-20 09:07:38,811 epoch 3 - iter 924/1546 - loss 0.15990269 - time (sec): 13.93 - samples/sec: 5374.68 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-20 09:07:41,185 epoch 3 - iter 1078/1546 - loss 0.16137110 - time (sec): 16.31 - samples/sec: 5359.58 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-20 09:07:43,553 epoch 3 - iter 1232/1546 - loss 0.16066108 - time (sec): 18.67 - samples/sec: 5353.33 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-20 09:07:45,922 epoch 3 - iter 1386/1546 - loss 0.16038985 - time (sec): 21.04 - samples/sec: 5281.58 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-20 09:07:48,314 epoch 3 - iter 1540/1546 - loss 0.15944440 - time (sec): 23.44 - samples/sec: 5277.46 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-20 09:07:48,405 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:07:48,405 EPOCH 3 done: loss 0.1593 - lr: 0.000023 |
|
2023-10-20 09:07:49,466 DEV : loss 0.0896943062543869 - f1-score (micro avg) 0.5099 |
|
2023-10-20 09:07:49,477 saving best model |
|
2023-10-20 09:07:49,512 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:07:51,999 epoch 4 - iter 154/1546 - loss 0.15769143 - time (sec): 2.49 - samples/sec: 5070.16 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-20 09:07:54,306 epoch 4 - iter 308/1546 - loss 0.14433493 - time (sec): 4.79 - samples/sec: 5182.99 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-20 09:07:56,624 epoch 4 - iter 462/1546 - loss 0.15210379 - time (sec): 7.11 - samples/sec: 5030.59 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-20 09:07:58,995 epoch 4 - iter 616/1546 - loss 0.15295122 - time (sec): 9.48 - samples/sec: 5109.69 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-20 09:08:01,502 epoch 4 - iter 770/1546 - loss 0.15434920 - time (sec): 11.99 - samples/sec: 5061.65 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-20 09:08:03,980 epoch 4 - iter 924/1546 - loss 0.15086170 - time (sec): 14.47 - samples/sec: 5039.38 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-20 09:08:06,343 epoch 4 - iter 1078/1546 - loss 0.14718520 - time (sec): 16.83 - samples/sec: 5118.47 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-20 09:08:08,710 epoch 4 - iter 1232/1546 - loss 0.14718548 - time (sec): 19.20 - samples/sec: 5163.73 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-20 09:08:11,129 epoch 4 - iter 1386/1546 - loss 0.14835468 - time (sec): 21.62 - samples/sec: 5143.52 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-20 09:08:13,586 epoch 4 - iter 1540/1546 - loss 0.14859824 - time (sec): 24.07 - samples/sec: 5144.69 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-20 09:08:13,668 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:08:13,669 EPOCH 4 done: loss 0.1485 - lr: 0.000020 |
|
2023-10-20 09:08:14,728 DEV : loss 0.08766192942857742 - f1-score (micro avg) 0.5747 |
|
2023-10-20 09:08:14,738 saving best model |
|
2023-10-20 09:08:14,771 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:08:17,083 epoch 5 - iter 154/1546 - loss 0.12769709 - time (sec): 2.31 - samples/sec: 5337.69 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-20 09:08:19,442 epoch 5 - iter 308/1546 - loss 0.13094414 - time (sec): 4.67 - samples/sec: 5128.06 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-20 09:08:21,893 epoch 5 - iter 462/1546 - loss 0.13300073 - time (sec): 7.12 - samples/sec: 5051.31 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-20 09:08:24,311 epoch 5 - iter 616/1546 - loss 0.13238196 - time (sec): 9.54 - samples/sec: 5160.07 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-20 09:08:26,690 epoch 5 - iter 770/1546 - loss 0.13428010 - time (sec): 11.92 - samples/sec: 5219.68 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-20 09:08:29,059 epoch 5 - iter 924/1546 - loss 0.13243603 - time (sec): 14.29 - samples/sec: 5203.51 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-20 09:08:31,427 epoch 5 - iter 1078/1546 - loss 0.13095759 - time (sec): 16.66 - samples/sec: 5184.83 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-20 09:08:33,859 epoch 5 - iter 1232/1546 - loss 0.13464576 - time (sec): 19.09 - samples/sec: 5174.45 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-20 09:08:36,291 epoch 5 - iter 1386/1546 - loss 0.13608735 - time (sec): 21.52 - samples/sec: 5199.06 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-20 09:08:38,636 epoch 5 - iter 1540/1546 - loss 0.13658634 - time (sec): 23.86 - samples/sec: 5187.95 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-20 09:08:38,723 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:08:38,723 EPOCH 5 done: loss 0.1363 - lr: 0.000017 |
|
2023-10-20 09:08:39,806 DEV : loss 0.08561883121728897 - f1-score (micro avg) 0.588 |
|
2023-10-20 09:08:39,817 saving best model |
|
2023-10-20 09:08:39,849 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:08:42,216 epoch 6 - iter 154/1546 - loss 0.10872815 - time (sec): 2.37 - samples/sec: 5045.83 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-20 09:08:44,590 epoch 6 - iter 308/1546 - loss 0.11960726 - time (sec): 4.74 - samples/sec: 5003.13 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-20 09:08:46,973 epoch 6 - iter 462/1546 - loss 0.13441599 - time (sec): 7.12 - samples/sec: 5024.85 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-20 09:08:49,131 epoch 6 - iter 616/1546 - loss 0.13530848 - time (sec): 9.28 - samples/sec: 5247.57 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-20 09:08:51,443 epoch 6 - iter 770/1546 - loss 0.14041532 - time (sec): 11.59 - samples/sec: 5202.06 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-20 09:08:53,814 epoch 6 - iter 924/1546 - loss 0.13528183 - time (sec): 13.96 - samples/sec: 5249.94 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-20 09:08:56,177 epoch 6 - iter 1078/1546 - loss 0.13227395 - time (sec): 16.33 - samples/sec: 5253.32 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-20 09:08:58,522 epoch 6 - iter 1232/1546 - loss 0.13103216 - time (sec): 18.67 - samples/sec: 5297.52 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-20 09:09:00,904 epoch 6 - iter 1386/1546 - loss 0.12991021 - time (sec): 21.05 - samples/sec: 5249.58 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-20 09:09:03,307 epoch 6 - iter 1540/1546 - loss 0.13230097 - time (sec): 23.46 - samples/sec: 5276.90 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-20 09:09:03,406 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:09:03,406 EPOCH 6 done: loss 0.1321 - lr: 0.000013 |
|
2023-10-20 09:09:04,491 DEV : loss 0.08819162845611572 - f1-score (micro avg) 0.6039 |
|
2023-10-20 09:09:04,502 saving best model |
|
2023-10-20 09:09:04,536 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:09:07,006 epoch 7 - iter 154/1546 - loss 0.11723149 - time (sec): 2.47 - samples/sec: 5437.44 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-20 09:09:09,369 epoch 7 - iter 308/1546 - loss 0.11870071 - time (sec): 4.83 - samples/sec: 5146.87 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-20 09:09:11,751 epoch 7 - iter 462/1546 - loss 0.11534894 - time (sec): 7.21 - samples/sec: 5251.42 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-20 09:09:14,090 epoch 7 - iter 616/1546 - loss 0.12686834 - time (sec): 9.55 - samples/sec: 5179.35 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-20 09:09:16,467 epoch 7 - iter 770/1546 - loss 0.12653362 - time (sec): 11.93 - samples/sec: 5208.68 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-20 09:09:18,835 epoch 7 - iter 924/1546 - loss 0.12736348 - time (sec): 14.30 - samples/sec: 5228.86 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-20 09:09:21,220 epoch 7 - iter 1078/1546 - loss 0.12994923 - time (sec): 16.68 - samples/sec: 5236.09 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-20 09:09:23,602 epoch 7 - iter 1232/1546 - loss 0.12775059 - time (sec): 19.06 - samples/sec: 5245.07 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-20 09:09:26,092 epoch 7 - iter 1386/1546 - loss 0.12761801 - time (sec): 21.56 - samples/sec: 5181.77 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-20 09:09:28,442 epoch 7 - iter 1540/1546 - loss 0.12680413 - time (sec): 23.91 - samples/sec: 5179.14 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-20 09:09:28,532 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:09:28,532 EPOCH 7 done: loss 0.1266 - lr: 0.000010 |
|
2023-10-20 09:09:29,594 DEV : loss 0.08853663504123688 - f1-score (micro avg) 0.5949 |
|
2023-10-20 09:09:29,605 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:09:31,902 epoch 8 - iter 154/1546 - loss 0.10560379 - time (sec): 2.30 - samples/sec: 5298.60 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-20 09:09:34,349 epoch 8 - iter 308/1546 - loss 0.12872546 - time (sec): 4.74 - samples/sec: 5252.11 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-20 09:09:36,720 epoch 8 - iter 462/1546 - loss 0.13119470 - time (sec): 7.11 - samples/sec: 5187.26 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-20 09:09:39,051 epoch 8 - iter 616/1546 - loss 0.12502116 - time (sec): 9.45 - samples/sec: 5215.10 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-20 09:09:41,413 epoch 8 - iter 770/1546 - loss 0.12112850 - time (sec): 11.81 - samples/sec: 5278.34 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-20 09:09:43,859 epoch 8 - iter 924/1546 - loss 0.12463981 - time (sec): 14.25 - samples/sec: 5309.41 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-20 09:09:46,162 epoch 8 - iter 1078/1546 - loss 0.12423906 - time (sec): 16.56 - samples/sec: 5263.09 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-20 09:09:48,482 epoch 8 - iter 1232/1546 - loss 0.12613505 - time (sec): 18.88 - samples/sec: 5225.70 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-20 09:09:50,918 epoch 8 - iter 1386/1546 - loss 0.12382574 - time (sec): 21.31 - samples/sec: 5199.99 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-20 09:09:53,355 epoch 8 - iter 1540/1546 - loss 0.12351904 - time (sec): 23.75 - samples/sec: 5219.53 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-20 09:09:53,443 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:09:53,443 EPOCH 8 done: loss 0.1233 - lr: 0.000007 |
|
2023-10-20 09:09:54,533 DEV : loss 0.09046540409326553 - f1-score (micro avg) 0.6062 |
|
2023-10-20 09:09:54,545 saving best model |
|
2023-10-20 09:09:54,583 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:09:56,973 epoch 9 - iter 154/1546 - loss 0.12165880 - time (sec): 2.39 - samples/sec: 5120.10 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-20 09:09:59,354 epoch 9 - iter 308/1546 - loss 0.11720672 - time (sec): 4.77 - samples/sec: 5174.59 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-20 09:10:01,652 epoch 9 - iter 462/1546 - loss 0.10950130 - time (sec): 7.07 - samples/sec: 5397.14 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-20 09:10:03,813 epoch 9 - iter 616/1546 - loss 0.11189291 - time (sec): 9.23 - samples/sec: 5423.30 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-20 09:10:05,958 epoch 9 - iter 770/1546 - loss 0.11566478 - time (sec): 11.37 - samples/sec: 5567.08 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-20 09:10:08,178 epoch 9 - iter 924/1546 - loss 0.12061263 - time (sec): 13.59 - samples/sec: 5535.08 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-20 09:10:10,589 epoch 9 - iter 1078/1546 - loss 0.12124174 - time (sec): 16.01 - samples/sec: 5494.20 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-20 09:10:12,970 epoch 9 - iter 1232/1546 - loss 0.12235867 - time (sec): 18.39 - samples/sec: 5445.86 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-20 09:10:15,300 epoch 9 - iter 1386/1546 - loss 0.12037485 - time (sec): 20.72 - samples/sec: 5384.65 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-20 09:10:17,700 epoch 9 - iter 1540/1546 - loss 0.11940803 - time (sec): 23.12 - samples/sec: 5359.10 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-20 09:10:17,787 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:10:17,787 EPOCH 9 done: loss 0.1193 - lr: 0.000003 |
|
2023-10-20 09:10:18,860 DEV : loss 0.09189001470804214 - f1-score (micro avg) 0.6117 |
|
2023-10-20 09:10:18,871 saving best model |
|
2023-10-20 09:10:18,902 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:10:21,118 epoch 10 - iter 154/1546 - loss 0.12687807 - time (sec): 2.22 - samples/sec: 5422.87 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-20 09:10:23,371 epoch 10 - iter 308/1546 - loss 0.12068074 - time (sec): 4.47 - samples/sec: 5568.10 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-20 09:10:25,575 epoch 10 - iter 462/1546 - loss 0.12030419 - time (sec): 6.67 - samples/sec: 5713.42 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-20 09:10:27,713 epoch 10 - iter 616/1546 - loss 0.11605372 - time (sec): 8.81 - samples/sec: 5773.02 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-20 09:10:30,067 epoch 10 - iter 770/1546 - loss 0.11801476 - time (sec): 11.16 - samples/sec: 5649.92 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-20 09:10:32,425 epoch 10 - iter 924/1546 - loss 0.11442404 - time (sec): 13.52 - samples/sec: 5565.71 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-20 09:10:34,785 epoch 10 - iter 1078/1546 - loss 0.11142937 - time (sec): 15.88 - samples/sec: 5542.35 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-20 09:10:37,138 epoch 10 - iter 1232/1546 - loss 0.11098365 - time (sec): 18.24 - samples/sec: 5458.00 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-20 09:10:39,490 epoch 10 - iter 1386/1546 - loss 0.11535574 - time (sec): 20.59 - samples/sec: 5430.96 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-20 09:10:41,875 epoch 10 - iter 1540/1546 - loss 0.11683715 - time (sec): 22.97 - samples/sec: 5397.34 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-20 09:10:41,963 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:10:41,963 EPOCH 10 done: loss 0.1167 - lr: 0.000000 |
|
2023-10-20 09:10:43,032 DEV : loss 0.0931963250041008 - f1-score (micro avg) 0.6154 |
|
2023-10-20 09:10:43,043 saving best model |
|
2023-10-20 09:10:43,106 ---------------------------------------------------------------------------------------------------- |
|
2023-10-20 09:10:43,106 Loading model from best epoch ... |
|
2023-10-20 09:10:43,182 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-BUILDING, B-BUILDING, E-BUILDING, I-BUILDING, S-STREET, B-STREET, E-STREET, I-STREET |
|
2023-10-20 09:10:46,022 |
|
Results: |
|
- F-score (micro) 0.5552 |
|
- F-score (macro) 0.2272 |
|
- Accuracy 0.4001 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
LOC 0.6223 0.6321 0.6272 946 |
|
BUILDING 0.1333 0.0108 0.0200 185 |
|
STREET 0.5000 0.0179 0.0345 56 |
|
|
|
micro avg 0.6145 0.5063 0.5552 1187 |
|
macro avg 0.4185 0.2203 0.2272 1187 |
|
weighted avg 0.5403 0.5063 0.5046 1187 |
|
|
|
2023-10-20 09:10:46,022 ---------------------------------------------------------------------------------------------------- |
|
|