File size: 5,154 Bytes
f0cadde 0ac09fe 62cd4ef 0ac09fe 62cd4ef f0cadde 62cd4ef f0cadde 62cd4ef f0cadde 62cd4ef f0cadde 62cd4ef f0cadde |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 |
2022-04-03 20:45:35,951 ----------------------------------------------------------------------------------------------------
2022-04-03 20:45:35,958 Model: "SequenceTagger(
(embeddings): StackedEmbeddings(
(list_embedding_0): WordEmbeddings('fa')
(list_embedding_1): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(5105, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=5105, bias=True)
)
)
(list_embedding_2): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(5105, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=5105, bias=True)
)
)
)
(word_dropout): WordDropout(p=0.05)
(locked_dropout): LockedDropout(p=0.5)
(embedding2nn): Linear(in_features=4396, out_features=4396, bias=True)
(rnn): LSTM(4396, 256, batch_first=True, bidirectional=True)
(linear): Linear(in_features=512, out_features=17, bias=True)
(beta): 1.0
(weights): None
(weight_tensor) None
)"
2022-04-03 20:45:35,962 ----------------------------------------------------------------------------------------------------
2022-04-03 20:45:35,967 Corpus: "Corpus: 23060 train + 4070 dev + 4150 test sentences"
2022-04-03 20:45:35,971 ----------------------------------------------------------------------------------------------------
2022-04-03 20:45:35,973 Parameters:
2022-04-03 20:45:35,975 - learning_rate: "0.05"
2022-04-03 20:45:35,977 - mini_batch_size: "4"
2022-04-03 20:45:35,980 - patience: "3"
2022-04-03 20:45:35,982 - anneal_factor: "0.5"
2022-04-03 20:45:35,985 - max_epochs: "40"
2022-04-03 20:45:35,988 - shuffle: "True"
2022-04-03 20:45:35,991 - train_with_dev: "False"
2022-04-03 20:45:35,996 - batch_growth_annealing: "False"
2022-04-03 20:45:35,998 ----------------------------------------------------------------------------------------------------
2022-04-03 20:45:36,001 Model training base path: "/content/gdrive/MyDrive/project/data/ner/model2"
2022-04-03 20:45:36,004 ----------------------------------------------------------------------------------------------------
2022-04-03 20:45:36,006 Device: cuda:0
2022-04-03 20:45:36,007 ----------------------------------------------------------------------------------------------------
2022-04-03 20:45:36,009 Embeddings storage mode: none
2022-04-03 20:45:36,559 ----------------------------------------------------------------------------------------------------
2022-04-03 20:49:55,248 epoch 40 - iter 576/5765 - loss 0.05129424 - samples/sec: 8.91 - lr: 0.050000
2022-04-03 20:54:12,817 epoch 40 - iter 1152/5765 - loss 0.05045109 - samples/sec: 8.98 - lr: 0.050000
2022-04-03 20:58:35,265 epoch 40 - iter 1728/5765 - loss 0.05189116 - samples/sec: 8.81 - lr: 0.050000
2022-04-03 21:03:11,325 epoch 40 - iter 2304/5765 - loss 0.05151945 - samples/sec: 8.38 - lr: 0.050000
2022-04-03 21:07:46,802 epoch 40 - iter 2880/5765 - loss 0.05105861 - samples/sec: 8.40 - lr: 0.050000
2022-04-03 21:12:16,061 epoch 40 - iter 3456/5765 - loss 0.05160696 - samples/sec: 8.59 - lr: 0.050000
2022-04-03 21:16:46,997 epoch 40 - iter 4032/5765 - loss 0.05158343 - samples/sec: 8.54 - lr: 0.050000
2022-04-03 21:21:12,246 epoch 40 - iter 4608/5765 - loss 0.05160290 - samples/sec: 8.72 - lr: 0.050000
2022-04-03 21:25:34,335 epoch 40 - iter 5184/5765 - loss 0.05188003 - samples/sec: 8.83 - lr: 0.050000
2022-04-03 21:30:00,227 epoch 40 - iter 5760/5765 - loss 0.05183257 - samples/sec: 8.70 - lr: 0.050000
2022-04-03 21:30:03,367 ----------------------------------------------------------------------------------------------------
2022-04-03 21:30:03,370 EPOCH 40 done: loss 0.0519 - lr 0.0500000
2022-04-03 21:36:15,762 DEV : loss 0.05283118411898613 - f1-score (micro avg) 0.828
2022-04-03 21:36:15,836 BAD EPOCHS (no improvement): 0
2022-04-03 21:36:18,064 saving best model
2022-04-03 21:36:29,253 ----------------------------------------------------------------------------------------------------
2022-04-03 21:36:29,271 loading file /content/gdrive/MyDrive/project/data/ner/model2/best-model.pt
2022-04-03 21:43:00,026 0.8616 0.82 0.8403 0.7357
2022-04-03 21:43:00,030
Results:
- F-score (micro) 0.8403
- F-score (macro) 0.8656
- Accuracy 0.7357
By class:
precision recall f1-score support
LOC 0.8789 0.8589 0.8688 4083
ORG 0.8390 0.7653 0.8005 3166
PER 0.8395 0.8169 0.8280 2741
DAT 0.8648 0.7957 0.8288 1150
MON 0.9758 0.9020 0.9374 357
TIM 0.8500 0.8193 0.8344 166
PCT 0.9615 0.9615 0.9615 156
micro avg 0.8616 0.8200 0.8403 11819
macro avg 0.8871 0.8456 0.8656 11819
weighted avg 0.8613 0.8200 0.8400 11819
samples avg 0.7357 0.7357 0.7357 11819
2022-04-03 21:43:00,035 ----------------------------------------------------------------------------------------------------
|