facebook-xlm-roberta-large-finetuned-ner-vlsp2021-3090-15June-2
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1066
- Atetime: {'precision': 0.8667972575905974, 'recall': 0.8832335329341318, 'f1': 0.874938210578349, 'number': 1002}
- Ddress: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 29}
- Erson: {'precision': 0.9608050847457628, 'recall': 0.9552395997893628, 'f1': 0.9580142593081595, 'number': 1899}
- Ersontype: {'precision': 0.7647058823529411, 'recall': 0.7412280701754386, 'f1': 0.7527839643652561, 'number': 684}
- Honenumber: {'precision': 0.8181818181818182, 'recall': 1.0, 'f1': 0.9, 'number': 9}
- Iscellaneous: {'precision': 0.5471698113207547, 'recall': 0.5471698113207547, 'f1': 0.5471698113207547, 'number': 159}
- Mail: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 51}
- Ocation: {'precision': 0.8797276853252648, 'recall': 0.8939277478862413, 'f1': 0.8867708730461304, 'number': 1301}
- P: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11}
- Rl: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 15}
- Roduct: {'precision': 0.7075163398692811, 'recall': 0.6928, 'f1': 0.7000808407437348, 'number': 625}
- Overall Precision: 0.8654
- Overall Recall: 0.8650
- Overall F1: 0.8652
- Overall Accuracy: 0.9804
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Atetime | Ddress | Erson | Ersontype | Honenumber | Iscellaneous | Ocation | P | Rl | Roduct | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.0937 | 1.0 | 3263 | 0.0795 | {'precision': 0.8605108055009824, 'recall': 0.874251497005988, 'f1': 0.8673267326732673, 'number': 1002} | {'precision': 0.5277777777777778, 'recall': 0.6551724137931034, 'f1': 0.5846153846153845, 'number': 29} | {'precision': 0.9568460309003729, 'recall': 0.9457609268035808, 'f1': 0.9512711864406779, 'number': 1899} | {'precision': 0.7157584683357879, 'recall': 0.7105263157894737, 'f1': 0.7131327953044755, 'number': 684} | {'precision': 0.8181818181818182, 'recall': 1.0, 'f1': 0.9, 'number': 9} | {'precision': 0.5068493150684932, 'recall': 0.46540880503144655, 'f1': 0.4852459016393443, 'number': 159} | {'precision': 0.8421052631578947, 'recall': 0.9411764705882353, 'f1': 0.8888888888888888, 'number': 51} | {'precision': 0.8387799564270153, 'recall': 0.8877786318216756, 'f1': 0.8625840179238238, 'number': 1301} | {'precision': 0.7, 'recall': 0.6363636363636364, 'f1': 0.6666666666666666, 'number': 11} | {'precision': 0.42857142857142855, 'recall': 0.6, 'f1': 0.5, 'number': 15} | {'precision': 0.5705426356589147, 'recall': 0.5888, 'f1': 0.5795275590551181, 'number': 625} | 0.8247 | 0.8379 | 0.8312 | 0.9776 |
0.0607 | 2.0 | 6526 | 0.0845 | {'precision': 0.8528265107212476, 'recall': 0.873253493013972, 'f1': 0.8629191321499013, 'number': 1002} | {'precision': 0.6842105263157895, 'recall': 0.896551724137931, 'f1': 0.7761194029850746, 'number': 29} | {'precision': 0.9515534491837809, 'recall': 0.9515534491837809, 'f1': 0.9515534491837809, 'number': 1899} | {'precision': 0.8087649402390438, 'recall': 0.5935672514619883, 'f1': 0.684654300168634, 'number': 684} | {'precision': 0.8, 'recall': 0.8888888888888888, 'f1': 0.8421052631578948, 'number': 9} | {'precision': 0.5735294117647058, 'recall': 0.49056603773584906, 'f1': 0.5288135593220338, 'number': 159} | {'precision': 1.0, 'recall': 0.9411764705882353, 'f1': 0.9696969696969697, 'number': 51} | {'precision': 0.847970479704797, 'recall': 0.8831667947732513, 'f1': 0.8652108433734941, 'number': 1301} | {'precision': 0.7777777777777778, 'recall': 0.6363636363636364, 'f1': 0.7000000000000001, 'number': 11} | {'precision': 0.7647058823529411, 'recall': 0.8666666666666667, 'f1': 0.8125, 'number': 15} | {'precision': 0.59, 'recall': 0.5664, 'f1': 0.5779591836734693, 'number': 625} | 0.8459 | 0.8247 | 0.8352 | 0.9780 |
0.0399 | 3.0 | 9789 | 0.0881 | {'precision': 0.8660714285714286, 'recall': 0.8712574850299402, 'f1': 0.8686567164179105, 'number': 1002} | {'precision': 0.9, 'recall': 0.9310344827586207, 'f1': 0.9152542372881356, 'number': 29} | {'precision': 0.9576943416181914, 'recall': 0.9536598209583992, 'f1': 0.9556728232189974, 'number': 1899} | {'precision': 0.7612179487179487, 'recall': 0.6944444444444444, 'f1': 0.7262996941896024, 'number': 684} | {'precision': 0.9, 'recall': 1.0, 'f1': 0.9473684210526316, 'number': 9} | {'precision': 0.4723618090452261, 'recall': 0.5911949685534591, 'f1': 0.5251396648044693, 'number': 159} | {'precision': 1.0, 'recall': 0.9411764705882353, 'f1': 0.9696969696969697, 'number': 51} | {'precision': 0.8602230483271376, 'recall': 0.889315910837817, 'f1': 0.8745275888133032, 'number': 1301} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} | {'precision': 0.5789473684210527, 'recall': 0.7333333333333333, 'f1': 0.6470588235294117, 'number': 15} | {'precision': 0.7083333333333334, 'recall': 0.68, 'f1': 0.6938775510204083, 'number': 625} | 0.8541 | 0.8541 | 0.8541 | 0.9795 |
0.0253 | 4.0 | 13052 | 0.0961 | {'precision': 0.8686567164179104, 'recall': 0.8712574850299402, 'f1': 0.8699551569506726, 'number': 1002} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 29} | {'precision': 0.9528054535920294, 'recall': 0.9568193786203265, 'f1': 0.9548081975827641, 'number': 1899} | {'precision': 0.7784911717495987, 'recall': 0.7090643274853801, 'f1': 0.7421576128538638, 'number': 684} | {'precision': 0.8181818181818182, 'recall': 1.0, 'f1': 0.9, 'number': 9} | {'precision': 0.4712041884816754, 'recall': 0.5660377358490566, 'f1': 0.5142857142857142, 'number': 159} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 51} | {'precision': 0.8851963746223565, 'recall': 0.9008455034588778, 'f1': 0.8929523809523809, 'number': 1301} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} | {'precision': 0.875, 'recall': 0.9333333333333333, 'f1': 0.9032258064516129, 'number': 15} | {'precision': 0.6703125, 'recall': 0.6864, 'f1': 0.6782608695652174, 'number': 625} | 0.8574 | 0.8608 | 0.8591 | 0.9799 |
0.0161 | 5.0 | 16315 | 0.1066 | {'precision': 0.8667972575905974, 'recall': 0.8832335329341318, 'f1': 0.874938210578349, 'number': 1002} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 29} | {'precision': 0.9608050847457628, 'recall': 0.9552395997893628, 'f1': 0.9580142593081595, 'number': 1899} | {'precision': 0.7647058823529411, 'recall': 0.7412280701754386, 'f1': 0.7527839643652561, 'number': 684} | {'precision': 0.8181818181818182, 'recall': 1.0, 'f1': 0.9, 'number': 9} | {'precision': 0.5471698113207547, 'recall': 0.5471698113207547, 'f1': 0.5471698113207547, 'number': 159} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 51} | {'precision': 0.8797276853252648, 'recall': 0.8939277478862413, 'f1': 0.8867708730461304, 'number': 1301} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 15} | {'precision': 0.7075163398692811, 'recall': 0.6928, 'f1': 0.7000808407437348, 'number': 625} | 0.8654 | 0.8650 | 0.8652 | 0.9804 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Kudod/facebook-xlm-roberta-large-finetuned-ner-vlsp2021-3090-15June-2
Base model
FacebookAI/xlm-roberta-large