Kudod commited on
Commit
a255b66
1 Parent(s): f77a593

End of training

Browse files
Files changed (1) hide show
  1. README.md +56 -0
README.md ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: mit
4
+ base_model: vinai/phobert-large
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: grab-ner-ghtk-ai-fluent-segmented-21-label-new-data-3090-6Obt-1
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # grab-ner-ghtk-ai-fluent-segmented-21-label-new-data-3090-6Obt-1
16
+
17
+ This model is a fine-tuned version of [vinai/phobert-large](https://huggingface.co/vinai/phobert-large) on the None dataset.
18
+
19
+ ## Model description
20
+
21
+ More information needed
22
+
23
+ ## Intended uses & limitations
24
+
25
+ More information needed
26
+
27
+ ## Training and evaluation data
28
+
29
+ More information needed
30
+
31
+ ## Training procedure
32
+
33
+ ### Training hyperparameters
34
+
35
+ The following hyperparameters were used during training:
36
+ - learning_rate: 2.5e-05
37
+ - train_batch_size: 8
38
+ - eval_batch_size: 8
39
+ - seed: 42
40
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
+ - lr_scheduler_type: linear
42
+ - num_epochs: 1
43
+
44
+ ### Training results
45
+
46
+ | Training Loss | Epoch | Step | Validation Loss | Ho | Hoảng thời gian | Háng trừu tượng | Hông tin ctt | Hụ cấp | Hứ | Iấy tờ | Iền cụ thể | Iền trừu tượng | Ã số thuế | Ã đơn | Ình thức làm việc | Ông | Ương | Ị trí | Ố công | Ố giờ | Ố điểm | Ố đơn | Ợt | Ỷ lệ | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
47
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:----------------------------------------------------------:|:---------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:----------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:----------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------:|:----------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:----------------------------------------------------------:|:----------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
48
+ | No log | 1.0 | 147 | 0.4119 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 10} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 6} | {'precision': 0.5657894736842105, 'recall': 0.6825396825396826, 'f1': 0.6187050359712231, 'number': 63} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 9} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 8} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 31} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.26865671641791045, 'recall': 0.43902439024390244, 'f1': 0.3333333333333333, 'number': 82} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 54} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 16} | {'precision': 0.6808510638297872, 'recall': 0.9411764705882353, 'f1': 0.7901234567901235, 'number': 238} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 42} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 17} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | 0.5622 | 0.4864 | 0.5215 | 0.8912 |
49
+
50
+
51
+ ### Framework versions
52
+
53
+ - Transformers 4.44.2
54
+ - Pytorch 2.4.1+cu121
55
+ - Datasets 3.0.0
56
+ - Tokenizers 0.19.1