joe611 commited on
Commit
bf1cc74
1 Parent(s): 64087f9

End of training

Browse files
README.md ADDED
@@ -0,0 +1,135 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: facebook/detr-resnet-50
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: chickens-60-epoch-1000-images-aug
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # chickens-60-epoch-1000-images-aug
16
+
17
+ This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.2271
20
+ - Map: 0.8302
21
+ - Map 50: 0.9765
22
+ - Map 75: 0.9394
23
+ - Map Small: 0.3768
24
+ - Map Medium: 0.8372
25
+ - Map Large: 0.9097
26
+ - Mar 1: 0.3092
27
+ - Mar 10: 0.8633
28
+ - Mar 100: 0.8673
29
+ - Mar Small: 0.4523
30
+ - Mar Medium: 0.8825
31
+ - Mar Large: 0.9469
32
+ - Map Chicken: 0.8148
33
+ - Mar 100 Chicken: 0.852
34
+ - Map Duck: 0.8005
35
+ - Mar 100 Duck: 0.832
36
+ - Map Plant: 0.8752
37
+ - Mar 100 Plant: 0.9179
38
+
39
+ ## Model description
40
+
41
+ More information needed
42
+
43
+ ## Intended uses & limitations
44
+
45
+ More information needed
46
+
47
+ ## Training and evaluation data
48
+
49
+ More information needed
50
+
51
+ ## Training procedure
52
+
53
+ ### Training hyperparameters
54
+
55
+ The following hyperparameters were used during training:
56
+ - learning_rate: 1e-05
57
+ - train_batch_size: 2
58
+ - eval_batch_size: 8
59
+ - seed: 42
60
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
61
+ - lr_scheduler_type: cosine
62
+ - num_epochs: 60
63
+
64
+ ### Training results
65
+
66
+ | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
67
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
68
+ | 1.4444 | 1.0 | 500 | 1.2809 | 0.1687 | 0.2287 | 0.1896 | 0.0084 | 0.082 | 0.6067 | 0.0776 | 0.2271 | 0.2799 | 0.0955 | 0.2549 | 0.8548 | 0.0137 | 0.0302 | 0.0015 | 0.0041 | 0.4908 | 0.8055 |
69
+ | 1.161 | 2.0 | 1000 | 1.0354 | 0.2627 | 0.3469 | 0.3021 | 0.0103 | 0.1913 | 0.7773 | 0.1133 | 0.3438 | 0.3635 | 0.1733 | 0.3341 | 0.8498 | 0.1001 | 0.2751 | 0.003 | 0.0186 | 0.6849 | 0.7968 |
70
+ | 0.9626 | 3.0 | 1500 | 0.9506 | 0.3524 | 0.4873 | 0.3912 | 0.064 | 0.3292 | 0.7269 | 0.1438 | 0.4596 | 0.4709 | 0.2636 | 0.4525 | 0.795 | 0.3214 | 0.5764 | 0.058 | 0.0804 | 0.6779 | 0.7559 |
71
+ | 0.9326 | 4.0 | 2000 | 0.8692 | 0.3709 | 0.5154 | 0.4364 | 0.0691 | 0.3451 | 0.7301 | 0.1382 | 0.4762 | 0.4846 | 0.1578 | 0.4744 | 0.7854 | 0.4049 | 0.6778 | 0.0261 | 0.033 | 0.6816 | 0.7429 |
72
+ | 0.6659 | 5.0 | 2500 | 0.6837 | 0.413 | 0.5961 | 0.4705 | 0.1159 | 0.3773 | 0.7666 | 0.1422 | 0.4859 | 0.491 | 0.1725 | 0.4766 | 0.8268 | 0.5092 | 0.6778 | 0.0174 | 0.0155 | 0.7122 | 0.7798 |
73
+ | 0.7296 | 6.0 | 3000 | 0.6072 | 0.5372 | 0.7429 | 0.6386 | 0.0559 | 0.5243 | 0.7879 | 0.1926 | 0.5855 | 0.5897 | 0.185 | 0.5818 | 0.8477 | 0.6528 | 0.7191 | 0.2271 | 0.2546 | 0.7316 | 0.7954 |
74
+ | 0.6554 | 7.0 | 3500 | 0.5324 | 0.6558 | 0.9113 | 0.7997 | 0.1299 | 0.6447 | 0.8007 | 0.2578 | 0.7171 | 0.7222 | 0.239 | 0.7284 | 0.8607 | 0.6517 | 0.7164 | 0.5665 | 0.6371 | 0.7491 | 0.813 |
75
+ | 0.5715 | 8.0 | 4000 | 0.4872 | 0.6867 | 0.9358 | 0.8206 | 0.1134 | 0.6818 | 0.7926 | 0.2647 | 0.7397 | 0.7454 | 0.3002 | 0.7551 | 0.851 | 0.7093 | 0.7609 | 0.603 | 0.6639 | 0.7478 | 0.8115 |
76
+ | 0.5274 | 9.0 | 4500 | 0.4499 | 0.7146 | 0.9336 | 0.8676 | 0.1682 | 0.7109 | 0.8252 | 0.2755 | 0.7668 | 0.7717 | 0.2519 | 0.7833 | 0.8757 | 0.7187 | 0.7756 | 0.6537 | 0.7103 | 0.7715 | 0.8291 |
77
+ | 0.5099 | 10.0 | 5000 | 0.4246 | 0.6996 | 0.9397 | 0.8741 | 0.1082 | 0.6981 | 0.8265 | 0.2724 | 0.7506 | 0.7569 | 0.2267 | 0.7678 | 0.8778 | 0.6951 | 0.752 | 0.6301 | 0.6887 | 0.7736 | 0.83 |
78
+ | 0.5061 | 11.0 | 5500 | 0.4607 | 0.6805 | 0.9531 | 0.8266 | 0.1386 | 0.6753 | 0.7997 | 0.2683 | 0.7245 | 0.732 | 0.3051 | 0.739 | 0.8464 | 0.6733 | 0.7191 | 0.6168 | 0.6711 | 0.7515 | 0.8058 |
79
+ | 0.4486 | 12.0 | 6000 | 0.3777 | 0.7374 | 0.9521 | 0.8619 | 0.1533 | 0.7431 | 0.8409 | 0.2878 | 0.7829 | 0.7897 | 0.279 | 0.8013 | 0.8958 | 0.7396 | 0.788 | 0.6804 | 0.732 | 0.7923 | 0.8493 |
80
+ | 0.4797 | 13.0 | 6500 | 0.3840 | 0.7362 | 0.9536 | 0.882 | 0.1523 | 0.7366 | 0.8381 | 0.2826 | 0.7809 | 0.7859 | 0.2932 | 0.8036 | 0.8791 | 0.7586 | 0.8022 | 0.66 | 0.7124 | 0.79 | 0.8432 |
81
+ | 0.4406 | 14.0 | 7000 | 0.3612 | 0.7311 | 0.954 | 0.8672 | 0.1824 | 0.7351 | 0.8355 | 0.2865 | 0.7769 | 0.7809 | 0.3004 | 0.7962 | 0.8766 | 0.7316 | 0.78 | 0.67 | 0.7237 | 0.7918 | 0.8389 |
82
+ | 0.4381 | 15.0 | 7500 | 0.3645 | 0.7384 | 0.9535 | 0.886 | 0.1814 | 0.7373 | 0.863 | 0.2885 | 0.7863 | 0.7908 | 0.3093 | 0.8026 | 0.9084 | 0.7349 | 0.7822 | 0.6638 | 0.7227 | 0.8164 | 0.8674 |
83
+ | 0.4486 | 16.0 | 8000 | 0.3342 | 0.7556 | 0.9581 | 0.8989 | 0.1665 | 0.7565 | 0.86 | 0.2912 | 0.7947 | 0.8033 | 0.3473 | 0.814 | 0.9004 | 0.7666 | 0.8049 | 0.6878 | 0.7443 | 0.8124 | 0.8608 |
84
+ | 0.4384 | 17.0 | 8500 | 0.3604 | 0.745 | 0.9498 | 0.8935 | 0.1927 | 0.7493 | 0.8457 | 0.2883 | 0.7897 | 0.7967 | 0.3438 | 0.8084 | 0.8958 | 0.7451 | 0.7951 | 0.686 | 0.7381 | 0.804 | 0.8568 |
85
+ | 0.4309 | 18.0 | 9000 | 0.3503 | 0.7318 | 0.9543 | 0.8896 | 0.2168 | 0.7352 | 0.8469 | 0.2808 | 0.7824 | 0.7888 | 0.3881 | 0.7978 | 0.8967 | 0.7193 | 0.7689 | 0.667 | 0.7351 | 0.809 | 0.8625 |
86
+ | 0.4402 | 19.0 | 9500 | 0.3163 | 0.7693 | 0.9562 | 0.9143 | 0.2008 | 0.7762 | 0.8668 | 0.2952 | 0.8127 | 0.8208 | 0.3835 | 0.8318 | 0.9105 | 0.7577 | 0.8018 | 0.7254 | 0.7856 | 0.8249 | 0.8749 |
87
+ | 0.3739 | 20.0 | 10000 | 0.2994 | 0.7837 | 0.9563 | 0.9073 | 0.2549 | 0.7792 | 0.8739 | 0.3008 | 0.8237 | 0.831 | 0.4373 | 0.8354 | 0.9167 | 0.7881 | 0.8324 | 0.7327 | 0.7825 | 0.8301 | 0.8781 |
88
+ | 0.3943 | 21.0 | 10500 | 0.2955 | 0.7855 | 0.9645 | 0.9146 | 0.2567 | 0.7939 | 0.8751 | 0.2987 | 0.8231 | 0.8274 | 0.3778 | 0.8442 | 0.9105 | 0.7776 | 0.8227 | 0.7411 | 0.7794 | 0.8377 | 0.8801 |
89
+ | 0.3686 | 22.0 | 11000 | 0.3065 | 0.7778 | 0.9676 | 0.9136 | 0.2991 | 0.7749 | 0.869 | 0.2964 | 0.8192 | 0.826 | 0.4436 | 0.8329 | 0.9092 | 0.7829 | 0.8236 | 0.7229 | 0.7794 | 0.8277 | 0.8749 |
90
+ | 0.3679 | 23.0 | 11500 | 0.3141 | 0.7706 | 0.9626 | 0.9035 | 0.23 | 0.7822 | 0.8717 | 0.2945 | 0.8059 | 0.8127 | 0.3434 | 0.8287 | 0.9105 | 0.7469 | 0.7947 | 0.7305 | 0.768 | 0.8343 | 0.8755 |
91
+ | 0.348 | 24.0 | 12000 | 0.3200 | 0.76 | 0.9652 | 0.9081 | 0.2134 | 0.7668 | 0.8777 | 0.2899 | 0.8019 | 0.81 | 0.3676 | 0.8234 | 0.9167 | 0.7344 | 0.7813 | 0.7078 | 0.766 | 0.8378 | 0.8827 |
92
+ | 0.384 | 25.0 | 12500 | 0.3128 | 0.7665 | 0.9682 | 0.9198 | 0.2723 | 0.7719 | 0.8643 | 0.2923 | 0.8056 | 0.8112 | 0.3345 | 0.8237 | 0.9054 | 0.7448 | 0.7911 | 0.7316 | 0.7742 | 0.8231 | 0.8683 |
93
+ | 0.3814 | 26.0 | 13000 | 0.2909 | 0.7848 | 0.9693 | 0.9247 | 0.2777 | 0.7863 | 0.8762 | 0.2977 | 0.8227 | 0.8282 | 0.3674 | 0.8398 | 0.9176 | 0.7803 | 0.8204 | 0.7407 | 0.7845 | 0.8334 | 0.8795 |
94
+ | 0.333 | 27.0 | 13500 | 0.2687 | 0.7993 | 0.9699 | 0.9289 | 0.2639 | 0.8064 | 0.8781 | 0.3001 | 0.8342 | 0.8411 | 0.4169 | 0.8586 | 0.9176 | 0.8054 | 0.8453 | 0.7515 | 0.7887 | 0.841 | 0.8893 |
95
+ | 0.3533 | 28.0 | 14000 | 0.2771 | 0.799 | 0.9729 | 0.924 | 0.3156 | 0.8003 | 0.8806 | 0.3012 | 0.8326 | 0.8383 | 0.3983 | 0.8501 | 0.9205 | 0.8064 | 0.8431 | 0.7507 | 0.7866 | 0.8401 | 0.8853 |
96
+ | 0.3131 | 29.0 | 14500 | 0.2643 | 0.8031 | 0.9695 | 0.9236 | 0.3254 | 0.8079 | 0.8896 | 0.3002 | 0.839 | 0.844 | 0.4453 | 0.8568 | 0.9289 | 0.813 | 0.8444 | 0.7431 | 0.7887 | 0.8531 | 0.8988 |
97
+ | 0.3354 | 30.0 | 15000 | 0.2846 | 0.7941 | 0.9681 | 0.9167 | 0.241 | 0.802 | 0.8789 | 0.3017 | 0.8308 | 0.8369 | 0.3652 | 0.8543 | 0.9192 | 0.7905 | 0.8338 | 0.7509 | 0.7918 | 0.8411 | 0.8853 |
98
+ | 0.3292 | 31.0 | 15500 | 0.2748 | 0.7903 | 0.9747 | 0.9182 | 0.2757 | 0.7904 | 0.8856 | 0.2975 | 0.8305 | 0.839 | 0.4532 | 0.8483 | 0.9259 | 0.7628 | 0.8116 | 0.7611 | 0.8124 | 0.8468 | 0.8931 |
99
+ | 0.3226 | 32.0 | 16000 | 0.2699 | 0.801 | 0.9673 | 0.915 | 0.2847 | 0.8065 | 0.8933 | 0.3033 | 0.8372 | 0.842 | 0.4025 | 0.8578 | 0.9297 | 0.7939 | 0.8298 | 0.7518 | 0.7969 | 0.8574 | 0.8994 |
100
+ | 0.3251 | 33.0 | 16500 | 0.2737 | 0.8015 | 0.9702 | 0.923 | 0.2881 | 0.8074 | 0.8886 | 0.3001 | 0.8393 | 0.8459 | 0.3864 | 0.8609 | 0.9318 | 0.8008 | 0.8382 | 0.7509 | 0.8 | 0.8527 | 0.8994 |
101
+ | 0.3248 | 34.0 | 17000 | 0.2550 | 0.8027 | 0.9753 | 0.9349 | 0.3638 | 0.8032 | 0.8948 | 0.3017 | 0.8396 | 0.8456 | 0.4866 | 0.8535 | 0.936 | 0.797 | 0.8356 | 0.7512 | 0.7969 | 0.8599 | 0.9043 |
102
+ | 0.334 | 35.0 | 17500 | 0.2558 | 0.8069 | 0.9762 | 0.929 | 0.3566 | 0.8067 | 0.8921 | 0.3014 | 0.8434 | 0.8476 | 0.4678 | 0.856 | 0.9322 | 0.7941 | 0.8356 | 0.7731 | 0.8082 | 0.8536 | 0.8988 |
103
+ | 0.3091 | 36.0 | 18000 | 0.2558 | 0.8059 | 0.9753 | 0.9318 | 0.3609 | 0.8096 | 0.8829 | 0.3038 | 0.8459 | 0.8499 | 0.4648 | 0.8625 | 0.9255 | 0.8031 | 0.844 | 0.7637 | 0.8093 | 0.8508 | 0.8965 |
104
+ | 0.3121 | 37.0 | 18500 | 0.2589 | 0.8018 | 0.9754 | 0.9377 | 0.4216 | 0.8067 | 0.8795 | 0.3019 | 0.8387 | 0.8438 | 0.5091 | 0.856 | 0.9205 | 0.7785 | 0.8249 | 0.7745 | 0.8093 | 0.8525 | 0.8971 |
105
+ | 0.2849 | 38.0 | 19000 | 0.2546 | 0.8063 | 0.9763 | 0.9351 | 0.3452 | 0.8083 | 0.8923 | 0.3014 | 0.8426 | 0.8477 | 0.4366 | 0.8598 | 0.9318 | 0.8002 | 0.84 | 0.7593 | 0.8031 | 0.8593 | 0.9 |
106
+ | 0.3041 | 39.0 | 19500 | 0.2504 | 0.8077 | 0.9767 | 0.9357 | 0.3586 | 0.8122 | 0.8915 | 0.3046 | 0.8413 | 0.8456 | 0.4557 | 0.8583 | 0.9301 | 0.7997 | 0.8351 | 0.7627 | 0.801 | 0.8607 | 0.9006 |
107
+ | 0.2881 | 40.0 | 20000 | 0.2482 | 0.8206 | 0.9758 | 0.9352 | 0.3451 | 0.8233 | 0.8946 | 0.3054 | 0.8519 | 0.8558 | 0.45 | 0.8662 | 0.9351 | 0.8247 | 0.8542 | 0.7777 | 0.8113 | 0.8595 | 0.9017 |
108
+ | 0.275 | 41.0 | 20500 | 0.2482 | 0.8194 | 0.9758 | 0.9381 | 0.4187 | 0.8225 | 0.8947 | 0.3054 | 0.8546 | 0.8592 | 0.5203 | 0.8681 | 0.9339 | 0.806 | 0.8449 | 0.789 | 0.8268 | 0.8631 | 0.9061 |
109
+ | 0.2844 | 42.0 | 21000 | 0.2358 | 0.8181 | 0.976 | 0.9379 | 0.3793 | 0.8228 | 0.899 | 0.3077 | 0.8548 | 0.8592 | 0.4458 | 0.8715 | 0.9393 | 0.8137 | 0.8516 | 0.7783 | 0.8186 | 0.8624 | 0.9075 |
110
+ | 0.3045 | 43.0 | 21500 | 0.2451 | 0.8111 | 0.9759 | 0.9413 | 0.366 | 0.8188 | 0.888 | 0.3036 | 0.8474 | 0.8515 | 0.4544 | 0.8667 | 0.9322 | 0.8011 | 0.8418 | 0.7722 | 0.8072 | 0.8599 | 0.9055 |
111
+ | 0.2861 | 44.0 | 22000 | 0.2326 | 0.8268 | 0.9763 | 0.946 | 0.4139 | 0.8327 | 0.8961 | 0.3087 | 0.8605 | 0.8651 | 0.5144 | 0.8771 | 0.9343 | 0.8171 | 0.8551 | 0.7976 | 0.832 | 0.8657 | 0.9084 |
112
+ | 0.3017 | 45.0 | 22500 | 0.2336 | 0.8234 | 0.9758 | 0.9394 | 0.3642 | 0.8273 | 0.8987 | 0.3078 | 0.8578 | 0.8621 | 0.4604 | 0.8748 | 0.9385 | 0.8036 | 0.8431 | 0.8029 | 0.8351 | 0.8638 | 0.9081 |
113
+ | 0.2834 | 46.0 | 23000 | 0.2332 | 0.8227 | 0.9767 | 0.9414 | 0.3694 | 0.8292 | 0.8979 | 0.3066 | 0.8568 | 0.861 | 0.4523 | 0.8761 | 0.9385 | 0.8082 | 0.8493 | 0.7923 | 0.8237 | 0.8677 | 0.9101 |
114
+ | 0.304 | 47.0 | 23500 | 0.2359 | 0.8241 | 0.9743 | 0.94 | 0.3466 | 0.8338 | 0.9024 | 0.3113 | 0.8608 | 0.8656 | 0.44 | 0.8817 | 0.9418 | 0.8067 | 0.8484 | 0.7969 | 0.8361 | 0.8687 | 0.9124 |
115
+ | 0.2794 | 48.0 | 24000 | 0.2357 | 0.8211 | 0.9714 | 0.9358 | 0.3528 | 0.8296 | 0.9014 | 0.3084 | 0.8558 | 0.8601 | 0.4366 | 0.8752 | 0.9414 | 0.8045 | 0.8427 | 0.7913 | 0.8258 | 0.8677 | 0.9118 |
116
+ | 0.2895 | 49.0 | 24500 | 0.2356 | 0.825 | 0.9762 | 0.9389 | 0.3688 | 0.8331 | 0.8947 | 0.3083 | 0.8569 | 0.8624 | 0.4655 | 0.8772 | 0.9364 | 0.8107 | 0.8502 | 0.799 | 0.8278 | 0.8654 | 0.9092 |
117
+ | 0.2658 | 50.0 | 25000 | 0.2253 | 0.8353 | 0.9763 | 0.9393 | 0.3873 | 0.8427 | 0.911 | 0.31 | 0.8646 | 0.87 | 0.4782 | 0.8849 | 0.9473 | 0.8288 | 0.8618 | 0.7974 | 0.8278 | 0.8798 | 0.9205 |
118
+ | 0.2705 | 51.0 | 25500 | 0.2291 | 0.8305 | 0.9764 | 0.9388 | 0.3847 | 0.837 | 0.9024 | 0.3111 | 0.8633 | 0.8686 | 0.4731 | 0.8837 | 0.9431 | 0.8143 | 0.8538 | 0.8045 | 0.8361 | 0.8726 | 0.9159 |
119
+ | 0.2853 | 52.0 | 26000 | 0.2282 | 0.8302 | 0.9764 | 0.941 | 0.3858 | 0.8375 | 0.9028 | 0.3107 | 0.8628 | 0.8677 | 0.4606 | 0.8839 | 0.9418 | 0.8192 | 0.8582 | 0.7982 | 0.8299 | 0.873 | 0.915 |
120
+ | 0.3174 | 53.0 | 26500 | 0.2314 | 0.8276 | 0.9764 | 0.9424 | 0.3758 | 0.8349 | 0.9037 | 0.3092 | 0.8612 | 0.8655 | 0.4481 | 0.8812 | 0.9418 | 0.8143 | 0.8533 | 0.7975 | 0.8299 | 0.871 | 0.9133 |
121
+ | 0.2642 | 54.0 | 27000 | 0.2279 | 0.828 | 0.9764 | 0.9394 | 0.349 | 0.8365 | 0.9092 | 0.3103 | 0.8627 | 0.8666 | 0.4326 | 0.8827 | 0.946 | 0.8153 | 0.8538 | 0.7953 | 0.8299 | 0.8735 | 0.9161 |
122
+ | 0.268 | 55.0 | 27500 | 0.2284 | 0.8278 | 0.9766 | 0.9423 | 0.3701 | 0.8341 | 0.9097 | 0.3089 | 0.8607 | 0.8653 | 0.4638 | 0.8803 | 0.9464 | 0.8115 | 0.8484 | 0.7948 | 0.8289 | 0.8771 | 0.9184 |
123
+ | 0.2537 | 56.0 | 28000 | 0.2299 | 0.8275 | 0.9763 | 0.9392 | 0.3759 | 0.8342 | 0.9045 | 0.309 | 0.861 | 0.8652 | 0.4576 | 0.8809 | 0.9431 | 0.8107 | 0.8484 | 0.7983 | 0.832 | 0.8734 | 0.9153 |
124
+ | 0.2477 | 57.0 | 28500 | 0.2278 | 0.8285 | 0.9764 | 0.9393 | 0.3748 | 0.8357 | 0.9069 | 0.3087 | 0.8621 | 0.8663 | 0.4585 | 0.8816 | 0.9452 | 0.8151 | 0.852 | 0.7961 | 0.8299 | 0.8742 | 0.917 |
125
+ | 0.2838 | 58.0 | 29000 | 0.2274 | 0.8306 | 0.9765 | 0.9394 | 0.377 | 0.837 | 0.9102 | 0.3089 | 0.8634 | 0.8674 | 0.4553 | 0.8822 | 0.9473 | 0.8151 | 0.8511 | 0.8011 | 0.833 | 0.8756 | 0.9182 |
126
+ | 0.2693 | 59.0 | 29500 | 0.2272 | 0.8302 | 0.9765 | 0.9394 | 0.3768 | 0.8372 | 0.9097 | 0.3092 | 0.8633 | 0.8673 | 0.4523 | 0.8825 | 0.9469 | 0.8148 | 0.852 | 0.8005 | 0.832 | 0.8752 | 0.9179 |
127
+ | 0.3082 | 60.0 | 30000 | 0.2271 | 0.8302 | 0.9765 | 0.9394 | 0.3768 | 0.8372 | 0.9097 | 0.3092 | 0.8633 | 0.8673 | 0.4523 | 0.8825 | 0.9469 | 0.8148 | 0.852 | 0.8005 | 0.832 | 0.8752 | 0.9179 |
128
+
129
+
130
+ ### Framework versions
131
+
132
+ - Transformers 4.46.1
133
+ - Pytorch 2.5.0+cu121
134
+ - Datasets 2.19.2
135
+ - Tokenizers 0.20.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:64336f50f0350e90ae8276c3b9b65c462a4529d673bed021f41072f7f756eb57
3
  size 166496880
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bbc0c24fc62772d86b56c21b637c281e728d2f259b0d83a03765a761e4d55d01
3
  size 166496880
runs/Oct30_22-08-59_3cc3d721e222/events.out.tfevents.1730326140.3cc3d721e222.737.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8d6b1188ae4d3d492826c374af155757a64ad53f73817c70c02bc4c5da64e99d
3
- size 291246
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:778895f4ca16bdf2dda3abc8ab8aa4f1ec2c483513fb46d36923b1ebcbde49bb
3
+ size 292849