metadata
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_hint
results: []
resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_hint
This model is a fine-tuned version of microsoft/resnet-50 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 20.4893
- Accuracy: 0.7622
- Brier Loss: 0.3995
- Nll: 2.6673
- F1 Micro: 0.7622
- F1 Macro: 0.7619
- Ece: 0.1742
- Aurc: 0.0853
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 250 | 27.0152 | 0.144 | 0.9329 | 8.3774 | 0.144 | 0.1293 | 0.0760 | 0.8496 |
26.9201 | 2.0 | 500 | 25.8022 | 0.4547 | 0.8625 | 4.1098 | 0.4547 | 0.4194 | 0.3292 | 0.3673 |
26.9201 | 3.0 | 750 | 24.5485 | 0.5617 | 0.6135 | 3.0722 | 0.5617 | 0.5439 | 0.1557 | 0.2257 |
24.565 | 4.0 | 1000 | 23.9825 | 0.6388 | 0.5062 | 2.7343 | 0.6388 | 0.6354 | 0.1084 | 0.1537 |
24.565 | 5.0 | 1250 | 23.8483 | 0.6747 | 0.4518 | 2.5930 | 0.6747 | 0.6686 | 0.0597 | 0.1289 |
23.3904 | 6.0 | 1500 | 23.2280 | 0.7137 | 0.3953 | 2.4736 | 0.7138 | 0.7117 | 0.0486 | 0.0997 |
23.3904 | 7.0 | 1750 | 23.0275 | 0.725 | 0.3781 | 2.3823 | 0.7250 | 0.7238 | 0.0414 | 0.0911 |
22.6462 | 8.0 | 2000 | 22.8213 | 0.7358 | 0.3699 | 2.3745 | 0.7358 | 0.7351 | 0.0539 | 0.0881 |
22.6462 | 9.0 | 2250 | 22.6219 | 0.7468 | 0.3629 | 2.3056 | 0.7468 | 0.7465 | 0.0617 | 0.0852 |
22.0944 | 10.0 | 2500 | 22.4746 | 0.751 | 0.3593 | 2.3500 | 0.751 | 0.7523 | 0.0637 | 0.0846 |
22.0944 | 11.0 | 2750 | 22.3503 | 0.752 | 0.3624 | 2.4245 | 0.752 | 0.7533 | 0.0810 | 0.0834 |
21.6411 | 12.0 | 3000 | 22.2263 | 0.7545 | 0.3693 | 2.4277 | 0.7545 | 0.7547 | 0.0972 | 0.0885 |
21.6411 | 13.0 | 3250 | 22.1353 | 0.7522 | 0.3740 | 2.4647 | 0.7522 | 0.7532 | 0.1141 | 0.0862 |
21.2742 | 14.0 | 3500 | 22.1122 | 0.7475 | 0.3868 | 2.5369 | 0.7475 | 0.7495 | 0.1250 | 0.0922 |
21.2742 | 15.0 | 3750 | 22.0040 | 0.7508 | 0.3842 | 2.5364 | 0.7508 | 0.7501 | 0.1304 | 0.0911 |
20.9515 | 16.0 | 4000 | 21.8795 | 0.758 | 0.3772 | 2.5474 | 0.7580 | 0.7578 | 0.1324 | 0.0846 |
20.9515 | 17.0 | 4250 | 21.7554 | 0.754 | 0.3892 | 2.5498 | 0.754 | 0.7543 | 0.1420 | 0.0923 |
20.6695 | 18.0 | 4500 | 21.6863 | 0.749 | 0.3981 | 2.6337 | 0.749 | 0.7507 | 0.1510 | 0.0922 |
20.6695 | 19.0 | 4750 | 21.6123 | 0.7498 | 0.4007 | 2.5993 | 0.7498 | 0.7499 | 0.1551 | 0.0921 |
20.4239 | 20.0 | 5000 | 21.5128 | 0.7595 | 0.3845 | 2.5510 | 0.7595 | 0.7590 | 0.1498 | 0.0870 |
20.4239 | 21.0 | 5250 | 21.4770 | 0.7542 | 0.4005 | 2.6396 | 0.7542 | 0.7547 | 0.1623 | 0.0932 |
20.2131 | 22.0 | 5500 | 21.3497 | 0.7612 | 0.3892 | 2.5117 | 0.7612 | 0.7609 | 0.1539 | 0.0891 |
20.2131 | 23.0 | 5750 | 21.3489 | 0.7572 | 0.3956 | 2.5227 | 0.7572 | 0.7570 | 0.1608 | 0.0883 |
20.0332 | 24.0 | 6000 | 21.2609 | 0.7585 | 0.3939 | 2.5487 | 0.7585 | 0.7595 | 0.1629 | 0.0860 |
20.0332 | 25.0 | 6250 | 21.2046 | 0.7552 | 0.3982 | 2.6283 | 0.7552 | 0.7559 | 0.1663 | 0.0878 |
19.8699 | 26.0 | 6500 | 21.1515 | 0.7528 | 0.4038 | 2.6730 | 0.7528 | 0.7536 | 0.1721 | 0.0858 |
19.8699 | 27.0 | 6750 | 21.0789 | 0.7562 | 0.4003 | 2.6027 | 0.7562 | 0.7575 | 0.1683 | 0.0876 |
19.7228 | 28.0 | 7000 | 21.0357 | 0.7565 | 0.3996 | 2.6490 | 0.7565 | 0.7561 | 0.1707 | 0.0844 |
19.7228 | 29.0 | 7250 | 20.9975 | 0.758 | 0.3971 | 2.6300 | 0.7580 | 0.7574 | 0.1704 | 0.0835 |
19.589 | 30.0 | 7500 | 20.9221 | 0.7568 | 0.4007 | 2.5841 | 0.7568 | 0.7567 | 0.1714 | 0.0860 |
19.589 | 31.0 | 7750 | 20.8725 | 0.7562 | 0.3996 | 2.5775 | 0.7562 | 0.7562 | 0.1752 | 0.0847 |
19.4738 | 32.0 | 8000 | 20.8438 | 0.7572 | 0.3999 | 2.6441 | 0.7572 | 0.7570 | 0.1693 | 0.0877 |
19.4738 | 33.0 | 8250 | 20.8337 | 0.755 | 0.4052 | 2.6660 | 0.755 | 0.7555 | 0.1743 | 0.0868 |
19.3704 | 34.0 | 8500 | 20.7635 | 0.7575 | 0.4022 | 2.6885 | 0.7575 | 0.7583 | 0.1764 | 0.0868 |
19.3704 | 35.0 | 8750 | 20.7705 | 0.7608 | 0.4001 | 2.6415 | 0.7608 | 0.7601 | 0.1735 | 0.0856 |
19.2791 | 36.0 | 9000 | 20.7221 | 0.7632 | 0.3984 | 2.7139 | 0.7632 | 0.7640 | 0.1706 | 0.0857 |
19.2791 | 37.0 | 9250 | 20.6873 | 0.7622 | 0.3986 | 2.6743 | 0.7622 | 0.7625 | 0.1715 | 0.0838 |
19.2036 | 38.0 | 9500 | 20.6757 | 0.7618 | 0.3990 | 2.6225 | 0.7618 | 0.7620 | 0.1735 | 0.0852 |
19.2036 | 39.0 | 9750 | 20.6421 | 0.7588 | 0.4018 | 2.6342 | 0.7588 | 0.7579 | 0.1761 | 0.0870 |
19.1398 | 40.0 | 10000 | 20.6432 | 0.761 | 0.4057 | 2.6595 | 0.761 | 0.7610 | 0.1760 | 0.0868 |
19.1398 | 41.0 | 10250 | 20.5778 | 0.7672 | 0.3981 | 2.6180 | 0.7672 | 0.7674 | 0.1680 | 0.0850 |
19.0835 | 42.0 | 10500 | 20.5628 | 0.764 | 0.3981 | 2.6309 | 0.764 | 0.7625 | 0.1726 | 0.0851 |
19.0835 | 43.0 | 10750 | 20.5530 | 0.7632 | 0.3995 | 2.6470 | 0.7632 | 0.7628 | 0.1733 | 0.0868 |
19.0398 | 44.0 | 11000 | 20.5625 | 0.761 | 0.4029 | 2.6650 | 0.761 | 0.7608 | 0.1764 | 0.0864 |
19.0398 | 45.0 | 11250 | 20.5637 | 0.7628 | 0.4010 | 2.6709 | 0.7628 | 0.7623 | 0.1760 | 0.0850 |
19.0073 | 46.0 | 11500 | 20.5378 | 0.7628 | 0.3998 | 2.6522 | 0.7628 | 0.7631 | 0.1749 | 0.0859 |
19.0073 | 47.0 | 11750 | 20.5199 | 0.7615 | 0.4010 | 2.6406 | 0.7615 | 0.7619 | 0.1748 | 0.0867 |
18.9818 | 48.0 | 12000 | 20.5378 | 0.761 | 0.4031 | 2.6434 | 0.761 | 0.7616 | 0.1767 | 0.0856 |
18.9818 | 49.0 | 12250 | 20.4962 | 0.7652 | 0.3962 | 2.6250 | 0.7652 | 0.7653 | 0.1720 | 0.0853 |
18.9734 | 50.0 | 12500 | 20.4893 | 0.7622 | 0.3995 | 2.6673 | 0.7622 | 0.7619 | 0.1742 | 0.0853 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3