Yolo_test / README.md
SmallPS's picture
End of training
af46cd6 verified
|
raw
history blame
No virus
9.53 kB
metadata
license: apache-2.0
base_model: hustvl/yolos-tiny
tags:
  - generated_from_trainer
model-index:
  - name: Yolo_test
    results: []

Yolo_test

This model is a fine-tuned version of hustvl/yolos-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3290
  • Map: 0.1338
  • Map 50: 0.2829
  • Map 75: 0.1119
  • Map Small: 0.0653
  • Map Medium: 0.1961
  • Map Large: -1.0
  • Mar 1: 0.1889
  • Mar 10: 0.3333
  • Mar 100: 0.3815
  • Mar Small: 0.38
  • Mar Medium: 0.3824
  • Mar Large: -1.0
  • Map Background : -1.0
  • Mar 100 Background : -1.0
  • Map Score: 0.1338
  • Mar 100 Score: 0.3815

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Background Mar 100 Background Map Score Mar 100 Score
No log 1.0 93 1.3889 0.0097 0.0437 0.0002 0.0124 0.0108 -1.0 0.0259 0.1444 0.1852 0.19 0.1824 -1.0 -1.0 -1.0 0.0097 0.1852
No log 2.0 186 1.3371 0.0379 0.1162 0.0014 0.0348 0.0454 -1.0 0.0407 0.2037 0.2185 0.2 0.2294 -1.0 -1.0 -1.0 0.0379 0.2185
No log 3.0 279 1.6085 0.0058 0.0317 0.0014 0.0017 0.0102 -1.0 0.0222 0.1037 0.1037 0.02 0.1529 -1.0 -1.0 -1.0 0.0058 0.1037
No log 4.0 372 1.2737 0.0119 0.0385 0.0021 0.0285 0.0056 -1.0 0.0185 0.2074 0.2519 0.37 0.1824 -1.0 -1.0 -1.0 0.0119 0.2519
No log 5.0 465 1.0391 0.0218 0.0505 0.0203 0.0556 0.0103 -1.0 0.0296 0.3148 0.437 0.73 0.2647 -1.0 -1.0 -1.0 0.0218 0.437
1.0243 6.0 558 1.2415 0.0105 0.0342 0.0022 0.0126 0.0123 -1.0 0.0185 0.1444 0.2926 0.27 0.3059 -1.0 -1.0 -1.0 0.0105 0.2926
1.0243 7.0 651 1.0791 0.0244 0.0722 0.0063 0.0412 0.0226 -1.0 0.0481 0.3111 0.3741 0.48 0.3118 -1.0 -1.0 -1.0 0.0244 0.3741
1.0243 8.0 744 1.1443 0.0388 0.1407 0.0012 0.0482 0.0422 -1.0 0.0667 0.2815 0.3259 0.34 0.3176 -1.0 -1.0 -1.0 0.0388 0.3259
1.0243 9.0 837 1.1221 0.0366 0.1054 0.0035 0.0414 0.0372 -1.0 0.0407 0.2963 0.3815 0.44 0.3471 -1.0 -1.0 -1.0 0.0366 0.3815
1.0243 10.0 930 1.1264 0.0445 0.1488 0.009 0.0899 0.0362 -1.0 0.0963 0.3259 0.4074 0.48 0.3647 -1.0 -1.0 -1.0 0.0445 0.4074
0.6487 11.0 1023 1.1333 0.0471 0.1532 0.0118 0.0442 0.0676 -1.0 0.0593 0.2778 0.2889 0.24 0.3176 -1.0 -1.0 -1.0 0.0471 0.2889
0.6487 12.0 1116 1.2594 0.0448 0.1664 0.0006 0.0413 0.0583 -1.0 0.0667 0.2444 0.263 0.26 0.2647 -1.0 -1.0 -1.0 0.0448 0.263
0.6487 13.0 1209 1.1627 0.033 0.1218 0.003 0.0487 0.0377 -1.0 0.0333 0.3 0.3148 0.38 0.2765 -1.0 -1.0 -1.0 0.033 0.3148
0.6487 14.0 1302 1.2219 0.0669 0.2662 0.014 0.0835 0.0711 -1.0 0.1148 0.2889 0.3148 0.3 0.3235 -1.0 -1.0 -1.0 0.0669 0.3148
0.6487 15.0 1395 1.1355 0.0994 0.2971 0.0248 0.1038 0.1244 -1.0 0.1519 0.337 0.3741 0.35 0.3882 -1.0 -1.0 -1.0 0.0994 0.3741
0.6487 16.0 1488 1.1675 0.1154 0.3476 0.026 0.1273 0.1285 -1.0 0.1667 0.3 0.3444 0.34 0.3471 -1.0 -1.0 -1.0 0.1154 0.3444
0.4946 17.0 1581 1.2258 0.0976 0.2925 0.0282 0.0668 0.1497 -1.0 0.1444 0.3148 0.3704 0.33 0.3941 -1.0 -1.0 -1.0 0.0976 0.3704
0.4946 18.0 1674 1.2367 0.1138 0.3025 0.045 0.0959 0.1453 -1.0 0.1778 0.3481 0.4 0.43 0.3824 -1.0 -1.0 -1.0 0.1138 0.4
0.4946 19.0 1767 1.2897 0.1208 0.3045 0.0764 0.0784 0.1481 -1.0 0.1778 0.3296 0.3704 0.39 0.3588 -1.0 -1.0 -1.0 0.1208 0.3704
0.4946 20.0 1860 1.2484 0.1321 0.3143 0.0969 0.092 0.1805 -1.0 0.1926 0.3444 0.3889 0.38 0.3941 -1.0 -1.0 -1.0 0.1321 0.3889
0.4946 21.0 1953 1.2827 0.1272 0.3247 0.1094 0.0998 0.1661 -1.0 0.1926 0.3259 0.363 0.34 0.3765 -1.0 -1.0 -1.0 0.1272 0.363
0.3004 22.0 2046 1.3022 0.1311 0.2813 0.1268 0.0728 0.1852 -1.0 0.1852 0.3296 0.3778 0.37 0.3824 -1.0 -1.0 -1.0 0.1311 0.3778
0.3004 23.0 2139 1.3196 0.1319 0.2916 0.127 0.0644 0.1938 -1.0 0.1852 0.3222 0.3667 0.34 0.3824 -1.0 -1.0 -1.0 0.1319 0.3667
0.3004 24.0 2232 1.3351 0.137 0.2932 0.1095 0.0663 0.1992 -1.0 0.1852 0.3407 0.3889 0.37 0.4 -1.0 -1.0 -1.0 0.137 0.3889
0.3004 25.0 2325 1.2901 0.1343 0.279 0.1256 0.0743 0.1898 -1.0 0.1963 0.3444 0.3852 0.4 0.3765 -1.0 -1.0 -1.0 0.1343 0.3852
0.3004 26.0 2418 1.3259 0.1361 0.2762 0.1092 0.0648 0.2014 -1.0 0.1889 0.337 0.3889 0.39 0.3882 -1.0 -1.0 -1.0 0.1361 0.3889
0.1909 27.0 2511 1.3338 0.1297 0.2823 0.1118 0.0651 0.1881 -1.0 0.1852 0.3296 0.3778 0.38 0.3765 -1.0 -1.0 -1.0 0.1297 0.3778
0.1909 28.0 2604 1.3253 0.1346 0.2831 0.1119 0.0676 0.1961 -1.0 0.1889 0.337 0.3852 0.39 0.3824 -1.0 -1.0 -1.0 0.1346 0.3852
0.1909 29.0 2697 1.3243 0.1338 0.2829 0.1119 0.0653 0.1961 -1.0 0.1889 0.3333 0.3815 0.38 0.3824 -1.0 -1.0 -1.0 0.1338 0.3815
0.1909 30.0 2790 1.3290 0.1338 0.2829 0.1119 0.0653 0.1961 -1.0 0.1889 0.3333 0.3815 0.38 0.3824 -1.0 -1.0 -1.0 0.1338 0.3815

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1