|
---
|
|
license: apache-2.0
|
|
base_model: hustvl/yolos-tiny
|
|
tags:
|
|
- generated_from_trainer
|
|
model-index:
|
|
- name: Yolo_test
|
|
results: []
|
|
---
|
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
should probably proofread and complete it, then remove this comment. -->
|
|
|
|
# Yolo_test
|
|
|
|
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on an unknown dataset.
|
|
It achieves the following results on the evaluation set:
|
|
- Loss: 0.9194
|
|
- Map: 0.1092
|
|
- Map 50: 0.2705
|
|
- Map 75: 0.0385
|
|
- Map Small: 0.2009
|
|
- Map Medium: 0.0867
|
|
- Map Large: -1.0
|
|
- Mar 1: 0.1037
|
|
- Mar 10: 0.4296
|
|
- Mar 100: 0.5222
|
|
- Mar Small: 0.61
|
|
- Mar Medium: 0.4706
|
|
- Mar Large: -1.0
|
|
- Map Background : -1.0
|
|
- Mar 100 Background : -1.0
|
|
- Map Score: 0.1092
|
|
- Mar 100 Score: 0.5222
|
|
|
|
## Model description
|
|
|
|
More information needed
|
|
|
|
## Intended uses & limitations
|
|
|
|
More information needed
|
|
|
|
## Training and evaluation data
|
|
|
|
More information needed
|
|
|
|
## Training procedure
|
|
|
|
### Training hyperparameters
|
|
|
|
The following hyperparameters were used during training:
|
|
- learning_rate: 5e-05
|
|
- train_batch_size: 8
|
|
- eval_batch_size: 8
|
|
- seed: 42
|
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
|
- lr_scheduler_type: cosine
|
|
- num_epochs: 30
|
|
|
|
### Training results
|
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Background | Mar 100 Background | Map Score | Mar 100 Score |
|
|
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:---------:|:-------------:|
|
|
| No log | 1.0 | 93 | 1.2349 | 0.0104 | 0.0497 | 0.0004 | 0.0266 | 0.0053 | -1.0 | 0.0333 | 0.1741 | 0.2556 | 0.44 | 0.1471 | -1.0 | -1.0 | -1.0 | 0.0104 | 0.2556 |
|
|
| No log | 2.0 | 186 | 1.3555 | 0.0071 | 0.038 | 0.001 | 0.0205 | 0.0032 | -1.0 | 0.0037 | 0.1444 | 0.1704 | 0.24 | 0.1294 | -1.0 | -1.0 | -1.0 | 0.0071 | 0.1704 |
|
|
| No log | 3.0 | 279 | 1.3345 | 0.0224 | 0.0669 | 0.0048 | 0.0557 | 0.0124 | -1.0 | 0.063 | 0.1926 | 0.2148 | 0.29 | 0.1706 | -1.0 | -1.0 | -1.0 | 0.0224 | 0.2148 |
|
|
| No log | 4.0 | 372 | 1.1534 | 0.0709 | 0.2465 | 0.0068 | 0.1562 | 0.0233 | -1.0 | 0.1 | 0.3111 | 0.3259 | 0.46 | 0.2471 | -1.0 | -1.0 | -1.0 | 0.0709 | 0.3259 |
|
|
| No log | 5.0 | 465 | 1.0287 | 0.0536 | 0.1696 | 0.0025 | 0.1018 | 0.0286 | -1.0 | 0.063 | 0.3259 | 0.3444 | 0.37 | 0.3294 | -1.0 | -1.0 | -1.0 | 0.0536 | 0.3444 |
|
|
| 1.2502 | 6.0 | 558 | 1.0684 | 0.0381 | 0.1233 | 0.0015 | 0.0979 | 0.0184 | -1.0 | 0.0593 | 0.2889 | 0.3556 | 0.44 | 0.3059 | -1.0 | -1.0 | -1.0 | 0.0381 | 0.3556 |
|
|
| 1.2502 | 7.0 | 651 | 1.0613 | 0.0252 | 0.0884 | 0.0024 | 0.0409 | 0.0208 | -1.0 | 0.0407 | 0.2593 | 0.3407 | 0.35 | 0.3353 | -1.0 | -1.0 | -1.0 | 0.0252 | 0.3407 |
|
|
| 1.2502 | 8.0 | 744 | 0.9724 | 0.0602 | 0.2019 | 0.008 | 0.1258 | 0.034 | -1.0 | 0.1407 | 0.3704 | 0.4296 | 0.5 | 0.3882 | -1.0 | -1.0 | -1.0 | 0.0602 | 0.4296 |
|
|
| 1.2502 | 9.0 | 837 | 1.0170 | 0.055 | 0.1548 | 0.0077 | 0.0627 | 0.0556 | -1.0 | 0.0667 | 0.3889 | 0.3889 | 0.41 | 0.3765 | -1.0 | -1.0 | -1.0 | 0.055 | 0.3889 |
|
|
| 1.2502 | 10.0 | 930 | 1.1431 | 0.0324 | 0.1301 | 0.0025 | 0.0532 | 0.0281 | -1.0 | 0.063 | 0.2741 | 0.3037 | 0.32 | 0.2941 | -1.0 | -1.0 | -1.0 | 0.0324 | 0.3037 |
|
|
| 0.8688 | 11.0 | 1023 | 0.9432 | 0.0511 | 0.1614 | 0.0152 | 0.1059 | 0.0394 | -1.0 | 0.0333 | 0.3407 | 0.4481 | 0.49 | 0.4235 | -1.0 | -1.0 | -1.0 | 0.0511 | 0.4481 |
|
|
| 0.8688 | 12.0 | 1116 | 0.9395 | 0.0658 | 0.1824 | 0.0115 | 0.1559 | 0.0409 | -1.0 | 0.0815 | 0.3741 | 0.4333 | 0.49 | 0.4 | -1.0 | -1.0 | -1.0 | 0.0658 | 0.4333 |
|
|
| 0.8688 | 13.0 | 1209 | 1.0367 | 0.0554 | 0.1926 | 0.0138 | 0.123 | 0.0377 | -1.0 | 0.1074 | 0.3 | 0.3667 | 0.47 | 0.3059 | -1.0 | -1.0 | -1.0 | 0.0554 | 0.3667 |
|
|
| 0.8688 | 14.0 | 1302 | 1.0167 | 0.0459 | 0.1409 | 0.0077 | 0.0799 | 0.0452 | -1.0 | 0.0444 | 0.3 | 0.4 | 0.46 | 0.3647 | -1.0 | -1.0 | -1.0 | 0.0459 | 0.4 |
|
|
| 0.8688 | 15.0 | 1395 | 1.0687 | 0.0403 | 0.126 | 0.005 | 0.0494 | 0.0472 | -1.0 | 0.0778 | 0.2741 | 0.3444 | 0.48 | 0.2647 | -1.0 | -1.0 | -1.0 | 0.0403 | 0.3444 |
|
|
| 0.8688 | 16.0 | 1488 | 0.9219 | 0.1075 | 0.2652 | 0.0661 | 0.233 | 0.0563 | -1.0 | 0.1185 | 0.3741 | 0.4889 | 0.58 | 0.4353 | -1.0 | -1.0 | -1.0 | 0.1075 | 0.4889 |
|
|
| 0.7025 | 17.0 | 1581 | 0.9246 | 0.0706 | 0.2286 | 0.0293 | 0.2273 | 0.0339 | -1.0 | 0.1111 | 0.3778 | 0.4444 | 0.58 | 0.3647 | -1.0 | -1.0 | -1.0 | 0.0706 | 0.4444 |
|
|
| 0.7025 | 18.0 | 1674 | 0.9737 | 0.0638 | 0.1656 | 0.0298 | 0.1258 | 0.0524 | -1.0 | 0.0778 | 0.4111 | 0.463 | 0.55 | 0.4118 | -1.0 | -1.0 | -1.0 | 0.0638 | 0.463 |
|
|
| 0.7025 | 19.0 | 1767 | 0.9821 | 0.0687 | 0.2266 | 0.0189 | 0.1069 | 0.0594 | -1.0 | 0.0444 | 0.3704 | 0.4259 | 0.46 | 0.4059 | -1.0 | -1.0 | -1.0 | 0.0687 | 0.4259 |
|
|
| 0.7025 | 20.0 | 1860 | 0.9198 | 0.0966 | 0.3077 | 0.0152 | 0.212 | 0.058 | -1.0 | 0.1519 | 0.3704 | 0.4778 | 0.59 | 0.4118 | -1.0 | -1.0 | -1.0 | 0.0966 | 0.4778 |
|
|
| 0.7025 | 21.0 | 1953 | 0.9468 | 0.087 | 0.2415 | 0.0305 | 0.1982 | 0.0708 | -1.0 | 0.163 | 0.3778 | 0.4667 | 0.51 | 0.4412 | -1.0 | -1.0 | -1.0 | 0.087 | 0.4667 |
|
|
| 0.5994 | 22.0 | 2046 | 0.8969 | 0.1079 | 0.2414 | 0.0639 | 0.2735 | 0.0765 | -1.0 | 0.1444 | 0.4296 | 0.5407 | 0.64 | 0.4824 | -1.0 | -1.0 | -1.0 | 0.1079 | 0.5407 |
|
|
| 0.5994 | 23.0 | 2139 | 0.9532 | 0.0974 | 0.233 | 0.0462 | 0.249 | 0.0641 | -1.0 | 0.163 | 0.3963 | 0.5111 | 0.62 | 0.4471 | -1.0 | -1.0 | -1.0 | 0.0974 | 0.5111 |
|
|
| 0.5994 | 24.0 | 2232 | 0.9191 | 0.104 | 0.2925 | 0.0506 | 0.2068 | 0.0671 | -1.0 | 0.1148 | 0.4 | 0.4926 | 0.62 | 0.4176 | -1.0 | -1.0 | -1.0 | 0.104 | 0.4926 |
|
|
| 0.5994 | 25.0 | 2325 | 0.8977 | 0.1095 | 0.3031 | 0.0364 | 0.1874 | 0.0843 | -1.0 | 0.1148 | 0.437 | 0.5296 | 0.63 | 0.4706 | -1.0 | -1.0 | -1.0 | 0.1095 | 0.5296 |
|
|
| 0.5994 | 26.0 | 2418 | 0.9240 | 0.0999 | 0.2688 | 0.0423 | 0.1689 | 0.0872 | -1.0 | 0.0963 | 0.4222 | 0.5185 | 0.59 | 0.4765 | -1.0 | -1.0 | -1.0 | 0.0999 | 0.5185 |
|
|
| 0.526 | 27.0 | 2511 | 0.9139 | 0.1091 | 0.2691 | 0.038 | 0.2088 | 0.0872 | -1.0 | 0.1259 | 0.4222 | 0.5185 | 0.6 | 0.4706 | -1.0 | -1.0 | -1.0 | 0.1091 | 0.5185 |
|
|
| 0.526 | 28.0 | 2604 | 0.9143 | 0.1058 | 0.2494 | 0.0547 | 0.2091 | 0.0827 | -1.0 | 0.1037 | 0.4296 | 0.5296 | 0.62 | 0.4765 | -1.0 | -1.0 | -1.0 | 0.1058 | 0.5296 |
|
|
| 0.526 | 29.0 | 2697 | 0.9166 | 0.1103 | 0.2704 | 0.037 | 0.2046 | 0.0867 | -1.0 | 0.1074 | 0.4259 | 0.5259 | 0.61 | 0.4765 | -1.0 | -1.0 | -1.0 | 0.1103 | 0.5259 |
|
|
| 0.526 | 30.0 | 2790 | 0.9194 | 0.1092 | 0.2705 | 0.0385 | 0.2009 | 0.0867 | -1.0 | 0.1037 | 0.4296 | 0.5222 | 0.61 | 0.4706 | -1.0 | -1.0 | -1.0 | 0.1092 | 0.5222 |
|
|
|
|
|
|
### Framework versions
|
|
|
|
- Transformers 4.42.4
|
|
- Pytorch 2.3.1+cu121
|
|
- Datasets 2.20.0
|
|
- Tokenizers 0.19.1
|
|
|