swinv2-tiny-patch4-window8-256-dmae-humeda-DAV17
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8543
- Accuracy: 0.7885
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 42
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| No log | 0.9302 | 10 | 1.5864 | 0.2885 |
| 6.7245 | 1.9302 | 20 | 1.4466 | 0.4615 |
| 5.7484 | 2.9302 | 30 | 1.0303 | 0.5962 |
| 3.9879 | 3.9302 | 40 | 0.9820 | 0.5577 |
| 2.769 | 4.9302 | 50 | 0.8608 | 0.6538 |
| 2.3766 | 5.9302 | 60 | 0.8945 | 0.6923 |
| 2.3766 | 6.9302 | 70 | 0.7773 | 0.6346 |
| 1.9183 | 7.9302 | 80 | 0.8082 | 0.6346 |
| 1.4993 | 8.9302 | 90 | 0.9407 | 0.6923 |
| 1.3461 | 9.9302 | 100 | 0.9281 | 0.75 |
| 1.2085 | 10.9302 | 110 | 0.7563 | 0.7692 |
| 0.8413 | 11.9302 | 120 | 0.9108 | 0.7308 |
| 0.8413 | 12.9302 | 130 | 0.8543 | 0.7885 |
| 0.9607 | 13.9302 | 140 | 1.2058 | 0.6731 |
| 0.837 | 14.9302 | 150 | 0.9733 | 0.7115 |
| 0.7641 | 15.9302 | 160 | 1.0169 | 0.6538 |
| 0.7997 | 16.9302 | 170 | 0.8486 | 0.7308 |
| 0.6171 | 17.9302 | 180 | 0.9551 | 0.7885 |
| 0.6171 | 18.9302 | 190 | 1.0267 | 0.7308 |
| 0.6755 | 19.9302 | 200 | 1.1810 | 0.6923 |
| 0.6393 | 20.9302 | 210 | 1.0516 | 0.7308 |
| 0.573 | 21.9302 | 220 | 1.1029 | 0.7115 |
| 0.4657 | 22.9302 | 230 | 1.0257 | 0.7885 |
| 0.4626 | 23.9302 | 240 | 1.2266 | 0.6923 |
| 0.4626 | 24.9302 | 250 | 1.3491 | 0.6538 |
| 0.4899 | 25.9302 | 260 | 1.2055 | 0.7692 |
| 0.3991 | 26.9302 | 270 | 1.1633 | 0.6923 |
| 0.3778 | 27.9302 | 280 | 1.1751 | 0.7308 |
| 0.443 | 28.9302 | 290 | 1.1727 | 0.75 |
| 0.43 | 29.9302 | 300 | 1.3292 | 0.7115 |
| 0.43 | 30.9302 | 310 | 1.1873 | 0.7115 |
| 0.4425 | 31.9302 | 320 | 1.2326 | 0.6538 |
| 0.3098 | 32.9302 | 330 | 1.2379 | 0.7115 |
| 0.4086 | 33.9302 | 340 | 1.3020 | 0.6731 |
| 0.3046 | 34.9302 | 350 | 1.2686 | 0.7115 |
| 0.3503 | 35.9302 | 360 | 1.3006 | 0.6923 |
| 0.3503 | 36.9302 | 370 | 1.3207 | 0.6923 |
| 0.2985 | 37.9302 | 380 | 1.3626 | 0.7115 |
| 0.3445 | 38.9302 | 390 | 1.3689 | 0.7115 |
| 0.3017 | 39.9302 | 400 | 1.3523 | 0.7115 |
| 0.3446 | 40.9302 | 410 | 1.3447 | 0.7115 |
| 0.2799 | 41.9302 | 420 | 1.3395 | 0.7115 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 2
Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV17
Base model
microsoft/swinv2-tiny-patch4-window8-256