swinv2-tiny-patch4-window8-256-dmae-humeda-DAV9

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2286
  • Accuracy: 0.5192

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 3 1.5944 0.2308
No log 2.0 6 1.5511 0.2692
No log 3.0 9 1.4915 0.3462
6.2974 4.0 12 1.4388 0.4615
6.2974 5.0 15 1.3927 0.4615
6.2974 6.0 18 1.3394 0.4423
5.3611 7.0 21 1.3108 0.4423
5.3611 8.0 24 1.3680 0.3462
5.3611 9.0 27 1.2718 0.4038
3.7205 10.0 30 1.2679 0.4231
3.7205 11.0 33 1.3010 0.4038
3.7205 12.0 36 1.2598 0.4231
3.7205 13.0 39 1.2016 0.4231
2.8178 14.0 42 1.1934 0.4423
2.8178 15.0 45 1.1842 0.4808
2.8178 16.0 48 1.1539 0.5
2.4001 17.0 51 1.1308 0.4808
2.4001 18.0 54 1.2173 0.4615
2.4001 19.0 57 1.1670 0.5
2.081 20.0 60 1.1792 0.5
2.081 21.0 63 1.2286 0.5192
2.081 22.0 66 1.2633 0.5
2.081 23.0 69 1.2380 0.5
1.8588 24.0 72 1.2498 0.4808
1.8588 25.0 75 1.2591 0.5
1.8588 26.0 78 1.2653 0.5
1.7634 27.0 81 1.2599 0.5
1.7634 28.0 84 1.2549 0.5
1.7634 29.0 87 1.2545 0.5192
1.8177 30.0 90 1.2547 0.5192

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
3
Safetensors
Model size
27.6M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV9

Finetuned
(138)
this model