bmedeiros's picture
Model save
4799540 verified
metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/swinv2-tiny-patch4-window8-256
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swinv2-tiny-patch4-window8-256-finetuned-lf-invalidation
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.676595744680851

swinv2-tiny-patch4-window8-256-finetuned-lf-invalidation

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7236
  • Accuracy: 0.6766

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.5608 0.9796 12 0.6779 0.5532
0.5249 1.9592 24 0.5234 0.7617
0.4404 2.9388 36 0.5121 0.7766
0.3893 4.0 49 0.3981 0.8128
0.4083 4.9796 61 0.5870 0.6447
0.3725 5.9592 73 0.4991 0.7553
0.3909 6.9388 85 0.4062 0.8426
0.3799 8.0 98 0.5115 0.7574
0.3332 8.9796 110 0.4470 0.8277
0.3108 9.9592 122 0.3451 0.8681
0.308 10.9388 134 0.5822 0.7511
0.3699 12.0 147 0.4653 0.8106
0.2945 12.9796 159 0.4171 0.8426
0.2934 13.9592 171 0.4366 0.8234
0.2719 14.9388 183 0.5905 0.7638
0.3287 16.0 196 0.6654 0.7234
0.271 16.9796 208 0.6328 0.7447
0.3018 17.9592 220 0.4671 0.8255
0.2763 18.9388 232 0.6032 0.7468
0.2834 20.0 245 0.7016 0.7
0.2653 20.9796 257 0.4089 0.8468
0.2666 21.9592 269 0.7905 0.6447
0.2941 22.9388 281 0.6064 0.7553
0.2792 24.0 294 0.7444 0.7085
0.2019 24.9796 306 0.7595 0.7170
0.2552 25.9592 318 1.0296 0.5660
0.2451 26.9388 330 0.5999 0.7340
0.2126 28.0 343 0.5730 0.7660
0.2214 28.9796 355 0.9756 0.5809
0.2633 29.9592 367 0.4134 0.8404
0.2427 30.9388 379 0.8228 0.6362
0.2405 32.0 392 0.5279 0.7723
0.2078 32.9796 404 0.6581 0.6979
0.2201 33.9592 416 0.9132 0.5745
0.2481 34.9388 428 0.9526 0.5617
0.248 36.0 441 0.8979 0.5553
0.2209 36.9796 453 0.8351 0.5915
0.2253 37.9592 465 0.6744 0.6851
0.2447 38.9388 477 0.7794 0.6404
0.2049 40.0 490 0.6136 0.7468
0.1965 40.9796 502 0.6582 0.7340
0.2638 41.9592 514 0.7487 0.6766
0.2297 42.9388 526 0.7282 0.6702
0.2163 44.0 539 0.5713 0.7511
0.2016 44.9796 551 0.5994 0.7319
0.1739 45.9592 563 0.6865 0.6915
0.2497 46.9388 575 0.6901 0.6957
0.2293 48.0 588 0.7150 0.6851
0.2237 48.9796 600 0.7236 0.6766

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1