Configuration Parsing
Warning:
In adapter_config.json: "peft.task_type" must be a string
vit-base-patch16-224-in21k-finetuned-breast-composition
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4735
- Accuracy: 0.8057
- F1: 0.8046
- Precision: 0.8053
- Recall: 0.8057
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 12
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 1.3733 | 0.1062 | 100 | 1.3716 | 0.3608 | 0.2980 | 0.3783 | 0.3608 |
| 1.3575 | 0.2125 | 200 | 1.3537 | 0.4026 | 0.3205 | 0.3988 | 0.4026 |
| 1.3262 | 0.3187 | 300 | 1.3186 | 0.4256 | 0.3125 | 0.4184 | 0.4256 |
| 1.2714 | 0.4250 | 400 | 1.2666 | 0.4562 | 0.3528 | 0.6333 | 0.4562 |
| 1.212 | 0.5312 | 500 | 1.2172 | 0.5161 | 0.4304 | 0.6528 | 0.5161 |
| 1.1533 | 0.6375 | 600 | 1.1610 | 0.5655 | 0.4831 | 0.6747 | 0.5655 |
| 1.0462 | 0.7437 | 700 | 1.0505 | 0.6136 | 0.5278 | 0.7080 | 0.6136 |
| 0.9731 | 0.8499 | 800 | 0.9478 | 0.6327 | 0.5451 | 0.7219 | 0.6327 |
| 0.9101 | 0.9562 | 900 | 0.8763 | 0.6438 | 0.5551 | 0.7303 | 0.6438 |
| 0.8524 | 1.0624 | 1000 | 0.8397 | 0.6431 | 0.5549 | 0.7130 | 0.6431 |
| 0.8274 | 1.1687 | 1100 | 0.7888 | 0.6571 | 0.5795 | 0.7088 | 0.6571 |
| 0.7976 | 1.2749 | 1200 | 0.7741 | 0.6578 | 0.5901 | 0.7111 | 0.6578 |
| 0.7717 | 1.3811 | 1300 | 0.7198 | 0.6857 | 0.6385 | 0.7203 | 0.6857 |
| 0.7298 | 1.4874 | 1400 | 0.6948 | 0.7015 | 0.6731 | 0.7208 | 0.7015 |
| 0.7192 | 1.5936 | 1500 | 0.6792 | 0.7124 | 0.6873 | 0.7297 | 0.7124 |
| 0.7003 | 1.6999 | 1600 | 0.6539 | 0.7266 | 0.7109 | 0.7354 | 0.7266 |
| 0.69 | 1.8061 | 1700 | 0.6349 | 0.7422 | 0.7326 | 0.7444 | 0.7422 |
| 0.6759 | 1.9124 | 1800 | 0.6228 | 0.7478 | 0.7404 | 0.7505 | 0.7478 |
| 0.6689 | 2.0186 | 1900 | 0.6154 | 0.7474 | 0.7429 | 0.7473 | 0.7474 |
| 0.6505 | 2.1248 | 2000 | 0.6064 | 0.7557 | 0.7542 | 0.7573 | 0.7557 |
| 0.6602 | 2.2311 | 2100 | 0.5888 | 0.7646 | 0.7615 | 0.7657 | 0.7646 |
| 0.6498 | 2.3373 | 2200 | 0.5815 | 0.7653 | 0.7622 | 0.7643 | 0.7653 |
| 0.6484 | 2.4436 | 2300 | 0.5692 | 0.7716 | 0.7684 | 0.7715 | 0.7716 |
| 0.6366 | 2.5498 | 2400 | 0.5698 | 0.7695 | 0.7670 | 0.7703 | 0.7695 |
| 0.6304 | 2.6560 | 2500 | 0.5651 | 0.7705 | 0.7668 | 0.7702 | 0.7705 |
| 0.5906 | 2.7623 | 2600 | 0.5663 | 0.7691 | 0.7628 | 0.7719 | 0.7691 |
| 0.607 | 2.8685 | 2700 | 0.5563 | 0.7733 | 0.7699 | 0.7737 | 0.7733 |
| 0.6211 | 2.9748 | 2800 | 0.5513 | 0.7745 | 0.7709 | 0.7753 | 0.7745 |
| 0.6043 | 3.0810 | 2900 | 0.5478 | 0.7760 | 0.7735 | 0.7761 | 0.7760 |
| 0.5905 | 3.1873 | 3000 | 0.5439 | 0.7778 | 0.7765 | 0.7771 | 0.7778 |
| 0.6309 | 3.2935 | 3100 | 0.5399 | 0.7774 | 0.7749 | 0.7770 | 0.7774 |
| 0.5975 | 3.3997 | 3200 | 0.5431 | 0.7792 | 0.7778 | 0.7786 | 0.7792 |
| 0.6175 | 3.5060 | 3300 | 0.5340 | 0.7813 | 0.7779 | 0.7829 | 0.7813 |
| 0.588 | 3.6122 | 3400 | 0.5293 | 0.7844 | 0.7817 | 0.7850 | 0.7844 |
| 0.5981 | 3.7185 | 3500 | 0.5265 | 0.7849 | 0.7832 | 0.7844 | 0.7849 |
| 0.5926 | 3.8247 | 3600 | 0.5335 | 0.7784 | 0.7732 | 0.7807 | 0.7784 |
| 0.5945 | 3.9309 | 3700 | 0.5249 | 0.7853 | 0.7821 | 0.7866 | 0.7853 |
| 0.5956 | 4.0372 | 3800 | 0.5325 | 0.7829 | 0.7788 | 0.7845 | 0.7829 |
| 0.5943 | 4.1434 | 3900 | 0.5295 | 0.7860 | 0.7837 | 0.7860 | 0.7860 |
| 0.5842 | 4.2497 | 4000 | 0.5227 | 0.7848 | 0.7813 | 0.7857 | 0.7848 |
| 0.5666 | 4.3559 | 4100 | 0.5187 | 0.7864 | 0.7833 | 0.7881 | 0.7864 |
| 0.5762 | 4.4622 | 4200 | 0.5179 | 0.7876 | 0.7859 | 0.7889 | 0.7876 |
| 0.595 | 4.5684 | 4300 | 0.5111 | 0.7909 | 0.7898 | 0.7902 | 0.7909 |
| 0.5641 | 4.6746 | 4400 | 0.5151 | 0.7888 | 0.7874 | 0.7890 | 0.7888 |
| 0.5743 | 4.7809 | 4500 | 0.5113 | 0.7894 | 0.7883 | 0.7907 | 0.7894 |
| 0.564 | 4.8871 | 4600 | 0.5075 | 0.7919 | 0.7902 | 0.7933 | 0.7919 |
| 0.578 | 4.9934 | 4700 | 0.5029 | 0.7921 | 0.7905 | 0.7918 | 0.7921 |
| 0.5643 | 5.0996 | 4800 | 0.5042 | 0.7931 | 0.7909 | 0.7945 | 0.7931 |
| 0.5611 | 5.2058 | 4900 | 0.5012 | 0.7940 | 0.7909 | 0.7959 | 0.7940 |
| 0.5736 | 5.3121 | 5000 | 0.5133 | 0.7864 | 0.7812 | 0.7919 | 0.7864 |
| 0.5635 | 5.4183 | 5100 | 0.5034 | 0.7947 | 0.7939 | 0.7950 | 0.7947 |
| 0.5653 | 5.5246 | 5200 | 0.4981 | 0.7966 | 0.7944 | 0.7982 | 0.7966 |
| 0.5664 | 5.6308 | 5300 | 0.4959 | 0.7951 | 0.7932 | 0.7950 | 0.7951 |
| 0.5689 | 5.7371 | 5400 | 0.4946 | 0.7972 | 0.7956 | 0.7969 | 0.7972 |
| 0.5394 | 5.8433 | 5500 | 0.5022 | 0.7928 | 0.7877 | 0.7979 | 0.7928 |
| 0.5645 | 5.9495 | 5600 | 0.4931 | 0.7965 | 0.7944 | 0.7974 | 0.7965 |
| 0.5588 | 6.0558 | 5700 | 0.4895 | 0.7990 | 0.7975 | 0.7990 | 0.7990 |
| 0.5539 | 6.1620 | 5800 | 0.4874 | 0.8008 | 0.7992 | 0.8010 | 0.8008 |
| 0.5504 | 6.2683 | 5900 | 0.4945 | 0.7970 | 0.7941 | 0.7996 | 0.7970 |
| 0.5683 | 6.3745 | 6000 | 0.4883 | 0.7985 | 0.7969 | 0.7983 | 0.7985 |
| 0.5594 | 6.4807 | 6100 | 0.4883 | 0.7985 | 0.7976 | 0.7980 | 0.7985 |
| 0.5709 | 6.5870 | 6200 | 0.4883 | 0.7976 | 0.7959 | 0.7983 | 0.7976 |
| 0.553 | 6.6932 | 6300 | 0.4907 | 0.7954 | 0.7931 | 0.7964 | 0.7954 |
| 0.5515 | 6.7995 | 6400 | 0.4893 | 0.7971 | 0.7945 | 0.7982 | 0.7971 |
| 0.5501 | 6.9057 | 6500 | 0.4821 | 0.7982 | 0.7969 | 0.7978 | 0.7982 |
| 0.5567 | 7.0120 | 6600 | 0.4851 | 0.7985 | 0.7955 | 0.8005 | 0.7985 |
| 0.5387 | 7.1182 | 6700 | 0.4808 | 0.8012 | 0.7995 | 0.8014 | 0.8012 |
| 0.5257 | 7.2244 | 6800 | 0.4795 | 0.8020 | 0.8008 | 0.8018 | 0.8020 |
| 0.5591 | 7.3307 | 6900 | 0.4809 | 0.8010 | 0.7980 | 0.8030 | 0.8010 |
| 0.5513 | 7.4369 | 7000 | 0.4747 | 0.8047 | 0.8033 | 0.8048 | 0.8047 |
| 0.5636 | 7.5432 | 7100 | 0.4760 | 0.8031 | 0.8018 | 0.8031 | 0.8031 |
| 0.5491 | 7.6494 | 7200 | 0.4787 | 0.8004 | 0.7981 | 0.8014 | 0.8004 |
| 0.5504 | 7.7556 | 7300 | 0.4745 | 0.8041 | 0.8026 | 0.8043 | 0.8041 |
| 0.5393 | 7.8619 | 7400 | 0.4829 | 0.7983 | 0.7950 | 0.8006 | 0.7983 |
| 0.543 | 7.9681 | 7500 | 0.4744 | 0.8047 | 0.8036 | 0.8046 | 0.8047 |
| 0.5308 | 8.0744 | 7600 | 0.4786 | 0.8018 | 0.7999 | 0.8022 | 0.8018 |
| 0.5341 | 8.1806 | 7700 | 0.4740 | 0.8037 | 0.8024 | 0.8036 | 0.8037 |
| 0.5351 | 8.2869 | 7800 | 0.4762 | 0.8035 | 0.8015 | 0.8043 | 0.8035 |
| 0.5309 | 8.3931 | 7900 | 0.4754 | 0.8053 | 0.8042 | 0.8051 | 0.8053 |
| 0.5409 | 8.4993 | 8000 | 0.4761 | 0.8036 | 0.8014 | 0.8049 | 0.8036 |
| 0.5368 | 8.6056 | 8100 | 0.4722 | 0.8052 | 0.8038 | 0.8051 | 0.8052 |
| 0.5276 | 8.7118 | 8200 | 0.4735 | 0.8057 | 0.8046 | 0.8053 | 0.8057 |
| 0.5388 | 8.8181 | 8300 | 0.4766 | 0.8020 | 0.8002 | 0.8022 | 0.8020 |
| 0.5368 | 8.9243 | 8400 | 0.4742 | 0.8037 | 0.8021 | 0.8036 | 0.8037 |
| 0.5574 | 9.0305 | 8500 | 0.4762 | 0.8034 | 0.8011 | 0.8047 | 0.8034 |
| 0.5562 | 9.1368 | 8600 | 0.4735 | 0.8034 | 0.8011 | 0.8046 | 0.8034 |
| 0.5577 | 9.2430 | 8700 | 0.4735 | 0.8048 | 0.8042 | 0.8048 | 0.8048 |
| 0.5398 | 9.3493 | 8800 | 0.4698 | 0.8056 | 0.8041 | 0.8058 | 0.8056 |
| 0.5224 | 9.4555 | 8900 | 0.4734 | 0.8040 | 0.8025 | 0.8041 | 0.8040 |
| 0.5392 | 9.5618 | 9000 | 0.4713 | 0.8051 | 0.8038 | 0.8051 | 0.8051 |
| 0.5346 | 9.6680 | 9100 | 0.4706 | 0.8048 | 0.8037 | 0.8044 | 0.8048 |
| 0.5295 | 9.7742 | 9200 | 0.4713 | 0.8041 | 0.8026 | 0.8040 | 0.8041 |
| 0.5607 | 9.8805 | 9300 | 0.4689 | 0.8051 | 0.8039 | 0.8050 | 0.8051 |
| 0.5354 | 9.9867 | 9400 | 0.4692 | 0.8057 | 0.8042 | 0.8058 | 0.8057 |
| 0.5427 | 10.0930 | 9500 | 0.4678 | 0.8056 | 0.8043 | 0.8053 | 0.8056 |
| 0.5216 | 10.1992 | 9600 | 0.4711 | 0.8036 | 0.8020 | 0.8036 | 0.8036 |
| 0.5348 | 10.3054 | 9700 | 0.4696 | 0.8054 | 0.8040 | 0.8053 | 0.8054 |
| 0.5319 | 10.4117 | 9800 | 0.4710 | 0.8047 | 0.8029 | 0.8049 | 0.8047 |
| 0.5465 | 10.5179 | 9900 | 0.4691 | 0.8049 | 0.8035 | 0.8048 | 0.8049 |
| 0.5387 | 10.6242 | 10000 | 0.4699 | 0.8042 | 0.8023 | 0.8046 | 0.8042 |
| 0.5431 | 10.7304 | 10100 | 0.4697 | 0.8045 | 0.8030 | 0.8044 | 0.8045 |
| 0.5358 | 10.8367 | 10200 | 0.4698 | 0.8043 | 0.8025 | 0.8045 | 0.8043 |
| 0.5506 | 10.9429 | 10300 | 0.4693 | 0.8047 | 0.8032 | 0.8045 | 0.8047 |
| 0.5049 | 11.0491 | 10400 | 0.4686 | 0.8051 | 0.8037 | 0.8050 | 0.8051 |
| 0.5338 | 11.1554 | 10500 | 0.4685 | 0.8058 | 0.8044 | 0.8056 | 0.8058 |
| 0.5411 | 11.2616 | 10600 | 0.4681 | 0.8056 | 0.8043 | 0.8054 | 0.8056 |
| 0.5414 | 11.3679 | 10700 | 0.4684 | 0.8053 | 0.8039 | 0.8050 | 0.8053 |
| 0.5438 | 11.4741 | 10800 | 0.4689 | 0.8053 | 0.8038 | 0.8052 | 0.8053 |
| 0.5271 | 11.5803 | 10900 | 0.4688 | 0.8050 | 0.8035 | 0.8049 | 0.8050 |
| 0.5292 | 11.6866 | 11000 | 0.4688 | 0.8049 | 0.8034 | 0.8049 | 0.8049 |
| 0.5318 | 11.7928 | 11100 | 0.4688 | 0.8046 | 0.8031 | 0.8045 | 0.8046 |
| 0.54 | 11.8991 | 11200 | 0.4688 | 0.8047 | 0.8032 | 0.8046 | 0.8047 |
Framework versions
- PEFT 0.13.3.dev0
- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.0.2
- Tokenizers 0.19.1
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for BTX24/vit-base-patch16-224-in21k-finetuned-breast-composition
Base model
google/vit-base-patch16-224-in21k








