my_awesome_wnut_model
This model is a fine-tuned version of distilbert/distilbert-base-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4684
- Precision: 0.4865
- Recall: 0.3494
- F1: 0.4067
- Accuracy: 0.9440
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| No log | 1.0 | 213 | 0.2863 | 0.5189 | 0.3179 | 0.3943 | 0.9427 |
| No log | 2.0 | 426 | 0.3160 | 0.4378 | 0.3133 | 0.3652 | 0.9418 |
| 0.0602 | 3.0 | 639 | 0.3395 | 0.5007 | 0.3160 | 0.3875 | 0.9422 |
| 0.0602 | 4.0 | 852 | 0.3650 | 0.5213 | 0.3624 | 0.4276 | 0.9443 |
| 0.0231 | 5.0 | 1065 | 0.3833 | 0.5 | 0.3466 | 0.4094 | 0.9434 |
| 0.0231 | 6.0 | 1278 | 0.3958 | 0.5056 | 0.3327 | 0.4013 | 0.9438 |
| 0.0231 | 7.0 | 1491 | 0.3766 | 0.4452 | 0.3800 | 0.41 | 0.9433 |
| 0.0098 | 8.0 | 1704 | 0.3944 | 0.4573 | 0.3624 | 0.4043 | 0.9434 |
| 0.0098 | 9.0 | 1917 | 0.4326 | 0.4897 | 0.3318 | 0.3956 | 0.9434 |
| 0.0047 | 10.0 | 2130 | 0.4203 | 0.4764 | 0.3457 | 0.4006 | 0.9437 |
| 0.0047 | 11.0 | 2343 | 0.4555 | 0.5243 | 0.3401 | 0.4126 | 0.9444 |
| 0.0031 | 12.0 | 2556 | 0.4200 | 0.4840 | 0.3642 | 0.4157 | 0.9441 |
| 0.0031 | 13.0 | 2769 | 0.4481 | 0.5014 | 0.3401 | 0.4053 | 0.9442 |
| 0.0031 | 14.0 | 2982 | 0.4410 | 0.4806 | 0.3550 | 0.4083 | 0.9438 |
| 0.0025 | 15.0 | 3195 | 0.4548 | 0.4862 | 0.3429 | 0.4022 | 0.9435 |
| 0.0025 | 16.0 | 3408 | 0.5038 | 0.5223 | 0.3151 | 0.3931 | 0.9432 |
| 0.0017 | 17.0 | 3621 | 0.4635 | 0.4668 | 0.3448 | 0.3966 | 0.9431 |
| 0.0017 | 18.0 | 3834 | 0.4699 | 0.4823 | 0.3401 | 0.3989 | 0.9439 |
| 0.0015 | 19.0 | 4047 | 0.4646 | 0.4815 | 0.3503 | 0.4056 | 0.9437 |
| 0.0015 | 20.0 | 4260 | 0.4684 | 0.4865 | 0.3494 | 0.4067 | 0.9440 |
Framework versions
- Transformers 4.56.1
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.0
- Downloads last month
- 2
Model tree for ZONGHAOLI/my_awesome_wnut_model
Base model
distilbert/distilbert-base-cased