File size: 3,839 Bytes
			
			| 387afd3 209517f 387afd3 3645a66 387afd3 7e59714 387afd3 e21db0c 387afd3 7e59714 387afd3 e21db0c 209517f 387afd3 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 | ---
license: apache-2.0
base_model: distilbert/distilbert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: my_awesome_qa_model
  results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_qa_model
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 5.7197
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log        | 1.0   | 9    | 5.6727          |
| No log        | 2.0   | 18   | 4.7499          |
| No log        | 3.0   | 27   | 4.0555          |
| No log        | 4.0   | 36   | 3.6859          |
| No log        | 5.0   | 45   | 3.7884          |
| No log        | 6.0   | 54   | 3.8482          |
| No log        | 7.0   | 63   | 3.9977          |
| No log        | 8.0   | 72   | 4.3411          |
| No log        | 9.0   | 81   | 4.6886          |
| No log        | 10.0  | 90   | 4.6630          |
| No log        | 11.0  | 99   | 4.7296          |
| No log        | 12.0  | 108  | 4.6999          |
| No log        | 13.0  | 117  | 4.8416          |
| No log        | 14.0  | 126  | 4.9140          |
| No log        | 15.0  | 135  | 4.7755          |
| No log        | 16.0  | 144  | 4.9881          |
| No log        | 17.0  | 153  | 5.0003          |
| No log        | 18.0  | 162  | 5.0493          |
| No log        | 19.0  | 171  | 5.2228          |
| No log        | 20.0  | 180  | 5.2716          |
| No log        | 21.0  | 189  | 5.1830          |
| No log        | 22.0  | 198  | 5.1855          |
| No log        | 23.0  | 207  | 5.1763          |
| No log        | 24.0  | 216  | 5.4629          |
| No log        | 25.0  | 225  | 5.4778          |
| No log        | 26.0  | 234  | 5.4658          |
| No log        | 27.0  | 243  | 5.4926          |
| No log        | 28.0  | 252  | 5.5078          |
| No log        | 29.0  | 261  | 5.7624          |
| No log        | 30.0  | 270  | 5.5743          |
| No log        | 31.0  | 279  | 5.6203          |
| No log        | 32.0  | 288  | 5.5513          |
| No log        | 33.0  | 297  | 5.5641          |
| No log        | 34.0  | 306  | 5.6943          |
| No log        | 35.0  | 315  | 5.6849          |
| No log        | 36.0  | 324  | 5.6235          |
| No log        | 37.0  | 333  | 5.6985          |
| No log        | 38.0  | 342  | 5.6973          |
| No log        | 39.0  | 351  | 5.6863          |
| No log        | 40.0  | 360  | 5.6592          |
| No log        | 41.0  | 369  | 5.7277          |
| No log        | 42.0  | 378  | 5.7641          |
| No log        | 43.0  | 387  | 5.7648          |
| No log        | 44.0  | 396  | 5.7564          |
| No log        | 45.0  | 405  | 5.7573          |
| No log        | 46.0  | 414  | 5.7402          |
| No log        | 47.0  | 423  | 5.7205          |
| No log        | 48.0  | 432  | 5.7104          |
| No log        | 49.0  | 441  | 5.7145          |
| No log        | 50.0  | 450  | 5.7197          |
### Framework versions
- Transformers 4.38.2
- Pytorch 2.1.0+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
 | 
