b81db5edc1723338fe5e69ea9599c43f
This model is a fine-tuned version of facebook/opt-125m on the nyu-mll/glue [cola] dataset. It achieves the following results on the evaluation set:
- Loss: 0.8617
- Data Size: 1.0
- Epoch Runtime: 20.0130
- Accuracy: 0.7607
- F1 Macro: 0.7223
- Rouge1: 0.7607
- Rouge2: 0.0
- Rougel: 0.7607
- Rougelsum: 0.7617
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro | Rouge1 | Rouge2 | Rougel | Rougelsum |
|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 0.7581 | 0 | 1.6190 | 0.6885 | 0.4137 | 0.6890 | 0.0 | 0.6885 | 0.6885 |
| No log | 1 | 267 | 0.6682 | 0.0078 | 2.4625 | 0.6201 | 0.5079 | 0.6211 | 0.0 | 0.6211 | 0.6201 |
| No log | 2 | 534 | 0.7250 | 0.0156 | 1.8983 | 0.6885 | 0.4137 | 0.6895 | 0.0 | 0.6895 | 0.6885 |
| No log | 3 | 801 | 0.6312 | 0.0312 | 2.2904 | 0.6689 | 0.5196 | 0.6699 | 0.0 | 0.6689 | 0.6689 |
| No log | 4 | 1068 | 0.6133 | 0.0625 | 3.0742 | 0.6963 | 0.4605 | 0.6963 | 0.0 | 0.6963 | 0.6963 |
| 0.0368 | 5 | 1335 | 0.7332 | 0.125 | 4.2057 | 0.6895 | 0.4111 | 0.6904 | 0.0 | 0.6899 | 0.6895 |
| 0.5281 | 6 | 1602 | 0.5788 | 0.25 | 6.3516 | 0.7236 | 0.5645 | 0.7246 | 0.0 | 0.7227 | 0.7236 |
| 0.472 | 7 | 1869 | 0.5995 | 0.5 | 10.9350 | 0.75 | 0.6258 | 0.75 | 0.0 | 0.75 | 0.75 |
| 0.3882 | 8.0 | 2136 | 0.5349 | 1.0 | 20.0279 | 0.7578 | 0.6946 | 0.7578 | 0.0 | 0.7588 | 0.7578 |
| 0.2541 | 9.0 | 2403 | 0.5746 | 1.0 | 19.8151 | 0.7598 | 0.7041 | 0.7598 | 0.0 | 0.7598 | 0.7607 |
| 0.1935 | 10.0 | 2670 | 0.8062 | 1.0 | 19.7531 | 0.7568 | 0.6930 | 0.7568 | 0.0 | 0.7568 | 0.7578 |
| 0.1755 | 11.0 | 2937 | 0.7620 | 1.0 | 20.0056 | 0.7529 | 0.7015 | 0.7520 | 0.0 | 0.7529 | 0.7529 |
| 0.1829 | 12.0 | 3204 | 0.8617 | 1.0 | 20.0130 | 0.7607 | 0.7223 | 0.7607 | 0.0 | 0.7607 | 0.7617 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- 9
Model tree for contemmcm/b81db5edc1723338fe5e69ea9599c43f
Base model
facebook/opt-125m