wav2vec2-xls-r-1b-distant-from-faroese-251h-30-epochs_20250120_v2
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0947
- Wer: 24.1682
- Cer: 3.8304
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Lan=Hungarian, 44h
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 3000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|---|---|---|---|---|---|
| 2.0058 | 1.4451 | 1000 | 1.3562 | 99.5531 | 42.6542 |
| 0.5946 | 2.8902 | 2000 | 0.2567 | 42.1243 | 7.7199 |
| 0.2822 | 4.3353 | 3000 | 0.1662 | 33.2307 | 5.6799 |
| 0.2782 | 5.7803 | 4000 | 0.1404 | 30.0924 | 5.0783 |
| 0.2994 | 7.2254 | 5000 | 0.1233 | 28.8956 | 4.7534 |
| 0.2259 | 8.6705 | 6000 | 0.1251 | 28.1458 | 4.5812 |
| 0.1798 | 10.1156 | 7000 | 0.1161 | 27.0186 | 4.3967 |
| 0.1734 | 11.5607 | 8000 | 0.1081 | 26.2886 | 4.2150 |
| 0.2183 | 13.0058 | 9000 | 0.1044 | 26.0652 | 4.1661 |
| 0.1639 | 14.4509 | 10000 | 0.1001 | 25.6977 | 4.0875 |
| 0.1519 | 15.8960 | 11000 | 0.0972 | 25.2061 | 4.0101 |
| 0.1255 | 17.3410 | 12000 | 0.0993 | 25.3153 | 4.0461 |
| 0.1181 | 18.7861 | 13000 | 0.0957 | 24.8237 | 3.9891 |
| 0.1274 | 20.2312 | 14000 | 0.0952 | 24.6102 | 3.9003 |
| 0.1302 | 21.6763 | 15000 | 0.0961 | 24.5506 | 3.9261 |
| 0.1073 | 23.1214 | 16000 | 0.0952 | 24.4116 | 3.8732 |
| 0.0998 | 24.5665 | 17000 | 0.0944 | 24.3073 | 3.8386 |
| 0.1149 | 26.0116 | 18000 | 0.0952 | 24.2576 | 3.8338 |
| 0.115 | 27.4566 | 19000 | 0.0946 | 24.1732 | 3.8243 |
| 0.0911 | 28.9017 | 20000 | 0.0947 | 24.1682 | 3.8304 |
Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 3