Whisper Small FR - Radiologie

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2151
  • Wer: 50.5180

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 6
  • seed: 3407
  • optimizer: Use OptimizerNames.ADAMW_8BIT with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • training_steps: 1000

Training results

Training Loss Epoch Step Validation Loss Wer
No log 6.25 100 1.3961 111.2127
No log 12.5 200 1.2598 53.5040
No log 18.75 300 1.2146 67.8854
No log 25.0 400 1.2144 50.5180
0.262 31.25 500 1.2287 57.8915
0.262 37.5 600 1.2258 62.8275
0.262 43.75 700 1.2176 53.8087
0.262 50.0 800 1.2204 52.5289
0.262 56.25 900 1.2156 53.1383
0.0079 62.5 1000 1.2151 50.5180

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.2
Downloads last month
10
Safetensors
Model size
0.2B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for StephaneBah/whisper-small-rad-FR3

Finetuned
(2991)
this model

Collection including StephaneBah/whisper-small-rad-FR3