dgo-tts-training-data-a-speecht5

This model is a fine-tuned version of microsoft/speecht5_tts on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4943

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 4000
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.6491 5.3763 1000 0.6131
0.5683 10.7527 2000 0.5488
0.543 16.1290 3000 0.5120
0.5203 21.5054 4000 0.5201
0.5149 26.8817 5000 0.5136
0.4962 32.2581 6000 0.4971
0.488 37.6344 7000 0.4969
0.4742 43.0108 8000 0.4880
0.4746 48.3871 9000 0.4941
0.4634 53.7634 10000 0.4886
0.4491 59.1398 11000 0.4912
0.4416 64.5161 12000 0.4854
0.4388 69.8925 13000 0.4894
0.4263 75.2688 14000 0.4911
0.425 80.6452 15000 0.4853
0.4179 86.0215 16000 0.4862
0.4205 91.3978 17000 0.4882
0.4087 96.7742 18000 0.4869
0.4079 102.1505 19000 0.4898
0.4142 107.5269 20000 0.4892
0.4132 112.9032 21000 0.4937
0.4126 118.2796 22000 0.4908
0.4091 123.6559 23000 0.4901
0.4078 129.0323 24000 0.4886
0.4167 134.4086 25000 0.4926
0.4052 139.7849 26000 0.4906
0.4057 145.1613 27000 0.4905
0.408 150.5376 28000 0.4919
0.4054 155.9140 29000 0.4896
0.4096 161.2903 30000 0.4920
0.4087 166.6667 31000 0.4922
0.3987 172.0430 32000 0.4911
0.4006 177.4194 33000 0.4934
0.4017 182.7957 34000 0.4936
0.4 188.1720 35000 0.4923
0.401 193.5484 36000 0.4941
0.3984 198.9247 37000 0.4946
0.401 204.3011 38000 0.4923
0.4012 209.6774 39000 0.4945
0.396 215.0538 40000 0.4943

Framework versions

  • Transformers 4.57.0
  • Pytorch 2.8.0+cu128
  • Datasets 4.1.1
  • Tokenizers 0.22.1
Downloads last month
35
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for sil-ai/dgo-tts-training-data-a-speecht5

Finetuned
(1264)
this model