Malaysian-Dia-1.6B

Full parameter finetuning nari-labs/Dia-1.6B on mesolitica/Malaysian-Emilia.

Complete tutorial how to use at mesolitica/malaya-speech/Dia-TTS.

How we trained it

  1. The finetuning done in FP32-BF16 mixed precision training.
  2. Multipacking encoder-decoder.
  3. Wandb at https://wandb.ai/huseinzol05/dia-tts-malaysian-emilia-full-mixed-precision-multipacking-v2

Source code

Source code at https://github.com/mesolitica/malaya-speech/tree/master/session/dia-tts

Acknowledgement

Special thanks to https://www.sns.com.my and Nvidia for 8x H100 node!

Downloads last month
16
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including mesolitica/Malaysian-Dia-1.6B