Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
iamgroot42
/
smollm3-Custom-DPO
like
0
Text Generation
PEFT
Safetensors
Transformers
dpo
lora
trl
conversational
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
smollm3-Custom-DPO
/
adapter_model.safetensors
Commit History
Upload folder using huggingface_hub
135193d
verified
iamgroot42
commited on
Oct 13