Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
jekim028
/
my-dpo-model
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
my-dpo-model
Commit History
Upload folder using huggingface_hub
19e5f47
verified
jekim028
commited on
Oct 15
initial commit
75070c5
verified
jekim028
commited on
Oct 15