Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
naiweizi
/
dpo-harmless_helpful-rc_armo_mistral
like
0
Model card
Files
Files and versions
xet
Community
main
dpo-harmless_helpful-rc_armo_mistral
29.6 MB
1 contributor
History:
2 commits
naiweizi
Initial upload
c297cdf
verified
7 months ago
final_checkpoint
Initial upload
7 months ago
.gitattributes
Safe
1.52 kB
initial commit
7 months ago