Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
fzengin18
/
multrenizer
like
0
Turkish
English
tokenizers
tokenizer
unigram
turkish
english
bilingual
sentencepiece
arxiv:
2508.08424
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
main
multrenizer
1.84 MB
Ctrl+K
Ctrl+K
1 contributor
History:
9 commits
fzengin18
Update model card
85d2709
verified
about 11 hours ago
.gitattributes
Safe
1.52 kB
initial commit
12 days ago
LICENSE
Safe
10.8 kB
Upload LICENSE with huggingface_hub
12 days ago
README.md
17.1 kB
Update model card
about 11 hours ago
special_tokens_map.json
12.8 kB
Rebuild: case-preserving + correct utility taxonomy
about 11 hours ago
tokenizer.json
1.79 MB
Rebuild: case-preserving + correct utility taxonomy
about 11 hours ago
tokenizer_config.json
13.6 kB
Rebuild: case-preserving + correct utility taxonomy
about 11 hours ago