How Far Can 100 Samples Go? Unlocking Overall Zero-Shot Multilingual Translation via Tiny Multi-Parallel Data Paper • 2401.12413 • Published Jan 22, 2024 • 1
Make Pre-trained Model Reversible: From Parameter to Memory Efficient Fine-Tuning Paper • 2306.00477 • Published Jun 1, 2023 • 2
Towards a Better Understanding of Variations in Zero-Shot Neural Machine Translation Performance Paper • 2310.10385 • Published Oct 16, 2023 • 1
Neuron Specialization: Leveraging intrinsic task modularity for multilingual machine translation Paper • 2404.11201 • Published Apr 17, 2024 • 1
Remedy: Learning Machine Translation Evaluation from Human Preferences with Reward Modeling Paper • 2504.13630 • Published Apr 18, 2025 • 1
Investigating Test-Time Scaling with Reranking for Machine Translation Paper • 2509.19020 • Published Sep 23, 2025 • 1
Remedy-R: Generative Reasoning for Machine Translation Evaluation without Error Annotations Paper • 2512.18906 • Published Dec 21, 2025 • 1
ReMedy Collection 🚀 ReMedy: Machine Translation Evaluation via Reward Modeling • 5 items • Updated Nov 22, 2025 • 1
Qwen2.5 Collection Qwen2.5 language models, including pretrained and instruction-tuned models of 7 sizes, including 0.5B, 1.5B, 3B, 7B, 14B, 32B, and 72B. • 43 items • Updated Mar 2 • 710