Safetensors
Basque
llama

Llama3.2-1B-eu continual

Llama-3.2 1B for Basque continually pretrained on ZelaHandi-v1 for 5 epochs.

📝 Paper: Sub-1B Language Models for Low-Resource Languages: Training Strategies and Insights for Basque accepted in the 5TH MULTILINGUAL REPRESENTATION LEARNING (MRL) WORKSHOP 2025 (EMNLP)

Acknowledgments

The creation of this dataset has been partially funded by the Basque Government (ICL4LANG project, grant no. KK-2023/00094) and the European Union (EFA 104/01-LINGUATEC IA project, INTERREG POCTEFA 2021-2027 program). Pre-training and fine-tuning of SLMs were conducted using the Hyperion system at the Donostia International Physics Center (DIPC). Finally, we thank Idoia Davila Uzkudun for her contributions to manual data curation and evaluation.

Citation

If you use this dataset please cite the following paper:

@inproceedings{urbizu2025sub,
  title={Sub-1B Language Models for Low-Resource Languages: Training Strategies and Insights for {B}asque},
  author={Urbizu, Gorka and Corral, Ander and Saralegi, Xabier and San Vicente, I{\~n}aki},
  booktitle={Proceedings of the 5th Workshop on Multilingual Representation Learning (MRL 2025)},
  pages={519--530},
  year={2025}
}

Contact

Downloads last month
13
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for orai-nlp/Llama3.2-1B-eu-continual

Finetuned
(805)
this model
Finetunes
1 model

Datasets used to train orai-nlp/Llama3.2-1B-eu-continual

Collection including orai-nlp/Llama3.2-1B-eu-continual