Trouble exporting AI4Bharat IndicTrans2 model to ONNX using Optimum
#14
by
harshhh17
- opened
I'm working on a project to create an offline, browser-based English-to-Hindi translation app. For this, I'm trying to use the ai4bharat/indictrans2-en-indic-1B model.
My goal is to convert the model from its Hugging Face PyTorch format to ONNX, which I can then run in a web browser using WebAssembly.
I've been trying to use the optimum library to perform this conversion, but I'm running into a series of errors, which seems to be related to the model's custom architecture and the optimum library's API.
What I have tried so far:
- Using optimum-cli: The command-line tool failed with unrecognized arguments and ValueErrors.
- Changing arguments: I have tried various combinations of arguments, such as using output-dir instead of output, and changing fp16=True to dtype="fp16". The TypeErrors seem to persist regardless.
- Manual Conversion: I have tried using torch.onnx.export directly, but this also caused errors with the model's custom tokenizer.