babybabellm-multi-all

This repository contains checkpoints for the multilingual (all) variant of BabyBabeLLM.

Files

  • *_15_16.bin – main model weights
  • *_15_16_ema.bin – EMA smoothed weights
  • *_15_16_state_dict.bin – PyTorch state dict
  • pytorch_model.bin – extracted EMA weights (for AutoModel)

Usage

from transformers import AutoModel, AutoTokenizer
repo = "suchirsalhan/babybabellm-multi-all"
tokenizer = AutoTokenizer.from_pretrained(repo)
model = AutoModel.from_pretrained(repo)
inputs = tokenizer("Hello world!", return_tensors="pt")
outputs = model(**inputs)

Notes

  • These are research checkpoints trained on BabyLM-style data.
  • Model naming: multiall indicates the language/config variant.
Downloads last month
13
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Collection including suchirsalhan/babybabellm-multi-all