suchirsalhan's picture
Update README.md
fa53776 verified
metadata
tags:
  - babylm
  - language-model
  - gpt-bert
  - multilingual
license: mit

babybabellm-multi-all

This repository contains checkpoints for the multilingual (all) variant of BabyBabeLLM.

Files

  • *_15_16.bin – main model weights
  • *_15_16_ema.bin – EMA smoothed weights
  • *_15_16_state_dict.bin – PyTorch state dict
  • pytorch_model.bin – extracted EMA weights (for AutoModel)

Usage

from transformers import AutoModel, AutoTokenizer
repo = "suchirsalhan/babybabellm-multi-all"
tokenizer = AutoTokenizer.from_pretrained(repo)
model = AutoModel.from_pretrained(repo)
inputs = tokenizer("Hello world!", return_tensors="pt")
outputs = model(**inputs)

Notes

  • These are research checkpoints trained on BabyLM-style data.
  • Model naming: multiall indicates the language/config variant.