Griffin: Pretrained Checkpoints
This repository contains various pretrained checkpoints for the Griffin model. The paper is at Link
Checkpoints
The checkpoints are organized as follows:
./checkpoints/
βββ single-completion # Pretrained single table completion model.
βββ single-sft # Pretrained single table SFT model. Used in main experiments.
βββ transfer # Pretrained transfer model. Used in transfer experiments.
βββ commerce-1 # Split name.
βββ FULL # RDB-SFT setting name. This one used in main transfer experiments.
βββ MIXED # RDB-SFT setting name. Used in ablation in RDB-SFT setting.
βββ LIMITED # RDB-SFT setting name. Used in ablation in RDB-SFT setting.
βββ commerce-2 # Same as above.
βββ FULL
βββ MIXED
βββ LIMITED
βββ others-1
βββ FULL
βββ MIXED
βββ LIMITED
βββ others-2
βββ FULL
βββ MIXED
βββ LIMITED
How to use
To get started, you will need to have the model's architecture defined in your code, provided in Github Repo. You can then use the huggingface_hub library to download a specific checkpoint and load its weights.
import json
import torch
from huggingface_hub import hf_hub_download
import accelerate
# Assume 'GriffinModel' is your model's class definition
# from your_project_position.hmodel import GriffinMod
# 1. Define the repository ID and the specific file you want to load
repo_id = "yamboo/Griffin_models"
# Example: Loading the main single-table SFT model
checkpoint_path = "single-sft/model.safetensors"
config_path = "single-sft/config.json"
# 2. Download the checkpoint file from the Hub
model_weights_path = hf_hub_download(repo_id=repo_id, filename=checkpoint_path)
model_config_path = hf_hub_download(repo_id=repo_id, filename=config_path)
config = json.load(open("config.json", "r"))
# 3. Instantiate your model and load the weights. We use accelerate to align with Github repo experiment pipeline.
model = GriffinMod(**config) # Make sure to pass any required config
accelerate.load_checkpoint_in_model(model, model_weights_path)
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support