CLAMP HMM checkpoints

Neural-HMM priors trained on Llama-3.1 continuation samples for two EAI datasets (BEHAVIOR and VirtualHome). Used by eai_ctrlg as the semantic prior for constrained decoding with γ+β DFAs.

Files

  • behavior/hmm-h128-lr0.01/checkpoint.eqx — hidden=128, lr=0.01
  • virtualhome/hmm-h128-lr0.01/checkpoint.eqx — hidden=128, lr=0.01

Loading

import equinox as eqx
from ctrlg.hmm.model import ConditionalHMM
model = eqx.tree_deserialise_leaves(
    "behavior/hmm-h128-lr0.01/checkpoint.eqx", ConditionalHMM(...))

Training pipeline: see eai_train/cond_hmm/ in the companion repo.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support