CLAMP HMM checkpoints
Neural-HMM priors trained on Llama-3.1 continuation samples for two EAI
datasets (BEHAVIOR and VirtualHome). Used by eai_ctrlg as the semantic
prior for constrained decoding with γ+β DFAs.
Files
behavior/hmm-h128-lr0.01/checkpoint.eqx— hidden=128, lr=0.01virtualhome/hmm-h128-lr0.01/checkpoint.eqx— hidden=128, lr=0.01
Loading
import equinox as eqx
from ctrlg.hmm.model import ConditionalHMM
model = eqx.tree_deserialise_leaves(
"behavior/hmm-h128-lr0.01/checkpoint.eqx", ConditionalHMM(...))
Training pipeline: see eai_train/cond_hmm/ in the companion repo.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support