KSimplex Geometric Attention Prior

Geometric cross-attention prior for SD1.5 using pentachoron (4-simplex) structures.

Architecture

Component Params
SD1.5 UNet (frozen) 859,520,964
Geo prior (trained) 4,845,725
Geo conditioner (trained) 1,613,847

Simplex Configuration

Parameter Value
k (simplex dim) 4
Embedding dim 32
Feature dim 768
Stacked layers 4
Attention heads 8
Base deformation 0.25
Residual blend learnable
Timestep conditioned True

GeoVocab Conditioning

Parameter Value
Gate dim 17
Patch feat dim 256
Num patches 64
Cross-attention enabled
Cross-attn heads 8
Blend mode learnable

Usage

from sd15_trainer_geo.pipeline import load_pipeline

pipe = load_pipeline(geo_repo_id="AbstractPhil/sd15-geovocab-lora-prototype")

Training Info

  • dataset: AbstractPhil/synthetic-characters (schnell_full_1_512)
  • subdir: schnell_full_1_5e-5
  • samples: 50000
  • epochs: 10
  • steps: 83330
  • shift: 2.0
  • base_lr: 5e-05
  • min_snr_gamma: 5.0
  • cfg_dropout: 0.1
  • batch_size: 6
  • geo_loss_weight: 0.01
  • geovocab_lr_mult: 2.0
  • clip_vae: AbstractPhil/geovae-proto/clip_vae/best_model.pt
  • patch_maker: AbstractPhil/geovocab-patch-maker
  • loss_final: 0.3035515168607235

License

MIT — AbstractPhil

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AbstractPhil/sd15-geovocab-lora-prototype

Finetuned
(352)
this model