UNet with Sliding Window Attention

  • 8ch latent by moving modules from WF-VAE to NoobAI XL VAE
  • supports recent long context CLIPs
  • variable num_head in MHA across the layers
  • both the UNet and the Autoencoder are written in vanilla PyTorch

The result is similar to what Mitsua accomplished back then.

References

  • 2411.17459
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support