Spaces:
Running on Zero

Commit History

Change to mask model
ec35c53
verified

Ruurd commited on

Remove placeholder
a2125f4
verified

Ruurd commited on

Enable direct prompting
3b599f4
verified

Ruurd commited on

Remove answer_start check
8ee3ad1
verified

Ruurd commited on

Set answer_start to 0
4d3e9fe
verified

Ruurd commited on

Remove conditional generation
32b1a1f
verified

Ruurd commited on

Remove BidirectionalLlamaAttentionLayer input
12ed3ea
verified

Ruurd commited on

Remove EOT from noise visualization for ease of viewing
2798cf6
verified

Ruurd commited on

Removed custom bidirectional layer as it is not needed when using the Llama attention_masks
b57b92e
verified

Ruurd commited on

Reimplement EOT weighting
093a557
verified

Ruurd commited on

Change red highlighting for debugging
150f6e1
verified

Ruurd commited on

Add MASK slider
a86f3af
verified

Ruurd commited on

Changed to bidirectional
7d7b6d7
verified

Ruurd commited on

Visualize noising
09a7f62
verified

Ruurd commited on

Update app.py
7287d81
verified

Ruurd commited on

Fix MASK token noising
a73a13e
verified

Ruurd commited on

Remove unnecessary print statements - Add MASK noising
ea86b58
verified

Ruurd commited on

Changed model from tini_bi_m to tini_bi
4cd194e
verified

Ruurd commited on

Turn autocast back on
b6cb410
verified

Ruurd commited on

Try different monkey-patch
932e0b0
verified

Ruurd commited on

Monkey-patch (temporarily) for LoRA layers
b36c7a9
verified

Ruurd commited on

Remove autocast
526493a
verified

Ruurd commited on

Change model
bd6e2e6
verified

Ruurd commited on

New masking implementation
7141e39
verified

Ruurd commited on

Clamp logits BEFORE softmax
7ec3bd7
verified

Ruurd commited on

Set to bidirectional for debugging
04f0876
verified

Ruurd commited on

Set to unidirectional for debugging
57e6bce
verified

Ruurd commited on

Remove single ]
b3de773
verified

Ruurd commited on

Safe sampling
dc427d9
verified

Ruurd commited on

Deal with float values
a721355
verified

Ruurd commited on

Make attention mask float
8851563
verified

Ruurd commited on

Update llama_diffusion_model.py
238c8f8
verified

Ruurd commited on

Create safe fallback for models not yet initialized with masking_type
f2ca6a6
verified

Ruurd commited on

Overhaul code for appropriate masking for full model instead of just attention layers
b43e862
verified

Ruurd commited on

Fix attention_weights referenced before assigned bug
22370b2
verified

Ruurd commited on

Implement improved attention masking for bidirectional_masked
1723639
verified

Ruurd commited on

Fix noise_start to always start at 1.0
5213031
verified

Ruurd commited on

Remove erroneous indent
02eb393
verified

Ruurd commited on

Fix pause at start
fc90b53
verified

Ruurd commited on

Fix initial noise display
9cf2e5c
verified

Ruurd commited on

Fix initial noise display
19c039a
verified

Ruurd commited on

Fix initial noise display
93d53bc
verified

Ruurd commited on

Show initial noise
77e752d
verified

Ruurd commited on

Fix input_ids instead of current_tokens for first noise iteration
74479ff
verified

Ruurd commited on

Use updated settings for initial and clustered noise
02f6e21
verified

Ruurd commited on

Add noise start
034cffe
verified

Ruurd commited on

Add comma
1394a1e
verified

Ruurd commited on

Add pause length
8cb5f7a
verified

Ruurd commited on

Update app.py
0ffa4b5
verified

Ruurd commited on

Update requirements.txt
939752c
verified

Ruurd commited on