File size: 394 Bytes
8d3799c
 
 
16718a3
 
 
 
 
 
 
8844f2a
1
2
3
4
5
6
7
8
9
10
11
---
license: apache-2.0
---

Catastrophic forgetting test results:

Initial evaluation loss on 1k subset of HuggingFaceTB/cosmopedia-100k dataset was 1.102. 100 steps of LISA training reduced this to 1.049.

Comparison to control: cosmo-1b started out with 1.003 loss on (a different subset of) dataset, increasing to 1.024 at 100 steps.

Axolotl config: Same as qdora version but without dora.