File size: 1,439 Bytes
257daa9
 
 
 
 
 
 
 
 
 
 
ec97b6f
257daa9
ec97b6f
 
 
 
 
 
 
 
 
 
 
 
 
257daa9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
base_model:
- zerofata/L3.3-GeneticLemonade-Unleashed-v2.1-70B
- Nohobby/L3.3-Prikol-70B-v0.5
- allura-org/Bigger-Body-70b
library_name: transformers
tags:
- mergekit
- merge

---
# Smog

Tried out some obscure merge method. Turned out decent enough. The result is something averaged from three selected models, without any of them being too prevalent.

Distinct prose and (imo) best ERP out of the L3.3 models. 

**Settings:**

Temp 1.05, minP 0.01

**Quants:**

https://huggingface.co/mradermacher/L3.3-Smog-70B-GGUF

https://huggingface.co/mradermacher/L3.3-Smog-70B-i1-GGUF

## Merge Details
### Merge Method

This model was merged using the [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) merge method.

### Models Merged

The following models were included in the merge:
* [zerofata/L3.3-GeneticLemonade-Unleashed-v2.1-70B](https://huggingface.co/zerofata/L3.3-GeneticLemonade-Unleashed-v2.1-70B)
* [Nohobby/L3.3-Prikol-70B-v0.5](https://huggingface.co/Nohobby/L3.3-Prikol-70B-v0.5)
* [allura-org/Bigger-Body-70b](https://huggingface.co/allura-org/Bigger-Body-70b)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: zerofata/L3.3-GeneticLemonade-Unleashed-v2.1-70B
  - model: allura-org/Bigger-Body-70b
  - model: Nohobby/L3.3-Prikol-70B-v0.5
merge_method: karcher
parameters:
  max_iter: 18
  tol: 1e-8
normalize: true
int8_mask: true
dtype: bfloat16
```