L3.3-Smog-70B / README.md
Yobenboben's picture
Update README.md
ec97b6f verified
metadata
base_model:
  - zerofata/L3.3-GeneticLemonade-Unleashed-v2.1-70B
  - Nohobby/L3.3-Prikol-70B-v0.5
  - allura-org/Bigger-Body-70b
library_name: transformers
tags:
  - mergekit
  - merge

Smog

Tried out some obscure merge method. Turned out decent enough. The result is something averaged from three selected models, without any of them being too prevalent.

Distinct prose and (imo) best ERP out of the L3.3 models.

Settings:

Temp 1.05, minP 0.01

Quants:

https://huggingface.co/mradermacher/L3.3-Smog-70B-GGUF

https://huggingface.co/mradermacher/L3.3-Smog-70B-i1-GGUF

Merge Details

Merge Method

This model was merged using the Karcher Mean merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: zerofata/L3.3-GeneticLemonade-Unleashed-v2.1-70B
  - model: allura-org/Bigger-Body-70b
  - model: Nohobby/L3.3-Prikol-70B-v0.5
merge_method: karcher
parameters:
  max_iter: 18
  tol: 1e-8
normalize: true
int8_mask: true
dtype: bfloat16