File size: 903 Bytes
4d94a33
a946ffb
47a8b4e
4d94a33
 
 
 
 
 
 
47a8b4e
 
4d94a33
 
 
 
47a8b4e
4d94a33
47a8b4e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
base_model: Alsebay/RainyMotip-2x7B
license: apache-2.0
library_name: transformers
tags:
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- moe
- merge
pipeline_tag: text-generation
inference: false
quantized_by: Suparious
---
# Alsebay/RainyMotip-2x7B AWQ

- Model creator: [Alsebay](https://huggingface.co/Alsebay)
- Original model: [RainyMotip-2x7B](https://huggingface.co/Alsebay/RainyMotip-2x7B)

## Model Summary

What is it? A 2x7B MoE model for Roleplay(?).

You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.

You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard

This model is is a Mixure of Experts (MoE) made with the following models:

- udkai/Turdus
- Kquant03/Samlagast-7B-laser-bf16

If you used it, please let me know if it good or not. Thank you :)