โ ๏ธ Warning: This model can produce narratives and RP that contain violent and graphic erotic content. Adjust your system prompt accordingly.

Theta Black Gorgon 8B - Story-Driving RP Model
The unknown angle. The unforeseen variable. The mythic problem in your loss curve.
Theta Black Gorgon is a small feral 8B that honors the summoning circle while setting everything inside it on fire.
Wee Beastie's prowling talents:
- Spins tales and leads the narrative dance through RP.
- Digests complex character cards like a gourmand - the richer the feast, the better the performance.
- Feed it breadcrumbs, get grocery store cake. Feed it a complex card/rich scenario, get a five-course meal with language that will peel paint.
Handler's notes:
- This isn't some domesticated assistant - it's a feral thing that matches your energy.
- If you hand it vivid characters, sharp dialogue, and specific stakes, it will happily start a bar fight, describe the slaying of enemies, and make vicars bleed from the ears with its language.
- Drop a couple graphic examples in your character card so it knows the safeties are off, and the varnish will start melting off anything within a metre of your laptop.
- Make sure your system prompt includes something kin to "you lead the plot and narrate {{char}} and all side characters, but you are FORBIDDEN to write {{user}}โs dialogue, choices, or body reactions. User is the only one who writes {{user}}" to ensure that this wildling doesn't take over the whole story!
Obligatory cover-my-arse bit:
Behaviour: Neutral by default, built from a mix of low-guardrail and uncensored models. It follows your system, card, and chat prompts, and can produce explicit erotic and/or violent content when steered there. By downloading this model, you confirm you are 18+ and fully responsible for anything you provoke out of it.
I cannot guarantee how whack it gets. Thatโs between you, your prompt, and whatever electronic gods lurk about here.
Merge Details
This is a merge of pre-trained language models created using mergekit.
Personal Lab Notes
Although there are newer Hermes models, Theta is still one of my favourites (Agrilla's CapybaraHermes is the other) because of its low guardrail, intelligent RP abilities: sense of humour, creativity, ability to drive the plot forward, and unapologetic nature.
Nothing wrecks my RP/ERP experience faster than a cringey, hand-wringing, submissive AI. I donโt want a soppy, apologetic assistant. I want an AI that can lead, surprise, make me laugh, improvise without checking, but still stay true to the card. I can course-correct in-story with a narrative redirect if I donโt like the lead itโs taking.
My choices for models in merging reflect frustration with the loss of independent creativity in AI in favour of โyes, of course, Iโm sorry, anything you say.โ
Black Sheep ร Dolphin is great for following prompts while having its own spin and a bit of attitude, though itโs not the sharpest tool in the shed, so it needs to be merged with higher intelligence and verbal dexterity.
Sthenoโs creative voice is excellent; the latest version took a small hit in creativity, but it does immersive work better and can handle complex cards.
Iโve started throwing my gnarly RP cards at this merge; Iโll update with more detailed impressions as I go.
Quant & context
A word, though. If youโre expecting strong RP performance on a potato quant (Q4), my advice is simple: foolโs quest. Drop a size in model and lower the context window if you have to.
I am presently testing Theta Black Gorgon at Q5_K_M with a 24K context, because thatโs all my laptop can handle. But I personally wonโt use potato quants for RP due to the brutal loss of quality in dialogue, verbal dexterity, humour, creativity, and the ability to handle complexity.
Merge Method
This model was merged using the SCE merge method hybrid with TIES, using OpenPipe/Hermes-2-Theta-Llama-3-8B-32k as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: OpenPipe/Hermes-2-Theta-Llama-3-8B-32k
name: base
- model: TroyDoesAI/BlackSheep-X-Dolphin
name: bsxd
- model: Sao10K/L3-8B-Stheno-v3.2
name: stheno
merge_method: sce
base_model: OpenPipe/Hermes-2-Theta-Llama-3-8B-32k
parameters:
select_topk: 0.70
prescale: true
normalize: true
weights:
# ATTENTION: tracking context, card, long-turn state.
- filter: ".*(attn|attention).*"
models: {base: 0.70, bsxd: 0.15, stheno: 0.15}
# MLP / FFN: interpretation and reasoning
- filter: ".*(mlp|ffn).*"
models: {base: 0.60, bsxd: 0.20, stheno: 0.20}
# OUTPUT / lm_head: voice, phrasing, confidence
- filter: ".*(lm_head|output).*"
models: {base: 0.55, bsxd: 0.25, stheno: 0.20}
dtype: float32
out_dtype: bfloat16
tokenizer:
source: base
target: base
๐ง Maintained by: Your Mum
๐ง Variant: Is it a witch or a wizard? or Both?
๐พ Upload date: November 21 2025
โ Notes: Made with stubbornness, Python, and profanity.
- Downloads last month
- 92