Sometimes I finetune models specifically to take on expert roles in a MoE configuration, sometimes I find interesting models others have fine tuned.
Rasmus Rasmussen
theprint
AI & ML interests
Small model experiments and homespun datasets.
Recent Activity
updated a collection about 24 hours ago
Mixture of Experts (MoE) updated a collection about 24 hours ago
Mixture of Experts (MoE) updated a collection about 24 hours ago
Mixture of Experts (MoE)