Sometimes I finetune models specifically to take on expert roles in a MoE configuration, sometimes I find interesting models others have fine tuned.
Rasmus Rasmussen
theprint
AI & ML interests
Small model experiments and homespun datasets.
Recent Activity
updated a collection 1 day ago
Mixture of Experts (MoE) updated a collection 1 day ago
Mixture of Experts (MoE) updated a collection 1 day ago
Mixture of Experts (MoE)