File size: 503 Bytes
1756940
 
 
 
f532791
 
 
a90c547
7df4b7a
f532791
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
---
base_model:
- microsoft/Phi-4-mini-reasoning
- microsoft/Phi-4-mini-instruct
license: mit
tags:
- merge
pipeline_tag: text-generation
library_name: transformers
---
This is an upscaled/merged version of microsoft/Phi-4-mini-instruct. it has 6B parameters and 56 layers. As it has more layers, it also has an increased capacity for learning, so this model can be fine-tuned to be, very good.

Layers 18-22 are merged from microsoft/Phi-4-mini-reasoning, others are from microsoft/Phi-4-mini-instruct.