File size: 2,629 Bytes
e301914 a3d0103 315e1b8 56f2ae6 54303d7 56f2ae6 40f4924 56f2ae6 40f4924 56f2ae6 dd3f3d1 56f2ae6 2d3e6bc 56f2ae6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
license: mit
language:
- en
pipeline_tag: text-generation
tags:
- phi
- nlp
- math
- code
- chat
- conversational
inference:
parameters:
temperature: 0
widget:
- messages:
- role: user
content: How should I explain the Internet?
library_name: transformers
---
# ModernBert Model Card
[ModernBert Technical Report](https://arxiv.org/pdf/2412.08905)
## Model Summary
| | |
|-------------------------|-------------------------------------------------------------------------------|
| **Developed by** | Micro |
| **Description** | `phi-4` is a state-of-the-art open model built upon a blend of synthetic datasets, data from filtered public domain websites, and acquired academic books and Q&A datasets. The goal of this approach was to ensure that small capable models were trained with data focused on high quality and advanced reasoning.<br><br>`phi-4` underwent a rigorous enhancement and alignment process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures |
| **Architecture** | 14B parameters, dense decoder-only Transformer model |
| **Inputs** | Text, best suited for prompts in the chat format |
| **Context length** | 16K tokens |
| **GPUs** | 1920 H100-80G |
| **Training time** | 21 days |
| **Training data** | 9.8T tokens |
| **Outputs** | Generated text in response to input |
| **Dates** | October 2024 – November 2024 |
| **Status** | Static model trained on an offline dataset with cutoff dates of June 2024 and earlier for publicly available data |
| **Release date** | March 17, 2025 |
| **License** | MIT |
|