File size: 811 Bytes
9bf90f3
 
 
 
 
 
 
 
 
5b58489
 
3b6c66c
 
75dc956
 
 
 
 
 
8396b1e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
---
title: README
emoji: πŸ“‰
colorFrom: red
colorTo: red
sdk: static
pinned: false
---

<UltrascaleBookForm />

<p align="right"><i>(*If you experience issues downloading the PDF with Chrome try restarting/updating or use a different browser)</i></p>

The Nanotron team focus on sharing open knowledge and developping open-source libraries for efficient distributed training of large-scale AI models.

Some of its contributions are:

- the [Nanotron library](https://github.com/huggingface/nanotron)
- the [Picotron library](https://github.com/huggingface/picotron)
- the [Ultrascale-Playbook](https://huggingface.co/spaces/nanotron/ultrascale-playbook), a comprehensive book covering all distributed/parallelisation and low-level techniques that can be used to efficiently train models at the largest scales.