|
--- |
|
license: mit |
|
datasets: |
|
- MuzzammilShah/people-names |
|
language: |
|
- en |
|
model_name: Manual Backpropagation through BatchNorm |
|
library_name: pytorch |
|
tags: |
|
- makemore |
|
- backpropagation |
|
- batchnorm |
|
- neural-networks |
|
- andrej-karpathy |
|
--- |
|
|
|
# Manual Backpropagation through BatchNorm: Makemore (Part 4) |
|
|
|
This repository explores manual backpropagation through a 2-layer MLP (with BatchNorm) without using PyTorch autograd's loss.backward(). It involves manually backpropagating through the cross entropy loss, 2nd linear layer, tanh, batchnorm, 1st linear layer, and the embedding table. |
|
|
|
## Documentation |
|
For a better reading experience and detailed notes, visit my **[Road to GPT Documentation Site](https://muzzammilshah.github.io/Road-to-GPT/Makemore-part4/)**. |
|
|
|
## Acknowledgments |
|
Notes and implementations inspired by the **Makemore - Part 4** video by [Andrej Karpathy](https://karpathy.ai/). |
|
|
|
For more of my projects, visit my [Portfolio Site](https://muhammedshah.com). |