File size: 971 Bytes
2a0226b a74ead5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
---
license: mit
datasets:
- MuzzammilShah/people-names
language:
- en
model_name: Manual Backpropagation through BatchNorm
library_name: pytorch
tags:
- makemore
- backpropagation
- batchnorm
- neural-networks
- andrej-karpathy
---
# Manual Backpropagation through BatchNorm: Makemore (Part 4)
This repository explores manual backpropagation through a 2-layer MLP (with BatchNorm) without using PyTorch autograd's loss.backward(). It involves manually backpropagating through the cross entropy loss, 2nd linear layer, tanh, batchnorm, 1st linear layer, and the embedding table.
## Documentation
For a better reading experience and detailed notes, visit my **[Road to GPT Documentation Site](https://muzzammilshah.github.io/Road-to-GPT/Makemore-part4/)**.
## Acknowledgments
Notes and implementations inspired by the **Makemore - Part 4** video by [Andrej Karpathy](https://karpathy.ai/).
For more of my projects, visit my [Portfolio Site](https://muhammedshah.com). |