Backpropagation
Overall Progress
0%
The chain rule applied backwards through a neural network โ computing gradients for every weight and verifying them with numerical finite differences.
Build a full training loop in NumPy: batches, epochs, forward pass, backprop, and weight updates.
15 drill problems covering neural networks, forward pass, backpropagation, optimizers, and training.