ReLU
Overall Progress
0%
ReLU, sigmoid, tanh, and softmax โ what they compute, when to use each, and why non-linearity is essential for deep networks.
ReLU, sigmoid, tanh, and softmax โ what they compute, when to use each, and why non-linearity is essential for deep networks.