In this video, we learn about Activation Functions in Neural Networks.
We go over:
- The definition of activation functions
- Why they are used
- Different activation functions
- How to use them in code
The different activation functions we go over are Step Functions, Sigmoid, TanH, ReLU, Leaky ReLU, and Softmax.
Watch here: