Neural network activation functions are a crucial component of deep learning.
Activation functions determine the output of a deep learning model, its accuracy, and also the computational efficiency of training a model – which can make or break a large scale neural network. Activation functions also have a major effect on the neural network’s ability to converge and the convergence speed, or in some cases, activation functions might prevent neural networks from converging in the first place.
In neural networks, linear and non-linear activation functions produce output decision boundaries by combining the network’s weighted inputs. The ReLU activation function is the most commonly used activation function right now, although the Tanh or hyperbolic tangent, and Sigmoid or logistic activation functions are also used.
Add comment