jdhao's digital space

Recent content on jdhao's digital space

马上订阅 jdhao's digital space RSS 更新: https://jdhao.github.io/index.xml

Nonlinear Activations for Neural Networks

2022年3月27日 17:25

Non-linear activations are important in deep neural networks. It is important in the sense that without non-linear activation functions, even if you have many linear layers, the end results is like you have only one linear layer, and the approximation ability of the network is very limited1. Some of most commonly-used nonlinear activation functions are Sigmoid, ReLU and Tanh.