Gaussian Error Linear Unit Activates Neural Networks Beyond ReLU (GELU, ELU, RELU 비교글)
https://medium.com/syncedreview/gaussian-error-linear-unit-activates-neural-networks-beyond-relu-121d1938a1f7 Gaussian Error Linear Unit Activates Neural Networks Beyond ReLU Results of the various experiments show GELU consistently has the best performance compared with ReLU and ELU. medium.com GELU에 대해서 간단하게 알아보자 ## Google Bert def gelu(x): """Gaussian Error Linear Unit. This is a smoother ver..
2020.01.05