周池春
513讨论班——激活函数综述(凌震)
2021-9-26 21:34
阅读:2111

题目:激活函数

主讲人:凌震

地点:工程学院513

时间:2021-09-27 上午10点40

简介:1)神经网络

2)神经网络的重要组成部分——激活函数

参考文献:

[1] Nwankpa C, Ijomah W, Gachagan A, et al. Activation functions: Comparison of trends in practice and research for deep learning[J]. arXiv preprint arXiv:1811.03378, 2018.

[2] Sharma S, Sharma S. Activation functions in neural networks[J]. Towards Data Science, 2017, 6(12): 310-316.

[3] Agarap A F. Deep learning using rectified linear units (relu)[J]. arXiv preprint arXiv:1803.08375, 2018.

[4] Mercioni M A, Tiron A, Holban S. Dynamic modification of activation function using the backpropagation algorithm in the artificial neural networks[J]. IJACSA) International Journal of Advanced Computer Science and Applications, 2019, 10(4).

[5] Hendrycks D, Gimpel K. Gaussian error linear units (gelus)[J]. arXiv preprint arXiv:1606.08415, 2016.

[6] Qiumei Z, Dan T, Fenghua W. Improved convolutional neural network based on fast exponentially linear unit activation function[J]. Ieee Access, 2019, 7: 151359-151367.

[7] Xu B, Wang N, Chen T, et al. Empirical evaluation of rectified activations in convolutional network[J]. arXiv preprint arXiv:1505.00853, 2015.

[8] Hayou S, Doucet A, Rousseau J. On the impact of the activation function on deep neural networks training[C]//International conference on machine learning. PMLR, 2019: 2672-2680.

[9] Liang M, Hu X. Recurrent convolutional neural network for object recognition[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2015: 3367-3375.

[10] Hubara I, Courbariaux M, Soudry D, et al. Binarized neural networks: Training neural networks with weights and activations constrained to+ 1 or-1[J]. arXiv preprint arXiv:1602.02830, 2016.

[11] Yamashita R, Nishio M, Do R K G, et al. Convolutional neural networks: an overview and application in radiology[J]. Insights into imaging, 2018, 9(4): 611-629.

[12] Mou L, Ghamisi P, Zhu X X. Deep recurrent neural networks for hyperspectral image classification[J]. IEEE Transactions on Geoscience and Remote Sensing, 2017, 55(7): 3639-3655.

[13] Zhang H, Weng T W, Chen P Y, et al. Efficient neural network robustness certification with general activation functions[J]. arXiv preprint arXiv:1811.00866, 2018.

[14] Salamon J, Bello J P. Deep convolutional neural networks and data augmentation for environmental sound classification[J]. IEEE Signal processing letters, 2017, 24(3): 279-283.

[15] Katz G, Barrett C, Dill D L, et al. Reluplex: An efficient SMT solver for verifying deep neural networks[C]//International Conference on Computer Aided Verification. Springer, Cham, 2017: 97-117.

[16] Lai S, Xu L, Liu K, et al. Recurrent convolutional neural networks for text classification[C]//Twenty-ninth AAAI conference on artificial intelligence. 2015.

[17] Misra D. Mish: A self regularized non-monotonic neural activation function[J]. arXiv preprint arXiv:1908.08681, 2019, 4: 2.

[18] Ramachandran P, Zoph B, Le Q V. Searching for activation functions[J]. arXiv preprint arXiv:1710.05941, 2017.

[19] Wang Y, Li Y, Song Y, et al. The influence of the activation function in a convolution neural network model of facial expression recognition[J]. Applied Sciences, 2020, 10(5): 1897.

[20] Basirat M, Roth P M. The quest for the golden activation function[J]. arXiv preprint arXiv:1808.00783, 2018.

[21] Jacob B, Kligys S, Chen B, et al. Quantization and training of neural networks for efficient integer-arithmetic-only inference[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2018: 2704-2713.

[22] Lin, Guifang, and Wei Shen. "Research on convolutional neural network based on improved Relu piecewise activation function." Procedia computer science 131 (2018): 977-984.


转载本文请联系原作者获取授权,同时请注明本文来自周池春科学网博客。

链接地址:https://wap.sciencenet.cn/blog-3453120-1305789.html?mobile=1

收藏

分享到:

当前推荐数:0
推荐到博客首页
网友评论0 条评论
确定删除指定的回复吗?
确定删除本博文吗?