Performance Analysis of Fully Connected Parametric Activation Functions

Authors

  • Rashmi Awasthi  Research Scholar, Department of Computer Application, Mandsaur University, Mandsaur, Madhya Pradesh, India
  • Dr. B.K. Sharma  Associate Professor, Department of Computer Application, Mandsaur University, Mandsaur, Madhya Pradesh, India

DOI:

https://doi.org//10.32628/CSEIT2390287

Keywords:

Combined parametric functions, Artificial Intelligence, Artificial Neural Networks (ANN), Machine Learning, Deep Learning.

Abstract

Modern studies have shown that the choice of activation function can significantly affect the performance of several learning methods and deep networks. The activation function plays an important role in solving the nonlinear problem, and various nonlinear fully connected activation functions have been studied. In this paper, we propose a combined parametric activation function that can improve the performance of a fully connected Artificial intelligence methods and neural network. Combined parametric activation functions can be created by simply adding parametric activation functions. The parametric activation function is a function that can be optimized in the direction of minimizing the loss function by applying a appropriate parameter that converts the scale and location of the activation function according to the input data. The development of Artificial Neural Networks (ANNs) has achieved a lot of fruitful results so far, and we know that activation function is one of the principal factors which will affect the performance of the networks. In this paper, we discussed about the impact of varying model width and depth on robustness, the impact of using learnable parametric activation functions (PAFs).

References

  1. C. Nwankpa, W. Ijomah, A. Gachagan and S. Marshall, "Activation functions: Comparison of trends in practice and research for deep learning" in arXiv:1811.03378, 2018, Available : http://arxiv.org/abs/1811.03378.
  2. S. Hayou, A. Doucet and J. Rousseau, "On the impact of the activation function on deep neural networks training" in arXiv:1902.06853, 2019, [online] Available : http://arxiv.org/abs/1902.06853.
  3. Bingham, W. Macke, and R. Miikkulainen. Evolutionary optimization of deep learning activation functions. In Genetic and Evolutionary Computation Conference (GECCO ’20), July 8–12, 2020, Cancún, Mexico, 2020.
  4. Elfwing, E. Uchibe, and K. Doya. Sigmoid-weighted linear units for neural network function approximation in reinforcement learning. Neural Networks, 107:3–11, 2018.
  5. N. Y. Kong and S. W. Ko, “Performance improvement method of deep neural network using parametric activation functions,” Journal of the Korea Contents Association, Vol.21, No.3, pp.616-625, 2021.
  6. C. Nwankpa, W. Ijomah, A. Gachagan, and S. Marshall. Activation functions: Comparison of trends in practice and research for deep learning. arXiv:1811.03378, 2018.
  7. Keras Team. Keras Documentation: Reuters Newswire Classification Dataset. Accessed: Sep. 2, 2020. [Online]. Available: https://keras.io/api/ datasets/reuters/.
  8. T. Carneiro, R. V. Medeiros Da Nobrega, T. Nepomuceno, G.-B. Bian, V. H. C. De Albuquerque, and P. P. R. Filho, ‘‘Performance analysis of Google colaboratory as a tool for accelerating deep learning applications,’’ IEEE Access, vol. 6, pp. 61677–61685, 2018.
  9. B. Karlik and A. V. Olgac, “Performance analysis of various activa- tion functions in generalized mlp architectures of neural networks,” International Journal of Artificial Intelligence and Expert Systems, vol. 1, no. 4, pp. 111–122, 2011.
  10. B. Xu, N. Wang, T. Chen, and M. Li, “Empirical evaluation of rectified activations in convolutional network,” arXiv preprint arXiv:1505.00853, 2015.

Downloads

Published

2023-04-30

Issue

Section

Research Articles

How to Cite

[1]
Rashmi Awasthi, Dr. B.K. Sharma, " Performance Analysis of Fully Connected Parametric Activation Functions, IInternational Journal of Scientific Research in Computer Science, Engineering and Information Technology(IJSRCSEIT), ISSN : 2456-3307, Volume 9, Issue 2, pp.613-617, March-April-2023. Available at doi : https://doi.org/10.32628/CSEIT2390287