New activation functions for single layer feedforward neural network


KOÇAK Y., ÜSTÜNDAĞ ŞİRAY G.

EXPERT SYSTEMS WITH APPLICATIONS, cilt.164, 2021 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 164
  • Basım Tarihi: 2021
  • Doi Numarası: 10.1016/j.eswa.2020.113977
  • Dergi Adı: EXPERT SYSTEMS WITH APPLICATIONS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Aerospace Database, Applied Science & Technology Source, Communication Abstracts, Computer & Applied Sciences, INSPEC, Metadex, Public Affairs Index, Civil Engineering Abstracts
  • Anahtar Kelimeler: Artificial Neural Network, Activation function, Generalized swish, ReLU-swish, Triple-state swish
  • Çukurova Üniversitesi Adresli: Evet

Özet

Artificial Neural Network (ANN) is a subfield of machine learning and it has been widely used by the researchers. The attractiveness of ANNs comes from their remarkable information processing capability. An important key parameter of ANN is activation function (AF). Different AFs can significantly affect the performance of an ANN, and therefore choosing a good AF is important. This study aims to introduce some new AFs that combine the advantages of predefined AFs and perform better than them. For this purpose, we propose some new AFs, which we called generalized swish, mean-swish, ReLU-swish, triple-state swish, sigmoid-algebraic, triple-state sigmoid, exponential swish, sinc-sigmoid and derivative of sigmoid AFs. Then, we compare the proposed AFs with some well-known and recently proposed AFs. To investigate the performance of these AFs, we use four different data sets, which are simulated data, optical interconnection network data, specifications of cars, and average house costs data.