New activation functions for single layer feedforward neural network


KOÇAK Y., ÜSTÜNDAĞ ŞİRAY G.

EXPERT SYSTEMS WITH APPLICATIONS, vol.164, 2021 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 164
  • Publication Date: 2021
  • Doi Number: 10.1016/j.eswa.2020.113977
  • Journal Name: EXPERT SYSTEMS WITH APPLICATIONS
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Aerospace Database, Applied Science & Technology Source, Communication Abstracts, Computer & Applied Sciences, INSPEC, Metadex, Public Affairs Index, Civil Engineering Abstracts
  • Keywords: Artificial Neural Network, Activation function, Generalized swish, ReLU-swish, Triple-state swish
  • Çukurova University Affiliated: Yes

Abstract

Artificial Neural Network (ANN) is a subfield of machine learning and it has been widely used by the researchers. The attractiveness of ANNs comes from their remarkable information processing capability. An important key parameter of ANN is activation function (AF). Different AFs can significantly affect the performance of an ANN, and therefore choosing a good AF is important. This study aims to introduce some new AFs that combine the advantages of predefined AFs and perform better than them. For this purpose, we propose some new AFs, which we called generalized swish, mean-swish, ReLU-swish, triple-state swish, sigmoid-algebraic, triple-state sigmoid, exponential swish, sinc-sigmoid and derivative of sigmoid AFs. Then, we compare the proposed AFs with some well-known and recently proposed AFs. To investigate the performance of these AFs, we use four different data sets, which are simulated data, optical interconnection network data, specifications of cars, and average house costs data.