Artificial Neural Network (ANN) is a subfield of machine learning and it has been widely used by the researchers. The attractiveness of ANNs comes from their remarkable information processing capability. An important key parameter of ANN is activation function (AF). Different AFs can significantly affect the performance of an ANN, and therefore choosing a good AF is important. This study aims to introduce some new AFs that combine the advantages of predefined AFs and perform better than them. For this purpose, we propose some new AFs, which we called generalized swish, mean-swish, ReLU-swish, triple-state swish, sigmoid-algebraic, triple-state sigmoid, exponential swish, sinc-sigmoid and derivative of sigmoid AFs. Then, we compare the proposed AFs with some well-known and recently proposed AFs. To investigate the performance of these AFs, we use four different data sets, which are simulated data, optical interconnection network data, specifications of cars, and average house costs data.