Neural Networks, cilt.172, 2024 (SCI-Expanded)
Convolutional Neural Networks (CNNs) have revolutionized image classification through their innovative design and training methodologies in computer vision. Differential convolutional neural network with simultaneous multidimensional filter realization improved the performance of the convolutional neural network with calculation cost drawback. This paper introduces logarithmic learning integration into the differential Convolutional neural network to overcome the drawback by supplying faster error minimization and convergence. This task is done by incorporating LogRelu activation, a Logarithmic Cost Function, and unique logarithmic learning method. The effectiveness of the proposed approaches are evaluated by using various datasets and SGD/Adam optimizers. The first step is the adaptation of LogRelu activation function to convolutional and differential convolutional neural networks. The experiment results show that LogRelu integration to convolutional neural network and differential convolutional neural network yields performance improvements ranging from 1.61% to 5.44%. The same integration on ResNet-18, ResNet-34, and ResNet-50 enhances top-1 accuracy in the range of 3.07% and 9.96%. In addition to LogRelu activation function, a Logarithmic Cost Function with logarithmic learning method is also proposed and adapted to differential convolutional neural network. These improvements lead to a new differential convolutional neural network named as Logarithmic Differential Convolutional Neural Network (LDiffCNN), It consistently outperforms standard CNN by increasing the accuracy up to 3.02%. Notably, Logarithmic Differential Convolutional Neural Network demonstrates reduced training iterations up to 38% with faster convergence. The experimental results proved the efficiency of the proposed approach.