PATTERN LAYER REDUCTION FOR A GENERALIZED REGRESSION NEURAL NETWORK BY USING A SELF-ORGANIZING MAP


Creative Commons License

KARTAL S. , ORAL M. , ÖZYILDIRIM B. M.

INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, cilt.28, ss.411-424, 2018 (SCI İndekslerine Giren Dergi) identifier identifier

  • Cilt numarası: 28 Konu: 2
  • Basım Tarihi: 2018
  • Doi Numarası: 10.2478/amcs-2018-0031
  • Dergi Adı: INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE
  • Sayfa Sayıları: ss.411-424

Özet

In a general regression neural network (GRNN), the number of neurons in the pattern layer is proportional to the number of training samples in the dataset. The use of a GRNN in applications that have relatively large datasets becomes troublesome due to the architecture and speed required. The great number of neurons in the pattern layer requires a substantial increase in memory usage and causes a substantial decrease in calculation speed. Therefore, there is a strong need for pattern layer size reduction. In this study, a self-organizing map (SOM) structure is introduced as a pre-processor for the GRNN. First, an SOM is generated for the training dataset. Second, each training record is labelled with the most similar map unit. Lastly, when a new test record is applied to the network, the most similar map units are detected, and the training data that have the same labels as the detected units are fed into the network instead of the entire training dataset. This scheme enables a considerable reduction in the pattern layer size. The proposed hybrid model was evaluated by using fifteen benchmark test functions and eight different UCI datasets. According to the simulation results, the proposed model significantly simplifies the GRNN's structure without any performance loss.