15th Iranian Statistics Conference, Yazd, İran, 9 - 11 Eylül 2020, ss.1
The increased availability of high dimensional data and presence of multicollinearity have made regularized methods widespread use. Regularized methods produce coefficient estimates to be biased with low variance while improving the prediction accuracy of the model. Although L2 regularization is the most popular of the regularized methods, it does not do variable selection while shrinking the coefficients. On the other hand, L1 regularization simultaneously shrinks the coefficients and selects variables. Although L1 and L2 regularization methods are powerful methods in themselves, there are cases where L1 and L2 methods are also weak. Hence, improvements that develop L2 and L1 regularization methods have occurred. This talk presents novel regularization strategies to solve multicollinearity and high dimensionality in linear regression and extreme learning machine through the adoption of new regularization functions. The applications of all these developments with the help of Matlab program will be mentioned.