Investigators that seek to employ regression analysis usually encounter the problem of multicollinearity with dependency on two or more explanatory variables. Multicollinearity is associated with unstable estimated coefficients and it results in high variances of the least squares estimators in linear regression models (LRMs). Thus the detection of collinearity is the compulsory first step in regression analysis. Multicollinearity also come out in generalized linear models (GLMs) and has same serious effects on the maximum likelihood estimates. The purposes of this paper are to propose new collinearity diagnostics criteria in GLMs in the context of both the maximum likelihood and ridge estimators, to examine the properties of new collinearity diagnostics via the ridge constant, to exemplify the theoretical results by numerical examples on Poisson, Binomial and Gamma responses. The effects of centering and scaling the information matrix on the sensitivity of the diagnostics in the presence of collinearity are also investigated.