About the limits of raise regression to reduce condition number when three explanatory variables are involved.

  1. ANTONIO FRANCISCO ROLDAN LóPEZ DE HIERRO 1
  2. ROMAN SALMERÓN GÓMEZ 2
  3. CATALINA GARCÍA GARCÍA 2
  1. 1 Departamento de Didáctica de la Matemática, Universidad de Granada
  2. 2 Departamento de Métodos Cuantitativos para la Economía y la Empresa, Universidad de Granada
Journal:
Rect@: Revista Electrónica de Comunicaciones y Trabajos de ASEPUMA

ISSN: 1575-605X

Year of publication: 2018

Volume: 19

Issue: 1

Pages: 45-62

Type: Article

DOI: 10.24309/RECTA.2018.19.1.04 DIALNET GOOGLE SCHOLAR lock_openDialnet editor

More publications in: Rect@: Revista Electrónica de Comunicaciones y Trabajos de ASEPUMA

Abstract

This manuscript shows that the raise regression can be considered as an appropriate methodology in order to reduce the approximate multicollinearity that naturally appears in problems of linear regression. When three explanatory variables are involved, its application reduces the condition number of the matrix associated to data set. Nevertheless, this procedure has a threshold: although the columns of X can be separated, it is proved that the condition number will never be less than a constant that can be easily worked out by using the elements of the initial matrix. Finally, the contribution is illustrated through an empirical example.

Bibliographic References

  • A. E., Hoerl and R. W., Kennard, Ridge regression: Biased estimation for nonorthogonal problems, Technometrics, (1970), 12, 55-67.
  • A. E., Hoerl and R. W., Kennard, Ridge regression: Applications to nonorthogonal problems, Technometrics, (1970), 12, 69-82.
  • G.C., McDonald, Ridge regression, Wiley Interdisciplinary Reviews: Computational Statistics, (2009), 1, 93–100.
  • C.B. García, J. García and J. Soto, The raise method: An alternative procedure to estimate the parameters in presence of collinearity, Quality and Quantity, 45, (2010), 403–423.
  • R. Salmerón, C.B. García, J. García and M.M. López, The raise estimators. Estimation, inference and properties, Communications in Statistics Theory and Methods, 46 (13) (2017), 6446–6462.
  • D.W. Marquardt, Generalized inverses, ridge regression, biased linear estimation and nonlinear estimation, Technometrics, 12 (3) (1970), 591-612.
  • H. Theil, Principles of econometrics (Wiley, New York, 1971).
  • J. Fox and G. Monette, Generalized collinearity diagnostics, Journal of the American Statistical Association, 87 (1992), 178–183.
  • R.M. O’Brien, A caution regarding rules of thumb for variance inflation factors, Quality and Quantity, 41 (2007), 673–690.
  • C.B. García, R. Salmerón, J. García and M.M. López, The condition number in the raise regression, The 4th Advanced Research in Scientific Areas, (2015), 100–103.
  • D.A. Belsley, E. Kuh and R.E. Welsch, Regression Diagnostics: Identifying Influential Observations and Sources of Collinearity (New York: John Wiley, 1980).
  • D.A. Belsley, Demeaning conditioning diagnostics through centering, The American Statistician, 38 (2) (1984), 73–77.
  • C. Hurvich, Multicollinearity, In Handouts about regression (chapter 19), available on http://pages.stern.nyu.edu/~churvich/Regress/Handouts/Chapt19.pdf.