Comprehensive Review of Orthogonal Regression and Its Applications in Different Domains
- Abstract
- Orthogonal regression is one of the prominent approaches for linear regression used to adjust the estimate of predictor errors. It can be considered as a least square regression with orthogonal constraints. It can maintain more discriminative constraints in the projection subspace than the least square regression and can avoid trivial solutions. In contrast to basic linear regression, orthogonal regression involves a computation error in both the answer and the predictor. Only the response variable contains the estimated error in simple regression. Orthogonal regression has also been utilized as the variable error occurs. Based on the data properties, specific models of orthogonal regression can be selected depending on whether there are calculation errors and/or equation errors. This article presents a comprehensive review of various variants of orthogonal regressions. The comparisons are drawn among the various variants of orthogonal regressions by considering various characteristics. The use of orthogonal regressions in various domains is also studied. Finally, various future directions are also presented.
- Author(s)
- Pallavi; Joshi, Sandeep; Singh, Dilbag; Kaur, Manjit; Lee, Heung-No
- Issued Date
- 2022-10
- Type
- Article
- DOI
- 10.1007/s11831-022-09728-5
- URI
- https://scholar.gist.ac.kr/handle/local/10610
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.