OAK

Comprehensive Review of Orthogonal Regression and Its Applications in Different Domains

Metadata Downloads
Abstract
Orthogonal regression is one of the prominent approaches for linear regression used to adjust the estimate of predictor errors. It can be considered as a least square regression with orthogonal constraints. It can maintain more discriminative constraints in the projection subspace than the least square regression and can avoid trivial solutions. In contrast to basic linear regression, orthogonal regression involves a computation error in both the answer and the predictor. Only the response variable contains the estimated error in simple regression. Orthogonal regression has also been utilized as the variable error occurs. Based on the data properties, specific models of orthogonal regression can be selected depending on whether there are calculation errors and/or equation errors. This article presents a comprehensive review of various variants of orthogonal regressions. The comparisons are drawn among the various variants of orthogonal regressions by considering various characteristics. The use of orthogonal regressions in various domains is also studied. Finally, various future directions are also presented.
Author(s)
PallaviJoshi, SandeepSingh, DilbagKaur, ManjitLee, Heung-No
Issued Date
2022-10
Type
Article
DOI
10.1007/s11831-022-09728-5
URI
https://scholar.gist.ac.kr/handle/local/10610
Publisher
International Center for Numerical Methods in Engineering
Citation
Archives of Computational Methods in Engineering, v.29, no.6, pp.4027 - 4047
ISSN
1134-3060
Appears in Collections:
Department of Electrical Engineering and Computer Science > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.