Accession Number : ADA184374

Title :   A Computational Examination of Orthogonal Distance Regression.

Descriptive Note : Technical rept.,

Corporate Author : COLORADO UNIV AT BOULDER DEPT OF COMPUTER SCIENCE

Personal Author(s) : Boggs,Paul T ; Donaldson,Janet R ; Schnabel,Robert B ; Spiegelman,Clifford H

PDF Url : ADA184374

Report Date : 08 May 1987

Pagination or Media Count : 49

Abstract : Classical or ordinary least squares (OLS) is one of the most commonly used criteria for fitting data to models and for estimating parameters. This is true when a key assumption for its use, namely that the independent variables are known exactly, is violated. Orthogonal distance regression (ODR) extends least squares data fitting to problems with the independent variables that are not known exactly. Theoretical analysis, however, shows OLS is preferable to ODR for straight line functions under certain conditions, even when there are measurement errors in the independent variable. This has lead some to conjecture that under some similar conditions OLS will also be preferable to ODR for nonlinear functions even though there are errors in the independent variable. This paper presents the results of an empirical study designed to examine whether ODR provides better results than OLS when there are errors in the independent variable. Examined are a variety of functions, both linear and nonlinear, under a variety of experimental conditions. The results indicate that for the data and performance criteria considered, ODR never performs appreciably worse than OLS and sometimes performs considerably better. This leads to the conclusion that ODR is appropriate for a wide variety of practical problems. (Author)

Descriptors :   *REGRESSION ANALYSIS, *FITTING FUNCTIONS(MATHEMATICS), ORTHOGONALITY, LEAST SQUARES METHOD, MONTE CARLO METHOD, VARIABLES, ERRORS, PARAMETERS, ESTIMATES, CHARTS

Subject Categories : Statistics and Probability

Distribution Statement : APPROVED FOR PUBLIC RELEASE