Accession Number : ADA313896

Title :   Information Theoretic Regression Methods.

Descriptive Note : Technical rept.,

Corporate Author : GEORGE MASON UNIV FAIRFAX VA CENTER FOR COMPUTATIONAL STATISTICS

Personal Author(s) : Soofi, Ehsan

PDF Url : ADA313896

Report Date : APR 1996

Pagination or Media Count : 72

Abstract : Since the publication of the seminal note, Kuilback and Leiber (1951), there has been continual endeavor in statistics and related fields to explicate the existing statistical methods and to develop new methods based on the logarithmic information of Shannon (1948). During the last four decades numerous information theoretic regression methods have been developed. Kullback and Rosenblatt (1957) pioneered the information theoretic approach to regression by explicating the usual regression qyantities such as sums of squares and F ratios in terms of information functions. We have now information theoretic methods for model and predictive density derivation, parameter estimation and testing, model selection, collinearity analysis, and influential observation detection which can be used in sampling theory and Bayesian regression analyses. The purpose of this paper is to integrate the existing entropy-based methods in a single work, to explore their interrelationships, to elaborate on information theoretic interpretations of the existing entropy-based diagnostics and to present information theoretic interpretations for traditional diagnostics.

Descriptors :   *REGRESSION ANALYSIS, *INFORMATION THEORY, *STATISTICAL PROCESSES, MATHEMATICAL MODELS, METHODOLOGY, DETECTION, PARAMETERS, OBSERVATION, ESTIMATES, SAMPLING, LOGARITHM FUNCTIONS, BAYES THEOREM, SELECTION.

Subject Categories : Statistics and Probability
      Cybernetics

Distribution Statement : APPROVED FOR PUBLIC RELEASE