We propose a new method in two variations for the identification of most
relevant covariates in linear models with homoscedastic errors. In contrast to AIC, BIC and
other information criteria, our method is based on an interpretable scaled quantity. This
quantity measures a maximal relative error one makes by selecting covariates from a given
set of all available covariates. The proposed model selection procedures rely on asymptotic
normality of test statistics, and therefore normality of the errors in the regression model
is not required. In a simulation study the performance of the suggested methods along
with the performance of the standard model selection criteria AIC and BIC is examined.
The simulation study illustrates the evident superiority of the proposed method over the AIC and the BIC, and especially when regression effects possess influence of several orders in magnitude. The accuracy of the normal approximation to the test statistics is also
investigated. The normal approximation is already satisfactory for sample sizes 50 and
100. As an illustration we analyze US college spending data from 1994.
«
We propose a new method in two variations for the identification of most
relevant covariates in linear models with homoscedastic errors. In contrast to AIC, BIC and
other information criteria, our method is based on an interpretable scaled quantity. This
quantity measures a maximal relative error one makes by selecting covariates from a given
set of all available covariates. The proposed model selection procedures rely on asymptotic
normality of test statistics, and therefore normality of t...
»