Estimation of Parameters in a Linear Regression Model under the Kullback-Leibler Loss
AbstractThis paper is concerned with the simultaneous estimation of parameters of regression coefficients and error variance in a linear regression model. Motivated from the Akaike information criterion, the expected Kullback-Leibler distance is employed as a risk function for comparing estimators in a decision-theoretic framework. This setup gives us the difficulty in handling the risk because an estimator of the variance is incorporated into the loss for estimating the regression coefficients. In this situation, several estimators of the variance and the regression coefficients are proposed and shown to improve on usual estimators used as a benchmark. Through simulation studies for the risk behavior of estimators, it is numerically shown that a truncated estimator has more favorable risk than the usual estimators.
Download InfoTo our knowledge, this item is not available for download. To find whether it is available, there are three options:
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
Bibliographic InfoPaper provided by CIRJE, Faculty of Economics, University of Tokyo in its series CIRJE F-Series with number CIRJE-F-389.
Length: 28 pages
Date of creation: Nov 2005
Date of revision:
Contact details of provider:
Postal: Hongo 7-3-1, Bunkyo-ku, Tokyo 113-0033
Web page: http://www.cirje.e.u-tokyo.ac.jp/index.html
More information through EDIRC
You can help add them by filling out this form.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (CIRJE administrative office).
If references are entirely missing, you can add them using this form.