Estimation of Parameters in a Linear Regression Model under the Kullback-Leibler Loss
This paper is concerned with the simultaneous estimation of parameters of regression coefficients and error variance in a linear regression model. Motivated from the Akaike information criterion, the expected Kullback-Leibler distance is employed as a risk function for comparing estimators in a decision-theoretic framework. This setup gives us the difficulty in handling the risk because an estimator of the variance is incorporated into the loss for estimating the regression coefficients. In this situation, several estimators of the variance and the regression coefficients are proposed and shown to improve on usual estimators used as a benchmark. Through simulation studies for the risk behavior of estimators, it is numerically shown that a truncated estimator has more favorable risk than the usual estimators.
To our knowledge, this item is not available for
download. To find whether it is available, there are three
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
|Date of creation:||Nov 2005|
|Date of revision:|
|Contact details of provider:|| Postal: |
Web page: http://www.cirje.e.u-tokyo.ac.jp/index.html
More information through EDIRC
When requesting a correction, please mention this item's handle: RePEc:tky:fseres:2005cf389. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (CIRJE administrative office)
If references are entirely missing, you can add them using this form.