Asymptotics for Least Absolute Deviation Regression Estimators
The LAD estimator of the vector parameter in a linear regression is defined by minimizing the sum of the absolute values of the residuals. This paper provides a direct proof of asymptotic normality for the LAD estimator. The main theorem assumes deterministic carriers. The extension to random carriers includes the case of autoregressions whose error terms have finite second moments. For a first-order autoregression with Cauchy errors the LAD estimator is shown to converge at a 1/ n rate.
Volume (Year): 7 (1991)
Issue (Month): 02 (June)
|Contact details of provider:|| Postal: Cambridge University Press, UPH, Shaftesbury Road, Cambridge CB2 8BS UK|
Web page: http://journals.cambridge.org/jid_ECT
When requesting a correction, please mention this item's handle: RePEc:cup:etheor:v:7:y:1991:i:02:p:186-199_00. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Keith Waters)
If references are entirely missing, you can add them using this form.