Seraching for Additive Outliers in Nonstationary Time Series
Recently, Vogelsang (1999) proposed a method to detect outliers which explicitly imposes the null hypothesis of a unit root. It works in an iterative fashion to select multiple outliers in a given series. We show, via simulations, that under the null hypothesis of no outliers, it has the right size in finite samples to detect a single outlier but when applied in an iterative fashion to select multiple outliers, it exhibits severe size distortions towards finding an excessive number of outliers. We show that this iterative method is incorrect and derice the appropriate limiting distribution of the test at each step of the search.
To our knowledge, this item is not available for
download. To find whether it is available, there are three
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
|Date of creation:||2000|
|Date of revision:|
|Contact details of provider:|| Postal: |
Phone: (613) 562-5753
Fax: (613) 562-5999
Web page: http://www.socialsciences.uottawa.ca/eco/eng/index.asp
More information through EDIRC
When requesting a correction, please mention this item's handle: RePEc:ott:wpaper:0005e. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Diane Ritchot)
If references are entirely missing, you can add them using this form.