IDEAS home Printed from https://ideas.repec.org/a/cup/etheor/v21y2005i02p431-454_05.html
   My bibliography  Save this article

On Plug-In Estimation Of Long Memory Models

Author

Listed:
  • Lieberman, Offer

Abstract

We consider the Gaussian ARFIMA(j,d,l) model, with spectral density and an unknown mean . For this class of models, the n−1-normalized information matrix of the full parameter vector, (μ,θ), is asymptotically degenerate. To estimate θ, Dahlhaus (1989, Annals of Statistics 17, 1749–1766) suggested using the maximizer of the plug-in log-likelihood, , where is any n(1−2d)/2-consistent estimator of μ. The resulting estimator is a plug-in maximum likelihood estimator (PMLE). This estimator is asymptotically normal, efficient, and consistent, but in finite samples it has some serious drawbacks. Primarily, none of the Bartlett identities associated with are satisfied for fixed n. Cheung and Diebold (1994, Journal of Econometrics 62, 301–316) conducted a Monte Carlo simulation study and reported that the bias of the PMLE is about 3–4 times the bias of the regular maximum likelihood estimator (MLE). In this paper, we derive asymptotic expansions for the PMLE and show that its second-order bias is contaminated by an additional term, which does not exist in regular cases. This term arises because of the failure of the first Bartlett identity to hold and seems to explain Cheung and Diebold's simulated results. We derive similar expansions for the Whittle MLE, which is another estimator tacitly using the plug-in principle. An application to the ARFIMA(0,d,0) shows that the additional bias terms are considerable.Research on this topic commenced during 2000–2002, while the author was visiting the Cowles Foundation for Research in Economics at Yale University. The author is most grateful to the Cowles Foundation for their generous hospitality and to Donald Andrews and Peter Phillips for numerous helpful comments.

Suggested Citation

  • Lieberman, Offer, 2005. "On Plug-In Estimation Of Long Memory Models," Econometric Theory, Cambridge University Press, vol. 21(2), pages 431-454, April.
  • Handle: RePEc:cup:etheor:v:21:y:2005:i:02:p:431-454_05
    as

    Download full text from publisher

    File URL: https://www.cambridge.org/core/product/identifier/S0266466605050231/type/journal_article
    File Function: link to article abstract page
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Shuping Shi & Jun Yu, 2023. "Volatility Puzzle: Long Memory or Antipersistency," Management Science, INFORMS, vol. 69(7), pages 3861-3883, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cup:etheor:v:21:y:2005:i:02:p:431-454_05. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kirk Stebbing (email available below). General contact details of provider: https://www.cambridge.org/ect .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.