Optimal Dimension of Transition Probability Matrices for Markov Chain Bootstrapping
Abstract� While the large portion of the literature on Markov chain (possibly of order higher than one) bootstrap methods has focused on the correct estimation of the transition probabilities, little or no attention has been devoted to the problem of estimating the dimension of the transition probability matrix. Indeed, it is usual to assume that the Markov chain has a one-step memory property and that the state space could not to be clustered, and coincides with the distinct observed values. In this paper we question the opportunity of such a standard approach. In particular we advance a method to jointly estimate the order of the Markov chain and identify a suitable clustering of the states. Indeed in several real life applications the "memory" of many processes extends well over the last observation; in those cases a correct representation of past trajectories requires a significantly richer set than the state space. On the contrary it can sometimes happen that some distinct values do not correspond to really "different states of a process; this is a common conclusion whenever, for example, a process assuming two distinct values in t is not affected in its distribution in t+1. Such a situation would suggest to reduce the dimension of the transition probability matrix. Our methods are based on solving two optimization problems. More specifically we consider two competing objectives that a researcher will in general pursue when dealing with bootstrapping: preserving the similarity between the observed and the bootstrap series and reducing the probabilities of getting a perfect replication of the original sample. A brief axiomatic discussion is developed to define the desirable properties for such optimal criteria. Two numerical examples are presented to illustrate the method. �
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by Macerata University, Department of Finance and Economic Sciences in its series Working Papers with number 53-2009.
Date of creation: Apr 2009
Date of revision: Apr 2009
order of Markov chains; similarity of time series; transition probability matrices; multiplicity of time series; partition of states of Markov chains; Markov chains; bootstrap methods;
Find related papers by JEL classification:
- C14 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Semiparametric and Nonparametric Methods: General
- C15 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Statistical Simulation Methods: General
- C61 - Mathematical and Quantitative Methods - - Mathematical Methods; Programming Models; Mathematical and Simulation Modeling - - - Optimization Techniques; Programming Models; Dynamic Analysis
This paper has been announced in the following NEP Reports:
- NEP-ALL-2009-05-02 (All new papers)
- NEP-ECM-2009-05-02 (Econometrics)
- NEP-ETS-2009-05-02 (Econometric Time Series)
- NEP-ORE-2009-05-02 (Operations Research)
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Cerqueti, Roy & Falbo, Paolo & Guastaroba, Gianfranco & Pelizzari, Cristian, 2013. "A Tabu Search heuristic procedure in Markov chain bootstrapping," European Journal of Operational Research, Elsevier, vol. 227(2), pages 367-384.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Silvana Tartufoli).
If references are entirely missing, you can add them using this form.