DFA: MATLAB function to compute the Hurst exponent using Detrended Fluctuation Analysis (DFA)
H = DFA(X) calculates the Hurst exponent of time series X using Detrended Fluctuation Analysis (DFA). If a vector of increasing natural numbers is given as the second input parameter, i.e. DFA(X,D), then it defines the box sizes that the sample is divided into (the values in D have to be divisors of the length of series X). If D is a scalar (default value D = 10) it is treated as the smallest box size that the sample can be divided into. In this case the optimal sample size OptN and the vector of divisors for this size are automatically computed. OptN is defined as the length that possesses the most divisors among series shorter than X by no more than 1%. The input series X is truncated at the OptN-th value. [H,PV95] = DFA(X) returns the empirical confidence intervals PV95 at the 95% level (see ). [H,PV95,P] = DFA(X) returns the average standard deviations P of the detrended walk for all the divisors.
|Requires:||MATLAB (tested on MATLAB ver. 7.9).|
|Date of creation:||30 Sep 2011|
|Contact details of provider:|| Postal: Wybrzeze Wyspianskiego 27, 50-370 Wroclaw|
Web page: http://prac.im.pwr.wroc.pl/~hugo
More information through EDIRC
When requesting a correction, please mention this item's handle: RePEc:wuu:hscode:m11002. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Rafal Weron)
If references are entirely missing, you can add them using this form.