Finite Mixture Distributions, Sequential Likelihood and the EM Algorithm
A popular way to account for unobserved heterogeneity is to assume that the data are drawn from a finite mixture distribution. A barrier to using finite mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive separability of the log-likelihood function. We show, however, that an extension of the EM algorithm reintroduces additive separability, thus allowing one to estimate parameters sequentially during each maximization step. In establishing this result, we develop a broad class of estimators for mixture models. Returning to the likelihood problem, we show that, relative to full information maximum likelihood, our sequential estimator can generate large computational savings with little loss of efficiency. Copyright Econometric Society, 2002.
Volume (Year): 71 (2003)
Issue (Month): 3 (05)
|Contact details of provider:|| Phone: 1 212 998 3820|
Fax: 1 212 995 4487
Web page: http://www.econometricsociety.org/
More information through EDIRC
|Order Information:|| Web: https://www.econometricsociety.org/publications/econometrica/access/ordering-back-issues Email: |
When requesting a correction, please mention this item's handle: RePEc:ecm:emetrp:v:71:y:2003:i:3:p:933-946. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley-Blackwell Digital Licensing)or (Christopher F. Baum)
If references are entirely missing, you can add them using this form.