Accelerating the quadratic lower-bound algorithm via optimizing the shrinkage parameter
When the Newton–Raphson algorithm or the Fisher scoring algorithm does not work and the EM-type algorithms are not available, the quadratic lower-bound (QLB) algorithm may be a useful optimization tool. However, like all EM-type algorithms, the QLB algorithm may also suffer from slow convergence which can be viewed as the cost for having the ascent property. This paper proposes a novel ‘shrinkage parameter’ approach to accelerate the QLB algorithm while maintaining its simplicity and stability (i.e., monotonic increase in log-likelihood). The strategy is first to construct a class of quadratic surrogate functions Qr(θ|θ(t)) that induces a class of QLB algorithms indexed by a ‘shrinkage parameter’ r (r∈R) and then to optimize r over R under some criterion of convergence. For three commonly used criteria (i.e., the smallest eigenvalue, the trace and the determinant), we derive a uniformly optimal shrinkage parameter and find an optimal QLB algorithm. Some theoretical justifications are also presented. Next, we generalize the optimal QLB algorithm to problems with penalizing function and then investigate the associated properties of convergence. The optimal QLB algorithm is applied to fit a logistic regression model and a Cox proportional hazards model. Two real datasets are analyzed to illustrate the proposed methods.
If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Ravi Varadhan & Christophe Roland, 2008. "Simple and Globally Convergent Methods for Accelerating the Convergence of Any EM Algorithm," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 35(2), pages 335-353.
- Dankmar Böhning, 1992. "Multinomial logistic regression algorithm," Annals of the Institute of Statistical Mathematics, Springer, vol. 44(1), pages 197-200, March.
- Mingfeng Wang & Masahiro Kuroda & Michio Sakakihara & Zhi Geng, 2008. "Acceleration of the EM algorithm using the vector epsilon algorithm," Computational Statistics, Springer, vol. 23(3), pages 469-486, July.
- Kuroda, Masahiro & Sakakihara, Michio, 2006. "Accelerating the convergence of the EM algorithm using the vector [epsilon] algorithm," Computational Statistics & Data Analysis, Elsevier, vol. 51(3), pages 1549-1561, December.
- Dankmar Böhning & Bruce Lindsay, 1988. "Monotonicity of quadratic-approximation algorithms," Annals of the Institute of Statistical Mathematics, Springer, vol. 40(4), pages 641-663, December.
When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:56:y:2012:i:2:p:255-265. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Zhang, Lei)
If references are entirely missing, you can add them using this form.