Parallel algorithms for downdating the least-squares estimator of the regression model
Computationally efficient parallel algorithms for downdating the least-squares estimator of the ordinary linear model (OLM) are proposed. The algorithms are block versions of sequential Givens strategies and efficiently exploit the triangular structure of the matrices. The first strategy utilizes the orthogonal matrix which is derived from the QR decomposition of the initial data matrix. The orthogonal matrix is updated and explicitely computed. The second approach is based on hyperbolic transformations. This is equivalent to update the model with the imaginary data that is to be deleted. An efficient distribution of matrices over the processors is proposed. Furthermore, the new algorithms do not require any inter-processor communication. The theoretical complexities are derived and experimental results are presented and analyzed. The parallel strategies are scalable and highly efficient for large-scale downdating least-squares problems.
To our knowledge, this item is not available for
download. To find whether it is available, there are three
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
When requesting a correction, please mention this item's handle: RePEc:sce:scecfa:288. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Christopher F. Baum)
If references are entirely missing, you can add them using this form.