We focus on efficient preconditioning techniques for sequences of Karush-Kuhn-Tucker (KKT) linear systems arising from the interior point (IP) solution of large convex quadratic programming problems. Constraint preconditioners (CPs), although very effective in accelerating Krylov methods in the solution of KKT systems, have a very high computational cost in some instances, because their factorization may be the most time-consuming task at each IP iteration. We overcome this problem by computing the CP from scratch only at selected IP iterations and by updating the last computed CP at the remaining iterations, via suitable low-rank modifications based on a BFGS-like formula. This work extends the limited-memory preconditioners (LMPs) for symmetric positive definite matrices proposed by Gratton, Sartenaer and Tshimanga in 2011, by exploiting specific features of KKT systems and CPs. We prove that the updated preconditioners still belong to the class of exact CPs, thus allowing the use of the conjugate gradient method. Furthermore, they have the property of increasing the number of unit eigenvalues of the preconditioned matrix as compared with the generally used CPs. Numerical experiments are reported, which show the effectiveness of our updating technique when the cost for the factorization of the CP is high.

BFGS-like updates of constraint preconditioners for sequences of KKT linear systems in quadratic programming

De Simone, Valentina;di Serafino, Daniela;
2018

Abstract

We focus on efficient preconditioning techniques for sequences of Karush-Kuhn-Tucker (KKT) linear systems arising from the interior point (IP) solution of large convex quadratic programming problems. Constraint preconditioners (CPs), although very effective in accelerating Krylov methods in the solution of KKT systems, have a very high computational cost in some instances, because their factorization may be the most time-consuming task at each IP iteration. We overcome this problem by computing the CP from scratch only at selected IP iterations and by updating the last computed CP at the remaining iterations, via suitable low-rank modifications based on a BFGS-like formula. This work extends the limited-memory preconditioners (LMPs) for symmetric positive definite matrices proposed by Gratton, Sartenaer and Tshimanga in 2011, by exploiting specific features of KKT systems and CPs. We prove that the updated preconditioners still belong to the class of exact CPs, thus allowing the use of the conjugate gradient method. Furthermore, they have the property of increasing the number of unit eigenvalues of the preconditioned matrix as compared with the generally used CPs. Numerical experiments are reported, which show the effectiveness of our updating technique when the cost for the factorization of the CP is high.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11591/384299
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 6
social impact