kth.sePublications KTH
Change search
Link to record
Permanent link

Direct link
Publications (9 of 9) Show all publications
Peng, S., Canessa, G., Ek, D. & Forsgren, A. (2025). Finding search directions in quasi-Newton methods for minimizing a quadratic function subject to uncertainty. Computational optimization and applications, 91(1), 145-171
Open this publication in new window or tab >>Finding search directions in quasi-Newton methods for minimizing a quadratic function subject to uncertainty
2025 (English)In: Computational optimization and applications, ISSN 0926-6003, E-ISSN 1573-2894, Vol. 91, no 1, p. 145-171Article in journal (Refereed) Published
Abstract [en]

We investigate quasi-Newton methods for minimizing a strongly convex quadratic function which is subject to errors in the evaluation of the gradients. In particular, we focus on computing search directions for quasi-Newton methods that all give identical behavior in exact arithmetic, generating minimizers of Krylov subspaces of increasing dimensions, thereby having finite termination. The BFGS quasi-Newton method may be seen as an ideal method in exact arithmetic and is empirically known to behave very well on a quadratic problem subject to small errors. We investigate large-error scenarios, in which the expected behavior is not so clear. We consider memoryless methods that are less expensive than the BFGS method, in that they generate low-rank quasi-Newton matrices that differ from the identity by a symmetric matrix of rank two. In addition, a more advanced model for generating the search directions is proposed, based on solving a chance-constrained optimization problem. Our numerical results indicate that for large errors, such a low-rank memoryless quasi-Newton method may perform better than a BFGS method. In addition, the results indicate a potential edge by including the chance-constrained model in the memoryless quasi-Newton method.

Place, publisher, year, edition, pages
Springer Nature, 2025
Keywords
Quadratic programming, Quasi-Newton method, Stochastic quasi-Newton method, Chance constrained model
National Category
Computational Mathematics
Identifiers
urn:nbn:se:kth:diva-360747 (URN)10.1007/s10589-025-00661-4 (DOI)001426490300001 ()2-s2.0-105001073066 (Scopus ID)
Note

QC 20250303

Available from: 2025-03-03 Created: 2025-03-03 Last updated: 2026-01-15Bibliographically approved
Ek, D. & Forsgren, A. (2023). A structured modified Newton approach for solving systems of nonlinear equations arising in interior-point methods for quadratic programming. Computational optimization and applications, 86(1), 1-48
Open this publication in new window or tab >>A structured modified Newton approach for solving systems of nonlinear equations arising in interior-point methods for quadratic programming
2023 (English)In: Computational optimization and applications, ISSN 0926-6003, E-ISSN 1573-2894, Vol. 86, no 1, p. 1-48Article in journal (Refereed) Published
Abstract [en]

The focus in this work is on interior-point methods for inequality-constrained quadratic programs, and particularly on the system of nonlinear equations to be solved for each value of the barrier parameter. Newton iterations give high quality solutions, but we are interested in modified Newton systems that are computationally less expensive at the expense of lower quality solutions. We propose a structured modified Newton approach where each modified Jacobian is composed of a previous Jacobian, plus one low-rank update matrix per succeeding iteration. Each update matrix is, for a given rank, chosen such that the distance to the Jacobian at the current iterate is minimized, in both 2-norm and Frobenius norm. The approach is structured in the sense that it preserves the nonzero pattern of the Jacobian. The choice of update matrix is supported by results in an ideal theoretical setting. We also produce numerical results with a basic interior-point implementation to investigate the practical performance within and beyond the theoretical framework. In order to improve performance beyond the theoretical framework, we also motivate and construct two heuristics to be added to the method.

Place, publisher, year, edition, pages
Springer Nature, 2023
Keywords
Interior-point methods, Low-rank updates, Modified Newton methods, Quasi-Newton methods
National Category
Computational Mathematics
Identifiers
urn:nbn:se:kth:diva-338567 (URN)10.1007/s10589-023-00486-z (DOI)001009193200001 ()2-s2.0-85161996855 (Scopus ID)
Note

QC 20231107

Available from: 2023-11-07 Created: 2023-11-07 Last updated: 2023-11-07Bibliographically approved
Ek, D. & Forsgren, A. (2021). Approximate solution of system of equations arising in interior-point methods for bound-constrained optimization. Computational optimization and applications, 79(1), 155-191
Open this publication in new window or tab >>Approximate solution of system of equations arising in interior-point methods for bound-constrained optimization
2021 (English)In: Computational optimization and applications, ISSN 0926-6003, E-ISSN 1573-2894, Vol. 79, no 1, p. 155-191Article in journal (Refereed) Published
Abstract [en]

The focus in this paper is interior-point methods for bound-constrained nonlinear optimization, where the system of nonlinear equations that arise are solved with Newton’s method. There is a trade-off between solving Newton systems directly, which give high quality solutions, and solving many approximate Newton systems which are computationally less expensive but give lower quality solutions. We propose partial and full approximate solutions to the Newton systems. The specific approximate solution depends on estimates of the active and inactive constraints at the solution. These sets are at each iteration estimated by basic heuristics. The partial approximate solutions are computationally inexpensive, whereas a system of linear equations needs to be solved for the full approximate solution. The size of the system is determined by the estimate of the inactive constraints at the solution. In addition, we motivate and suggest two Newton-like approaches which are based on an intermediate step that consists of the partial approximate solutions. The theoretical setting is introduced and asymptotic error bounds are given. We also give numerical results to investigate the performance of the approximate solutions within and beyond the theoretical framework. 

Place, publisher, year, edition, pages
Springer Nature, 2021
Keywords
Approximate solution of system of linear equations, Bound-constrained optimization, Interior-point methods, Newton-like approaches, Constrained optimization, Economic and social effects, Error analysis, Iterative methods, Linear programming, Nonlinear programming, Asymptotic error bound, Bound constrained optimization, Constrained non-linear optimizations, High-quality solutions, Interior-point method, System of linear equations, System of nonlinear equations, Theoretical framework, Nonlinear equations
National Category
Computational Mathematics Control Engineering
Identifiers
urn:nbn:se:kth:diva-305487 (URN)10.1007/s10589-021-00265-8 (DOI)000618583600001 ()2-s2.0-85100926817 (Scopus ID)
Note

QC 20211130

Available from: 2021-11-30 Created: 2021-11-30 Last updated: 2022-06-25Bibliographically approved
Ek, D. & Forsgren, A. (2021). Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function. Computational optimization and applications, 79(3), 789-816
Open this publication in new window or tab >>Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function
2021 (English)In: Computational optimization and applications, ISSN 0926-6003, E-ISSN 1573-2894, Vol. 79, no 3, p. 789-816Article in journal (Refereed) Published
Abstract [en]

The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of limited-memory quasi-Newton Hessian approximations which generate search directions parallel to those of the BFGS method, or equivalently, to those of the method of preconditioned conjugate gradients. In the setting of reduced Hessians, the class provides a dynamical framework for the construction of limited-memory quasi-Newton methods. These methods attain finite termination on quadratic optimization problems in exact arithmetic. We show performance of the methods within this framework in finite precision arithmetic by numerical simulations on sequences of related systems of linear equations, which originate from the CUTEst test collection. In addition, we give a compact representation of the Hessian approximations in the full Broyden class for the general unconstrained optimization problem. This representation consists of explicit matrices and gradients only as vector components.

Place, publisher, year, edition, pages
Springer Nature, 2021
Keywords
Exact linesearch method, Limited-memory method, Method of conjugate gradients, Quasi-Newton method, Unconstrained quadratic program
National Category
Computational Mathematics
Identifiers
urn:nbn:se:kth:diva-309861 (URN)10.1007/s10589-021-00277-4 (DOI)000645176600002 ()2-s2.0-85105371182 (Scopus ID)
Note

Not duplicate with DiVA 1501899 which is a manuscript and part of a thesis

QC 20220314

Available from: 2022-03-14 Created: 2022-03-14 Last updated: 2022-06-25Bibliographically approved
Mele, G., Ringh, E., Ek, D., Izzo, F., Upadhyaya, P. & Jarlebring, E. (2020). Preconditioning for linear systems. KD Publishing
Open this publication in new window or tab >>Preconditioning for linear systems
Show others...
2020 (English)Book (Other academic)
Place, publisher, year, edition, pages
KD Publishing, 2020
National Category
Computational Mathematics
Identifiers
urn:nbn:se:kth:diva-273402 (URN)9798646306334 (ISBN)
Note

QC 20200523

Available from: 2020-05-17 Created: 2020-05-17 Last updated: 2024-03-18Bibliographically approved
Ek, D. & Forsgren, A.A structured modified Newton approach for solving systems of nonlinear equations arising in interior-point methods for quadratic programming.
Open this publication in new window or tab >>A structured modified Newton approach for solving systems of nonlinear equations arising in interior-point methods for quadratic programming
(English)Manuscript (preprint) (Other academic)
Abstract [en]

The focus in this work is interior-point methods for quadratic optimization problems with linear inequality constraints where the system of nonlinear equations that arise are solved with Newton-like methods. In particular, the concern is the system of linear equations to be solved at each iteration. Newton systems give high quality solutions but there is an interest in designing modified Newton systems which are computationally less expensive but normally at the expense of lower quality solutions. We propose a structured modified Newton approach where the Jacobian of each Newton system is approximated by a Jacobian at a previous point plus a sequence of low-rank update matrices. The updates are such that the Jacobian approximation preserves its sparsity structure and in consequence may be viewed as a Jacobian evaluated at a different point. We introduce the theoretical setting, motivate the approach by theoretical results applicable in the asymptotic region and give numerical results for a set of convex quadratic programs. In an attempt to improve performance we also motivate and construct two heuristics to be added to the method.

Keywords
interior-point methods, modified Newton methods, quasi-Newton methods, low-rank updates
National Category
Computational Mathematics
Research subject
Applied and Computational Mathematics, Optimization and Systems Theory
Identifiers
urn:nbn:se:kth:diva-286034 (URN)
Note

QC 20201201

Available from: 2020-11-18 Created: 2020-11-18 Last updated: 2022-06-25Bibliographically approved
Ek, D. & Forsgren, A.An optimization derivation of the method of conjugate gradients.
Open this publication in new window or tab >>An optimization derivation of the method of conjugate gradients
(English)Manuscript (preprint) (Other academic)
Abstract [en]

We give a derivation of the method of conjugate gradients based on the requirement that each iterate minimizes a strictly convex quadratic on the space spanned by the previously observed gradients. Rather than verifying that the search direction has the correct properties, we show that generation of such iterates is equivalent to generation of orthogonal gradients which gives the description of the direction and the step length. Our approach gives a straightforward way to see that the searchdirection of the method of conjugate gradients is a negative scalartimes the gradient of minimum Euclidean norm on the subspace generated so far.

Keywords
method of conjugate gradients, orthogonal gradients, gradient of minimum Euclidean norm.
National Category
Computational Mathematics
Research subject
Applied and Computational Mathematics, Optimization and Systems Theory
Identifiers
urn:nbn:se:kth:diva-286036 (URN)
Funder
Swedish Research Council, 621-2014-4772
Note

QC 20201201

Available from: 2020-11-18 Created: 2020-11-18 Last updated: 2022-06-25Bibliographically approved
Ek, D. & Forsgren, A.Approximate solution of system of equations arising in interior-point methods for bound-constrained optimization.
Open this publication in new window or tab >>Approximate solution of system of equations arising in interior-point methods for bound-constrained optimization
(English)Manuscript (preprint) (Other academic)
Abstract [en]

The focus in this paper is interior-point methods for bound-constrained nonlinear optimization, where the system of nonlinear equations that arise are solved with Newton's method. There is a trade-off between solving Newton systems directly, which give high quality solutions, and solving many approximate Newton systems which are computationally less expensive but give lower quality solutions. We propose partial and full approximate solutions to the Newton systems. The specific approximate solution depends on estimates of the active and inactive constraints at the solution. These sets are at each iteration estimated by basic heuristics. The partial approximate solutions are computationally inexpensive, whereas a system of linear equations needs to be solved for the full approximate solution. The size of the system is determined by the estimate of the inactive constraints at the solution. In addition, we motivate and suggest two Newton-like approaches which are based on an intermediate step that consists of the partial approximate solutions. The theoretical setting is introduced and asymptotic error bounds are given. We also give numerical results to investigate the performance of the approximate solutions within and beyond the theoretical framework. 

Keywords
interior-point methods, bound-constrained optimization, approximate solution of system of linear equations, Newton-like approaches
National Category
Computational Mathematics
Research subject
Applied and Computational Mathematics, Optimization and Systems Theory
Identifiers
urn:nbn:se:kth:diva-286032 (URN)
Funder
Swedish Research Council, 621-2014-4772
Note

QC 20201201

Available from: 2020-11-18 Created: 2020-11-18 Last updated: 2022-06-25Bibliographically approved
Ek, D. & Forsgren, A.Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function.
Open this publication in new window or tab >>Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function
(English)Manuscript (preprint) (Other academic)
Abstract [en]

The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of limited-memory quasi-Newton Hessian approximations which generate search directions parallel to those of the method of preconditioned conjugate gradients, and hence give finite termination on quadratic optimization problems in exact arithmetic. With the framework of reduced-Hessians this class  provides a dynamical framework for the construction of limited-memory quasi-Newton methods. We give an indication of the performance of the methods within this framework by showing numerical simulations on sequences of related systems of linear equations, which originate from the CUTEst test collection. In addition, we give a compact representation of the Hessian approximations in the full Broyden class for the general unconstrained optimization problem. This representation consists of explicit matrices and gradients only as vector components.

Keywords
method of conjugate gradients, quasi-Newton method, unconstrained quadratic program, limited-memory method, exact linesearch method
National Category
Computational Mathematics
Research subject
Applied and Computational Mathematics, Optimization and Systems Theory
Identifiers
urn:nbn:se:kth:diva-286030 (URN)
Funder
Swedish Research Council, 621-2014-4772
Note

QC 20201201

Available from: 2020-11-18 Created: 2020-11-18 Last updated: 2022-06-25Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-1764-5449

Search in DiVA

Show all publications