kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Soft quasi-Newton: guaranteed positive definiteness by relaxing the secant constraint
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Decision and Control Systems (Automatic Control).ORCID iD: 0000-0002-8872-8885
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Decision and Control Systems (Automatic Control).
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Decision and Control Systems (Automatic Control).ORCID iD: 0000-0002-2237-2580
2025 (English)In: Optimization Methods and Software, ISSN 1055-6788, E-ISSN 1029-4937, p. 1-30Article in journal (Refereed) Epub ahead of print
Abstract [en]

We propose a novel algorithm, termed soft quasi-Newton (soft QN), for optimization in the presence of bounded noise. Traditional quasi-Newton algorithms are vulnerable to such noise-induced perturbations. To develop a more robust quasi-Newton method, we replace the secant condition in the matrix optimization problem for the Hessian update with a penalty term in its objective and derive a closed-form update formula. A key feature of our approach is its ability to maintain positive definiteness of the Hessian inverse approximation throughout the iterations. Furthermore, we establish the following properties of soft QN: it recovers the BFGS method under specific limits, it treats positive and negative curvature equally, and it is scale invariant. Collectively, these features enhance the efficacy of soft QN in noisy environments. For strongly convex objective functions and Hessian approximations obtained using soft QN, we develop an algorithm that exhibits linear convergence toward a neighborhood of the optimal solution even when gradient and function evaluations are subject to bounded perturbations. Through numerical experiments, we demonstrate that soft QN consistently outperforms state-of-the-art methods across a range of scenarios.

Place, publisher, year, edition, pages
Informa UK Limited , 2025. p. 1-30
Keywords [en]
quasi-Newton methods, general bounded noise, secant condition, penalty
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:kth:diva-362428DOI: 10.1080/10556788.2025.2475406ISI: 001449014500001Scopus ID: 2-s2.0-105000489741OAI: oai:DiVA.org:kth-362428DiVA, id: diva2:1952336
Note

QC 20250425

Available from: 2025-04-15 Created: 2025-04-15 Last updated: 2025-04-25Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Berglund, ErikZhang, JiaojiaoJohansson, Mikael

Search in DiVA

By author/editor
Berglund, ErikZhang, JiaojiaoJohansson, Mikael
By organisation
Decision and Control Systems (Automatic Control)
In the same journal
Optimization Methods and Software
Computational Mathematics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 15 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf