kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Compressed gradient tracking algorithms for distributed nonconvex optimization
State Key Laboratory of Synthetical Automation for Process Industries, Northeastern University, 110819, Shenyang, China; Department of Mechanical Engineering, University of Victoria, Victoria, BC V8W 2Y2, Canada.
College of Electronics and Information Engineering, Tongji University, Shanghai, 201804, China; Lab for Information & Decision Systems, Massachusetts Institute of Technology, Cambridge, MA 02139, USA.
Department of Systems Science, School of Mathematics, Southeast University, Nanjing 210096, China.
Department of Mechanical Engineering, University of Victoria, Victoria, BC V8W 2Y2, Canada.
Show others and affiliations
2025 (English)In: Automatica, ISSN 0005-1098, E-ISSN 1873-2836, Vol. 177, article id 112286Article in journal (Refereed) Published
Abstract [en]

In this paper, we study the distributed nonconvex optimization problem, aiming to minimize the average value of the local nonconvex cost functions using local information exchange. To reduce the communication overhead, we introduce three general classes of compressors, i.e., compressors with bounded relative compression error, compressors with globally bounded absolute compression error, and compressors with locally bounded absolute compression error. By integrating them, respectively, with the distributed gradient tracking algorithm, we then propose three corresponding compressed distributed nonconvex optimization algorithms. Motivated by the state-of-the-art BEER algorithm proposed in Zhao et al. (2022), which is an efficient compressed algorithm integrating gradient tracking with biased and contractive compressors, our first proposed algorithm extends this algorithm to accommodate both biased and non-contractive compressors For each algorithm, we design a novel Lyapunov function to demonstrate its sublinear convergence to a stationary point if the local cost functions are smooth. Furthermore, when the global cost function satisfies the Polyak–Łojasiewicz (P–Ł) condition, we show that our proposed algorithms linearly converge to a global optimal point. It is worth noting that, for compressors with bounded relative compression error and globally bounded absolute compression error, our proposed algorithms’ parameters do not require prior knowledge of the P–Ł constant.

Place, publisher, year, edition, pages
Elsevier BV , 2025. Vol. 177, article id 112286
Keywords [en]
Communication compression, Gradient tracking algorithm, Linear convergence, Nonconvex optimization, Polyak–Łojasiewicz condition, Sublinear convergence
National Category
Control Engineering
Identifiers
URN: urn:nbn:se:kth:diva-362549DOI: 10.1016/j.automatica.2025.112286Scopus ID: 2-s2.0-105002048469OAI: oai:DiVA.org:kth-362549DiVA, id: diva2:1952997
Note

QC 20250424

Available from: 2025-04-16 Created: 2025-04-16 Last updated: 2025-04-24Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Johansson, Karl H.

Search in DiVA

By author/editor
Johansson, Karl H.
By organisation
Decision and Control Systems (Automatic Control)
In the same journal
Automatica
Control Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 13 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf