Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Distributed Optimization with Gradient Descent and Quantized Communication
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Reglerteknik. KTH, Skolan för elektroteknik och datavetenskap (EECS), Centra, Digital futures.ORCID-id: 0000-0002-8737-1984
Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, Espoo, Finland.
Department of Electrical and Computer Engineering, School of Engineering, University of Cyprus, Nicosia, Cyprus.
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Reglerteknik. KTH, Skolan för elektroteknik och datavetenskap (EECS), Centra, Digital futures.ORCID-id: 0000-0001-9940-5929
2023 (engelsk)Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

In this paper, we consider the unconstrained distributed optimization problem, in which the exchange of information in the network is captured by a directed graph topology, thus, nodes can only communicate with their neighbors. Additionally, in our problem, the communication channels among the nodes have limited bandwidth. In order to alleviate this limitation, quantized messages should be exchanged among the nodes. For solving this distributed optimization problem, we combine a gradient descent method with a distributed quantized consensus algorithm (which requires the nodes to exchange quantized messages and converges in a finite number of steps). Specifically, at every optimization step, each node (i) performs a gradient descent step (i.e., subtracts the scaled gradient from its current estimate), and (ii) performs a finite-time calculation of the quantized average of every node's estimate in the network. As a consequence, this algorithm approximately mimics the centralized gradient descent algorithm. We show that our algorithm asymptotically converges to a neighborhood of the optimal solution with linear convergence rate. The performance of the proposed algorithm is demonstrated via simple illustrative examples.

sted, utgiver, år, opplag, sider
Elsevier BV , 2023. s. 5900-5906
Emneord [en]
directed graphs, Distributed optimization, finite-time consensus, quantized communication
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-343701DOI: 10.1016/j.ifacol.2023.10.100ISI: 001196709200450Scopus ID: 2-s2.0-85184958272OAI: oai:DiVA.org:kth-343701DiVA, id: diva2:1839896
Konferanse
22nd IFAC World Congress, Yokohama, Japan, July 9-14, 2023
Merknad

Part of ISBN 9781713872344

QC 20250923

Tilgjengelig fra: 2024-02-22 Laget: 2024-02-22 Sist oppdatert: 2025-09-23bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Person

Rikos, ApostolosJohansson, Karl H.

Søk i DiVA

Av forfatter/redaktør
Rikos, ApostolosJohansson, Karl H.
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric

doi
urn-nbn
Totalt: 68 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf