kth.sePublikationer KTH
Driftmeddelande
För närvarande är det driftstörningar. Felsökning pågår.
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Distributed Optimization with Gradient Descent and Quantized Communication
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Reglerteknik. KTH, Skolan för elektroteknik och datavetenskap (EECS), Centra, Digital futures.ORCID-id: 0000-0002-8737-1984
Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, Espoo, Finland.
Department of Electrical and Computer Engineering, School of Engineering, University of Cyprus, Nicosia, Cyprus.
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Reglerteknik. KTH, Skolan för elektroteknik och datavetenskap (EECS), Centra, Digital futures.ORCID-id: 0000-0001-9940-5929
2023 (Engelska)Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

In this paper, we consider the unconstrained distributed optimization problem, in which the exchange of information in the network is captured by a directed graph topology, thus, nodes can only communicate with their neighbors. Additionally, in our problem, the communication channels among the nodes have limited bandwidth. In order to alleviate this limitation, quantized messages should be exchanged among the nodes. For solving this distributed optimization problem, we combine a gradient descent method with a distributed quantized consensus algorithm (which requires the nodes to exchange quantized messages and converges in a finite number of steps). Specifically, at every optimization step, each node (i) performs a gradient descent step (i.e., subtracts the scaled gradient from its current estimate), and (ii) performs a finite-time calculation of the quantized average of every node's estimate in the network. As a consequence, this algorithm approximately mimics the centralized gradient descent algorithm. We show that our algorithm asymptotically converges to a neighborhood of the optimal solution with linear convergence rate. The performance of the proposed algorithm is demonstrated via simple illustrative examples.

Ort, förlag, år, upplaga, sidor
Elsevier BV , 2023. s. 5900-5906
Nyckelord [en]
directed graphs, Distributed optimization, finite-time consensus, quantized communication
Nationell ämneskategori
Reglerteknik
Identifikatorer
URN: urn:nbn:se:kth:diva-343701DOI: 10.1016/j.ifacol.2023.10.100ISI: 001196709200450Scopus ID: 2-s2.0-85184958272OAI: oai:DiVA.org:kth-343701DiVA, id: diva2:1839896
Konferens
22nd IFAC World Congress, Yokohama, Japan, July 9-14, 2023
Anmärkning

Part of ISBN 9781713872344

QC 20250923

Tillgänglig från: 2024-02-22 Skapad: 2024-02-22 Senast uppdaterad: 2025-09-23Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Rikos, ApostolosJohansson, Karl H.

Sök vidare i DiVA

Av författaren/redaktören
Rikos, ApostolosJohansson, Karl H.
Av organisationen
ReglerteknikDigital futures
Reglerteknik

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 68 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf