Distributed Optimization with Gradient Descent and Quantized Communication
2023 (Engelska)Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]
In this paper, we consider the unconstrained distributed optimization problem, in which the exchange of information in the network is captured by a directed graph topology, thus, nodes can only communicate with their neighbors. Additionally, in our problem, the communication channels among the nodes have limited bandwidth. In order to alleviate this limitation, quantized messages should be exchanged among the nodes. For solving this distributed optimization problem, we combine a gradient descent method with a distributed quantized consensus algorithm (which requires the nodes to exchange quantized messages and converges in a finite number of steps). Specifically, at every optimization step, each node (i) performs a gradient descent step (i.e., subtracts the scaled gradient from its current estimate), and (ii) performs a finite-time calculation of the quantized average of every node's estimate in the network. As a consequence, this algorithm approximately mimics the centralized gradient descent algorithm. We show that our algorithm asymptotically converges to a neighborhood of the optimal solution with linear convergence rate. The performance of the proposed algorithm is demonstrated via simple illustrative examples.
Ort, förlag, år, upplaga, sidor
Elsevier BV , 2023. s. 5900-5906
Nyckelord [en]
directed graphs, Distributed optimization, finite-time consensus, quantized communication
Nationell ämneskategori
Reglerteknik
Identifikatorer
URN: urn:nbn:se:kth:diva-343701DOI: 10.1016/j.ifacol.2023.10.100ISI: 001196709200450Scopus ID: 2-s2.0-85184958272OAI: oai:DiVA.org:kth-343701DiVA, id: diva2:1839896
Konferens
22nd IFAC World Congress, Yokohama, Japan, July 9-14, 2023
Anmärkning
Part of ISBN 9781713872344
QC 20250923
2024-02-222024-02-222025-09-23Bibliografiskt granskad