Distributed zeroth-order optimization: Convergence rates that match centralized counterpart
2024 (English)In: Automatica, ISSN 0005-1098, E-ISSN 1873-2836, Vol. 159, article id 111328Article in journal (Refereed) Published
Abstract [en]
Zeroth-order optimization has become increasingly important in complex optimization and machine learning when cost functions are impossible to be described in closed analytical forms. The key idea of zeroth-order optimization lies in the ability for a learner to build gradient estimates by queries sent to the cost function, and then traditional gradient descent algorithms can be executed replacing gradients by the estimates. For optimization over large-scale multi-agent systems with decentralized data and costs, zeroth-order optimization can continue to be utilized to develop scalable and distributed algorithms. In this paper, we aim at understanding the trend in performance transitioning from centralized to distributed zeroth-order algorithms in terms of convergence rates, and focus on multi-agent systems with time-varying communication networks. We establish a series of convergence rates for distributed zeroth-order subgradient algorithms under both one-point and two-point zeroth-order oracles. Apart from the additional node-to-node communication cost due to the distributed nature of algorithms, the established rates in convergence are shown to match their centralized counterpart. We also propose a multi-stage distributed zeroth-order algorithm that better utilizes the learning rates, reduces the computational complexity, and attains even faster convergence rates for compact decision set.
Place, publisher, year, edition, pages
Elsevier BV , 2024. Vol. 159, article id 111328
Keywords [en]
Distributed optimization, Multi-stage optimization algorithm, Optimal convergence rate, Zeroth-order optimization
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:kth:diva-339481DOI: 10.1016/j.automatica.2023.111328ISI: 001101835000001Scopus ID: 2-s2.0-85175044513OAI: oai:DiVA.org:kth-339481DiVA, id: diva2:1811449
Note
QC 20231113
2023-11-132023-11-132025-12-05Bibliographically approved