kth.sePublikationer KTH
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Adaptive Hyperparameter Selection for Differentially Private Gradient Descent
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Reglerteknik. Elekta.ORCID-id: 0000-0002-5530-2714
Department of Computer and Systems Sciences Stockholm University, Stockholm, Sweden.
Department of Information Technology Uppsala University, Uppsala, Sweden.
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Reglerteknik.ORCID-id: 0000-0002-2237-2580
2023 (Engelska)Ingår i: Transactions on Machine Learning Research, E-ISSN 2835-8856, Vol. 2023, nr 9Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

We present an adaptive mechanism for hyperparameter selection in differentially private optimization that addresses the inherent trade-off between utility and privacy. The mechanism eliminates the often unstructured and time-consuming manual effort of selecting hyperpa-rameters and avoids the additional privacy costs that hyperparameter selection otherwise incurs on top of that of the actual algorithm. We instantiate our mechanism for noisy gradient descent on non-convex, convex and strongly convex loss functions, respectively, to derive schedules for the noise variance and step size. These schedules account for the properties of the loss function and adapt to convergence metrics such as the gradient norm. When using these schedules, we show that noisy gradient descent converges at essentially the same rate as its noise-free counterpart. Numerical experiments show that the schedules consistently perform well across a range of datasets without manual tuning.

Ort, förlag, år, upplaga, sidor
Transactions on Machine Learning Research , 2023. Vol. 2023, nr 9
Nationell ämneskategori
Datavetenskap (datalogi) Reglerteknik
Identifikatorer
URN: urn:nbn:se:kth:diva-361461Scopus ID: 2-s2.0-86000063307OAI: oai:DiVA.org:kth-361461DiVA, id: diva2:1945891
Anmärkning

QC 20250325

Tillgänglig från: 2025-03-19 Skapad: 2025-03-19 Senast uppdaterad: 2025-03-25Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Scopusfulltext

Person

Fay, DominikJohansson, Mikael

Sök vidare i DiVA

Av författaren/redaktören
Fay, DominikJohansson, Mikael
Av organisationen
Reglerteknik
I samma tidskrift
Transactions on Machine Learning Research
Datavetenskap (datalogi)Reglerteknik

Sök vidare utanför DiVA

GoogleGoogle Scholar

urn-nbn

Altmetricpoäng

urn-nbn
Totalt: 40 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf