kth.sePublications KTH
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
More PAC-Bayes bounds: From bounded losses, to losses with general tail behaviors, to anytime validity
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0002-0862-1333
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0001-9307-484X
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0002-7926-5081
2024 (English)In: Journal of machine learning research, ISSN 1532-4435, E-ISSN 1533-7928, Vol. 25, p. 1-43Article in journal (Refereed) Published
Abstract [en]

In this paper, we present new high-probability PAC-Bayes bounds for different types of losses. Firstly, for losses with a bounded range, we recover a strengthened version of Catoni's bound that holds uniformly for all parameter values. This leads to new fast-rate and mixed-rate bounds that are interpretable and tighter than previous bounds in the literature. In particular, the fast-rate bound is equivalent to the Seeger-Langford bound. Secondly, for losses with more general tail behaviors, we introduce two new parameter-free bounds: a PAC-Bayes Chernoff analogue when the loss' cumulative generating function is bounded, and a bound when the loss' second moment is bounded. These two bounds are obtained using a new technique based on a discretization of the space of possible events for the "in probability" parameter optimization problem. This technique is both simpler and more general than previous approaches optimizing over a grid on the parameters' space. Finally, using a simple technique that is applicable to any existing bound, we extend all previous results to anytime-valid bounds.

Place, publisher, year, edition, pages
MICROTOME PUBL , 2024. Vol. 25, p. 1-43
Keywords [en]
Generalization bounds, PAC-Bayes bounds, concentration inequalities, rate, of convergence (fast, slow, mixed), tail behavior, parameter optimization.
National Category
Mathematical Analysis Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-345988ISI: 001203119000001Scopus ID: 2-s2.0-105018668397OAI: oai:DiVA.org:kth-345988DiVA, id: diva2:1855208
Note

Not duplicate with DiVA 1848241

QC 20240430

Available from: 2024-04-30 Created: 2024-04-30 Last updated: 2025-11-07Bibliographically approved

Open Access in DiVA

No full text in DiVA

Scopus

Authority records

Rodríguez Gálvez, BorjaThobaben, RagnarSkoglund, Mikael

Search in DiVA

By author/editor
Rodríguez Gálvez, BorjaThobaben, RagnarSkoglund, Mikael
By organisation
Information Science and Engineering
In the same journal
Journal of machine learning research
Mathematical AnalysisComputer Sciences

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 126 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf