kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
REMEDI: Corrective Transformations for Improved Neural Entropy Estimation
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematics (Div.).ORCID iD: 0000-0001-5740-5103
Mathematics and Computer Science Division, Argonne National Laboratory, Chicago IL, USA.
Mathematics and Computer Science Division, Argonne National Laboratory, Chicago IL, USA.
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematics (Div.). Department of Mathematical Sciences, Chalmers University of Technology and University of Gothenburg, Gothenburg, Sweden.ORCID iD: 0000-0001-8702-2293
2024 (English)In: International Conference on Machine Learning, ICML 2024, ML Research Press , 2024, p. 38207-38236Conference paper, Published paper (Refereed)
Abstract [en]

Information theoretic quantities play a central role in machine learning. The recent surge in the complexity of data and models has increased the demand for accurate estimation of these quantities. However, as the dimension grows the estimation presents significant challenges, with existing methods struggling already in relatively low dimensions. To address this issue, in this work, we introduce REMEDI for efficient and accurate estimation of differential entropy, a fundamental information theoretic quantity. The approach combines the minimization of the cross-entropy for simple, adaptive base models and the estimation of their deviation, in terms of the relative entropy, from the data density. Our approach demonstrates improvement across a broad spectrum of estimation tasks, encompassing entropy estimation on both synthetic and natural data. Further, we extend important theoretical consistency results to a more generalized setting required by our approach. We illustrate how the framework can be naturally extended to information theoretic supervised learning models, with a specific focus on the Information Bottleneck approach. It is demonstrated that the method delivers better accuracy compared to the existing methods in Information Bottleneck. In addition, we explore a natural connection between REMEDI and generative modeling using rejection sampling and Langevin dynamics.

Place, publisher, year, edition, pages
ML Research Press , 2024. p. 38207-38236
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-353945Scopus ID: 2-s2.0-85203821749OAI: oai:DiVA.org:kth-353945DiVA, id: diva2:1901021
Conference
41st International Conference on Machine Learning, ICML 2024, Vienna, Austria, Jul 21 2024 - Jul 27 2024
Note

QC 20240926

Available from: 2024-09-25 Created: 2024-09-25 Last updated: 2024-09-26Bibliographically approved

Open Access in DiVA

No full text in DiVA

Scopus

Authority records

Nilsson, ViktorNyquist, Pierre

Search in DiVA

By author/editor
Nilsson, ViktorNyquist, Pierre
By organisation
Mathematics (Div.)
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 26 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf