Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Banach Wasserstein GAN
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematics (Div.).ORCID iD: 0000-0001-9928-3407
Univ Cambridge, Dept Appl Math & Theoret Phys, Cambridge, England..
2018 (English)In: Advances in Neural Information Processing Systems 31 (NIPS 2018) / [ed] Bengio, S Wallach, H Larochelle, H Grauman, K CesaBianchi, N Garnett, R, Neural Information Processing Systems (NIPS) , 2018Conference paper, Published paper (Refereed)
Abstract [en]

Wasserstein Generative Adversarial Networks (WGANs) can be used to generate realistic samples from complicated image distributions. The Wasserstein metric used in WGANs is based on a notion of distance between individual images, which induces a notion of distance between probability distributions of images. So far the community has considered l(2) as the underlying distance. We generalize the theory of WGAN with gradient penalty to Banach spaces, allowing practitioners to select the features to emphasize in the generator. We further discuss the effect of some particular choices of underlying norms, focusing on Sobolev norms. Finally, we demonstrate a boost in performance for an appropriate choice of norm on CIFAR-10 and CelebA.

Place, publisher, year, edition, pages
Neural Information Processing Systems (NIPS) , 2018.
Series
Advances in Neural Information Processing Systems, ISSN 1049-5258 ; 31
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-249915ISI: 000461852001031OAI: oai:DiVA.org:kth-249915DiVA, id: diva2:1307316
Conference
32nd Conference on Neural Information Processing Systems (NIPS), DEC 02-08, 2018, Montreal, Canada
Note

QC 20190426

Available from: 2019-04-26 Created: 2019-04-26 Last updated: 2019-10-18Bibliographically approved
In thesis
1. Data-driven Methods in Inverse Problems
Open this publication in new window or tab >>Data-driven Methods in Inverse Problems
2019 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

In this thesis on data-driven methods in inverse problems we introduce several new methods to solve inverse problems using recent advancements in machine learning and specifically deep learning. The main goal has been to develop practically applicable methods, scalable to medical applications and with the ability to handle all the complexities associated with them.

In total, the thesis contains six papers. Some of them are focused on more theoretical questions such as characterizing the optimal solutions of reconstruction schemes or extending current methods to new domains, while others have focused on practical applicability. A significant portion of the papers also aim to bringing knowledge from the machine learning community into the imaging community, with considerable effort spent on translating many of the concepts. The papers have been published in a range of venues: machine learning, medical imaging and inverse problems.

The first two papers contribute to a class of methods now called learned iterative reconstruction where we introduce two ways of combining classical model driven reconstruction methods with deep neural networks. The next two papers look forward, aiming to address the question of "what do we want?" by proposing two very different but novel loss functions for training neural networks in inverse problems. The final papers dwelve into the statistical side, one gives a generalization of a class of deep generative models to Banach spaces while the next introduces two ways in which such methods can be used to perform Bayesian inversion at scale.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2019. p. 196
Series
TRITA-SCI-FOU ; 2019;49
Keywords
Inverse Problems, Machine Learning, Tomography
National Category
Computational Mathematics
Identifiers
urn:nbn:se:kth:diva-262727 (URN)978-91-7873-334-7 (ISBN)
Public defence
2019-10-31, F3, Lindstedtsvägen26, KTH, Stockholm, 14:00 (English)
Opponent
Supervisors
Funder
Swedish Foundation for Strategic Research
Available from: 2019-10-21 Created: 2019-10-18 Last updated: 2019-10-21Bibliographically approved

Open Access in DiVA

No full text in DiVA

Authority records BETA

Adler, Jonas

Search in DiVA

By author/editor
Adler, Jonas
By organisation
Mathematics (Div.)
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 390 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf