Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Data-driven Methods in Inverse Problems
KTH, Skolan för teknikvetenskap (SCI), Matematik (Inst.), Matematik (Avd.).ORCID-id: 0000-0001-9928-3407
2019 (engelsk)Doktoravhandling, med artikler (Annet vitenskapelig)
Abstract [en]

In this thesis on data-driven methods in inverse problems we introduce several new methods to solve inverse problems using recent advancements in machine learning and specifically deep learning. The main goal has been to develop practically applicable methods, scalable to medical applications and with the ability to handle all the complexities associated with them.

In total, the thesis contains six papers. Some of them are focused on more theoretical questions such as characterizing the optimal solutions of reconstruction schemes or extending current methods to new domains, while others have focused on practical applicability. A significant portion of the papers also aim to bringing knowledge from the machine learning community into the imaging community, with considerable effort spent on translating many of the concepts. The papers have been published in a range of venues: machine learning, medical imaging and inverse problems.

The first two papers contribute to a class of methods now called learned iterative reconstruction where we introduce two ways of combining classical model driven reconstruction methods with deep neural networks. The next two papers look forward, aiming to address the question of "what do we want?" by proposing two very different but novel loss functions for training neural networks in inverse problems. The final papers dwelve into the statistical side, one gives a generalization of a class of deep generative models to Banach spaces while the next introduces two ways in which such methods can be used to perform Bayesian inversion at scale.

sted, utgiver, år, opplag, sider
Stockholm: KTH Royal Institute of Technology, 2019. , s. 196
Serie
TRITA-SCI-FOU ; 2019;49
Emneord [en]
Inverse Problems, Machine Learning, Tomography
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-262727ISBN: 978-91-7873-334-7 (tryckt)OAI: oai:DiVA.org:kth-262727DiVA, id: diva2:1362355
Disputas
2019-10-31, F3, Lindstedtsvägen26, KTH, Stockholm, 14:00 (engelsk)
Opponent
Veileder
Forskningsfinansiär
Swedish Foundation for Strategic Research Tilgjengelig fra: 2019-10-21 Laget: 2019-10-18 Sist oppdatert: 2019-10-21bibliografisk kontrollert
Delarbeid
1. Solving ill-posed inverse problems using iterative deep neural networks
Åpne denne publikasjonen i ny fane eller vindu >>Solving ill-posed inverse problems using iterative deep neural networks
2017 (engelsk)Inngår i: Inverse Problems, ISSN 0266-5611, E-ISSN 1361-6420, Vol. 33, nr 12, artikkel-id 124007Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

We propose a partially learned approach for the solution of ill-posed inverse problems with not necessarily linear forward operators. The method builds on ideas from classical regularisation theory and recent advances in deep learning to perform learning while making use of prior information about the inverse problem encoded in the forward operator, noise model and a regularising functional. The method results in a gradient-like iterative scheme, where the 'gradient' component is learned using a convolutional network that includes the gradients of the data discrepancy and regulariser as input in each iteration. We present results of such a partially learned gradient scheme on a non-linear tomographic inversion problem with simulated data from both the Sheep-Logan phantom as well as a head CT. The outcome is compared against filtered backprojection and total variation reconstruction and the proposed method provides a 5.4 dB PSNR improvement over the total variation reconstruction while being significantly faster, giving reconstructions of 512 x 512 pixel images in about 0.4 s using a single graphics processing unit (GPU).

sted, utgiver, år, opplag, sider
Institute of Physics Publishing (IOPP), 2017
Emneord
tomography, deep learning, gradient descent, regularization
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-219496 (URN)10.1088/1361-6420/aa9581 (DOI)000416015300001 ()2-s2.0-85038424472 (Scopus ID)
Merknad

QC 20171207

Tilgjengelig fra: 2017-12-07 Laget: 2017-12-07 Sist oppdatert: 2019-10-18bibliografisk kontrollert
2. Learned Primal-Dual Reconstruction
Åpne denne publikasjonen i ny fane eller vindu >>Learned Primal-Dual Reconstruction
2018 (engelsk)Inngår i: IEEE Transactions on Medical Imaging, ISSN 0278-0062, E-ISSN 1558-254X, Vol. 37, nr 6, s. 1322-1332Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

We propose the Learned Primal-Dual algorithm for tomographic reconstruction. The algorithm accounts for a (possibly non-linear) forward operator in a deep neural network by unrolling a proximal primal-dual optimization method, but where the proximal operators have been replaced with convolutional neural networks. The algorithm is trained end-to-end, working directly from raw measured data and it does not depend on any initial reconstruction such as filtered back-projection (FBP). We compare performance of the proposed method on low dose computed tomography reconstruction against FBP, total variation (TV), and deep learning based post-processing of FBP. For the Shepp-Logan phantom we obtain >6 dB peak signal to noise ratio improvement against all compared methods. For human phantoms the corresponding improvement is 6.6 dB over TV and 2.2 dB over learned post-processing along with a substantial improvement in the structural similarity index. Finally, our algorithm involves only ten forward-back-projection computations, making the method feasible for time critical clinical applications.

sted, utgiver, år, opplag, sider
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2018
Emneord
Inverse problems, tomography, deep learning, primal-dual, optimization
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-231206 (URN)10.1109/TMI.2018.2799231 (DOI)000434302700004 ()29870362 (PubMedID)2-s2.0-85041342868 (Scopus ID)
Merknad

QC 20180629

Tilgjengelig fra: 2018-06-29 Laget: 2018-06-29 Sist oppdatert: 2019-10-18bibliografisk kontrollert
3. Task adapted reconstruction for inverse problems
Åpne denne publikasjonen i ny fane eller vindu >>Task adapted reconstruction for inverse problems
Vise andre…
(engelsk)Manuskript (preprint) (Annet vitenskapelig)
Abstract [en]

The paper considers the problem of performing a task defined on a model parameter that is only observed indirectly through noisy data in an ill-posed inverse problem. A key aspect is to formalize the steps of reconstruction and task as appropriate estimators (non-randomized decision rules) in statistical estimation problems. The implementation makes use of (deep) neural networks to provide a differentiable parametrization of the family of estimators for both steps. These networks are combined and jointly trained against suitable supervised training data in order to minimize a joint differentiable loss function, resulting in an end-to-end task adapted reconstruction method. The suggested framework is generic, yet adaptable, with a plug-and-play structure for adjusting both the inverse problem and the task at hand. More precisely, the data model (forward operator and statistical model of the noise) associated with the inverse problem is exchangeable, e.g., by using neural network architecture given by a learned iterative method. Furthermore, any task that is encodable as a trainable neural network can be used. The approach is demonstrated on joint tomographic image reconstruction, classification and joint tomographic image reconstruction segmentation.

Emneord
Inverse problems, image reconstruction, tomography, deep learning, feature reconstruction, segmentation, classification, regularization
HSV kategori
Forskningsprogram
Tillämpad matematik och beräkningsmatematik, Numerisk analys
Identifikatorer
urn:nbn:se:kth:diva-262725 (URN)
Merknad

QC 20191021

Tilgjengelig fra: 2019-10-18 Laget: 2019-10-18 Sist oppdatert: 2019-10-21bibliografisk kontrollert
4. Learning to solve inverse problems using Wasserstein loss
Åpne denne publikasjonen i ny fane eller vindu >>Learning to solve inverse problems using Wasserstein loss
(engelsk)Manuskript (preprint) (Annet vitenskapelig)
Abstract [en]

We propose using the Wasserstein loss for training in inverse problems. In particular, we consider a learned primal-dual reconstruction scheme for ill-posed inverse problems using the Wasserstein distance as loss function in the learning. This is motivated by miss-alignments in training data, which when using standard mean squared error loss could severely degrade reconstruction quality. We prove that training with the Wasserstein loss gives a reconstruction operator that correctly compensates for miss-alignments in certain cases, whereas training with the mean squared error gives a smeared reconstruction. Moreover, we demonstrate these effects by training a reconstruction algorithm using both mean squared error and optimal transport loss for a problem in computerized tomography.

HSV kategori
Forskningsprogram
Matematik; Tillämpad matematik och beräkningsmatematik
Identifikatorer
urn:nbn:se:kth:diva-239723 (URN)
Forskningsfinansiär
Swedish Foundation for Strategic Research , AM13- 0049Swedish Foundation for Strategic Research , ID14-0055Swedish Research Council, 2014-5870
Merknad

QC 20181211

Tilgjengelig fra: 2018-11-30 Laget: 2018-11-30 Sist oppdatert: 2019-10-18bibliografisk kontrollert
5. Banach Wasserstein GAN
Åpne denne publikasjonen i ny fane eller vindu >>Banach Wasserstein GAN
2018 (engelsk)Inngår i: Advances in Neural Information Processing Systems 31 (NIPS 2018) / [ed] Bengio, S Wallach, H Larochelle, H Grauman, K CesaBianchi, N Garnett, R, Neural Information Processing Systems (NIPS) , 2018Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Wasserstein Generative Adversarial Networks (WGANs) can be used to generate realistic samples from complicated image distributions. The Wasserstein metric used in WGANs is based on a notion of distance between individual images, which induces a notion of distance between probability distributions of images. So far the community has considered l(2) as the underlying distance. We generalize the theory of WGAN with gradient penalty to Banach spaces, allowing practitioners to select the features to emphasize in the generator. We further discuss the effect of some particular choices of underlying norms, focusing on Sobolev norms. Finally, we demonstrate a boost in performance for an appropriate choice of norm on CIFAR-10 and CelebA.

sted, utgiver, år, opplag, sider
Neural Information Processing Systems (NIPS), 2018
Serie
Advances in Neural Information Processing Systems, ISSN 1049-5258 ; 31
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-249915 (URN)000461852001031 ()
Konferanse
32nd Conference on Neural Information Processing Systems (NIPS), DEC 02-08, 2018, Montreal, Canada
Merknad

QC 20190426

Tilgjengelig fra: 2019-04-26 Laget: 2019-04-26 Sist oppdatert: 2019-10-18bibliografisk kontrollert
6. Deep Bayesian Inversion
Åpne denne publikasjonen i ny fane eller vindu >>Deep Bayesian Inversion
(engelsk)Manuskript (preprint) (Annet vitenskapelig)
Abstract [en]

Characterizing statistical properties of solutions of inverse problems is essential for decision making. Bayesian inversion offers a tractable framework for this purpose, but current approaches are computationally unfeasible for most realistic imaging applications in the clinic. We introduce two novel deep learning based methods for solving large-scale inverse problems using Bayesian inversion: a sampling based method using a WGAN with a novel mini-discriminator and a direct approach that trains a neural network using a novel loss function. The performance of both methods is demonstrated on image reconstruction in ultra low dose 3D helical CT. We compute the posterior mean and standard deviation of the 3D images followed by a hypothesis test to assess whether a "dark spot" in the liver of a cancer stricken patient is present. Both methods are computationally efficient and our evaluation shows very promising performance that clearly supports the claim that Bayesian inversion is usable for 3D imaging in time critical applications.

HSV kategori
Forskningsprogram
Tillämpad matematik och beräkningsmatematik, Numerisk analys
Identifikatorer
urn:nbn:se:kth:diva-262726 (URN)
Merknad

QC 20191021

Tilgjengelig fra: 2019-10-18 Laget: 2019-10-18 Sist oppdatert: 2019-10-21bibliografisk kontrollert

Open Access i DiVA

fulltext(39814 kB)186 nedlastinger
Filinformasjon
Fil FULLTEXT02.pdfFilstørrelse 39814 kBChecksum SHA-512
91e2fb9dffc63c31626254aa902fed3a24958594684fde7f669795e119c00c8f0df605aa1ff38ed49dce00494d904f6feb5afbab3a8fe2448558d9ce14b62214
Type fulltextMimetype application/pdf

Personposter BETA

Adler, Jonas

Søk i DiVA

Av forfatter/redaktør
Adler, Jonas
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar
Totalt: 186 nedlastinger
Antall nedlastinger er summen av alle nedlastinger av alle fulltekster. Det kan for eksempel være tidligere versjoner som er ikke lenger tilgjengelige

isbn
urn-nbn

Altmetric

isbn
urn-nbn
Totalt: 1486 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf