A topological framework for training latent variable models
2014 (English)In: Proceedings - International Conference on Pattern Recognition, 2014, p. 2471-2476Conference paper, Published paper (Refereed)
Abstract [en]
We discuss the properties of a class of latent variable models that assumes each labeled sample is associated with a set of different features, with no prior knowledge of which feature is the most relevant feature to be used. Deformable-Part Models (DPM) can be seen as good examples of such models. These models are usually considered to be expensive to train and very sensitive to the initialization. In this paper, we focus on the learning of such models by introducing a topological framework and show how it is possible to both reduce the learning complexity and produce more robust decision boundaries. We will also argue how our framework can be used for producing robust decision boundaries without exploiting the dataset bias or relying on accurate annotations. To experimentally evaluate our method and compare with previously published frameworks, we focus on the problem of image classification with object localization. In this problem, the correct location of the objects is unknown, during both training and testing stages, and is considered as a latent variable.
Place, publisher, year, edition, pages
2014. p. 2471-2476
Keywords [en]
Pattern recognition, Topology, Deformable part models, Latent variable models, Learning complexity, Object localization, Prior knowledge, Relevant features, Robust decisions, Training and testing, Image classification
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-167941DOI: 10.1109/ICPR.2014.427ISI: 000359818002099Scopus ID: 2-s2.0-84919941135ISBN: 9781479952083 (print)OAI: oai:DiVA.org:kth-167941DiVA, id: diva2:817407
Conference
22nd International Conference on Pattern Recognition, ICPR 2014, 24 August 2014 through 28 August 2014
Note
QC 20150605
2015-06-052015-05-222024-03-15Bibliographically approved