Shared Gaussian Process Latent Variable Models
2009 (English)Doctoral thesis, monograph (Other academic)
A fundamental task in machine learning is modeling the relationship between different observation spaces. Dimensionality reduction is the task of reducing thenumber of dimensions in a parameterization of a data-set. In this thesis we areinterested in the cross-road between these two tasks: shared dimensionality reduction. Shared dimensionality reduction aims to represent multiple observationspaces within the same model. Previously suggested models have been limited tothe scenarios where the observations have been generated from the same manifold.In this thesis we present a Gaussian Process Latent Variable Model (GP-LVM) for shared dimensionality reduction without making assumptions about therelationship between the observations. Further we suggest an extension to Canonical Correlation Analysis (CCA) called Non Consolidating Component Analysis (NCCA). The proposed algorithm extends classical CCA to represent the fullvariance of the data opposed to only the correlated. We compare the suggestedGP-LVM model to existing models and show results on real-world problems exemplifying the advantages of our approach.
Place, publisher, year, edition, pages
Oxford University Press, 2009. , 123 p.
IdentifiersURN: urn:nbn:se:kth:diva-50660OAI: oai:DiVA.org:kth-50660DiVA: diva2:462370