Change search
ReferencesLink to record
Permanent link

Direct link
Shared Gaussian Process Latent Variable Models
Oxford Brookes University.
2009 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]

A fundamental task in machine learning is modeling the relationship between different observation spaces. Dimensionality reduction is the task of reducing thenumber of dimensions in a parameterization of a data-set. In this thesis we areinterested in the cross-road between these two tasks: shared dimensionality reduction. Shared dimensionality reduction aims to represent multiple observationspaces within the same model. Previously suggested models have been limited tothe scenarios where the observations have been generated from the same manifold.In this thesis we present a Gaussian Process Latent Variable Model (GP-LVM)[33] for shared dimensionality reduction without making assumptions about therelationship between the observations. Further we suggest an extension to Canonical Correlation Analysis (CCA) called Non Consolidating Component Analysis (NCCA). The proposed algorithm extends classical CCA to represent the fullvariance of the data opposed to only the correlated. We compare the suggestedGP-LVM model to existing models and show results on real-world problems exemplifying the advantages of our approach.

Place, publisher, year, edition, pages
Oxford University Press, 2009. , 123 p.
National Category
Computer Science
URN: urn:nbn:se:kth:diva-50660OAI: diva2:462370
Available from: 2011-12-07 Created: 2011-12-07 Last updated: 2011-12-09

Open Access in DiVA

No full text

Search in DiVA

By author/editor
Ek, Carl Henrik
Computer Science

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 232 hits
ReferencesLink to record
Permanent link

Direct link