kth.sePublikationer KTH
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
On Symmetries and Metrics in Geometric Inference
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Robotik, perception och lärande, RPL.
2024 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

Spaces of data naturally carry intrinsic geometry. Statistics and machine learning can leverage on this rich structure in order to achieve efficiency and semantic generalization. Extracting geometry from data is therefore a fundamental challenge which by itself defines a statistical, computational and unsupervised learning problem. To this end, symmetries and metrics are two fundamental objects which are ubiquitous in continuous and discrete geometry. Both are suitable for data-driven approaches since symmetries arise as interactions and are thus collectable in practice while metrics can be induced locally from the ambient space. In this thesis, we address the question of extracting geometry from data by leveraging on symmetries and metrics. Additionally, we explore methods for statistical inference exploiting the extracted geometric structure. On the metric side, we focus on Voronoi tessellations and Delaunay triangulations, which are classical tools in computational geometry. Based on them, we propose novel non-parametric methods for machine learning and statistics, focusing on theoretical and computational aspects. These methods include an active version of the nearest neighbor regressor as well as two high-dimensional density estimators. All of them possess convergence guarantees due to the adaptiveness of Voronoi cells. On the symmetry side, we focus on representation learning in the context of data acted upon by a group. Specifically, we propose a method for learning equivariant representations which are guaranteed to be isomorphic to the data space, even in the presence of symmetries stabilizing data. We additionally explore applications of such representations in a robotics context, where symmetries correspond to actions performed by an agent. Lastly, we provide a theoretical analysis of invariant neural networks and show how the group-theoretical Fourier transform emerges in their weights. This addresses the problem of symmetry discovery in a self-supervised manner.  

Abstract [sv]

Datamängder innehar en naturlig inneboende geometri. Statistik och maskininlärning kan dra nytta av denna rika struktur för att uppnå effektivitet och semantisk generalisering. Att extrahera geometri ifrån data är därför en grundläggande utmaning som i sig definierar ett statistiskt, beräkningsmässigt och oövervakat inlärningsproblem. För detta ändamål är symmetrier och metriker två grundläggande objekt som är allestädes närvarande i kontinuerlig och diskret geometri. Båda är lämpliga för datadrivna tillvägagångssätt eftersom symmetrier uppstår som interaktioner och är därmed i praktiken samlingsbara medan metriker kan induceras lokalt ifrån det omgivande rummet. I denna avhandling adresserar vi frågan om att extrahera geometri ifrån data genom att utnyttja symmetrier och metriker. Dessutom utforskar vi metoder för statistisk inferens som utnyttjar den extraherade geometriska strukturen. På den metriska sidan fokuserar vi på Voronoi-tessellationer och Delaunay-trianguleringar, som är klassiska verktyg inom beräkningsgeometri. Baserat på dem föreslår vi nya icke-parametriska metoder för maskininlärning och statistik, med fokus på teoretiska och beräkningsmässiga aspekter. Dessa metoder inkluderar en aktiv version av närmaste grann-regressorn samt två högdimensionella täthetsskattare. Alla dessa besitter konvergensgarantier på grund av Voronoi-cellernas anpassningsbarhet. På symmetrisidan fokuserar vi på representationsinlärning i sammanhanget av data som påverkas av en grupp. Specifikt föreslår vi en metod för att lära sig ekvivarianta representationer som garanteras vara isomorfa till datarummet, även i närvaro av symmetrier som stabiliserar data. Vi utforskar även tillämpningar av sådana representationer i ett robotiksammanhang, där symmetrier motsvarar handlingar utförda av en agent. Slutligen tillhandahåller vi en teoretisk analys av invarianta neuronnät och visar hur den gruppteoretiska Fouriertransformen framträder i deras vikter. Detta adresserar problemet med att upptäcka symmetrier på ett självövervakat sätt.

Ort, förlag, år, upplaga, sidor
KTH Royal Institute of Technology, 2024. , s. 61
Serie
TRITA-EECS-AVL ; 2024:26
Nyckelord [en]
Machine Learning, Computational Geometry, Voronoi, Delaunay, Symmetry, Equivariance
Nationell ämneskategori
Datavetenskap (datalogi)
Forskningsämne
Datalogi
Identifikatorer
URN: urn:nbn:se:kth:diva-344129ISBN: 978-91-8040-864-6 (tryckt)OAI: oai:DiVA.org:kth-344129DiVA, id: diva2:1842047
Disputation
2024-04-09, https://kth-se.zoom.us/j/61437033234?pwd=dnBpMnYyaDVWWC95RHNTakNXWkNRQT09, F3 (Flodis) Lindstedtsvägen 26, Stockholm, 09:00 (Engelska)
Opponent
Handledare
Anmärkning

QC 20240304

Tillgänglig från: 2024-03-04 Skapad: 2024-03-02 Senast uppdaterad: 2025-05-23Bibliografiskt granskad
Delarbeten
1. Active Nearest Neighbor Regression Through Delaunay Refinement
Öppna denna publikation i ny flik eller fönster >>Active Nearest Neighbor Regression Through Delaunay Refinement
Visa övriga...
2022 (Engelska)Ingår i: Proceedings of the 39th International Conference on Machine Learning, MLResearch Press , 2022, Vol. 162, s. 11650-11664Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We introduce an algorithm for active function approximation based on nearest neighbor regression. Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value and select novel query points in a way that takes the geometry of the function graph into account. We consider the recent state-of-the-art active function approximator called DEFER, which is based on incremental rectangular partitioning of the space, as the main baseline. The ANNR addresses a number of limitations that arise from the space subdivision strategy used in DEFER. We provide a computationally efficient implementation of our method, as well as theoretical halting guarantees. Empirical results show that ANNR outperforms the baseline for both closed-form functions and real-world examples, such as gravitational wave parameter inference and exploration of the latent space of a generative model.

Ort, förlag, år, upplaga, sidor
MLResearch Press, 2022
Serie
Proceedings of Machine Learning Research, ISSN 2640-3498 ; 162
Nationell ämneskategori
Datavetenskap (datalogi) Reglerteknik
Identifikatorer
urn:nbn:se:kth:diva-319194 (URN)000900064901033 ()2-s2.0-85163127180 (Scopus ID)
Konferens
39th International Conference on Machine Learning, Baltimore, Maryland, USA, PMLR 162, 17-23 July, 2022
Anmärkning

QC 20230509

Tillgänglig från: 2022-09-28 Skapad: 2022-09-28 Senast uppdaterad: 2024-03-02Bibliografiskt granskad
2. Voronoi Density Estimator for High-Dimensional Data: Computation, Compactification and Convergence
Öppna denna publikation i ny flik eller fönster >>Voronoi Density Estimator for High-Dimensional Data: Computation, Compactification and Convergence
Visa övriga...
2022 (Engelska)Ingår i: Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, PMLR , 2022, Vol. 180, s. 1644-1653Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

The Voronoi Density Estimator (VDE) is an established density estimation technique that adapts to the local geometry of data. However, its applicability has been so far limited to problems in two and three dimensions. This is because Voronoi cells rapidly increase in complexity as dimensions grow, making the necessary explicit computations infeasible. We define a variant of the VDE deemed Compactified Voronoi Density Estimator (CVDE), suitable for higher dimensions. We propose computationally efficient algorithms for numerical approximation of the CVDE and formally prove convergence of the estimated density to the original one. We implement and empirically validate the CVDE through a comparison with the Kernel Density Estimator (KDE). Our results indicate that the CVDE outperforms the KDE on sound and image data.

Ort, förlag, år, upplaga, sidor
PMLR, 2022
Serie
Proceedings of Machine Learning Research, ISSN 2640-3498
Nationell ämneskategori
Sannolikhetsteori och statistik
Identifikatorer
urn:nbn:se:kth:diva-319195 (URN)2-s2.0-85163412377 (Scopus ID)
Konferens
The 38th Conference on Uncertainty in Artificial Intelligence, Eindhoven, The Netherlands, Aug 1-5 2022
Anmärkning

QC 20221003

Tillgänglig från: 2022-09-28 Skapad: 2022-09-28 Senast uppdaterad: 2024-07-23Bibliografiskt granskad
3. An Efficient and Continuous Voronoi Density Estimator
Öppna denna publikation i ny flik eller fönster >>An Efficient and Continuous Voronoi Density Estimator
Visa övriga...
2023 (Engelska)Ingår i: Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, ML Research Press , 2023, s. 4732-4744Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We introduce a non-parametric density estimator deemed Radial Voronoi Density Estimator (RVDE). RVDE is grounded in the geometry of Voronoi tessellations and as such benefits from local geometric adaptiveness and broad convergence properties. Due to its radial definition RVDE is continuous and computable in linear time with respect to the dataset size. This amends for the main shortcomings of previously studied VDEs, which are highly discontinuous and computationally expensive. We provide a theoretical study of the modes of RVDE as well as an empirical investigation of its performance on high-dimensional data. Results show that RVDE outperforms other non-parametric density estimators, including recently introduced VDEs.

Ort, förlag, år, upplaga, sidor
ML Research Press, 2023
Serie
Proceedings of Machine Learning Research, ISSN 2640-3498, ; 206
Nationell ämneskategori
Datorgrafik och datorseende
Identifikatorer
urn:nbn:se:kth:diva-334436 (URN)001222727704044 ()2-s2.0-85165187458 (Scopus ID)
Konferens
26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, Valencia, Spain, Apr 25 2023 - Apr 27 2023
Anmärkning

QC 20241204

Tillgänglig från: 2023-08-21 Skapad: 2023-08-21 Senast uppdaterad: 2025-02-07Bibliografiskt granskad
4. Equivariant Representation Learning via Class-Pose Decomposition
Öppna denna publikation i ny flik eller fönster >>Equivariant Representation Learning via Class-Pose Decomposition
2023 (Engelska)Ingår i: Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, ML Research Press , 2023, Vol. 206, s. 4745-4756Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We introduce a general method for learning representations that are equivariant to symmetries of data. Our central idea is to decompose the latent space into an invariant factor and the symmetry group itself. The components semantically correspond to intrinsic data classes and poses respectively. The learner is trained on a loss encouraging equivariance based on supervision from relative symmetry information. The approach is motivated by theoretical results from group theory and guarantees representations that are lossless, interpretable and disentangled. We provide an empirical investigation via experiments involving datasets with a variety of symmetries. Results show that our representations capture the geometry of data and outperform other equivariant representation learning frameworks.

Ort, förlag, år, upplaga, sidor
ML Research Press, 2023
Nationell ämneskategori
Robotik och automation
Identifikatorer
urn:nbn:se:kth:diva-334435 (URN)001222727704045 ()2-s2.0-85165155542 (Scopus ID)
Konferens
26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, Valencia, Spain, Apr 25 2023 - Apr 27 2023
Anmärkning

QC 20241204

Tillgänglig från: 2023-08-21 Skapad: 2023-08-21 Senast uppdaterad: 2025-02-09Bibliografiskt granskad
5. Equivariant Representation Learning in the Presence of Stabilizers
Öppna denna publikation i ny flik eller fönster >>Equivariant Representation Learning in the Presence of Stabilizers
Visa övriga...
2023 (Engelska)Ingår i: Machine Learning and Knowledge Discovery in Databases: Research Track - European Conference, ECML PKDD 2023, Proceedings, Springer Nature , 2023, s. 693-708Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We introduce Equivariant Isomorphic Networks (EquIN) – a method for learning representations that are equivariant with respect to general group actions over data. Differently from existing equivariant representation learners, EquIN is suitable for group actions that are not free, i.e., that stabilize data via nontrivial symmetries. EquIN is theoretically grounded in the orbit-stabilizer theorem from group theory. This guarantees that an ideal learner infers isomorphic representations while trained on equivariance alone and thus fully extracts the geometric structure of data. We provide an empirical investigation on image datasets with rotational symmetries and show that taking stabilizers into account improves the quality of the representations.

Ort, förlag, år, upplaga, sidor
Springer Nature, 2023
Nyckelord
Equivariance, Lie Groups, Representation Learning
Nationell ämneskategori
Geometri Matematisk analys
Identifikatorer
urn:nbn:se:kth:diva-339298 (URN)10.1007/978-3-031-43421-1_41 (DOI)001156141200041 ()2-s2.0-85174442272 (Scopus ID)
Konferens
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2023, Turin, Italy, Sep 18 2023 - Sep 22 2023
Anmärkning

Part of ISBN 9783031434204

QC 20231106

Tillgänglig från: 2023-11-06 Skapad: 2023-11-06 Senast uppdaterad: 2024-03-04Bibliografiskt granskad
6. Back to the Manifold: Recovering from Out-of-Distribution States
Öppna denna publikation i ny flik eller fönster >>Back to the Manifold: Recovering from Out-of-Distribution States
Visa övriga...
2022 (Engelska)Ingår i: 2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), Institute of Electrical and Electronics Engineers (IEEE) , 2022, s. 8660-8666Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Learning from previously collected datasets of expert data offers the promise of acquiring robotic policies without unsafe and costly online explorations. However, a major challenge is a distributional shift between the states in the training dataset and the ones visited by the learned policy at the test time. While prior works mainly studied the distribution shift caused by the policy during the offline training, the problem of recovering from out-of-distribution states at the deployment time is not very well studied yet. We alleviate the distributional shift at the deployment time by introducing a recovery policy that brings the agent back to the training manifold whenever it steps out of the in-distribution states, e.g., due to an external perturbation. The recovery policy relies on an approximation of the training data density and a learned equivariant mapping that maps visual observations into a latent space in which translations correspond to the robot actions. We demonstrate the effectiveness of the proposed method through several manipulation experiments on a real robotic platform. Our results show that the recovery policy enables the agent to complete tasks while the behavioral cloning alone fails because of the distributional shift problem.

Ort, förlag, år, upplaga, sidor
Institute of Electrical and Electronics Engineers (IEEE), 2022
Serie
IEEE International Conference on Intelligent Robots and Systems, ISSN 2153-0858
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
urn:nbn:se:kth:diva-324860 (URN)10.1109/IROS47612.2022.9981315 (DOI)000909405301050 ()2-s2.0-85146319849 (Scopus ID)
Konferens
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), OCT 23-27, 2022, Kyoto, JAPAN
Anmärkning

QC 20230322

Tillgänglig från: 2023-03-22 Skapad: 2023-03-22 Senast uppdaterad: 2026-02-16Bibliografiskt granskad
7. Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks
Öppna denna publikation i ny flik eller fönster >>Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks
(Engelska)Manuskript (preprint) (Övrigt vetenskapligt)
Abstract [en]

In this work, we formally prove that, under certain conditions, if a neural network is invariant to a finite group then its weights recover the Fourier transform on that group. This provides a mathematical explanation for the emergence of Fourier features -- a ubiquitous phenomenon in both biological and artificial learning systems. The results hold even for non-commutative groups, in which case the Fourier transform encodes all the irreducible unitary group representations. Our findings have consequences for the problem of symmetry discovery. Specifically, we demonstrate that the algebraic structure of an unknown group can be recovered from the weights of a network that is at least approximately invariant within certain bounds. Overall, this work contributes to a foundation for an algebraic learning theory of invariant neural network representations.

Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
urn:nbn:se:kth:diva-344128 (URN)
Anmärkning

QC 20240304

Tillgänglig från: 2024-03-02 Skapad: 2024-03-02 Senast uppdaterad: 2024-03-04Bibliografiskt granskad

Open Access i DiVA

fulltext(11206 kB)963 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 11206 kBChecksumma SHA-512
102ce06d44b0cc35b97081c389bf84f1f55b2f98b89a36584d637dcfcadd6765587ffbecdcd69fb7a6f06809e55285c552f91840dfc13347f437abaed136d9fa
Typ fulltextMimetyp application/pdf

Person

Marchetti, Giovanni Luca

Sök vidare i DiVA

Av författaren/redaktören
Marchetti, Giovanni Luca
Av organisationen
Robotik, perception och lärande, RPL
Datavetenskap (datalogi)

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 963 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

isbn
urn-nbn

Altmetricpoäng

isbn
urn-nbn
Totalt: 2249 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf