kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Equivariant Representation Learning in the Presence of Stabilizers
Eindhoven University of Technology, Eindhoven, The Netherlands; Eindhoven Artificial Intelligence Systems Institute, Eindhoven, The Netherlands; Prosus, Amsterdam, The Netherlands.
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Collaborative Autonomous Systems.
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Collaborative Autonomous Systems. KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0003-2965-2953
Eindhoven University of Technology, Eindhoven, The Netherlands; Prosus, Amsterdam, The Netherlands.
Show others and affiliations
2023 (English)In: Machine Learning and Knowledge Discovery in Databases: Research Track - European Conference, ECML PKDD 2023, Proceedings, Springer Nature , 2023, p. 693-708Conference paper, Published paper (Refereed)
Abstract [en]

We introduce Equivariant Isomorphic Networks (EquIN) – a method for learning representations that are equivariant with respect to general group actions over data. Differently from existing equivariant representation learners, EquIN is suitable for group actions that are not free, i.e., that stabilize data via nontrivial symmetries. EquIN is theoretically grounded in the orbit-stabilizer theorem from group theory. This guarantees that an ideal learner infers isomorphic representations while trained on equivariance alone and thus fully extracts the geometric structure of data. We provide an empirical investigation on image datasets with rotational symmetries and show that taking stabilizers into account improves the quality of the representations.

Place, publisher, year, edition, pages
Springer Nature , 2023. p. 693-708
Keywords [en]
Equivariance, Lie Groups, Representation Learning
National Category
Geometry Mathematical Analysis
Identifiers
URN: urn:nbn:se:kth:diva-339298DOI: 10.1007/978-3-031-43421-1_41ISI: 001156141200041Scopus ID: 2-s2.0-85174442272OAI: oai:DiVA.org:kth-339298DiVA, id: diva2:1809782
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2023, Turin, Italy, Sep 18 2023 - Sep 22 2023
Note

Part of ISBN 9783031434204

QC 20231106

Available from: 2023-11-06 Created: 2023-11-06 Last updated: 2024-03-04Bibliographically approved
In thesis
1. On Symmetries and Metrics in Geometric Inference
Open this publication in new window or tab >>On Symmetries and Metrics in Geometric Inference
2024 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Spaces of data naturally carry intrinsic geometry. Statistics and machine learning can leverage on this rich structure in order to achieve efficiency and semantic generalization. Extracting geometry from data is therefore a fundamental challenge which by itself defines a statistical, computational and unsupervised learning problem. To this end, symmetries and metrics are two fundamental objects which are ubiquitous in continuous and discrete geometry. Both are suitable for data-driven approaches since symmetries arise as interactions and are thus collectable in practice while metrics can be induced locally from the ambient space. In this thesis, we address the question of extracting geometry from data by leveraging on symmetries and metrics. Additionally, we explore methods for statistical inference exploiting the extracted geometric structure. On the metric side, we focus on Voronoi tessellations and Delaunay triangulations, which are classical tools in computational geometry. Based on them, we propose novel non-parametric methods for machine learning and statistics, focusing on theoretical and computational aspects. These methods include an active version of the nearest neighbor regressor as well as two high-dimensional density estimators. All of them possess convergence guarantees due to the adaptiveness of Voronoi cells. On the symmetry side, we focus on representation learning in the context of data acted upon by a group. Specifically, we propose a method for learning equivariant representations which are guaranteed to be isomorphic to the data space, even in the presence of symmetries stabilizing data. We additionally explore applications of such representations in a robotics context, where symmetries correspond to actions performed by an agent. Lastly, we provide a theoretical analysis of invariant neural networks and show how the group-theoretical Fourier transform emerges in their weights. This addresses the problem of symmetry discovery in a self-supervised manner.  

Abstract [sv]

Datamängder innehar en naturlig inneboende geometri. Statistik och maskininlärning kan dra nytta av denna rika struktur för att uppnå effektivitet och semantisk generalisering. Att extrahera geometri ifrån data är därför en grundläggande utmaning som i sig definierar ett statistiskt, beräkningsmässigt och oövervakat inlärningsproblem. För detta ändamål är symmetrier och metriker två grundläggande objekt som är allestädes närvarande i kontinuerlig och diskret geometri. Båda är lämpliga för datadrivna tillvägagångssätt eftersom symmetrier uppstår som interaktioner och är därmed i praktiken samlingsbara medan metriker kan induceras lokalt ifrån det omgivande rummet. I denna avhandling adresserar vi frågan om att extrahera geometri ifrån data genom att utnyttja symmetrier och metriker. Dessutom utforskar vi metoder för statistisk inferens som utnyttjar den extraherade geometriska strukturen. På den metriska sidan fokuserar vi på Voronoi-tessellationer och Delaunay-trianguleringar, som är klassiska verktyg inom beräkningsgeometri. Baserat på dem föreslår vi nya icke-parametriska metoder för maskininlärning och statistik, med fokus på teoretiska och beräkningsmässiga aspekter. Dessa metoder inkluderar en aktiv version av närmaste grann-regressorn samt två högdimensionella täthetsskattare. Alla dessa besitter konvergensgarantier på grund av Voronoi-cellernas anpassningsbarhet. På symmetrisidan fokuserar vi på representationsinlärning i sammanhanget av data som påverkas av en grupp. Specifikt föreslår vi en metod för att lära sig ekvivarianta representationer som garanteras vara isomorfa till datarummet, även i närvaro av symmetrier som stabiliserar data. Vi utforskar även tillämpningar av sådana representationer i ett robotiksammanhang, där symmetrier motsvarar handlingar utförda av en agent. Slutligen tillhandahåller vi en teoretisk analys av invarianta neuronnät och visar hur den gruppteoretiska Fouriertransformen framträder i deras vikter. Detta adresserar problemet med att upptäcka symmetrier på ett självövervakat sätt.

Place, publisher, year, edition, pages
KTH Royal Institute of Technology, 2024. p. 61
Series
TRITA-EECS-AVL ; 2024:26
Keywords
Machine Learning, Computational Geometry, Voronoi, Delaunay, Symmetry, Equivariance
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-344129 (URN)978-91-8040-864-6 (ISBN)
Public defence
2024-04-09, https://kth-se.zoom.us/j/61437033234?pwd=dnBpMnYyaDVWWC95RHNTakNXWkNRQT09, F3 (Flodis) Lindstedtsvägen 26, Stockholm, 09:00 (English)
Opponent
Supervisors
Note

QC 20240304

Available from: 2024-03-04 Created: 2024-03-02 Last updated: 2024-03-08Bibliographically approved

Open Access in DiVA

fulltext(2667 kB)106 downloads
File information
File name FULLTEXT01.pdfFile size 2667 kBChecksum SHA-512
3588794fa37097edd151302d0afedcf61997c37d1b6d0ca9693956122a4ef553c3eed461b032cdef32eea4391b904388fa21217beffdccf08dcd47503534bb7f
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopushttps://doi.org/10.48550/arXiv.2301.05231

Authority records

Marchetti, Giovanni LucaKragic, Danica

Search in DiVA

By author/editor
Marchetti, Giovanni LucaKragic, Danica
By organisation
Collaborative Autonomous SystemsRobotics, Perception and Learning, RPL
GeometryMathematical Analysis

Search outside of DiVA

GoogleGoogle Scholar
Total: 106 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 109 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf