kth.sePublications
Change search
Link to record
Permanent link

Direct link
Marchetti, Giovanni LucaORCID iD iconorcid.org/0009-0004-8248-229X
Publications (10 of 16) Show all publications
García-Castellanos, A., Medbouhi, A. A., Marchetti, G. L., Bekkers, E. J. & Kragic, D. (2025). HyperSteiner: Computing Heuristic Hyperbolic Steiner Minimal Trees. In: SIAM Symposium on Algorithm Engineering and Experiments, ALENEX 2025: . Paper presented at 2025 SIAM Symposium on Algorithm Engineering and Experiments, ALENEX 2025, New Orleans, United States of America, Januari 12-13, 2025 (pp. 194-208). Society for Industrial & Applied Mathematics (SIAM)
Open this publication in new window or tab >>HyperSteiner: Computing Heuristic Hyperbolic Steiner Minimal Trees
Show others...
2025 (English)In: SIAM Symposium on Algorithm Engineering and Experiments, ALENEX 2025, Society for Industrial & Applied Mathematics (SIAM) , 2025, p. 194-208Conference paper, Published paper (Refereed)
Abstract [en]

We propose HyperSteiner – an efficient heuristic algorithm for computing Steiner minimal trees in the hyperbolic space. HyperSteiner extends the Euclidean Smith-Lee-Liebman algorithm, which is grounded in a divide-and-conquer approach involving the Delaunay triangulation. The central idea is rephrasing Steiner tree problems with three terminals as a system of equations in the Klein-Beltrami model. Motivated by the fact that hyperbolic geometry is well-suited for representing hierarchies, we explore applications to hierarchy discovery in data. Results show that HyperSteiner infers more realistic hierarchies than the Minimum Spanning Tree and is more scalable to large datasets than Neighbor Joining.

Place, publisher, year, edition, pages
Society for Industrial & Applied Mathematics (SIAM), 2025
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-359645 (URN)10.1137/1.9781611978339.16 (DOI)2-s2.0-85216422778 (Scopus ID)
Conference
2025 SIAM Symposium on Algorithm Engineering and Experiments, ALENEX 2025, New Orleans, United States of America, Januari 12-13, 2025
Note

Part of ISBN 9798331311995

QC 20250207

Available from: 2025-02-06 Created: 2025-02-06 Last updated: 2025-02-07Bibliographically approved
Marchetti, G. L., Cesa, G., Kumar, P. & Behboodi, A. (2025). Neural Lattice Reduction: A Self-Supervised Geometric Deep Learning Approach. Transactions on Machine Learning Research, 2025-February, 1-22
Open this publication in new window or tab >>Neural Lattice Reduction: A Self-Supervised Geometric Deep Learning Approach
2025 (English)In: Transactions on Machine Learning Research, E-ISSN 2835-8856, Vol. 2025-February, p. 1-22Article in journal (Refereed) Epub ahead of print
Abstract [en]

Lattice reduction is a combinatorial optimization problem aimed at finding the most orthogonal basis in a given lattice. The Lenstra–Lenstra–Lovász (LLL) algorithm is the best algorithm in the literature for solving this problem. In light of recent research on algorithm discovery, in this work, we would like to answer this question: is it possible to parametrize the algorithm space for lattice reduction problem with neural networks and find an algorithm without supervised data? Our strategy is to use equivariant and invariant parametrizations and train in a self-supervised way. We design a deep neural model outputting factorized unimodular matrices and train it in a self-supervised manner by penalizing non-orthogonal lattice bases. We incorporate the symmetries of lattice reduction into the model by making it invariant to isometries and scaling of the ambient space and equivariant with respect to the hyperocrahedral group permuting and flipping the lattice basis elements. We show that this approach yields an algorithm with comparable complexity and performance to the LLL algorithm on a set of benchmarks. Additionally, motivated by certain applications for wire-less communication, we extend our method to a convolutional architecture which performs joint reduction of spatially-correlated lattices arranged in a grid, thereby amortizing its cost over multiple lattices.

Place, publisher, year, edition, pages
Transactions on Machine Learning Research, 2025
National Category
Signal Processing Computational Mathematics
Identifiers
urn:nbn:se:kth:diva-361196 (URN)2-s2.0-85219517527 (Scopus ID)
Note

QC 20250313

Available from: 2025-03-12 Created: 2025-03-12 Last updated: 2025-03-13Bibliographically approved
Taleb, F., Medbouhi, A. A., Marchetti, G. L. & Kragic Jensfelt, D. (2025). Towards Discovering the Hierarchy of the Olfactory Perceptual Space via Hyperbolic Embeddings. In: Science Communications Worldwide (Ed.), 22nd annual Computational and Systems Neuroscience (COSYNE) conference, Montreal and Mont Tremblant, Quebec, Canada, March 27 - April 1, 2025.: . Paper presented at Computational and Systems Neuroscience (COSYNE), Montreal 27-30, Canada.
Open this publication in new window or tab >>Towards Discovering the Hierarchy of the Olfactory Perceptual Space via Hyperbolic Embeddings
2025 (English)In: 22nd annual Computational and Systems Neuroscience (COSYNE) conference, Montreal and Mont Tremblant, Quebec, Canada, March 27 - April 1, 2025. / [ed] Science Communications Worldwide, 2025Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

Human olfactory perception is understudied in the whole spectrum of neuroscience, from computational to system neuroscience. In this study, we explore the hierarchy underlying human olfactory perception by embedding perceptual data in the hyperbolic space. Previous research emphasizes the significance of hyperbolic geometry in gaining insights into the neural encoding of natural odorants. This is due to the exponential growth of the hyperbolic space, that makes it appropriate to encode hierarchical data. We employ a contrastive learning approach over the Poincare ball in order to embed olfactory perceptual data in a hyperbolic space. The results indicate the emergence of a hierarchical representation in the hyperbolic space, which could have implications for understanding the structure of the olfactory perceptual space in the brain. Our finding suggests that the human brain may encode olfactory perceptions in a hierarchical manner, where higher odor perceptual certainty correlates with deeper levels in the hierarchical representation.

Keywords
hyperbolic geometry, olfaction, representation
National Category
Neurosciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-363971 (URN)10.57736/0d78-6155 (DOI)
Conference
Computational and Systems Neuroscience (COSYNE), Montreal 27-30, Canada
Note

QC 20250602

Available from: 2025-06-01 Created: 2025-06-01 Last updated: 2025-06-02Bibliographically approved
Marchetti, G. L., Hillar, C. J., Kragic, D. & Sanborn, S. (2024). Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks. In: Proceedings of 37th Conference on Learning Theory, COLT 2024: . Paper presented at 37th Annual Conference on Learning Theory, COLT 2024, Edmonton, Canada, Jun 30 2024 - Jul 3 2024 (pp. 3775-3797). ML Research Press
Open this publication in new window or tab >>Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks
2024 (English)In: Proceedings of 37th Conference on Learning Theory, COLT 2024, ML Research Press , 2024, p. 3775-3797Conference paper, Published paper (Refereed)
Abstract [en]

In this work, we formally prove that, under certain conditions, if a neural network is invariant to a finite group then its weights recover the Fourier transform on that group. This provides a mathematical explanation for the emergence of Fourier features - a ubiquitous phenomenon in both biological and artificial learning systems. The results hold even for non-commutative groups, in which case the Fourier transform encodes all the irreducible unitary group representations. Our findings have consequences for the problem of symmetry discovery. Specifically, we demonstrate that the algebraic structure of an unknown group can be recovered from the weights of a network that is at least approximately invariant within certain bounds. Overall, this work contributes to a foundation for an algebraic learning theory of invariant neural network representations.

Place, publisher, year, edition, pages
ML Research Press, 2024
Keywords
group representations, harmonic analysis, Invariant neural networks
National Category
Mathematical Analysis Algebra and Logic Geometry
Identifiers
urn:nbn:se:kth:diva-353960 (URN)2-s2.0-85203678110 (Scopus ID)
Conference
37th Annual Conference on Learning Theory, COLT 2024, Edmonton, Canada, Jun 30 2024 - Jul 3 2024
Note

QC 20240926

Available from: 2024-09-25 Created: 2024-09-25 Last updated: 2024-10-03Bibliographically approved
Medbouhi, A. A., Marchetti, G. L., Polianskii, V., Kravberg, A., Poklukar, P., Varava, A. & Kragic, D. (2024). Hyperbolic Delaunay Geometric Alignment. In: Bifet, A Krilavicius, T Davis, J Kull, M Ntoutsi, E Zliobaite, I (Ed.), Machine learning and knowledge discovery in databases: Research track, pt iii, ECML PKDD 2024. Paper presented at Joint European Conference on Machine Learning and Knowledge Discovery in Databases (ECML PKDD), SEP 09-13, 2024, Vilnius, Lithuania (pp. 111-126). Springer Nature
Open this publication in new window or tab >>Hyperbolic Delaunay Geometric Alignment
Show others...
2024 (English)In: Machine learning and knowledge discovery in databases: Research track, pt iii, ECML PKDD 2024 / [ed] Bifet, A Krilavicius, T Davis, J Kull, M Ntoutsi, E Zliobaite, I, Springer Nature , 2024, p. 111-126Conference paper, Published paper (Refereed)
Abstract [en]

Hyperbolic machine learning is an emerging field aimed at representing data with a hierarchical structure. However, there is a lack of tools for evaluation and analysis of the resulting hyperbolic data representations. To this end, we propose Hyperbolic Delaunay Geometric Alignment (HyperDGA) - a similarity score for comparing datasets in a hyperbolic space. The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets. We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets. Furthermore, we showcase the potential of HyperDGA for evaluating latent representations inferred by a Hyperbolic Variational Auto-Encoder.

Place, publisher, year, edition, pages
Springer Nature, 2024
Series
Lecture Notes in Artificial Intelligence, ISSN 2945-9133 ; 14943
Keywords
Hyperbolic Geometry, Hierarchical Data, Evaluation
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-355149 (URN)10.1007/978-3-031-70352-2_7 (DOI)001308375900007 ()
Conference
Joint European Conference on Machine Learning and Knowledge Discovery in Databases (ECML PKDD), SEP 09-13, 2024, Vilnius, Lithuania
Note

Part of ISBN: 978-3-031-70351-5, 978-3-031-70352-2

QC 20241025

Available from: 2024-10-25 Created: 2024-10-25 Last updated: 2024-10-25Bibliographically approved
Marchetti, G. L. (2024). On Symmetries and Metrics in Geometric Inference. (Doctoral dissertation). KTH Royal Institute of Technology
Open this publication in new window or tab >>On Symmetries and Metrics in Geometric Inference
2024 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Spaces of data naturally carry intrinsic geometry. Statistics and machine learning can leverage on this rich structure in order to achieve efficiency and semantic generalization. Extracting geometry from data is therefore a fundamental challenge which by itself defines a statistical, computational and unsupervised learning problem. To this end, symmetries and metrics are two fundamental objects which are ubiquitous in continuous and discrete geometry. Both are suitable for data-driven approaches since symmetries arise as interactions and are thus collectable in practice while metrics can be induced locally from the ambient space. In this thesis, we address the question of extracting geometry from data by leveraging on symmetries and metrics. Additionally, we explore methods for statistical inference exploiting the extracted geometric structure. On the metric side, we focus on Voronoi tessellations and Delaunay triangulations, which are classical tools in computational geometry. Based on them, we propose novel non-parametric methods for machine learning and statistics, focusing on theoretical and computational aspects. These methods include an active version of the nearest neighbor regressor as well as two high-dimensional density estimators. All of them possess convergence guarantees due to the adaptiveness of Voronoi cells. On the symmetry side, we focus on representation learning in the context of data acted upon by a group. Specifically, we propose a method for learning equivariant representations which are guaranteed to be isomorphic to the data space, even in the presence of symmetries stabilizing data. We additionally explore applications of such representations in a robotics context, where symmetries correspond to actions performed by an agent. Lastly, we provide a theoretical analysis of invariant neural networks and show how the group-theoretical Fourier transform emerges in their weights. This addresses the problem of symmetry discovery in a self-supervised manner.  

Abstract [sv]

Datamängder innehar en naturlig inneboende geometri. Statistik och maskininlärning kan dra nytta av denna rika struktur för att uppnå effektivitet och semantisk generalisering. Att extrahera geometri ifrån data är därför en grundläggande utmaning som i sig definierar ett statistiskt, beräkningsmässigt och oövervakat inlärningsproblem. För detta ändamål är symmetrier och metriker två grundläggande objekt som är allestädes närvarande i kontinuerlig och diskret geometri. Båda är lämpliga för datadrivna tillvägagångssätt eftersom symmetrier uppstår som interaktioner och är därmed i praktiken samlingsbara medan metriker kan induceras lokalt ifrån det omgivande rummet. I denna avhandling adresserar vi frågan om att extrahera geometri ifrån data genom att utnyttja symmetrier och metriker. Dessutom utforskar vi metoder för statistisk inferens som utnyttjar den extraherade geometriska strukturen. På den metriska sidan fokuserar vi på Voronoi-tessellationer och Delaunay-trianguleringar, som är klassiska verktyg inom beräkningsgeometri. Baserat på dem föreslår vi nya icke-parametriska metoder för maskininlärning och statistik, med fokus på teoretiska och beräkningsmässiga aspekter. Dessa metoder inkluderar en aktiv version av närmaste grann-regressorn samt två högdimensionella täthetsskattare. Alla dessa besitter konvergensgarantier på grund av Voronoi-cellernas anpassningsbarhet. På symmetrisidan fokuserar vi på representationsinlärning i sammanhanget av data som påverkas av en grupp. Specifikt föreslår vi en metod för att lära sig ekvivarianta representationer som garanteras vara isomorfa till datarummet, även i närvaro av symmetrier som stabiliserar data. Vi utforskar även tillämpningar av sådana representationer i ett robotiksammanhang, där symmetrier motsvarar handlingar utförda av en agent. Slutligen tillhandahåller vi en teoretisk analys av invarianta neuronnät och visar hur den gruppteoretiska Fouriertransformen framträder i deras vikter. Detta adresserar problemet med att upptäcka symmetrier på ett självövervakat sätt.

Place, publisher, year, edition, pages
KTH Royal Institute of Technology, 2024. p. 61
Series
TRITA-EECS-AVL ; 2024:26
Keywords
Machine Learning, Computational Geometry, Voronoi, Delaunay, Symmetry, Equivariance
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-344129 (URN)978-91-8040-864-6 (ISBN)
Public defence
2024-04-09, https://kth-se.zoom.us/j/61437033234?pwd=dnBpMnYyaDVWWC95RHNTakNXWkNRQT09, F3 (Flodis) Lindstedtsvägen 26, Stockholm, 09:00 (English)
Opponent
Supervisors
Note

QC 20240304

Available from: 2024-03-04 Created: 2024-03-02 Last updated: 2025-05-23Bibliographically approved
Marchetti, G. L., Polianskii, V., Varava, A., Pokorny, F. T. & Kragic, D. (2023). An Efficient and Continuous Voronoi Density Estimator. In: Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023: . Paper presented at 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, Valencia, Spain, Apr 25 2023 - Apr 27 2023 (pp. 4732-4744). ML Research Press
Open this publication in new window or tab >>An Efficient and Continuous Voronoi Density Estimator
Show others...
2023 (English)In: Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, ML Research Press , 2023, p. 4732-4744Conference paper, Published paper (Refereed)
Abstract [en]

We introduce a non-parametric density estimator deemed Radial Voronoi Density Estimator (RVDE). RVDE is grounded in the geometry of Voronoi tessellations and as such benefits from local geometric adaptiveness and broad convergence properties. Due to its radial definition RVDE is continuous and computable in linear time with respect to the dataset size. This amends for the main shortcomings of previously studied VDEs, which are highly discontinuous and computationally expensive. We provide a theoretical study of the modes of RVDE as well as an empirical investigation of its performance on high-dimensional data. Results show that RVDE outperforms other non-parametric density estimators, including recently introduced VDEs.

Place, publisher, year, edition, pages
ML Research Press, 2023
Series
Proceedings of Machine Learning Research, ISSN 2640-3498, ; 206
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:kth:diva-334436 (URN)001222727704044 ()2-s2.0-85165187458 (Scopus ID)
Conference
26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, Valencia, Spain, Apr 25 2023 - Apr 27 2023
Note

QC 20241204

Available from: 2023-08-21 Created: 2023-08-21 Last updated: 2025-02-07Bibliographically approved
Pérez Rey, L. A., Marchetti, G. L., Kragic, D., Jarnikov, D. & Holenderski, M. (2023). Equivariant Representation Learning in the Presence of Stabilizers. In: Machine Learning and Knowledge Discovery in Databases: Research Track - European Conference, ECML PKDD 2023, Proceedings: . Paper presented at European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2023, Turin, Italy, Sep 18 2023 - Sep 22 2023 (pp. 693-708). Springer Nature
Open this publication in new window or tab >>Equivariant Representation Learning in the Presence of Stabilizers
Show others...
2023 (English)In: Machine Learning and Knowledge Discovery in Databases: Research Track - European Conference, ECML PKDD 2023, Proceedings, Springer Nature , 2023, p. 693-708Conference paper, Published paper (Refereed)
Abstract [en]

We introduce Equivariant Isomorphic Networks (EquIN) – a method for learning representations that are equivariant with respect to general group actions over data. Differently from existing equivariant representation learners, EquIN is suitable for group actions that are not free, i.e., that stabilize data via nontrivial symmetries. EquIN is theoretically grounded in the orbit-stabilizer theorem from group theory. This guarantees that an ideal learner infers isomorphic representations while trained on equivariance alone and thus fully extracts the geometric structure of data. We provide an empirical investigation on image datasets with rotational symmetries and show that taking stabilizers into account improves the quality of the representations.

Place, publisher, year, edition, pages
Springer Nature, 2023
Keywords
Equivariance, Lie Groups, Representation Learning
National Category
Geometry Mathematical Analysis
Identifiers
urn:nbn:se:kth:diva-339298 (URN)10.1007/978-3-031-43421-1_41 (DOI)001156141200041 ()2-s2.0-85174442272 (Scopus ID)
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2023, Turin, Italy, Sep 18 2023 - Sep 22 2023
Note

Part of ISBN 9783031434204

QC 20231106

Available from: 2023-11-06 Created: 2023-11-06 Last updated: 2024-03-04Bibliographically approved
Marchetti, G. L., Tegner, G., Varava, A. & Kragic, D. (2023). Equivariant Representation Learning via Class-Pose Decomposition. In: Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023: . Paper presented at 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, Valencia, Spain, Apr 25 2023 - Apr 27 2023 (pp. 4745-4756). ML Research Press, 206
Open this publication in new window or tab >>Equivariant Representation Learning via Class-Pose Decomposition
2023 (English)In: Proceedings of the 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, ML Research Press , 2023, Vol. 206, p. 4745-4756Conference paper, Published paper (Refereed)
Abstract [en]

We introduce a general method for learning representations that are equivariant to symmetries of data. Our central idea is to decompose the latent space into an invariant factor and the symmetry group itself. The components semantically correspond to intrinsic data classes and poses respectively. The learner is trained on a loss encouraging equivariance based on supervision from relative symmetry information. The approach is motivated by theoretical results from group theory and guarantees representations that are lossless, interpretable and disentangled. We provide an empirical investigation via experiments involving datasets with a variety of symmetries. Results show that our representations capture the geometry of data and outperform other equivariant representation learning frameworks.

Place, publisher, year, edition, pages
ML Research Press, 2023
National Category
Robotics and automation
Identifiers
urn:nbn:se:kth:diva-334435 (URN)001222727704045 ()2-s2.0-85165155542 (Scopus ID)
Conference
26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, Valencia, Spain, Apr 25 2023 - Apr 27 2023
Note

QC 20241204

Available from: 2023-08-21 Created: 2023-08-21 Last updated: 2025-02-09Bibliographically approved
Reichlin, A., Marchetti, G. L., Yin, H., Varava, A. & Kragic, D. (2023). Learning Geometric Representations of Objects via Interaction. In: Machine Learning and Knowledge Discovery in Databases: Research Track - European Conference, ECML PKDD 2023, Proceedings: . Paper presented at European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2023, Turin, Italy, Sep 18 2023 - Sep 22 2023 (pp. 629-644). Springer Nature
Open this publication in new window or tab >>Learning Geometric Representations of Objects via Interaction
Show others...
2023 (English)In: Machine Learning and Knowledge Discovery in Databases: Research Track - European Conference, ECML PKDD 2023, Proceedings, Springer Nature , 2023, p. 629-644Conference paper, Published paper (Refereed)
Abstract [en]

We address the problem of learning representations from observations of a scene involving an agent and an external object the agent interacts with. To this end, we propose a representation learning framework extracting the location in physical space of both the agent and the object from unstructured observations of arbitrary nature. Our framework relies on the actions performed by the agent as the only source of supervision, while assuming that the object is displaced by the agent via unknown dynamics. We provide a theoretical foundation and formally prove that an ideal learner is guaranteed to infer an isometric representation, disentangling the agent from the object and correctly extracting their locations. We evaluate empirically our framework on a variety of scenarios, showing that it outperforms vision-based approaches such as a state-of-the-art keypoint extractor. We moreover demonstrate how the extracted representations enable the agent to solve downstream tasks via reinforcement learning in an efficient manner.

Place, publisher, year, edition, pages
Springer Nature, 2023
Keywords
Equivariance, Interaction, Representation Learning
National Category
Computer graphics and computer vision Computer Sciences
Identifiers
urn:nbn:se:kth:diva-339271 (URN)10.1007/978-3-031-43421-1_37 (DOI)001156141200037 ()2-s2.0-85174436596 (Scopus ID)
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2023, Turin, Italy, Sep 18 2023 - Sep 22 2023
Note

Part of ISBN 9783031434204

QC 20231106

Available from: 2023-11-06 Created: 2023-11-06 Last updated: 2025-02-01Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0009-0004-8248-229X

Search in DiVA

Show all publications