Endre søk
Link to record
Permanent link

Direct link
Publikasjoner (10 av 10) Visa alla publikasjoner
Longhini, A. (2025). Adapting to Variations in Textile Properties for Robotic Manipulation. (Doctoral dissertation). KTH Royal Institute of Technology
Åpne denne publikasjonen i ny fane eller vindu >>Adapting to Variations in Textile Properties for Robotic Manipulation
2025 (engelsk)Doktoravhandling, med artikler (Annet vitenskapelig)
Abstract [en]

In spite of the rapid advancements in AI, tasks like laundry, tidying, and general household assistance remain challenging for robots due to their limited capacity to generalize manipulation skills across different variations of everyday objects.Manipulation of textiles, in particular, poses unique challenges due to their deformable nature and complex dynamics.  In this thesis, we aim to enhance the generalization of robotic manipulation skills for textiles by addressing how robots can adapt their strategies based on the physical properties of deformable objects. We begin by identifying key factors of variation in textiles relevant to manipulation, drawing insights from overlooked taxonomies in the textile industry. The core challenge of adaptation is addressed by leveraging the synergies between interactive perception and cloth dynamics models. These are utilized to tackle two fundamental estimation problems to achieve adaptation: property identification, as these properties define the system’s dynamic and how the object responds to external forces, and state estimation, which provides the feedback necessary for closing the action-perception loop.  To identify object properties, we investigate how combining exploratory actions, such as pulling and twisting, with sensory feedback can enhance a robot’s understanding of textile characteristics. Central to this investigation is the development of an adaptation module designed to encode textile properties from recent observations, enabling data-driven dynamics models to adjust their predictions accordingly to the perceived properties. To address state estimation challenges arising from cloth self-occlusions, we explore semantic descriptors and 3D tracking methods that integrate geometric observations, such as point clouds, with visual cues from RGB data.Finally, we integrate these modeling and perceptual components into a model-based manipulation framework and evaluate the generalization of the proposed method across a diverse set of real-world textiles. The results, demonstrating enhanced generalization, underscore the potential of adapting the manipulation in response to variations in textiles' properties and highlight the critical role of the action-perception loop in achieving adaptability.

Abstract [sv]

Trots de snabba framstegen inom AI förblir uppgifter som att tvätta, städa och allmän hushållshjälp utmanande för robotar på grund av deras begränsade förmåga att generalisera manipulationsfärdigheter över olika variationer av vardagsföremål. Manipulation av textilier utgör i synnerhet unika utmaningar på grund av deras deformerbara natur och komplexa dynamik.I denna avhandling syftar vi till att förbättra generaliseringen av robotiska manipulationsfärdigheter för textilier genom att undersöka hur robotar kan anpassa sina strategier baserat på de fysiska egenskaperna hos deformerbara objekt. Vi börjar med att identifiera nyckelfaktorer för variation i textilier som är relevanta för manipulation och drar insikter från förbisedda taxonomier inom textilindustrin.Den centrala utmaningen med anpassning adresseras genom att utnyttja synergierna mellan interaktiv perception och modeller för textildynamik. Dessa används för att lösa två grundläggande estimeringsproblem för att uppnå anpassning: egenskapsidentifiering, eftersom dessa egenskaper definierar systemets dynamik och hur objektet reagerar på yttre krafter, samt tillståndsestimering, som ger den återkoppling som krävs för att stänga åtgärds-perceptionsslingan. För att identifiera objektets egenskaper undersöker vi hur kombinationen av utforskande handlingar, såsom att dra och vrida, med sensorisk återkoppling kan förbättra robotens förståelse för textilens egenskaper. Centralt i denna undersökning är utvecklingen av en anpassningsmodul utformad för att koda textilens egenskaper från nyligen gjorda observationer, vilket gör det möjligt för datadrivna dynamikmodeller att justera sina förutsägelser utifrån de uppfattade egenskaperna.För att hantera utmaningar med tillståndsestimering som uppstår vid textilens självocklusioner utforskar vi semantiska deskriptorer och 3D-spårningsmetoder som integrerar geometriska observationer, såsom punktmoln, med visuella ledtrådar från RGB-data.Slutligen integrerar vi dessa modellerings- och perceptionskomponenter i ett modellbaserat manipulationsramverk och utvärderar generaliseringen av den föreslagna metoden på ett brett urval av textilier i verkliga miljöer. Resultaten, som visar förbättrad generalisering, understryker potentialen i att anpassa manipulation till variationer i textilernas egenskaper och framhäver den avgörande rollen för åtgärds-perceptionsslingan i att uppnå anpassningsförmåga.

sted, utgiver, år, opplag, sider
KTH Royal Institute of Technology, 2025. s. 82
Serie
TRITA-EECS-AVL ; 2025:1
Emneord
Textile Variations, Robotic Manipulation, Generalization, Adaptation, Textila Variationer, Robotmanipulation, Generalisering, Anpassning
HSV kategori
Forskningsprogram
Datalogi
Identifikatorer
urn:nbn:se:kth:diva-357508 (URN)978-91-8106-125-3 (ISBN)
Disputas
2025-01-14, https://kth-se.zoom.us/j/66979575369, F3 (Flodis), Lindstedtsvägen 26 & 28, KTH Campus, Stockholm, 13:00 (engelsk)
Opponent
Veileder
Merknad

QC 20241213

Tilgjengelig fra: 2024-12-13 Laget: 2024-12-12 Sist oppdatert: 2025-10-29bibliografisk kontrollert
Betran, S. B., Longhini, A., Vasco, M., Zhang, Y. & Kragic Jensfelt, D. (2025). FLAME: A Federated Learning Benchmark for Robotic Manipulation. In: IROS 2025 - 2025 IEEE/RSJ International Conference on Intelligent Robots and Systems, Conference Proceedings: . Paper presented at 2025 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2025, Hangzhou, China, Oct 19 2025 - Oct 25 2025 (pp. 2494-2500). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>FLAME: A Federated Learning Benchmark for Robotic Manipulation
Vise andre…
2025 (engelsk)Inngår i: IROS 2025 - 2025 IEEE/RSJ International Conference on Intelligent Robots and Systems, Conference Proceedings, Institute of Electrical and Electronics Engineers (IEEE) , 2025, s. 2494-2500Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Recent progress in robotic manipulation has been fueled by large-scale datasets collected across diverse environments. Training robotic manipulation policies on these datasets is traditionally performed in a centralized manner, raising concerns regarding scalability, adaptability, and data privacy. While federated learning enables decentralized, privacy-preserving training, its application to robotic manipulation remains largely unexplored. We introduce FLAME (Federated Learning Across Manipulation Environments), the first benchmark designed for federated learning in robotic manipulation. FLAME consists of: (i) a set of large-scale datasets of over 160,000 expert demonstrations of multiple manipulation tasks, collected across a wide range of simulated environments; (ii) a training and evaluation framework for robotic policy learning in a federated setting. We evaluate standard federated learning algorithms in FLAME, showing their potential for distributed policy learning and highlighting key challenges. Our benchmark establishes a foundation for scalable, adaptive, and privacy-aware robotic learning. The code is publicly available at https://github.com/KTH-RPL/ELSA-Robotics-Challenge.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2025
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-377806 (URN)10.1109/IROS60139.2025.11245937 (DOI)2-s2.0-105029951023 (Scopus ID)
Konferanse
2025 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2025, Hangzhou, China, Oct 19 2025 - Oct 25 2025
Merknad

Part of ISBN 9798331543938

QC 20260312

Tilgjengelig fra: 2026-03-12 Laget: 2026-03-12 Sist oppdatert: 2026-03-12bibliografisk kontrollert
Dominguez, D. C., Iannotta, M., Kashyap, A., Sun, S., Yang, Y., Cella, C., . . . Iovino, M. (2025). The First WARA Robotics Mobile Manipulation Challenge - Lessons Learned. In: Gasteratos, A Bellotto, N Tortora, S (Ed.), Proceedings European Conference on Mobile Robots, ECMR 2025: . Paper presented at 12th European Conference on Mobile Robots-ECMR-Biennial, SEP 02-05, 2025, ITALY. Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>The First WARA Robotics Mobile Manipulation Challenge - Lessons Learned
Vise andre…
2025 (engelsk)Inngår i: Proceedings European Conference on Mobile Robots, ECMR 2025 / [ed] Gasteratos, A Bellotto, N Tortora, S, Institute of Electrical and Electronics Engineers (IEEE) , 2025Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

The first WARA Robotics Mobile Manipulation Challenge, held in December 2024 at ABB Corporate Research in Vasteras, Sweden, addressed the automation of task-intensive and repetitive manual labor in laboratory environments- specifically the transport and cleaning of glassware. Designed in collaboration with AstraZeneca, the challenge invited academic teams to develop autonomous robotic systems capable of navigating human-populated lab spaces and performing complex manipulation tasks, such as loading items into industrial dishwashers. This paper presents an overview of the challenge setup, its industrial motivation, and the four distinct approaches proposed by the participating teams. We summarize lessons learned from this edition and propose improvements in design to enable a more effective second iteration to take place in 2025. The initiative bridges an important gap in effective academia-industry collaboration within the domain of autonomous mobile manipulation systems by promoting the development and deployment of applied robotic solutions in real-world laboratory contexts.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2025
Serie
European Conference on Mobile Robots, ISSN 2639-7919
Emneord
Mobile Manipulation, Collaborative Robotics, Lab Automation
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-378307 (URN)10.1109/ECMR65884.2025.11163319 (DOI)001592487100076 ()2-s2.0-105018222663 (Scopus ID)
Konferanse
12th European Conference on Mobile Robots-ECMR-Biennial, SEP 02-05, 2025, ITALY
Merknad

Part of ISBN 979-8-3315-2706-8; 979-8-3315-2705-1

QC 20260318

Tilgjengelig fra: 2026-03-18 Laget: 2026-03-18 Sist oppdatert: 2026-03-18bibliografisk kontrollert
Longhini, A., Wang, Y., Garcia-Camacho, I., Blanco-Mulero, D., Moletta, M., Welle, M. C., . . . Kragic Jensfelt, D. (2025). Unfolding the Literature: A Review of Robotic Cloth Manipulation. Annual Review of Control, Robotics, and Autonomous Systems, 8(1), 295-322
Åpne denne publikasjonen i ny fane eller vindu >>Unfolding the Literature: A Review of Robotic Cloth Manipulation
Vise andre…
2025 (engelsk)Inngår i: Annual Review of Control, Robotics, and Autonomous Systems, E-ISSN 2573-5144, Vol. 8, nr 1, s. 295-322Artikkel, forskningsoversikt (Fagfellevurdert) Published
Abstract [en]

The realm of textiles spans clothing, households, healthcare, sports, and industrial applications. The deformable nature of these objects poses unique challenges that prior work on rigid objects cannot fully address. The increasing interest within the community in textile perception and manipulation has led to new methods that aim to address challenges in modeling, perception, and control, resulting in significant progress. However, this progress is often tailored to one specific textile or a subcategory of these textiles. To understand what restricts these methods and hinders current approaches from generalizing to a broader range of real-world textiles, this review provides an overview of the field, focusing specifically on how and to what extent textile variations are addressed in modeling, perception, benchmarking, and manipulation of textiles. We conclude by identifying key open problems and outlining grand challenges that will drive future advancements in the field.

Emneord
deformable object manipulation, generalization, physical property variations, task variations, textiles
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-363740 (URN)10.1146/annurev-control-022723-033252 (DOI)001488650100012 ()2-s2.0-105004918049 (Scopus ID)
Merknad

QC 20250528

Tilgjengelig fra: 2025-05-21 Laget: 2025-05-21 Sist oppdatert: 2025-07-01bibliografisk kontrollert
Longhini, A., Welle, M. C., Erickson, Z. & Kragic, D. (2024). AdaFold: Adapting Folding Trajectories of Cloths via Feedback-Loop Manipulation. IEEE Robotics and Automation Letters, 9(11), 9183-9190
Åpne denne publikasjonen i ny fane eller vindu >>AdaFold: Adapting Folding Trajectories of Cloths via Feedback-Loop Manipulation
2024 (engelsk)Inngår i: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 9, nr 11, s. 9183-9190Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

We present AdaFold, a model-based feedback-loop framework for optimizing folding trajectories. AdaFold extracts a particle-based representation of cloth from RGB-D images and feeds back the representation to a model predictive control to re-plan folding trajectory at every time-step. A key component of AdaFold that enables feedback-loop manipulation is the use of semantic descriptors extracted from geometric features. These descriptors enhance the particle representation of the cloth to distinguish between ambiguous point clouds of differently folded cloths. Our experiments demonstrate AdaFold's ability to adapt folding trajectories of cloths with varying physical properties and generalize from simulated training to real-world execution.

sted, utgiver, år, opplag, sider
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2024
Emneord
Trajectory optimization, Shape, Manipulation planning, perception for grasping and manipulation, RGB-D perception, semantic scene understanding
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-354332 (URN)10.1109/LRA.2024.3436329 (DOI)001316209900014 ()2-s2.0-85199779805 (Scopus ID)
Merknad

QC 20241004

Tilgjengelig fra: 2024-10-04 Laget: 2024-10-04 Sist oppdatert: 2025-02-09bibliografisk kontrollert
Longhini, A., Büsching, M., Duisterhof, B. P., Lundell, J., Ichnowski, J., Björkman, M. & Kragic, D. (2024). Cloth-Splatting: 3D Cloth State Estimation from RGB Supervision. In: Proceedings of the 8th Conference on Robot Learning, CoRL 2024: . Paper presented at 8th Annual Conference on Robot Learning, November 6-9, 2024, Munich, Germany (pp. 2845-2865). ML Research Press
Åpne denne publikasjonen i ny fane eller vindu >>Cloth-Splatting: 3D Cloth State Estimation from RGB Supervision
Vise andre…
2024 (engelsk)Inngår i: Proceedings of the 8th Conference on Robot Learning, CoRL 2024, ML Research Press , 2024, s. 2845-2865Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

We introduce Cloth-Splatting, a method for estimating 3D states of cloth from RGB images through a prediction-update framework. Cloth-Splatting leverages an action-conditioned dynamics model for predicting future states and uses 3D Gaussian Splatting to update the predicted states. Our key insight is that coupling a 3D mesh-based representation with Gaussian Splatting allows us to define a differentiable map between the cloth's state space and the image space. This enables the use of gradient-based optimization techniques to refine inaccurate state estimates using only RGB supervision. Our experiments demonstrate that Cloth-Splatting not only improves state estimation accuracy over current baselines but also reduces convergence time by ∼85 %.

sted, utgiver, år, opplag, sider
ML Research Press, 2024
Emneord
3D State Estimation, Gaussian Splatting, Vision-based Tracking, Deformable Objects
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-357192 (URN)2-s2.0-86000735293 (Scopus ID)
Konferanse
8th Annual Conference on Robot Learning, November 6-9, 2024, Munich, Germany
Merknad

QC 20250328

Tilgjengelig fra: 2024-12-04 Laget: 2024-12-04 Sist oppdatert: 2025-03-28bibliografisk kontrollert
Garcia-Camacho, I., Longhini, A., Welle, M. C., Alenyà, G., Kragic Jensfelt, D. & Borràs, J. (2024). Standardization of Cloth Objects and its Relevance in Robotic Manipulation. In: 2024 IEEE International Conference on Robotics and Automation, ICRA 2024: . Paper presented at 2024 IEEE International Conference on Robotics and Automation, ICRA 2024, Yokohama, Japan, May 13 2024 - May 17 2024 (pp. 8298-8304). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>Standardization of Cloth Objects and its Relevance in Robotic Manipulation
Vise andre…
2024 (engelsk)Inngår i: 2024 IEEE International Conference on Robotics and Automation, ICRA 2024, Institute of Electrical and Electronics Engineers (IEEE) , 2024, s. 8298-8304Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

The field of robotics faces inherent challenges in manipulating deformable objects, particularly in understanding and standardising fabric properties like elasticity, stiffness, and friction. While the significance of these properties is evident in the realm of cloth manipulation, accurately categorising and comprehending them in real-world applications remains elusive. This study sets out to address two primary objectives: (1) to provide a framework suitable for robotics applications to characterise cloth objects, and (2) to study how these properties influence robotic manipulation tasks. Our preliminary results validate the framework's ability to characterise cloth properties and compare cloth sets, and reveal the influence that different properties have on the outcome of five manipulation primitives. We believe that, in general, results on the manipulation of clothes should be reported along with a better description of the garments used in the evaluation. This paper proposes a set of these measures.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2024
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-367195 (URN)10.1109/ICRA57147.2024.10610630 (DOI)001294576206015 ()2-s2.0-85199751950 (Scopus ID)
Konferanse
2024 IEEE International Conference on Robotics and Automation, ICRA 2024, Yokohama, Japan, May 13 2024 - May 17 2024
Merknad

Part of ISBN 9798350384574

QC 20250715

Tilgjengelig fra: 2025-07-15 Laget: 2025-07-15 Sist oppdatert: 2025-08-22bibliografisk kontrollert
Longhini, A., Moletta, M., Reichlin, A., Welle, M. C., Held, D., Erickson, Z. & Kragic, D. (2023). EDO-Net: Learning Elastic Properties of Deformable Objects from Graph Dynamics. In: Proceedings - ICRA 2023: IEEE International Conference on Robotics and Automation. Paper presented at 2023 IEEE International Conference on Robotics and Automation, ICRA 2023, London, United Kingdom of Great Britain and Northern Ireland, May 29 2023 - Jun 2 2023 (pp. 3875-3881). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>EDO-Net: Learning Elastic Properties of Deformable Objects from Graph Dynamics
Vise andre…
2023 (engelsk)Inngår i: Proceedings - ICRA 2023: IEEE International Conference on Robotics and Automation, Institute of Electrical and Electronics Engineers (IEEE) , 2023, s. 3875-3881Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

We study the problem of learning graph dynamics of deformable objects that generalizes to unknown physical properties. Our key insight is to leverage a latent representation of elastic physical properties of cloth-like deformable objects that can be extracted, for example, from a pulling interaction. In this paper we propose EDO-Net (Elastic Deformable Object - Net), a model of graph dynamics trained on a large variety of samples with different elastic properties that does not rely on ground-truth labels of the properties. EDO-Net jointly learns an adaptation module, and a forward-dynamics module. The former is responsible for extracting a latent representation of the physical properties of the object, while the latter leverages the latent representation to predict future states of cloth-like objects represented as graphs. We evaluate EDO-Net both in simulation and real world, assessing its capabilities of: 1) generalizing to unknown physical properties, 2) transferring the learned representation to new downstream tasks.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2023
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-336773 (URN)10.1109/ICRA48891.2023.10161234 (DOI)001036713003039 ()2-s2.0-85168652855 (Scopus ID)
Konferanse
2023 IEEE International Conference on Robotics and Automation, ICRA 2023, London, United Kingdom of Great Britain and Northern Ireland, May 29 2023 - Jun 2 2023
Merknad

Part of ISBN 9798350323658

QC 20230920

Tilgjengelig fra: 2023-09-20 Laget: 2023-09-20 Sist oppdatert: 2025-02-01bibliografisk kontrollert
Longhini, A., Moletta, M., Reichlin, A., Welle, M. C., Kravberg, A., Wang, Y., . . . Kragic, D. (2023). Elastic Context: Encoding Elasticity for Data-driven Models of Textiles. In: Proceedings - ICRA 2023: IEEE International Conference on Robotics and Automation. Paper presented at 2023 IEEE International Conference on Robotics and Automation, ICRA 2023, London, United Kingdom of Great Britain and Northern Ireland, May 29 2023 - Jun 2 2023 (pp. 1764-1770). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>Elastic Context: Encoding Elasticity for Data-driven Models of Textiles
Vise andre…
2023 (engelsk)Inngår i: Proceedings - ICRA 2023: IEEE International Conference on Robotics and Automation, Institute of Electrical and Electronics Engineers (IEEE) , 2023, s. 1764-1770Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Physical interaction with textiles, such as assistivedressing or household tasks, requires advanced dexterous skills.The complexity of textile behavior during stretching and pullingis influenced by the material properties of the yarn and bythe textile’s construction technique, which are often unknownin real-world settings. Moreover, identification of physicalproperties of textiles through sensing commonly available onrobotic platforms remains an open problem. To address this,we introduce Elastic Context (EC), a method to encode theelasticity of textiles using stress-strain curves adapted fromtextile engineering for robotic applications. We employ EC tolearn generalized elastic behaviors of textiles and examine theeffect of EC dimension on accurate force modeling of real-worldnon-linear elastic behaviors.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2023
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-328397 (URN)10.1109/ICRA48891.2023.10160740 (DOI)001036713001083 ()2-s2.0-85168704167 (Scopus ID)
Konferanse
2023 IEEE International Conference on Robotics and Automation, ICRA 2023, London, United Kingdom of Great Britain and Northern Ireland, May 29 2023 - Jun 2 2023
Merknad

Part of ISBN 9798350323658

QC 20230615

Tilgjengelig fra: 2023-06-08 Laget: 2023-06-08 Sist oppdatert: 2025-02-09bibliografisk kontrollert
Longhini, A., Welle, M. C., Mitsioni, I. & Kragic, D. (2021). Textile Taxonomy and Classification Using Pulling and Twisting. In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): Prague/Online 27.09-01.10.2021. Paper presented at 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague/Online 27.09-01.10.2021 (pp. 7541-7548). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>Textile Taxonomy and Classification Using Pulling and Twisting
2021 (engelsk)Inngår i: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): Prague/Online 27.09-01.10.2021, Institute of Electrical and Electronics Engineers (IEEE), 2021, s. 7541-7548Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

— Identification of textile properties is an important milestone toward advanced robotic manipulation tasks that consider interaction with clothing items such as assisted dressing, laundry folding, automated sewing, textile recycling and reusing. Despite the abundance of work considering this class of deformable objects, many open problems remain. These relate to the choice and modelling of the sensory feedback as well as the control and planning of the interaction and manipulation strategies. Most importantly, there is no structured approach for studying and assessing different approaches that may bridge the gap between the robotics community and textile production industry. To this end, we outline a textile taxonomy considering fiber types and production methods, commonly used in textile industry. We devise datasets according to the taxonomy, and study how robotic actions, such as pulling and twisting of the textile samples, can be used for the classification. We also provide important insights from the perspective of visualization and interpretability of the gathered data.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2021
Serie
IEEE International Conference on Intelligent Robots and Systems, ISSN 2153-0858
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-304613 (URN)10.1109/IROS51168.2021.9635992 (DOI)000755125506011 ()2-s2.0-85124364312 (Scopus ID)
Konferanse
2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague/Online 27.09-01.10.2021
Merknad

QC 20220324

Part of conference proceedings: ISBN 978-166541714-3

Tilgjengelig fra: 2021-11-08 Laget: 2021-11-08 Sist oppdatert: 2025-02-09bibliografisk kontrollert
Organisasjoner
Identifikatorer
ORCID-id: ORCID iD iconorcid.org/0000-0001-9125-6615