Ändra sökning
Länk till posten
Permanent länk

Direktlänk
BETA
Ambrus, Rares
Publikationer (8 of 8) Visa alla publikationer
Brucker, M., Durner, M., Ambrus, R., Marton, Z. C., Wendt, A., Jensfelt, P., . . . Triebel, R. (2018). Semantic Labeling of Indoor Environments from 3D RGB Maps. In: 2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA): . Paper presented at IEEE International Conference on Robotics and Automation (ICRA), MAY 21-25, 2018, Brisbane, AUSTRALIA (pp. 1871-1878). IEEE Computer Society
Öppna denna publikation i ny flik eller fönster >>Semantic Labeling of Indoor Environments from 3D RGB Maps
Visa övriga...
2018 (Engelska)Ingår i: 2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), IEEE Computer Society, 2018, s. 1871-1878Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We present an approach to automatically assign semantic labels to rooms reconstructed from 3D RGB maps of apartments. Evidence for the room types is generated using state-of-the-art deep-learning techniques for scene classification and object detection based on automatically generated virtual RGB views, as well as from a geometric analysis of the map's 3D structure. The evidence is merged in a conditional random field, using statistics mined from different datasets of indoor environments. We evaluate our approach qualitatively and quantitatively and compare it to related methods.

Ort, förlag, år, upplaga, sidor
IEEE Computer Society, 2018
Serie
IEEE International Conference on Robotics and Automation ICRA, ISSN 1050-4729
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
urn:nbn:se:kth:diva-237161 (URN)000446394501066 ()2-s2.0-85063131122 (Scopus ID)978-1-5386-3081-5 (ISBN)
Konferens
IEEE International Conference on Robotics and Automation (ICRA), MAY 21-25, 2018, Brisbane, AUSTRALIA
Forskningsfinansiär
Vetenskapsrådet, C0475401Stiftelsen för strategisk forskning (SSF)
Anmärkning

QC 20181024

Tillgänglig från: 2018-10-24 Skapad: 2018-10-24 Senast uppdaterad: 2019-06-12Bibliografiskt granskad
Ambrus, R., Claici, S. & Wendt, A. (2017). Automatic Room Segmentation From Unstructured 3-D Data of Indoor Environments. IEEE Robotics and Automation Letters, 2(2), 749-756
Öppna denna publikation i ny flik eller fönster >>Automatic Room Segmentation From Unstructured 3-D Data of Indoor Environments
2017 (Engelska)Ingår i: IEEE Robotics and Automation Letters, ISSN 2377-3766, E-ISSN 1949-3045, Vol. 2, nr 2, s. 749-756Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

We present an automatic approach for the task of reconstructing a 2-D floor plan from unstructured point clouds of building interiors. Our approach emphasizes accurate and robust detection of building structural elements and, unlike previous approaches, does not require prior knowledge of scanning device poses. The reconstruction task is formulated as a multiclass labeling problem that we approach using energy minimization. We use intuitive priors to define the costs for the energy minimization problem and rely on accurate wall and opening detection algorithms to ensure robustness. We provide detailed experimental evaluation results, both qualitative and quantitative, against state-of-the-art methods and labeled ground-truth data.

Ort, förlag, år, upplaga, sidor
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2017
Nyckelord
Mapping, RGB-D perception, semantic scene understanding
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
urn:nbn:se:kth:diva-217456 (URN)10.1109/LRA.2017.2651939 (DOI)000413736600049 ()
Anmärkning

QC 20171117

Tillgänglig från: 2017-11-17 Skapad: 2017-11-17 Senast uppdaterad: 2018-01-13Bibliografiskt granskad
Ambrus, R., Bore, N., Folkesson, J. & Jensfelt, P. (2017). Autonomous meshing, texturing and recognition of object models with a mobile robot. In: Bicchi, A Okamura, A (Ed.), 2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS): . Paper presented at IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), SEP 24-28, 2017, Vancouver, CANADA (pp. 5071-5078). IEEE
Öppna denna publikation i ny flik eller fönster >>Autonomous meshing, texturing and recognition of object models with a mobile robot
2017 (Engelska)Ingår i: 2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) / [ed] Bicchi, A Okamura, A, IEEE , 2017, s. 5071-5078Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We present a system for creating object models from RGB-D views acquired autonomously by a mobile robot. We create high-quality textured meshes of the objects by approximating the underlying geometry with a Poisson surface. Our system employs two optimization steps, first registering the views spatially based on image features, and second aligning the RGB images to maximize photometric consistency with respect to the reconstructed mesh. We show that the resulting models can be used robustly for recognition by training a Convolutional Neural Network (CNN) on images rendered from the reconstructed meshes. We perform experiments on data collected autonomously by a mobile robot both in controlled and uncontrolled scenarios. We compare quantitatively and qualitatively to previous work to validate our approach.

Ort, förlag, år, upplaga, sidor
IEEE, 2017
Serie
IEEE International Conference on Intelligent Robots and Systems, ISSN 2153-0858
Nationell ämneskategori
Data- och informationsvetenskap
Identifikatorer
urn:nbn:se:kth:diva-225806 (URN)000426978204127 ()2-s2.0-85041961210 (Scopus ID)978-1-5386-2682-5 (ISBN)
Konferens
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), SEP 24-28, 2017, Vancouver, CANADA
Forskningsfinansiär
EU, FP7, Sjunde ramprogrammet, 600623Stiftelsen för strategisk forskning (SSF)Vetenskapsrådet, C0475401
Anmärkning

QC 20180409

Tillgänglig från: 2018-04-09 Skapad: 2018-04-09 Senast uppdaterad: 2019-08-20Bibliografiskt granskad
Ambrus, R., Bore, N., Folkesson, J. & Jensfelt, P. (2017). Autonomous meshing, texturing and recognition of objectmodels with a mobile robot. In: : . Paper presented at Intelligent Robots and Systems, IEEE/RSJ International Conference on. Vancouver, Canada
Öppna denna publikation i ny flik eller fönster >>Autonomous meshing, texturing and recognition of objectmodels with a mobile robot
2017 (Engelska)Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We present a system for creating object modelsfrom RGB-D views acquired autonomously by a mobile robot.We create high-quality textured meshes of the objects byapproximating the underlying geometry with a Poisson surface.Our system employs two optimization steps, first registering theviews spatially based on image features, and second aligningthe RGB images to maximize photometric consistency withrespect to the reconstructed mesh. We show that the resultingmodels can be used robustly for recognition by training aConvolutional Neural Network (CNN) on images rendered fromthe reconstructed meshes. We perform experiments on datacollected autonomously by a mobile robot both in controlledand uncontrolled scenarios. We compare quantitatively andqualitatively to previous work to validate our approach.

Ort, förlag, år, upplaga, sidor
Vancouver, Canada: , 2017
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
urn:nbn:se:kth:diva-215232 (URN)
Konferens
Intelligent Robots and Systems, IEEE/RSJ International Conference on
Anmärkning

QC 20171009

Tillgänglig från: 2017-10-05 Skapad: 2017-10-05 Senast uppdaterad: 2018-01-13Bibliografiskt granskad
Bore, N., Ambrus, R., Jensfelt, P. & Folkesson, J. (2017). Efficient retrieval of arbitrary objects from long-term robot observations. Robotics and Autonomous Systems, 91, 139-150
Öppna denna publikation i ny flik eller fönster >>Efficient retrieval of arbitrary objects from long-term robot observations
2017 (Engelska)Ingår i: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 91, s. 139-150Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

We present a novel method for efficient querying and retrieval of arbitrarily shaped objects from large amounts of unstructured 3D point cloud data. Our approach first performs a convex segmentation of the data after which local features are extracted and stored in a feature dictionary. We show that the representation allows efficient and reliable querying of the data. To handle arbitrarily shaped objects, we propose a scheme which allows incremental matching of segments based on similarity to the query object. Further, we adjust the feature metric based on the quality of the query results to improve results in a second round of querying. We perform extensive qualitative and quantitative experiments on two datasets for both segmentation and retrieval, validating the results using ground truth data. Comparison with other state of the art methods further enforces the validity of the proposed method. Finally, we also investigate how the density and distribution of the local features within the point clouds influence the quality of the results.

Ort, förlag, år, upplaga, sidor
ELSEVIER SCIENCE BV, 2017
Nyckelord
Mapping, Mobile robotics, Point cloud, Segmentation, Retrieval
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
urn:nbn:se:kth:diva-205426 (URN)10.1016/j.robot.2016.12.013 (DOI)000396949800012 ()2-s2.0-85015091269 (Scopus ID)
Anmärkning

QC 20170522

Tillgänglig från: 2017-05-22 Skapad: 2017-05-22 Senast uppdaterad: 2018-01-13Bibliografiskt granskad
Almeida, D., Ambrus, R., Caccamo, S., Chen, X., Cruciani, S., Pinto Basto De Carvalho, J. F., . . . Kragic, D. (2017). Team KTH’s Picking Solution for the Amazon Picking Challenge 2016. In: Warehouse Picking Automation Workshop 2017: Solutions, Experience, Learnings and Outlook of the Amazon Robotics Challenge. Paper presented at ICRA 2017.
Öppna denna publikation i ny flik eller fönster >>Team KTH’s Picking Solution for the Amazon Picking Challenge 2016
Visa övriga...
2017 (Engelska)Ingår i: Warehouse Picking Automation Workshop 2017: Solutions, Experience, Learnings and Outlook of the Amazon Robotics Challenge, 2017Konferensbidrag, Enbart muntlig presentation (Övrig (populärvetenskap, debatt, mm))
Abstract [en]

In this work we summarize the solution developed by Team KTH for the Amazon Picking Challenge 2016 in Leipzig, Germany. The competition simulated a warehouse automation scenario and it was divided in two tasks: a picking task where a robot picks items from a shelf and places them in a tote and a stowing task which is the inverse task where the robot picks items from a tote and places them in a shelf. We describe our approach to the problem starting from a high level overview of our system and later delving into details of our perception pipeline and our strategy for manipulation and grasping. The solution was implemented using a Baxter robot equipped with additional sensors.

Nationell ämneskategori
Robotteknik och automation
Forskningsämne
Datalogi
Identifikatorer
urn:nbn:se:kth:diva-215327 (URN)
Konferens
ICRA 2017
Anmärkning

QC 20171009

Tillgänglig från: 2017-10-07 Skapad: 2017-10-07 Senast uppdaterad: 2018-05-24Bibliografiskt granskad
Hawes, N., Ambrus, R., Bore, N., Folkesson, J., Jensfelt, P., Hanheide, M. & et al., . (2017). The STRANDS Project Long-Term Autonomy in Everyday Environments. IEEE robotics & automation magazine, 24(3), 146-156
Öppna denna publikation i ny flik eller fönster >>The STRANDS Project Long-Term Autonomy in Everyday Environments
Visa övriga...
2017 (Engelska)Ingår i: IEEE robotics & automation magazine, ISSN 1070-9932, E-ISSN 1558-223X, Vol. 24, nr 3, s. 146-156Artikel i tidskrift (Refereegranskat) Published
Ort, förlag, år, upplaga, sidor
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2017
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
urn:nbn:se:kth:diva-216406 (URN)10.1109/MRA.2016.2636359 (DOI)000411010400017 ()2-s2.0-85007063656 (Scopus ID)
Anmärkning

QC 20171020

Tillgänglig från: 2017-10-20 Skapad: 2017-10-20 Senast uppdaterad: 2018-01-13Bibliografiskt granskad
Ambrus, R., Folkesson, J. & Jensfelt, P. (2016). Unsupervised object segmentation through change detection in a long term autonomy scenario. In: IEEE-RAS International Conference on Humanoid Robots: . Paper presented at 16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016, 15 November 2016 through 17 November 2016 (pp. 1181-1187). IEEE
Öppna denna publikation i ny flik eller fönster >>Unsupervised object segmentation through change detection in a long term autonomy scenario
2016 (Engelska)Ingår i: IEEE-RAS International Conference on Humanoid Robots, IEEE, 2016, s. 1181-1187Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

In this work we address the problem of dynamic object segmentation in office environments. We make no prior assumptions on what is dynamic and static, and our reasoning is based on change detection between sparse and non-uniform observations of the scene. We model the static part of the environment, and we focus on improving the accuracy and quality of the segmented dynamic objects over long periods of time. We address the issue of adapting the static structure over time and incorporating new elements, for which we train and use a classifier whose output gives an indication of the dynamic nature of the segmented elements. We show that the proposed algorithms improve the accuracy and the rate of detection of dynamic objects by comparing with a labelled dataset.

Ort, förlag, år, upplaga, sidor
IEEE, 2016
Nyckelord
Anthropomorphic robots, Signal detection, Change detection, Dynamic nature, Dynamic objects, Non-uniform, Object segmentation, Office environments, Static structures, Object detection
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
urn:nbn:se:kth:diva-202843 (URN)10.1109/HUMANOIDS.2016.7803420 (DOI)000403009300175 ()2-s2.0-85010207172 (Scopus ID)9781509047185 (ISBN)
Konferens
16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016, 15 November 2016 through 17 November 2016
Anmärkning

QC 20170317

Tillgänglig från: 2017-03-17 Skapad: 2017-03-17 Senast uppdaterad: 2018-01-13Bibliografiskt granskad
Organisationer

Sök vidare i DiVA

Visa alla publikationer