Change search
Refine search result
1 - 12 of 12
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Almeida, Diogo
    et al.
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL. KTH.
    Ambrus, Rares
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Caccamo, Sergio
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL.
    Chen, Xi
    KTH.
    Cruciani, Silvia
    Pinto Basto De Carvalho, Joao F
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL.
    Haustein, Joshua
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL.
    Marzinotto, Alejandro
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Vina, Francisco
    KTH.
    Karayiannidis, Yannis
    KTH.
    Ögren, Petter
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Optimization and Systems Theory.
    Jensfelt, Patric
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.
    Team KTH’s Picking Solution for the Amazon Picking Challenge 20162017In: Warehouse Picking Automation Workshop 2017: Solutions, Experience, Learnings and Outlook of the Amazon Robotics Challenge, 2017Conference paper (Other (popular science, discussion, etc.))
    Abstract [en]

    In this work we summarize the solution developed by Team KTH for the Amazon Picking Challenge 2016 in Leipzig, Germany. The competition simulated a warehouse automation scenario and it was divided in two tasks: a picking task where a robot picks items from a shelf and places them in a tote and a stowing task which is the inverse task where the robot picks items from a tote and places them in a shelf. We describe our approach to the problem starting from a high level overview of our system and later delving into details of our perception pipeline and our strategy for manipulation and grasping. The solution was implemented using a Baxter robot equipped with additional sensors.

  • 2.
    Cruciani, Silvia
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Vision-Based In-Hand Manipulation with Limited Dexterity2019Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    In-hand manipulation is an action that allows for changing the grasp on an object without the need for releasing it. This action is an important component in the manipulation process and helps solving many tasks. Human hands are dexterous instruments suitable for moving an object inside the hand. However, it is not common for robots to be equipped with dexterous hands due to many challenges in control and mechanical design. In fact, robots are frequently equipped with simple parallel grippers, robust but lacking dexterity. This thesis focuses on achieving in-hand manipulation with limited dexterity. The proposed solutions are based only on visual input, without the need for additional sensing capabilities in the robot's hand.

    Extrinsic dexterity allows simple grippers to execute in-hand manipulation thanks to the exploitation of external supports. This thesis introduces new methods for solving in-hand manipulation using inertial forces, controlled friction and external pushes as additional supports to enhance the robot's manipulation capabilities. Pivoting is seen as a possible solution for simple grasp changes: two methods, which cope with inexact friction modeling, are reported, and pivoting is successfully integrated in an overall manipulation task. For large scale in-hand manipulation, the Dexterous Manipulation Graph is introduced as a novel representation of the object. This graph is a useful tool for planning how to change a certain grasp via in-hand manipulation. It can also be exploited to combine both in-hand manipulation and regrasping to augment the possibilities of adjusting the grasp. In addition, this method is extended to achieve in-hand manipulation even for objects with unknown shape. To execute the planned object motions within the gripper, dual-arm robots are exploited to enhance the poor dexterity of parallel grippers: the second arm is seen as an additional support that helps in pushing and holding the object to successfully adjust the grasp configuration.

    This thesis presents examples of successful executions of tasks where in-hand manipulation is a fundamental step in the manipulation process, showing how the proposed methods are a viable solution for achieving in-hand manipulation with limited dexterity.

  • 3.
    Cruciani, Silvia
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Almeida, Diogo
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL. KTH.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Karayiannidis, Yiannis
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Discrete Bimanual Manipulation for Wrench BalancingManuscript (preprint) (Other academic)
    Abstract [en]

    Dual-arm robots can overcome grasping force and payload limitations of a single arm by jointly grasping an object.However, if the distribution of mass of the grasped object is not even, each arm will experience different wrenches that can exceed its payload limits.In this work, we consider the problem of balancing the wrenches experienced by  a dual-arm robot grasping a rigid tray.The distribution of wrenches among the robot arms changes due to objects being placed on the tray.We present an approach to reduce the wrench imbalance among arms through discrete bimanual manipulation.Our approach is based on sequential sliding motions of the grasp points on the surface of the object, to attain a more balanced configuration.%This is achieved in a discrete manner, one arm at a time, to minimize the potential for undesirable object motion during execution.We validate our modeling approach and system design through a set of robot experiments.

  • 4.
    Cruciani, Silvia
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Hang, Kaiyu
    Yale University.
    Smith, Christian
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Dual-Arm In-Hand Manipulation Using Visual Feedback2019Conference paper (Refereed)
    Abstract [en]

    In this work, we address the problem of executing in-hand manipulation based on visual input. Given an initial grasp, the robot has to change its grasp configuration without releasing the object. We propose a method for in-hand manipulation planning and execution based on information on the object’s shape using a dual-arm robot. From the available information on the object, which can be a complete point cloud but also partial data, our method plans a sequence of rotations and translations to reconfigure the object’s pose. This sequence is executed using non-prehensile pushes defined as relative motions between the two robot arms.

  • 5.
    Cruciani, Silvia
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Hang, Yin
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    In-Hand Manipulation of Objects with Unknown ShapesManuscript (preprint) (Other academic)
    Abstract [en]

    This work addresses the problem of changing grasp configurations on objects with an unknown shape through in-hand manipulation. Our approach leverages shape priors,learned as deep generative models, to infer novel object shapesfrom partial visual sensing. The Dexterous Manipulation Graph method is extended to build upon incremental data and account for estimation uncertainty in searching a sequence of manipulation actions. We show that our approach successfully solves in-hand manipulation tasks with unknown objects, and demonstrate the validity of these solutions with robot experiments.

  • 6.
    Cruciani, Silvia
    et al.
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.
    In-Hand Manipulation Using Three-Stages Open Loop Pivoting2017In: 2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) / [ed] Bicchi, A Okamura, A, IEEE , 2017, p. 1244-1251Conference paper (Refereed)
    Abstract [en]

    In this paper we propose a method for pivoting an object held by a parallel gripper, without requiring accurate dynamical models or advanced hardware. Our solution uses the motion of the robot arm for generating inertial forces to move the object. It also controls the rotational friction at the pivoting point by commanding a desired distance to the gripper's fingers. This method relies neither on fast and precise tracking systems to obtain the position of the tool, nor on real-time and high-frequency controllable robotic grippers to quickly adjust the finger distance. We demonstrate the efficacy of our method by applying it on a Baxter robot.

  • 7.
    Cruciani, Silvia
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Smith, Christian
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Integrating Path Planning and Pivoting2018In: 2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) / [ed] Maciejewski, AA Okamura, A Bicchi, A Stachniss, C Song, DZ Lee, DH Chaumette, F Ding, H Li, JS Wen, J Roberts, J Masamune, K Chong, NY Amato, N Tsagwarakis, N Rocco, P Asfour, T Chung, WK Yasuyoshi, Y Sun, Y Maciekeski, T Althoefer, K AndradeCetto, J Chung, WK Demircan, E Dias, J Fraisse, P Gross, R Harada, H Hasegawa, Y Hayashibe, M Kiguchi, K Kim, K Kroeger, T Li, Y Ma, S Mochiyama, H Monje, CA Rekleitis, I Roberts, R Stulp, F Tsai, CHD Zollo, L, IEEE , 2018, p. 6601-6608Conference paper (Refereed)
    Abstract [en]

    In this work we propose a method for integrating motion planning and in-hand manipulation. Commonly addressed as a separate step from the final execution, in-hand manipulation allows the robot to reorient an object within the end-effector for the successful outcome of the goal task. A joint achievement of repositioning the object and moving the manipulator towards its desired final pose saves time in the execution and introduces more flexibility in the system. We address this problem using a pivoting strategy (i.e. in-hand rotation) for repositioning the object and we integrate this strategy with a path planner for the execution of a complex task. This method is applied on a Baxter robot and its efficacy is shown by experimental results.

  • 8.
    Cruciani, Silvia
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Smith, Christian
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.
    Hang, Kaiyu
    Hong Kong Univ Sci & Technol, Dept Comp Sci & Engn, Hong Kong, Peoples R China.;Hong Kong Univ Sci & Technol, Inst Adv Study, Hong Kong, Peoples R China..
    Dexterous Manipulation Graphs2018In: 2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) / [ed] Maciejewski, AA Okamura, A Bicchi, A Stachniss, C Song, DZ Lee, DH Chaumette, F Ding, H Li, JS Wen, J Roberts, J Masamune, K Chong, NY Amato, N Tsagwarakis, N Rocco, P Asfour, T Chung, WK Yasuyoshi, Y Sun, Y Maciekeski, T Althoefer, K AndradeCetto, J Chung, WK Demircan, E Dias, J Fraisse, P Gross, R Harada, H Hasegawa, Y Hayashibe, M Kiguchi, K Kim, K Kroeger, T Li, Y Ma, S Mochiyama, H Monje, CA Rekleitis, I Roberts, R Stulp, F Tsai, CHD Zollo, L, IEEE , 2018, p. 2040-2047Conference paper (Refereed)
    Abstract [en]

    We propose the Dexterous Manipulation Graph as a tool to address in-hand manipulation and reposition an object inside a robot's end-effector. This graph is used to plan a sequence of manipulation primitives so to bring the object to the desired end pose. This sequence of primitives is translated into motions of the robot to move the object held by the end-effector. We use a dual arm robot with parallel grippers to test our method on a real system and show successful planning and execution of in-hand manipulation.

  • 9.
    Cruciani, Silvia
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL. KTH Royal Inst Technol, Div Robot Percept & Learning, EECS, S-11428 Stockholm, Sweden..
    Sundaralingam, Balakumar
    Univ Utah, Robot Ctr, Salt Lake City, UT 84112 USA.;Univ Utah, Sch Comp, Salt Lake City, UT 84112 USA..
    Hang, Kaiyu
    Yale Univ, Dept Mech Engn & Mat Sci, New Haven, CT 06520 USA..
    Kumar, Vikash
    Google AI, San Francisco, CA 94110 USA..
    Hermans, Tucker
    Univ Utah, Robot Ctr, Salt Lake City, UT 84112 USA.;Univ Utah, Sch Comp, Salt Lake City, UT 84112 USA.;NVIDIA Res, Santa Clara, CA USA..
    Kragic, Danica
    KTH, Superseded Departments (pre-2005), Numerical Analysis and Computer Science, NADA. KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL. KTH, School of Electrical Engineering and Computer Science (EECS), Centres, Centre for Autonomous Systems, CAS. KTH Royal Inst Technol, Div Robot Percept & Learning, EECS, S-11428 Stockholm, Sweden..
    Benchmarking In-Hand Manipulation2020In: IEEE Robotics and Automation Letters, ISSN 2377-3766, E-ISSN 1949-3045, Vol. 5, no 2, p. 588-595Article in journal (Refereed)
    Abstract [en]

    The purpose of this benchmark is to evaluate the planning and control aspects of robotic in-hand manipulation systems. The goal is to assess the systems ability to change the pose of a hand-held object by either using the fingers, environment or a combination of both. Given an object surface mesh from the YCB data-set, we provide examples of initial and goal states (i.e. static object poses and fingertip locations) for various in-hand manipulation tasks. We further propose metrics that measure the error in reaching the goal state from a specific initial state, which, when aggregated across all tasks, also serves as a measure of the systems in-hand manipulation capability. We provide supporting software, task examples, and evaluation results associated with the benchmark.

  • 10.
    Cruciani, Silvia
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Yin, Hang
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    In-Hand Manipulation of Objects with Unknown ShapesManuscript (preprint) (Other academic)
    Abstract [en]

    This work addresses the problem of changing grasp configurations on objects with an unknown shape through in-hand manipulation. Our approach leverages shape priors,learned as deep generative models, to infer novel object shapesfrom partial visual sensing. The Dexterous Manipulation Graph method is extended to build upon incremental data and account for estimation uncertainty in searching a sequence of manipulation actions. We show that our approach successfully solves in-hand manipulation tasks with unknown objects, and demonstrate the validity of these solutions with robot experiments.

  • 11.
    Haustein, Joshua A.
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Cruciani, Silvia
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Asif, Rizwan
    KTH.
    Hang, Kaiyu
    KTH.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Placing Objects with prior In-Hand Manipulation using Dexterous Manipulation Graphs2019Conference paper (Refereed)
    Abstract [en]

    We address the problem of planning the placement of a grasped object with a robot manipulator. More specifically, the robot is tasked to place the grasped object such that a placement preference function is maximized. For this, we present an approach that uses in-hand manipulation to adjust the robot’s initial grasp to extend the set of reachable placements. Given an initial grasp, the algorithm computes a set of grasps that can be reached by pushing and rotating the object in-hand. With this set of reachable grasps, it then searches for a stable placement that maximizes the preference function. If successful it returns a sequence of in-hand pushes to adjust the initial grasp to a more advantageous grasp together with a transport motion that carries the object to the placement. We evaluate our algorithm’s performance on various placing scenarios, and observe its effectiveness also in challenging scenes containing many obstacles. Our experiments demonstrate that re-grasping with in-hand manipulation increases the quality of placements the robot can reach. In particular, it enables the algorithm to find solutions in situations where safe placing with the initial grasp wouldn’t be possible.

  • 12.
    Rakesh, Krishnan
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL. Department of Electronics, Mathematics and Natural Sciences, University of Gävle, Gävle, Sweden.
    Cruciani, Silvia
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL.
    Gutierrez-Farewik, Elena
    KTH, School of Engineering Sciences (SCI), Mechanics.
    Björsell, Niclas
    Department of Electronics, Mathematics and Natural Sciences, University of Gävle, Gävle, Sweden.
    Smith, Christian
    KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL.
    Reliably Segmenting Motion Reversals of a Rigid-IMU Cluster Using Screw-Based Invariants2018Conference paper (Refereed)
    Abstract [en]

    Human-robot interaction (HRI) is movingtowards the human-robot synchronization challenge. Inrobots like exoskeletons, this challenge translates to thereliable motion segmentation problem using wearabledevices. Therefore, our paper explores the possibility ofsegmenting the motion reversals of a rigid-IMU clusterusing screw-based invariants. Moreover, we evaluate thereliability of this framework with regard to the sensorplacement, speed and type of motion. Overall, our resultsshow that the screw-based invariants can reliably segmentthe motion reversals of a rigid-IMU cluster.

1 - 12 of 12
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf