Revisiting Proprioceptive Sensing for Articulated Object Manipulation

AI and Robotics Lab - Ghent University
Embracing Contacts workshop - ICRA 2023

We create a system that uses proprioceptive sensing to open articulated objects, such as this IKEA cabinet, with a position-controlled robot and a parallel gripper. By showing this is a viable approach, we hope to inspire future work to revisit proprioceptive sensing for articulated object manipulation.

Abstract

Assistive robots will need to interact with articulated objects such as cabinets or microwaves. Early work on creating systems for doing so used proprioceptive sensing to estimate joint mechanisms during contact. However, nowadays, almost all systems use only vision and no longer consider proprioceptive information during contact. We believe that proprioceptive information during contact is a valuable source of information and did not find clear motivation for not using it in the literature. Therefore, in this paper, we create a system that, starting from a given grasp, uses proprioceptive sensing to open cabinets with a position-controlled robot and a parallel gripper. We perform a qualitative evaluation of this system, where we find that slip between the gripper and handle limits the performance. Nonetheless, we find that the system already performs quite well. This poses the question: should we make more use of proprioceptive information during contact in articulated object manipulation systems, or is it not worth the added complexity, and can we manage with vision alone? We do not have an answer to this question, but we hope to spark some discussion on the matter.

System Overview

Overview

Under the assumption of a rigid transform between the gripper and the handle, we can use the gripper poses to estimate the joint twist. We modify a Factor Graph proposed by Heppert et Al. to do so. An admittance controller is then used to deal with estimation errors and open the articulated object. Subsequent gripper poses allow to refine the joint estimation.

Results

We find that our system is able to open various articulated objects, irrespective of their relative pose or surroundings. This reconfirms the potential of proprioceptive sensing for articulated object manipulation. Nonetheless, we observe a few limtations, the first of which is directly related to the use of proprioceptive sensing:

Slip between the gripper and handle can occur (see for example the video of the upside down cabinet) and deteriorates the articulation estimation. This does not necessarily result in a failure to open the object, yet it will need to be addressed to obtain more robust systems.

Using fixed grasps, i.e. grasping the handle once and then attempting to open it without regrasping, is not always possible. For some objects (such as the small oven in the videos above), it will result in collisions with the environment. It also aggravates workspace limitations of the robot, where the robot cannot fully open the object due to its workspace limitations. Both situations can be addressed by regrasping the object as needed.

Finally, it is important to note that our system takes about 2 minutes to open an articulated object. Even though this is comparable to other works, it is 2 orders of magnitudes slower than humans.

Open Questions

We have seen that proprioceptive sensing can be used to open articulated objects. However, it introduces additional complexity and slip needs to be handled. Furthermore, visual information is still required to find appropriate grasp poses. This brings us the the main question: Should we use proprioceptive sensing for articulated object manipulation? Or can we manage with vision alone? And if we do use proprioceptive sensing, how can we efficiently combine it with visual information?
We believe these are exciting questions and offer directions for future work.

Feel free to contact us if you have an opinion on the matter or if you have any other questions or comments!

BibTeX

@inproceedings{lips2023proprioceptive-articulated-manipulation,
        author    = {Lips, Thomas and wyffels, Francis},
        title     = {Revisiting Proprioceptive Sensing for Articulated Object Manipulation},
        booktitle = {Embracing Contacts workshop - IEEE International Conference on Robotics and Automation 2023 (ICRA)},
        year = {2023},
      }