Manipulation and Grasping

Robotic grasping has been studied for more than 30 years as it is an essential skill for both industrial and service robots. Most grasp generation methods rely on known objects and a static environment with no clutter, assumptions which are not applicable in more dynamic environments such as homes. In these environments robots need to rely on noisy sensory inputs to model their whereabouts, complicating the problem.

Current Projects

Deformable Object Manipulation

In this project we research on how to manipulate more efficiently deformable objects by using dynamic manipulation as well as the modeling deformable objects via graph structures. Our applications range from manipulation of granular materials such as ground coffee to cloth manipulation.

Exploiting Object Physical Properties for Grasping

In robotic manipulation, robots are required to interact with, and adapt to, unknown environments and objects. In order to successfully accomplish these tasks, robots need to identify various properties of the objects to be handled. For these reasons, identifying object models that can represent the properties of objects has become a crucial issue in robotics. […]

Learning geometry-based robot-manipulation skills

In many robot control problems, skills such as stiffness, damping and manipulability ellipsoids are naturally represented as symmetric positive definite (SPD) matrices, orientations are represented as unit quaternions, sensory data processed as spatial covariances, etc., which capture the specific geometric characteristics of those skills. Typical learned skill models such as dynamic movement primitives (DMPs), probabilistic […]

Past Projects

Shape-based grasping

In robotic grasping, knowing the object shape allows for better grasp planning. However, in many environments it is impossible to know a priori the shape of all possible objects. For this reason, the object to be grasped is usually perceived through some sensory input, commonly vision. However, one of the main problems with this approach […]