Submitted by admin on Mar, 29/08/2023 - 15:49

Redes sociales

CLOTH manIpulation Learning from DEmonstrations

CLOTHILDE

Textile objects pervade human environments and their versatile manipulation by robots would open up a whole range of possibilities, from increasing the autonomy of elderly and disabled people, housekeeping and hospital logistics, to novel automation in the clothing internet business and upholstered product manufacturing. Although efficient procedures exist for the robotic handling of rigid objects and the virtual rendering of deformable objects, cloth manipulation in the real world has proven elusive, because the vast number of degrees of freedom involved in non-rigid deformations leads to unbearable uncertainties in perception and action outcomes.

This proposal aims at developing a theory of cloth manipulation and carrying it all the way down to prototype implementation in our Lab. By combining powerful recent tools from computational topology and machine learning, we plan to characterize the state of textile objects and their transformations under given actions in a compact operational way (i.e., encoding task-relevant topological changes), which would permit probabilistic planning of actions (first one handed, then bimanual) that ensure reaching a desired cloth configuration despite noisy perceptions and inaccurate actions.

In our approach, the robot will learn manipulation skills from an initial human demonstration, subsequently refined through reinforcement learning, plus occasional requests for user advice. The skills will be encoded as parameterised dynamical systems, and safe interaction with humans will be guaranteed by using a predictive controller based on a model of the robot dynamics. Prototypes will be developed for 3 envisaged applications: recognizing and folding clothes, putting an elastic cover on a mattress or a car seat, and helping elderly and disabled people to dress. The broad Robotics and AI background of the PI and the project narrow focus on clothing seem most appropriate to obtain a breakthrough in this hard fundamental research topic.


ERC-2016-ADG