Filled

Project reference

2024_P17_Gionfrida_Bergeles

Start date

10/1/24

First Academic supervisor

Dr Letizia Gionfrida

Second Academic supervisor

Prof Christos Bergeles

First Clinical supervisor

Prof Prokar Dasgupta

4

Back to projects

Supervised autonomy in minimally invasive surgery for manipulation of highly deformable objects

Recent development in supervised autonomy applied to surgical robots aims to merge human manipulation abilities with high-precision capabilities of robots. In the case of an initial stage of laparoscopic surgery, known as docking, the procedure involves the manipulation of a rigid objects, the endoscope, through percutaneous soft bodies. The ability to understand hand kinematics in this framework can enhance shared autonomy in docking. The proposed system aims to collect hand joint angles during co-manipulations, global position and orientation of hands, global position and orientation of the surgical instrument, and finally of the trocar. The docking system leverages RGB cameras to extract hand and wrist kinematics (e.g., using MediaPipe), the robot’s end-effector and the trocar. A deep natural network will be trained to map hand and wrist kinematics to the pose of the trocar through which the laparoscopic tube should be inserted. The aim is to test hypothesis that tracking hand kinematics can improve shared autonomy in docking to enhance safety.