Minimally invasive surgery (MIS) benefits over 20 million patients annually worldwide, offering reduced recovery times, less scarring, and lower risks of infection compared to open surgeries. However, its adoption is limited by challenges like restricted visibility, access, and the absence of tactile feedback, increasing technical demands and procedural errors, particularly in delicate tasks like tumour resections. This PhD project aims to develop a tactile sensing approach to enhance localization, navigation and tissue property identification during robotic MIS. Specifically, the project will focus on developing a vision-based tactile sensor integrated into the tip of laparoscopes. The sensor will enable real-time detection of cavity wall contact and tissue stiffness with sub-millimetre accuracy, combining tactile and visual data for precise tumour boundary detection. This compact, integrated approach will enable robots to navigate complex anatomy intelligently, constituting a paradigm shift that improves surgical precision, safety, and outcomes in robotic MIS. The project benefits from extensive background work that has been carried out in the Robot Perception Lab, the Robotics and Vision in Medicine Lab, and international collaborations.
Back to projects
Tactile-aided Visual Navigation and Localisation for Robot-assisted Minimally Invasive Surgeries
