top of page

 Robot-assisted ultrasonic microsurgery

Project reference: SIE_03_22
First supervisor:    Antonios Pouliopoulos
Second supervisor: Kawal Rhode

Start date:   

Project summary: The aim of the project is to develop robot-assisted ultrasound-mediated microsurgery in the brain. Neuronavigation-guided focused ultrasound(FUS)is being tested as a methodtonon-invasively increase the delivered drug dose across the blood-brain barrier (BBB). However, there is currently no link between the guiding system and the positioning of the FUS transducer, necessitating manual targeting which is inaccurate and time consuming. In this project, we will develop an automated system to achieve accurate FUS targeting and microsurgery using a 7 degrees-of-freedom robotic arm. The FUS transducer will be attached to the robotic arm, and its location will be continuously monitored using fiducial markers and an infrared camera. Treatment planning will dictate the required motion of the arm towards the targeted location. Finally, a feedback loop based on both ultrasound and optical data will reposition the FUS transducer position in real-time, to adjust for potential movement of the subject’s head. 

 

Project description: 

Brain pathologies such as Alzheimer’s disease and brain tumours are notoriously difficult to treat, in part due to the presence of the blood-brain barrier (BBB), which blocks most drugs with molecular weight higher than 400 Da. Focused ultrasound (FUS) in combination with pre-formed circulating microbubbles is the only method that allows both non-invasive and localized opening of the BBB. This technology has been tested for over two decades at pre-clinical level, and is rapidly translating into the clinic. One approach of FUS-mediated BBB opening is using neuronavigation systems to optically guide the placement of the focal volume into the desired location. Current clinical systems use a manually-positioned arm to target. However, this is time consuming and operator dependent. Additionally, subject motion reduces the accuracy and increases the off-target brain volume being treated. Finally, the FUS transducer has a defined focal volume, restricting the dimension of the brain area that can be treated in a single session.

 

In this project, we will develop a fully automated targeting and motion compensation system, using a 7 degrees-of-freedom (7-DOF) robotic arm. Real-time feedback based on optics (i.e., infrared camera) and acoustics (i.e., B-mode) will be implemented to account for the subject motion and skull-induced beam aberrations, in order to achieve micrometre precision in the ultrasonic intervention. Finally, the robotic arm will be programmed to move along pre-defined trajectories, in order to treat arbitrary brain volumes.

 

The primary aims of this project are the following:

1)Link the neuronavigation system (e.g., BrainSight)and the 7-DOF robotic arm for automated FUS targeting. The robotic arm will be programmed to position the FUS transducer at a specified location on the scalp. Furthermore, for volumetric treatments, the robotic arm will relocate at specified translational and rotational speeds to achieve non-invasive microsurgery along arbitrary trajectories.

2)Develop routines for real-time feedback-based motion compensation, by continuously tracking the current position of the FUS transducer and the subject’s head. We will first use optical data from an infrared camera, typically provided with neuronavigation systems, as input for the motion compensation loop, followed by B-mode ultrasound data acquired with a co-aligned ultrasound imaging array. Machine learning algorithms will track the skull reflection in real-time, and adjust the position of the robotic arm at a micrometre precision.

3)Develop the concept of twisting acoustic holography.3D-printed holographic lenses are currently used to bend the acoustic focus into arbitrary shapes. A common limitation of existing acoustic lenses is that they can only bend the acoustic field along a single 2D plane. However, most treatment targets (e.g., brain tumours or amyloid-rich brain structures) have irregular 3D shapes. The 7-DOF robotic arm will enable rotation of the FUS transducer around the axial axis. We will perform numerical simulations to estimate the holographic deterioration induced by twisting holography, as a result of the skull differences in  each acoustic path. Mock treatments with tissue-mimicking phantoms will establish the feasibility and efficacy of twisting acoustic holography for treating 3D complex brain volumes.

 

 

Antonios_Kawal_edited.jpg
bottom of page