Integrating haptic and visual data for low latency transmission

Project summary: The advancement in haptic sensing together with telecommunication technology such as 5G enables the possibility of ultra-low latency wireless communication of haptic information. Low latency haptic wireless communication would allow users to feel and efficiently interact with an environment remotely in real time, opening up new applications ranging from healthcare services to new consumer devices.  The size of haptic data is generally small, however the technology bottleneck is how to integrate and synchronize the haptic data with visual information which has increasingly large data sizes associated with complex data compression processes. Furthermore, this problem is exacerbated by the requirement of low latency transmission of combined haptic and visual data for real-time operation.   

Project description: In this PhD research, we will investigate methods to address the above challenge. We will first investigate approaches of synchronizing the haptic data with the visual data. Research will be carried out on how to reliably and automatically identify the cues about when and where the device is in contact with the environment within image frames, and how to continuously identify with the corresponding haptic signal and thus synchronize the two modalities together. Furthermore, we will investigate robotic middleware design, to embed the software to hardware with guaranteed real-time performance to achieve time-critical, fail-safe integration and synchronization of haptic and visual information. 

To facilitate low latency data transmission via wireless communication, this research will further investigate methods of reducing the size of image data. In conjunction with the haptic/visual data synchronization method, the student will implement the state-of-the-art image compression techniques and evaluate the performances of data transmission through tele-operation via a haptic console. A range of experiments with low to high speed of device-environment interactions, together with simulated latency in the network, will be carried out to evaluate the performance of haptic/visual communication and identify the optimal standards.  

In many scenarios involving haptic interaction, the region of interested is often focused on a localised area where the tool-environment interaction occurs. To this end, we would like to investigate how to use haptic information to identify the regions on the image where device-environment interactions are occurring, and regions in the image with high frequency changes or low frequency changes. Different image compression techniques can be applied based on these criteria to further reduce the data size for transmission while at the same time maintain a high quality displace to the user. 

Project reference: SIE_26

First supervisor: Sebastien Ourselin

Second supervisor: Hongbin Liu

Start date: October 2020