The Digital Twin
Researchers at King’s College London School of Biomedical Engineering & Imaging Sciences are harnessing the ability of computers to reason with data and build the digital heart twin of the patient.
Have you ever considered the concept of the digital twin? If you have, did you associate it with healthcare? Researchers at King’s College London School of Biomedical Engineering & Imaging Sciences are harnessing the ability of computers to reason with data and build the digital heart twin of the patient. The digital twin is a "computational model" version of a real entity: cell, organ or even person. This model relies on verified mathematical expressions that describe the various processes of the cell, organ or person and the parameters to those mathematical expressions have been optimized to best describe this specific entity. It is this way that the “Digital Twin” can be used to predict responses that cannot be obtained experimentally (e.g. the response of a patient to treatment method A vs B) or are too expensive to be obtained experimentally (e.g. the response of a cell under 1 million different hypotheses). Our School is a world leader in this field of research. The academic team with Dr Pablo Lamata, Dr Anastasia Nasopoulou, Prof Steven Niederer, Dr Adelaide de Vecchi, Dr Martin Bishop, Prof Alistair Young, Dr Oleg Aslanidi, Dr Jack Lee, Dr David Nordsletten and Dr Jordi Alastruey are ultimately trying to better understand the current and future status of the patient through integration of the different bits of data that they have from images, different sensor sources and different clinical data to make a cohesive image of the patient. “We try to predict how the patient is going to be through a statistical inference: we have seen many cases like you, and we infer this is the way people will behave,” Dr Lamata said. “It can also be done through mechanistic simulations, meaning that we have the in-silico [via computer] reproduction of the heart to predict what would happen if I have a pacemaker there. How is this heart going to respond to this therapy?” Analyzing the beating heart in three dimensions does indeed provide insights about its anatomy and function in order to inform clinical decisions. The focus on our research, explains Dr Lamata, is to pull out accurate and precise numbers describing changes in the heart. “We human beings are reasonably good at looking at 2D shapes on a plane, but when we start thinking about 3D shapes or motion, we are not so good at telling differences – but computers are. We can then start to describe and tell the story of how the heart grows from birth in order to early detect deviations caused by disease, or we can characterize the specific signature that each cardiac condition cause in order to tailor treatments,” Dr Lamata said. Simulating the beating heart allows us to design therapies based, not on the current status of the patient, but on its future evolution. Computers first reproduce the current baseline condition – the one observed in available clinical images and data - and then simulate a change, like the pacemaker configuration or the intake of a drug - the drug will change some cellular behavior that is embedded in the physics of the simulations. As a result, we then have the heart beating in a slightly different way caused by the pacemaker or drug, and we can then design the optimal therapy by changing choices in the virtual model. Technology to assess the best reconstruction
This technology can thus improve the decisions about the optimal surgical strategy when operating on the heart. One example is the choice of the optimal timing to replace a cardiac faulty valve, a decision that is based on the assessment of the extra burden the heart endures when you have an obstruction in that faulty valve. Computer models can predict what this extra burden is without the need of invasive sensors that are placed inside the heart – they do it working with the 3D medical images of the patient and the knowledge of the physics of flow dynamics. For instance, Dr Lamata describes the project of a surgical planning system for aortic reconstruction of children that were born with a congenital heart conditions called hypoplastic left heart syndrome. In these cases babies are born with tiny left ventricles and tiny aortas that need to be widened. “It’s about planning the geometry, planning the anatomy, planning the best reconstruction and then simulate what the best reconstruction is. In the surgical arena, modelling can help both at the planning, but also at the post-operative stage by measuring how close the surgical team reached the planned optimal strategy,” Dr Lamata said. But what limitations exist with using algorithms to predict heart health? Unsurprisingly, getting the data is a huge limitation: the right amount of data and of course high-quality data. There are also obstacles to overcome regarding the societal perception about the use of their healthcare data. Dr Nasopoulou, however, says the technology is still evolving in all aspects. “We don’t need to simulate the whole heart in an exquisite detail. In these models we simply need to capture specific features that are relevant to the clinical question,” Dr Nasopoulou said. “The trick is also to simplify the model to fit the amount of information that we have available from the clinics. Data is limited and sometimes even contradictory.” “We need to personalize the digital twin to each patient, to build that coherent and comprehensive picture of their current health status, in order to reach better accuracy in the parameters we extract to inform clinical decisions.” Dr Lamata and team are taking the first steps towards the vision of the digital twin. Said Dr Lamata: “It’s very exciting to see a future where computers provide an accurate diagnosis and predict the best possible strategy comparing medical and surgical interventions.”