LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Teaching cardiovascular medicine to machines

Photo from wikipedia

Engineers and researchers are boosting human induction and deduction skills assisted by computers, and this is one of the main drivers of progress in cardiovascular sciences and health care. This… Click to show full abstract

Engineers and researchers are boosting human induction and deduction skills assisted by computers, and this is one of the main drivers of progress in cardiovascular sciences and health care. This claim builds on two core ideas: machine learning as the vehicle for inductive reasoning, and computational cardiac models as the tool for deductive reasoning, both illustrated in Figure 1. Machine learning refers to the ability of computers to gain knowledge without being explicitly programmed for it: the computer extracts patterns from the data and thus ‘learns’ a statistical model that will perform a given task (i.e. disease classification and prediction or segment the myocardium in an image). The logic process followed here is inductive, since the machine makes broad generalizations from specific observations it has learned from. The availability of new sources of data (i.e. omics or continuous monitoring systems), the digitalization of the health record, and the recent advances in machine learning technology now offer the opportunity to reveal new patterns and new signatures of cardiovascular health and disease. On the other hand, computational cardiac models are the representations of our knowledge of the physiology of the heart pump and circulatory system governed by fundamental laws of physics and biochemistry. Computers are explicitly programmed to represent this knowledge into mechanistic models and use them to perform a given task (i.e. estimate myocardial stiffness, identify the fibrosis patterns that lead to persistent re-entrant drivers in atrial fibrillation, or compute the risk of drug toxicity). The logic process followed here is deductive, since the machine reaches conclusions based on the concordance of multiple premises and assumptions. The availability of rich anatomical and functional data and the recent advances in computational cardiology technology now offer, through the process of model personalization, two opportunities: the ability to present an integral and cohesive diagnostic picture of the patient and the possibility to simulate and predict the evolution of a condition or the impact of a treatment. Both statistical and mechanistic models are thus rendering very encouraging perspectives and expectations, but these should be handled with caution. Machine learning, as a tool for inductive reasoning, does not provide us with a conclusive proof of causal connections. Accordingly, the patterns observed in the data will not necessarily continue to exist in the future (or simply in other studies or individual cases). This translates into the specific problem of the generalizability of the model learned, which is in compromise with its complexity (i.e. in essence, the number of features or parameters to be learned from the data). As a consequence, if the problem to be solved requires a large number of features in order to make accurate predictions, then many more training examples are needed to ensure a valid generalization. Computational cardiac models, as tools for deductive reasoning, rely on the validity of the premises and assumptions they are built on. Models are always a simplification of the reality, we simply cannot capture the entire complexity of the natural world. Consequently, the biomarkers and predictions extracted from models will always have a degree of Dr. Pablo Lamata is a Wellcome Trust Senior Fellow in Basical Biomedical Science and a Reader in Computational Cardiology at King’s College of London. His research interest focuses in the combination of imaging and computational modelling technologies to improve the management of cardiovascular diseases. His team (http://cmib.website/) develops solutions to stratify subjects according to the remodelling of cardiac anatomy, to characterise the performance of the heart during diastole, and to assess non-invasively the pressure driving blood flow in the central circulatory system. He is the coordinator of the EU consortium “Personalised In-Silico Cardiology” that develops modelling methodologies to optimize clinical protocols, from data acquisition to device parameters and intervention choices. Dr. Lamata has more than 15 years of experience in the development and clinical adoption of image analysis, physiological modelling, and surgical simulation and navigation solutions. He was a Marie Curie Fellow at Siemens, he obtained his PhD from the Universidad Politécnica de Madrid, and he received his MSc degree from the Universidad de Zaragoza. You can follow him on twitter: @pablolamata.

Keywords: cardiac models; medicine; machine learning; computational cardiac; cardiology

Journal Title: Cardiovascular Research
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.