T. Zheng, L. Azzolin, J. Sánchez, O. Dössel, and A. Loewe. An automate pipeline for generating fiber orientation and region annotation in patient specific atrial models. In Current Directions in Biomedical Engineering, vol. 7(2) , pp. 136-139, 2021
BACKGROUND AND OBJECTIVE: Cardiac electrophysiology is a medical specialty with a long and rich tradition of computational modeling. Nevertheless, no community standard for cardiac electrophysiology simulation software has evolved yet. Here, we present the openCARP simulation environment as one solution that could foster the needs of large parts of this community. METHODS AND RESULTS: openCARP and the Python-based carputils framework allow developing and sharing simulation pipelines which automate in silico experiments including all modeling and simulation steps to increase reproducibility and productivity. The continuously expanding openCARP user community is supported by tailored infrastructure. Documentation and training material facilitate access to this complementary research tool for new users. After a brief historic review, this paper summarizes requirements for a high-usability electrophysiology simulator and describes how openCARP fulfills them. We introduce the openCARP modeling workflow in a multi-scale example of atrial fibrillation simulations on single cell, tissue, organ and body level and finally outline future development potential. CONCLUSION: As an open simulator, openCARP can advance the computational cardiac electrophysiology field by making state-of-the-art simulations accessible. In combination with the carputils framework, it offers a tailored software solution for the scientific community and contributes towards increasing use, transparency, standardization and reproducibility of in silico experiments.
H. Anzt, and A. Loewe. Forschungssoftware – Nachhaltige Entwicklung und Bereitstellung. In Forschung & Lehre, vol. 28(5) , pp. 380-381, 2021
The arrhythmogenesis of atrial fibrillation is associated with the presence of fibrotic atrial tissue. Not only fibrosis but also physiological anatomical variability of the atria and the thorax reflect in altered morphology of the P wave in the 12-lead electrocardiogram (ECG). Distinguishing between the effects on the P wave induced by local atrial substrate changes and those caused by healthy anatomical variations is important to gauge the potential of the 12-lead ECG as a non-invasive and cost-effective tool for the early detection of fibrotic atrial cardiomyopathy to stratify atrial fibrillation propensity. In this work, we realized 54,000 combinations of different atria and thorax geometries from statistical shape models capturing anatomical variability in the general population. For each atrial model, 10 different volume fractions (0-45%) were defined as fibrotic. Electrophysiological simulations in sinus rhythm were conducted for each model combination and the respective 12-lead ECGs were computed. P wave features (duration, amplitude, dispersion, terminal force in V1) were extracted and compared between the healthy and the diseased model cohorts. All investigated feature values systematically in- or decreased with the left atrial volume fraction covered by fibrotic tissue, however value ranges overlapped between the healthy and the diseased cohort. Using all extracted P wave features as input values, the amount of the fibrotic left atrial volume fraction was estimated by a neural network with an absolute root mean square error of 8.78%. Our simulation results suggest that although all investigated P wave features highly vary for different anatomical properties, the combination of these features can contribute to non-invasively estimate the volume fraction of atrial fibrosis using ECG-based machine learning approaches.
AIMS: The treatment of atrial fibrillation beyond pulmonary vein isolation has remained an unsolved challenge. Targeting regions identified by different substrate mapping approaches for ablation resulted in ambiguous outcomes. With the effective refractory period being a fundamental prerequisite for the maintenance of fibrillatory conduction, this study aims at estimating the effective refractory period with clinically available measurements. METHODS AND RESULTS: A set of 240 simulations in a spherical model of the left atrium with varying model initialization, combination of cellular refractory properties, and size of a region of lowered effective refractory period was implemented to analyse the capabilities and limitations of cycle length mapping. The minimum observed cycle length and the 25% quantile were compared to the underlying effective refractory period. The density of phase singularities was used as a measure for the complexity of the excitation pattern. Finally, we employed the method in a clinical test of concept including five patients. Areas of lowered effective refractory period could be distinguished from their surroundings in simulated scenarios with successfully induced multi-wavelet re-entry. Larger areas and higher gradients in effective refractory period as well as complex activation patterns favour the method. The 25% quantile of cycle lengths in patients with persistent atrial fibrillation was found to range from 85 to 190 ms. CONCLUSION: Cycle length mapping is capable of highlighting regions of pathologic refractory properties. In combination with complementary substrate mapping approaches, the method fosters confidence to enhance the treatment of atrial fibrillation beyond pulmonary vein isolation particularly in patients with complex activation patterns.
Research software has become a central asset in academic research. It optimizes existing and enables new research methods, implements and embeds research knowledge, and constitutes an essential research product in itself. Research software must be sustainable in order to understand, replicate, reproduce, and build upon existing research or conduct new research effectively. In other words, software must be available, discoverable, usable, and adaptable to new needs, both now and in the future. Research software therefore requires an environment that supports sustainability. Hence, a change is needed in the way research software development and maintenance are currently motivated, incentivized, funded, structurally and infrastructurally supported, and legally treated. Failing to do so will threaten the quality and validity of research. In this paper, we identify challenges for research software sustainability in Germany and beyond, in terms of motivation, selection, research software engineering personnel, funding, infrastructure, and legal aspects. Besides researchers, we specifically address political and academic decision-makers to increase awareness of the importance and needs of sustainable research software practices. In particular, we recommend strategies and measures to create an environment for sustainable research software, with the ultimate goal to ensure that software-driven research is valid, reproducible and sustainable, and that software is recognized as a first class citizen in research. This paper is the outcome of two workshops run in Germany in 2019, at deRSE19 - the first International Conference of Research Software Engineers in Germany - and a dedicated DFG-supported follow-up workshop in Berlin.
OBJECTIVE: Atrial flutter (AFl) is a common arrhythmia that can be categorized according to different self-sustained electrophysiological mechanisms. The non-invasive discrimination of such mechanisms would greatly benefit ablative methods for AFl therapy as the driving mechanisms would be described prior to the invasive procedure, helping to guide ablation. In the present work, we sought to implement recurrence quantification analysis (RQA) on 12-lead ECG signals from a computational framework to discriminate different electrophysiological mechanisms sustaining AFl. METHODS: 20 different AFl mechanisms were generated in 8 atrial models and were propagated into 8 torso models via forward solution, resulting in 1,256 sets of 12-lead ECG signals. Principal component analysis was applied on the 12-lead ECGs, and six RQA-based features were extracted from the most significant principal component scores in two different approaches: individual component RQA and spatial reduced RQA. RESULTS: In both approaches, RQA-based features were significantly sensitive to the dynamic structures underlying different AFl mechanisms. Hit rate as high as 67.7% was achieved when discriminating the 20 AFl mechanisms. RQA-based features estimated for a clinical sample suggested high agreement with the results found in the computational framework. CONCLUSION: RQA has been shown an effective method to distinguish different AFl electrophysiological mechanisms in a non-invasive computational framework. A clinical 12-lead ECG used as proof of concept showed the value of both the simulations and the methods. SIGNIFICANCE: The non-invasive discrimination of AFl mechanisms helps to delineate the ablation strategy, reducing time and resources required to conduct invasive cardiac mapping and ablation procedures.
Acute ischemic stroke is a major health problem with a high mortality rate and a high risk for permanent disabilities. Selective brain hypothermia has the neuroprotective potential to possibly lower cerebral harm. A recently developed catheter system enables to combine endovascular blood cooling and thrombectomy using the same endovascular access. By using the penumbral perfusion via leptomeningeal collaterals, the catheter aims at enabling a cold reperfusion, which mitigates the risk of a reperfusion injury. However, cerebral circulation is highly patient-specific and can vary greatly. Since direct measurement of remaining perfusion and temperature decrease induced by the catheter is not possible without additional harm to the patient, computational modeling provides an alternative to gain knowledge about resulting cerebral temperature decrease. In this work, we present a brain temperature model with a realistic division into gray and white matter and consideration of spatially resolved perfusion. Furthermore, it includes detailed anatomy of cerebral circulation with possibility of personalizing on base of real patient anatomy. For evaluation of catheter performance in terms of cold reperfusion and to analyze its general performance, we calculated the decrease in brain temperature in case of a large vessel occlusion in the middle cerebral artery (MCA) for different scenarios of cerebral arterial anatomy. Congenital arterial variations in the circle of Willis had a distinct influence on the cooling effect and the resulting spatial temperature distribution before vessel recanalization. Independent of the branching configurations, the model predicted a cold reperfusion due to a strong temperature decrease after recanalization (1.4-2.2 C after 25 min of cooling, recanalization after 20 min of cooling). Our model illustrates the effectiveness of endovascular cooling in combination with mechanical thrombectomy and its results serve as an adequate substitute for temperature measurement in a clinical setting in the absence of direct intraparenchymal temperature probes.
The contraction of the human heart is a complex process as a consequence of the interaction of internal and external forces. In current clinical routine, the resulting deformation can be imaged during an entire heart beat. However, the active tension development cannot be measured in vivo but may provide valuable diagnostic information. In this work, we present a novel numerical method for solving an inverse problem of cardiac biomechanics-estimating the dynamic active tension field, provided the motion of the myocardial wall is known. This ill-posed non-linear problem is solved using second order Tikhonov regularization in space and time. We conducted a sensitivity analysis by varying the fiber orientation in the range of measurement accuracy. To achieve RMSE <20% of the maximal tension, the fiber orientation needs to be provided with an accuracy of 10°. Also, variation was added to the deformation data in the range of segmentation accuracy. Here, imposing temporal regularization led to an eightfold decrease in the error down to 12%. Furthermore, non-contracting regions representing myocardial infarct scars were introduced in the left ventricle and could be identified accurately in the inverse solution (sensitivity >0.95). The results obtained with non-matching input data are promising and indicate directions for further improvement of the method. In future, this method will be extended to estimate the active tension field based on motion data from clinical images, which could provide important insights in terms of a new diagnostic tool for the identification and treatment of diseased heart tissue.
Background: Hypertrophic cardiomyopathy (HCM) is typically caused by mutations in sarcomeric genes leading to cardiomyocyte disarray, replacement fibrosis, impaired contractility, and elevated filling pressures. These varying tissue properties are associ- ated with certain strain patterns that may allow to establish a diagnosis by means of non-invasive imaging without the necessity of harmful myocardial biopsies or con- trast agent application. With a numerical study, we aim to answer: how the variability in each of these mechanisms contributes to altered mechanics of the left ventricle (LV) and if the deformation obtained in in-silico experiments is comparable to values reported from clinical measurements. Methods: We conducted an in-silico sensitivity study on physiological and pathologi- cal mechanisms potentially underlying the clinical HCM phenotype. The deformation of the four-chamber heart models was simulated using a finite-element mechanical solver with a sliding boundary condition to mimic the tissue surrounding the heart. Furthermore, a closed-loop circulatory model delivered the pressure values acting on the endocardium. Deformation measures and mechanical behavior of the heart mod- els were evaluated globally and regionally. Results: Hypertrophy of the LV affected the course of strain, strain rate, and wall thickening—the root-mean-squared difference of the wall thickening between control (mean thickness 10 mm) and hypertrophic geometries (17 mm) was >10%. A reduc- tion of active force development by 40% led to less overall deformation: maximal radial strain reduced from 26 to 21%. A fivefold increase in tissue stiffness caused a more homogeneous distribution of the strain values among 17 heart segments. Fiber disarray led to minor changes in the circumferential and radial strain. A combination of pathological mechanisms led to reduced and slower deformation of the LV and halved the longitudinal shortening of the LA. Conclusions: This study uses a computer model to determine the changes in LV deformation caused by pathological mechanisms that are presumed to underlay HCM. This knowledge can complement imaging-derived information to obtain a more accu- rate diagnosis of HCM.
The term "In Silico Trial" indicates the use of computer modelling and simulation to evaluate the safety and efficacy of a medical product, whether a drug, a medical device, a diagnostic product or an advanced therapy medicinal product. Predictive models are positioned as new methodologies for the development and the regulatory evaluation of medical products. New methodologies are qualified by regulators such as FDA and EMA through formal processes, where a first step is the definition of the Context of Use (CoU), which is a concise description of how the new methodology is intended to be used in the development and regulatory assessment process. As In Silico Trials are a disruptively innovative class of new methodologies, it is important to have a list of possible CoUs highlighting potential applications for the development of the relative regulatory science. This review paper presents the result of a consensus process that took place in the InSilicoWorld Community of Practice, an online forum for experts in in silico medicine. The experts involved identified 46 descriptions of possible CoUs which were organised into a candidate taxonomy of nine CoU categories. Examples of 31 CoUs were identified in the available literature; the remaining 15 should, for now, be considered speculative.
L. Azzolin, S. Schuler, O. Dössel, and A. Loewe. A Reproducible Protocol to Assess Arrhythmia Vulnerability : Pacing at the End of the Effective Refractory Period.. In Frontiers in Physiology, vol. 12, pp. 656411, 2021
In both clinical and computational studies, different pacing protocols are used to induce arrhythmia and non-inducibility is often considered as the endpoint of treatment. The need for a standardized methodology is urgent since the choice of the protocol used to induce arrhythmia could lead to contrasting results, e.g., in assessing atrial fibrillation (AF) vulnerabilty. Therefore, we propose a novel method-pacing at the end of the effective refractory period (PEERP)-and compare it to state-of-the-art protocols, such as phase singularity distribution (PSD) and rapid pacing (RP) in a computational study. All methods were tested by pacing from evenly distributed endocardial points at 1 cm inter-point distance in two bi-atrial geometries. Seven different atrial models were implemented: five cases without specific AF-induced remodeling but with decreasing global conduction velocity and two persistent AF cases with an increasing amount of fibrosis resembling different substrate remodeling stages. Compared with PSD and RP, PEERP induced a larger variety of arrhythmia complexity requiring, on average, only 2.7 extra-stimuli and 3 s of simulation time to initiate reentry. Moreover, PEERP and PSD were the protocols which unveiled a larger number of areas vulnerable to sustain stable long living reentries compared to RP. Finally, PEERP can foster standardization and reproducibility, since, in contrast to the other protocols, it is a parameter-free method. Furthermore, we discuss its clinical applicability. We conclude that the choice of the inducing protocol has an influence on both initiation and maintenance of AF and we propose and provide PEERP as a reproducible method to assess arrhythmia vulnerability.
Background: Rate-varying S1S2 stimulation protocols can be used for restitution studies to characterize atrial substrate, ionic remodeling, and atrial fibrillation risk. Clinical restitution studies with numerous patients create large amounts of these data. Thus, an automated pipeline to evaluate clinically acquired S1S2 stimulation protocol data necessitates consistent, robust, reproducible, and precise evaluation of local activation times, electrogram amplitude, and conduction velocity. Here, we present the CVAR-Seg pipeline, developed focusing on three challenges: (i) No previous knowledge of the stimulation parameters is available, thus, arbitrary protocols are supported. (ii) The pipeline remains robust under different noise conditions. (iii) The pipeline supports segmentation of atrial activities in close temporal proximity to the stimulation artifact, which is challenging due to larger amplitude and slope of the stimulus compared to the atrial activity. Methods and Results: The S1 basic cycle length was estimated by time interval detection. Stimulation time windows were segmented by detecting synchronous peaks in different channels surpassing an amplitude threshold and identifying time intervals between detected stimuli. Elimination of the stimulation artifact by a matched filter allowed detection of local activation times in temporal proximity. A non-linear signal energy operator was used to segment periods of atrial activity. Geodesic and Euclidean inter electrode distances allowed approximation of conduction velocity. The automatic segmentation performance of the CVAR-Seg pipeline was evaluated on 37 synthetic datasets with decreasing signal-to-noise ratios. Noise was modeled by reconstructing the frequency spectrum of clinical noise. The pipeline retained a median local activation time error below a single sample (1 ms) for signal-to-noise ratios as low as 0 dB representing a high clinical noise level. As a proof of concept, the pipeline was tested on a CARTO case of a paroxysmal atrial fibrillation patient and yielded plausible restitution curves for conduction speed and amplitude. Conclusion: The proposed openly available CVAR-Seg pipeline promises fast, fully automated, robust, and accurate evaluations of atrial signals even with low signal-to-noise ratios. This is achieved by solving the proximity problem of stimulation and atrial activity to enable standardized evaluation without introducing human bias for large data sets.
Mathematical models of the human heart are evolving to become a cornerstone of precision medicine and support clinical decision making by providing a powerful tool to understand the mechanisms underlying pathophysiological conditions. In this study, we present a detailed mathematical description of a fully coupled multi-scale model of the human heart, including electrophysiology, mechanics, and a closed-loop model of circulation. State-of-the-art models based on human physiology are used to describe membrane kinetics, excitation-contraction coupling and active tension generation in the atria and the ventricles. Furthermore, we highlight ways to adapt this framework to patient specific measurements to build digital twins. The validity of the model is demonstrated through simulations on a personalized whole heart geometry based on magnetic resonance imaging data of a healthy volunteer. Additionally, the fully coupled model was employed to evaluate the effects of a typical atrial ablation scar on the cardiovascular system. With this work, we provide an adaptable multi-scale model that allows a comprehensive personalization from ion channels to the organ level enabling digital twin modeling
In patients with atrial fibrillation, intracardiac electrogram signal amplitude is known to decrease with increased structural tissue remodeling, referred to as fibrosis. In addition to the isolation of the pulmonary veins, fibrotic sites are considered a suitable target for catheter ablation. However, it remains an open challenge to find fibrotic areas and to differentiate their density and transmurality. This study aims to identify the volume fraction and transmurality of fibrosis in the atrial substrate. Simulated cardiac electrograms, combined with a generalized model of clinical noise, reproduce clinically measured signals. Our hybrid dataset approach combines and clinical electrograms to train a decision tree classifier to characterize the fibrotic atrial substrate. This approach captures different dynamics of the electrical propagation reflected on healthy electrogram morphology and synergistically combines it with synthetic fibrotic electrograms from experiments. The machine learning algorithm was tested on five patients and compared against clinical voltage maps as a proof of concept, distinguishing non-fibrotic from fibrotic tissue and characterizing the patient's fibrotic tissue in terms of density and transmurality. The proposed approach can be used to overcome a single voltage cut-off value to identify fibrotic tissue and guide ablation targeting fibrotic areas.
Aims Atrial cardiomyopathy (ACM) is associated with new-onset atrial fibrillation, arrhythmia recurrence after pulmonary vein isolation (PVI) and increased risk for stroke. At present, diagnosis of ACM is feasible by endocardial contact mapping of left atrial (LA) low-voltage substrate (LVS) or late gadolinium-enhanced magnetic resonance imaging, but their complexity limits a widespread use. The aim of this study was to assess non-invasive body surface electrocardiographic imaging (ECGI) as a novel clinical tool for diagnosis of ACM compared with endocardial mapping. Methods and results Thirty-nine consecutive patients (66 ± 9 years, 85% male) presenting for their first PVI for persistent atrial fibrillation underwent ECGI in sinus rhythm using a 252-electrode-array mapping system. Subsequently, high-density LA voltage and biatrial activation maps (mean 2090 ± 488 sites) were acquired in sinus rhythm prior to PVI. Freedom from arrhythmia recurrence was assessed within 12 months follow-up. Increased duration of total atrial conduction time (TACT) in ECGI was associated with both increased atrial activation time and extent of LA-LVS in endocardial contact mapping (r = 0.77 and r = 0.66, P < 0.0001 respectively). Atrial cardiomyopathy was found in 23 (59%) patients. A TACT value of 148 ms identified ACM with 91.3% sensitivity and 93.7% specificity. Arrhythmia recurrence occurred in 15 (38%) patients during a follow-up of 389 ± 55 days. Freedom from arrhythmia was significantly higher in patients with a TACT <148 ms compared with patients with a TACT ≥148 ms (82.4% vs. 45.5%, P = 0.019). Conclusion Analysis of TACT in non-invasive ECGI allows diagnosis of patients with ACM, which is associated with a significantly increased risk for arrhythmia recurrence following PVI.
S. Schuler, N. Pilia, D. Potyagaylo, and A. Loewe. Cobiveco: Consistent biventricular coordinates for precise and intuitive description of position in the heart – with MATLAB implementation. In Medical Image Analysis, vol. 74, pp. 102247, 2021
Ventricular coordinates are widely used as a versatile tool for various applications that benefit from a description of local position within the heart. However, the practical usefulness of ventricular coordinates is determined by their ability to meet application-specific requirements. For regression-based estimation of biventricular position, for example, a symmetric definition of coordinate directions in both ventricles is important. For the transfer of data between different hearts as another use case, the consistency of coordinate values across different geometries is particularly relevant. To meet these requirements, we compare different approaches to compute coordinates and present Cobiveco, a symmetric, consistent and intuitive biventricular coordinate system that builds upon existing coordinate systems, but overcomes some of their limitations. A novel one-way transfer error is introduced to assess the consistency of the coordinates. Normalized distances along bijective trajectories between two boundaries were found to be superior to solutions of Laplace’s equation for defining coordinate values, as they show better linearity in space. Evaluation of transfer and linearity errors on 36 patient geometries revealed a more than 4-fold improvement compared to a state-of-the-art method. Finally, we show two application examples underlining the relevance for cardiac data processing. Cobiveco MATLAB code is available under a permissive open-source license.
The ECG is one of the most commonly used non-invasive tools to gain insights into the electrical functioning of the heart. It has been crucial as a foundation in the creation and validation of in silico models describing the underlying electrophysiological processes. However, so far, the contraction of the heart and its influences on the ECG have mainly been overlooked in in silico models. As the heart contracts and moves, so do the electrical sources within the heart responsible for the signal on the body surface, thus potentially altering the ECG. To illuminate these aspects, we developed a human 4-chamber electro-mechanically coupled whole heart in silico model and embedded it within a torso model. Our model faithfully reproduces measured 12-lead ECG traces, circulatory characteristics, as well as physiological ventricular rotation and atrioventricular valve plane displacement. We compare our dynamic model to three non-deforming ones in terms of standard clinically used ECG leads (Einthoven and Wilson) and body surface potential maps (BSPM). The non-deforming models consider the heart at its ventricular end-diastatic, end-diastolic and end-systolic states. The standard leads show negligible differences during P-Wave and QRS-Complex, yet during T-Wave the leads closest to the heart show prominent differences in amplitude. When looking at the BSPM, there are no notable differences during the P-Wave, but effects of cardiac motion can be observed already during the QRS-Complex, increasing further during the T-Wave. We conclude that for the modeling of activation (P-Wave/QRS-Complex), the associated effort of simulating a complete electro-mechanical approach is not worth the computational cost. But when looking at ventricular repolarization (T-Wave) in standard leads as well as BSPM, there are areas where the signal can be influenced by cardiac motion of the heart to an extent that should not be ignored.
Atrial flutter (AFL) is a common atrial arrhythmia typically characterized by electrical activity propagating around specific anatomical regions. It is usually treated with catheter ablation. However, the identification of rotational activities is not straightforward, and requires an intense effort during the first phase of the electrophysiological (EP) study, i.e., the mapping phase, in which an anatomical 3D model is built and electrograms (EGMs) are recorded. In this study, we modeled the electrical propagation pattern of AFL (measured during mapping) using network theory (NT), a well-known field of research from the computer science domain. The main advantage of NT is the large number of available algorithms that can efficiently analyze the network. Using directed network mapping, we employed a cycle-finding algorithm to detect all cycles in the network, resembling the main propagation pattern of AFL. The method was tested on two subjects in sinus rhythm, six in an experimental model of in-silico simulations, and 10 subjects diagnosed with AFL who underwent a catheter ablation. The algorithm correctly detected the electrical propagation of both sinus rhythm cases and in-silico simulations. Regarding the AFL cases, arrhythmia mechanisms were either totally or partially identified in most of the cases (8 out of 10), i.e., cycles around the mitral valve, tricuspid valve and figure-of-eight reentries. The other two cases presented a poor mapping quality or a major complexity related to previous ablations, large areas of fibrotic tissue, etc. Directed network mapping represents an innovative tool that showed promising results in identifying AFL mechanisms in an automatic fashion. Further investigations are needed to assess the reliability of the method in different clinical scenarios.
The human heart is a masterpiece of the highest complexity coordinating multi-physics aspects on a multi-scale range. Thus, modeling the cardiac function to reproduce physiological characteristics and diseases remains challenging. Especially the complex simulation of the blood's hemodynamics and its interaction with the myocardial tissue requires a high accuracy of the underlying computational models and solvers. These demanding aspects make whole-heart fully-coupled simulations computationally highly expensive and call for simpler but still accurate models. While the mechanical deformation during the heart cycle drives the blood flow, less is known about the feedback of the blood flow onto the myocardial tissue. To solve the fluid-structure interaction problem, we suggest a cycle-to-cycle coupling of the structural deformation and the fluid dynamics. In a first step, the displacement of the endocardial wall in the mechanical simulation serves as a unidirectional boundary condition for the fluid simulation. After a complete heart cycle of fluid simulation, a spatially resolved pressure factor (PF) is extracted and returned to the next iteration of the solid mechanical simulation, closing the loop of the iterative coupling procedure. All simulations were performed on an individualized whole heart geometry. The effect of the sequential coupling was assessed by global measures such as the change in deformation and-as an example of diagnostically relevant information-the particle residence time. The mechanical displacement was up to 2 mm after the first iteration. In the second iteration, the deviation was in the sub-millimeter range, implying that already one iteration of the proposed cycle-to-cycle coupling is sufficient to converge to a coupled limit cycle. Cycle-to-cycle coupling between cardiac mechanics and fluid dynamics can be a promising approach to account for fluid-structure interaction with low computational effort. In an individualized healthy whole-heart model, one iteration sufficed to obtain converged and physiologically plausible results.
The electrocardiogram (ECG) is a standard cost-efficient and non-invasive tool for the early detection of various cardiac diseases. Quantifying different timing and amplitude features of and in between the single ECG waveforms can reveal important information about the underlying (dys-)function of the heart. Determining these features requires the detection of fiducial points that mark the on- and offset as well as the peak of each ECG waveform (P wave, QRS complex, T wave). Manually setting these points is time-consuming and requires a physician’s expert knowledge. Therefore, the highly modular ECGdeli toolbox for MATLAB was developed, which is capable of filtering clinically recorded 12-lead ECG signals and detecting the fiducial points, also called delineation. It is one of the few open toolboxes offering ECG delineation for P waves, T Waves and QRS complexes. The algorithms provided were evaluated with the QT database, an ECG database comprising 105 signals with fiducial points annotated by clinicians. The median difference between the fiducial points set by the boundary detection algorithm and the clinical annotations serving as a ground truth is less than 4 samples (16 ms) for the P wave and the QRS complex markers.
Computer modeling of the electrophysiology of the heart has undergone significant progress. A healthy heart can be modeled starting from the ion channels via the spread of a depolarization wave on a realistic geometry of the human heart up to the potentials on the body surface and the ECG. Research is advancing regarding modeling diseases of the heart. This article reviews progress in calculating and analyzing the corresponding electrocardiogram (ECG) from simulated depolarization and repolarization waves. First, we describe modeling of the P-wave, the QRS complex and the T-wave of a healthy heart. Then, both the modeling and the corresponding ECGs of several important diseases and arrhythmias are delineated: ischemia and infarction, ectopic beats and extrasystoles, ventricular tachycardia, bundle branch blocks, atrial tachycardia, flutter and fibrillation, genetic diseases and channelopathies, imbalance of electrolytes and drug-induced changes. Finally, we outline the potential impact of computer modeling on ECG interpretation. Computer modeling can contribute to a better comprehension of the relation between features in the ECG and the underlying cardiac condition and disease. It can pave the way for a quantitative analysis of the ECG and can support the cardiologist in identifying events or non-invasively localizing diseased areas. Finally, it can deliver very large databases of reliably labeled ECGs as training data for machine learning.
C. Nagel, S. Schuler, O. Dössel, and A. Loewe. A bi-atrial statistical shape model for large-scale in silico studies of human atria: model development and application to ECG simulations. In Medical Image Analysis, vol. 74, pp. 102210, 2021
Large-scale electrophysiological simulations to obtain electrocardiograms (ECG) carry the potential to pro- duce extensive datasets for training of machine learning classifiers to, e.g., discriminate between different cardiac pathologies. The adoption of simulations for these purposes is limited due to a lack of ready-to- use models covering atrial anatomical variability. We built a bi-atrial statistical shape model (SSM) of the endocardial wall based on 47 segmented human CT and MRI datasets using Gaussian process morphable models. Generalization, specificity, and compact- ness metrics were evaluated. The SSM was applied to simulate atrial ECGs in 100 random volumetric instances. The first eigenmode of our SSM reflects a change of the total volume of both atria, the second the asym- metry between left vs. right atrial volume, the third a change in the prominence of the atrial appendages. The SSM is capable of generalizing well to unseen geometries and 95% of the total shape variance is cov- ered by its first 24 eigenvectors. The P waves in the 12-lead ECG of 100 random instances showed a duration of 109 . 7 ±12 . 2 ms in accordance with large cohort studies. The novel bi-atrial SSM itself as well as 100 exemplary instances with rule-based augmentation of atrial wall thickness, fiber orientation, inter-atrial bridges and tags for anatomical structures have been made publicly available. This novel, openly available bi-atrial SSM can in future be employed to generate large sets of realistic atrial geometries as a basis for in silico big data approaches.
J. Sánchez, B. Trenor, J. Saiz, O. Dössel, and A. Loewe. Fibrotic Remodeling during Persistent Atrial Fibrillation: In Silico Investigation of the Role of Calcium for Human Atrial Myofibroblast Electrophysiology. In Cells, vol. 10(11) , pp. 2852, 2021
During atrial fibrillation, cardiac tissue undergoes different remodeling processes at different scales from the molecular level to the tissue level. One central player that contributes to both electrical and structural remodeling is the myofibroblast. Based on recent experimental evidence on myofibroblasts’ ability to contract, we extended a biophysical myofibroblast model with Ca2+ handling components and studied the effect on cellular and tissue electrophysiology. Using genetic algorithms, we fitted the myofibroblast model parameters to the existing in vitro data. In silico experiments showed that Ca2+ currents can explain the experimentally observed variability regarding the myofibroblast resting membrane potential. The presence of an L-type Ca2+ current can trigger automaticity in the myofibroblast with a cycle length of 799.9 ms. Myocyte action potentials were prolonged when coupled to myofibroblasts with Ca2+ handling machinery. Different spatial myofibroblast distribution patterns increased the vulnerable window to induce arrhythmia from 12 ms in non-fibrotic tissue to 22 ± 2.5 ms and altered the reentry dynamics. Our findings suggest that Ca2+ handling can considerably affect myofibroblast electrophysiology and alter the electrical propagation in atrial tissue composed of myocytes coupled with myofibroblasts. These findings can inform experimental validation experiments to further elucidate the role of myofibroblast Ca2+ handling in atrial arrhythmogenesis.
Clinical and computational studies highlighted the role of atrial anatomy for atrial fibrillation vulnerability. However, personalized computational models are often generated from electroanatomical maps, which might lack important anatomical structures like the appendages, or from imaging data which are potentially affected by segmentation uncertainty. A bi-atrial statistical shape model (SSM) covering relevant structures for electrophysiological simulations was shown to cover atrial shape variability. We hypothesized that it could, therefore, also be used to infer the shape of missing structures and deliver ready-to-use models to assess atrial fibrillation vulnerability in silico. We implemented a highly automatized pipeline to generate a personalized computational model by fitting the SSM to the clinically acquired geometries. We applied our framework to a geometry coming from an electroanatomical map and one derived from magnetic resonance images (MRI). Only landmarks belonging to the left atrium and no information from the right atrium were used in the fitting process. The left atrium surface-to-surface distance between electroanatomical map and a fitted instance of the SSM was 2.26+-1.95 mm. The distance between MRI segmentation and SSM was 2.07+-1.56 mm and 3.59+-2.84 mm in the left and right atrium, respectively. Our semi-automatic pipeline provides ready-to-use personalized computational models representing the original anatomy well by fitting a SSM. We were able to infer the shape of the right atrium even in the case of using information only from the left atrium.
Individualized computer models of the geometry of the human heart are often based on mag- netic resonance images (MRI) or computed tomography (CT) scans. The stress distribution in the imaged state cannot be measured but needs to be estimated from the segmented geometry, e.g. by an iterative algorithm. As the convergence of this algorithm depends on different geometrical conditions, we system- atically studied their influence. Beside various shape alterations, we investigated the chamber volume, as well as the effect of material parameters. We found a marked influence of passive material parameters: increasing the model stiffness by a factor of ten halved the residual norm in the first iteration. Flat and concave areas led to a reduced robustness and convergence rate of the unloading algorithm. With this study, the geometric effects and modeling aspects governing the unloading algorithm’s convergence are identified and can be used as a basis for further improvement.
Mitral regurgitation alters the flow conditions in the left ventricle. To account for quantitative changes and to investigate the behavior of different flow components, a realistic computational model of the whole human heart was employed in this study. While performing fluid dynamics simulations, a scalar transport equation was solved to analyze vortex formation and ventricular wash-out for different regurgitation severities. Additionally, a particle tracking algorithm was implemented to visualize single components of the blood flow. We confirmed a significantly lowered volume of the direct flow component as well as a higher vorticity in the diseased case.
S. Appel, T. Gerach, O. Dössel, and A. Loewe. Adaptation of the Calcium-dependent Tension Development in Ventricular Cardiomyocytes. In Current Directions in Biomedical Engineering, vol. 7(2) , pp. 251-254, 2021
Today a variety of models describe the physiological behavior of the heart on a cellular level. The intracellular calcium concentration plays an important role, since it is the main driver for the active contraction of the heart. Due to different implementations of the calcium dynamics, simulating cardiac electromechanics can lead to severely different behaviorsof the active tension when coupling the same tension model with different electrophysiological models. To handle these variations, we present an optimization tool that adapts the parameters of the most recent, human based tension model. The goal is to generate a physiologically valid tension development when coupled to an electrophysiological cellular model independent of the specifics of that model's calcium transient. In this work, we focus ona ventricular cell model. In order to identify the calcium-sensitive parameters, a sensitivity analysis of the tension model was carried out. In a further step, the cell model was adapted to reproduce the sarcomere length-dependent behavior of troponin C. With a maximum relative deviationof 20.3% per defined characteristic of the tension development, satisfactory results could be obtained for isometric twitch tension. Considering the length-dependent troponin handling, physiological behavior could be reproduced. In conclusion, we propose an algorithm to adapt the tension development model to any calcium transient input toachieve a physiologically valid active contraction on a cellular level. As a proof of concept, the algorithm is successfully applied to one of the most recent human ventricular cell models. This is an important step towards fullycoupled electromechanical heart models, which are a valuable tool in personalized health care
C. Nagel, O. Dössel, and A. Loewe. Sensitivity and Generalization of a Neural Network for Estimating Left Atrial Fibrotic Volume Fractions from the 12-lead ECG. In Current Directions in Biomedical Engineering, vol. 7(2) , pp. 307-310, 2021
In order to be used in a clinical context, numerical simulation tools have to strike a balance between accuracy and low computational effort. For re- producing the pumping function of the human heart numerically, the physical domains of cardiac continuum mechanics and fluid dynamics have a significant relevance. In this context, fluid-structure interaction between the heart muscle and the blood flow is particularly important: Myocardial tension development and wall deformation drive the blood flow. However, the degree to which the blood flow has a retrograde effect on the cardiac mechanics in this multi-physics problem remains unclear up to now. To address this question, we implemented a cycle-to-cycle coupling based on a finite element model of a patient-specific whole heart geometry. The deforma- tion of the cardiac wall over one heart cycle was computed using our mechanical simulation framework. A closed loop circulatory system model as part of the simulation delivered the chamber pressures. The displacement of the endo- cardial surfaces and the pressure courses of one cycle were used as boundary conditions for the fluid solver. After solving the Navier-Stokes equations, the relative pressure was extracted for all endocardial wall elements from the three dimensional pressure field. These local pressure deviations were subsequently returned to the next iteration of the continuum mechanical simulation, thus closing the loop of the iterative coupling procedure. Following this sequential coupling approach, we simulated three iterations of mechanic and fluid simulations. To characterize the convergence, we evaluated the time course of the normalized pressure field as well as the euclidean distance between nodes of the mechanic simulation in subsequent iterations. For the left ventricle (LV), the maximal euclidean distance of all endocardial wall nodes was smaller than 2mm between the first and second iteration. The maximal distance between the second and third iteration was 70μm, thus the limit of necessary cycles was already reached after two iterations. In future work, this iterative coupling approach will have to prove its abil- ity to deliver physiologically accurate results also for diseased heart models. Altogether, the sequential coupling approach with its low computational effort delivered promising results for modeling fluid-structure interaction in cardiac simulations.