Abstract:
Background: Computer models for simulating cardiac electrophysiology are valuable tools for research and clinical applications. Traditional reaction-diffusion (RD) models used for these purposes are computationally expensive. While eikonal models offer a faster alternative, they are not well-suited to study cardiac arrhythmias driven by reentrant activity. The present work extends the diffusion-reaction eikonal alternant model (DREAM), incorporating conduction velocity (CV) restitution for simulating complex cardiac arrhythmias. Methods: The DREAM modifies the fast iterative method to model cyclical behavior, dynamic boundary conditions, and frequency-dependent anisotropic CV. Additionally, the model alternates with an approximated RD model, using a detailed ionic model for the reaction term and a triple-Gaussian to approximate the diffusion term. The DREAM and monodomain models were compared, simulating reentries in 2D manifolds with different resolutions. Results: The DREAM produced similar results across all resolutions, while experiments with the monodomain model failed at lower resolutions. CV restitution curves obtained using the DREAM closely approximated those produced by the monodomain simulations. Reentry in 2D slabs yielded similar results in vulnerable window and mean reentry duration for low CV in both models. In the left atrium, most inducing points identified by the DREAM were also present in the high-resolution monodomain model. DREAM's reentry simulations on meshes with an average edge length of 1600$\mu$m were 40x faster than monodomain simulations at 200$\mu$m. Conclusion: This work establishes the mathematical foundation for using the accelerated DREAM simulation method for cardiac electrophysiology. Cardiac research applications are enabled by a publicly available implementation in the openCARP simulator.
Abstract:
Background and Aims: The effective refractory period is one of the main electrophysiological properties governing arrhythmia maintenance, yet effective refractory period personalisation is rarely performed when creating patient-specific computer models of the atria to inform clinical decision-making. The aim of this study is to evaluate the impact of incorporating clinical effective refractory period measurements when creating in silico personalised models on arrhythmia vulnerability. Methods: Clinical effective refractory period measurements were obtained in seven patients from multiple locations in the atria. The atrial geometries from the electroanatomical mapping system were used to generate personalised anatomical atrial models. To reproduce patient-specific refractory period measurements, the Courtemanche cellular model was gradually reparameterised from control conditions to a setup representing atrial fibrillation-induced remodelling. Four different modelling approaches were compared: homogeneous (A), heterogeneous (B), regional (C), and continuous (D) distribution of effective refractory period. The first two configurations were non-personalised based on literature data, the latter two were personalised based on patient measurements. We evaluated the effect of each modelling approach by quantifying arrhythmia vulnerability and tachycardia cycle length. We performed a sensitivity analysis to assess the influence of effective refractory period measurement uncertainty on arrhythmia vulnerability. Results: The mean vulnerability was 3.4±4.0 %, 7.7±3.4 %, 9.0±5.1 %, 7.0±3.6 % for scenarios A to D, respectively. The mean tachycardia cycle length was 167.1±12.6ms, 158.4±27.5ms, 265.2±39.9ms, and 285.9±77.3ms for scenarios A to D, respectively. Incorporating perturbations to the measured effective refractory period in the range of 2, 5, 10 and 20ms, had an impact on the vulnerability of the model of 5.8±2.7 %, 6.1±3.5 %, 6.9±3.7 %, 5.2±3.5 %, respectively. Conclusion: Increased dispersion of the effective refractory period had a greater effect on reentry dynamics than on mean vulnerability values. The incorporation of personalised effective refractory period in the form of gradients had a greater impact on vulnerability than had a homogeneously reduced effective refractory period. Effective refractory period measurement uncertainty up to 20ms slightly influences arrhythmia vulnerability. Electrophysiological personalisation of atrial in silico models appears essential and warrants confirmation in larger cohorts.
Abstract:
Background and Aims Patients with persistent atrial fibrillation (AF) experience 50% recurrence despite pulmonary vein isolation (PVI), and no consensus is established for second treatments. The aim of our i-STRATIFICATION study is to provide evidence for stratifying patients with AF recurrence after PVI to optimal pharmacological and ablation therapies, through in-silico trials.Methods A cohort of 800 virtual patients, with variability in atrial anatomy, electrophysiology, and tissue structure (low voltage areas, LVA), was developed and validated against clinical data from ionic currents to ECG. Virtual patients presenting AF post-PVI underwent 13 secondary treatments.Results Sustained AF developed in 522 virtual patients after PVI. Second ablation procedures involving left atrial ablation alone showed 55% efficacy, only succeeding in small right atria (<60mL). When additional cavo-tricuspid isthmus ablation was considered, Marshall-Plan sufficed (66% efficacy) for small left atria (<90mL). For bigger left atria, a more aggressive ablation approach was required, such as anterior mitral line (75% efficacy) or posterior wall isolation plus mitral isthmus ablation (77% efficacy). Virtual patients with LVA greatly benefited from LVA ablation in the left and right atria (100% efficacy). Conversely, in the absence of LVA, synergistic ablation and pharmacotherapy could terminate AF. In the absence of ablation, the patient’s ionic current substrate modulated the response to antiarrhythmic drugs, being the inward currents critical for optimal stratification to amiodarone or vernakalant.Conclusion In-silico trials identify optimal strategies for AF treatment based on virtual patient characteristics, evidencing the power of human modelling and simulation as a clinical assisting tool.
Abstract:
Simulation models and artificial intelligence (AI) are largely used to address healthcare and biomedical engineering problems. Both approaches showed promising results in the analysis and optimization of healthcare processes. Therefore, the combination of simulation models and AI could provide a strategy to further boost the quality of health services. In this work, a systematic review of studies applying a hybrid simulation models and AI approach to address healthcare management challenges was carried out. Scopus, Web of Science, and PubMed databases were screened by independent reviewers. The main strategies to combine simulation and AI as well as the major healthcare application scenarios were identified and discussed. Moreover, tools and algorithms to implement the proposed approaches were described. Results showed that machine learning appears to be the most employed AI strategy in combination with simulation models, which mainly rely on agent-based and discrete-event systems. The scarcity and heterogeneity of the included studies suggested that a standardized framework to implement hybrid machine learning-simulation approaches in healthcare management is yet to be defined. Future efforts should aim to use these approaches to design novel intelligent in-silico models of healthcare processes and to provide effective translation to the clinics.
Abstract:
Purpose: Handheld gamma cameras with coded aperture collimators are under inves- tigation for intraoperative imaging in nuclear medicine. Coded apertures are a promis- ing collimation technique for applications such as lymph node localization due to their high sensitivity and the possibility of 3D imaging. We evaluated the axial resolutionand computational performance of two reconstruction methods.Methods: An experimental gamma camera was set up consisting of the pixelated semiconductor detector Timepix3 and MURA mask of rank 31 with round holesof 0.08 mm in diameter in a 0.11 mm thick Tungsten sheet. A set of measurements was taken where a point-like gamma source was placed centrally at 21 different positions within the range of 12–100 mm. For each source position, the detector image was reconstructed in 0.5 mm steps around the true source position, resulting in an image stack. The axial resolution was assessed by the full width at half maximum (FWHM) of the contrast-to-noise ratio (CNR) profile along the z-axis of the stack. Two reconstruction methods were compared: MURA Decoding and a 3D maximum likeli- hood expectation maximization algorithm (3D-MLEM).Results: While taking 4400 times longer in computation, 3D-MLEM yielded a smaller axial FWHM and a higher CNR. The axial resolution degraded from 5.3 mm and 1.8 mm at 12 mm to 42.2 mm and 13.5 mm at 100 mm for MURA Decoding and 3D-MLEM respectively.Conclusion: Our results show that the coded aperture enables the depth estimation of single point-like sources in the near field. Here, 3D-MLEM offered a better axial reso- lution but was computationally much slower than MURA Decoding, whose reconstruc- tion time is compatible with real-time imaging.
Abstract:
Background: The global coronavirus disease 2019 (COVID-19) pandemic has posed substantial challenges for healthcare systems, notably the increased demand for chest computed tomography (CT) scans, which lack automated analysis. Our study addresses this by utilizing artificial intelligence-supported automated computer analysis to investigate lung involvement distribution and extent in COVID-19 patients. Additionally, we explore the association between lung involvement and intensive care unit (ICU) admission, while also comparing computer analysis performance with expert radiologists’ assessments.Methods: A total of 81 patients from an open-source COVID database with confirmed COVID-19 infection were included in the study. Three patients were excluded. Lung involvement was assessed in 78 patients using CT scans, and the extent of infiltration and collapse was quantified across various lung lobes and regions. The associations between lung involvement and ICU admission were analysed. Additionally, the computer analysis of COVID-19 involvement was compared against a human rating provided by radiological experts.Results: The results showed a higher degree of infiltration and collapse in the lower lobes compared to the upper lobes (P<0.05). No significant difference was detected in the COVID-19-related involvement of the left and right lower lobes. The right middle lobe demonstrated lower involvement compared to the right lower lobes (P<0.05). When examining the regions, significantly more COVID-19 involvement was found when comparing the posterior vs. the anterior halves and the lower vs. the upper half of the lungs. Patients, who required ICU admission during their treatment exhibited significantly higher COVID-19 involvement in their lung parenchyma according to computer analysis, compared to patients who remained in general wards. Patients with more than 40% COVID-19 involvement were almost exclusively treated in intensive care. A high correlation was observed between computer detection of COVID-19 affections and the rating by radiological experts.Conclusions: The findings suggest that the extent of lung involvement, particularly in the lower lobes, dorsal lungs, and lower half of the lungs, may be associated with the need for ICU admission in patients with COVID-19. Computer analysis showed a high correlation with expert rating, highlighting its potential utility in clinical settings for assessing lung involvement. This information may help guide clinical decision-making and resource allocation during ongoing or future pandemics. Further studies with larger sample sizes are warranted to validate these findings.
Abstract:
Feature importance methods promise to provide a ranking of features according to importance for a given classification task. A wide range of methods exist but their rankings often disagree and they are inherently difficult to evaluate due to a lack of ground truth beyond synthetic datasets. In this work, we put feature importance methods to the test on real-world data in the domain of cardiology, where we try to distinguish three specific pathologies from healthy subjects based on ECG features comparing to features used in cardiologists' decision rules as ground truth. We found that the SHAP and LIME methods and Chi-squared test all worked well together with the native Random forest and Logistic regression feature rankings. Some methods gave inconsistent results, which included the Maximum Relevance Minimum Redundancy and Neighbourhood Component Analysis methods. The permutation-based methods generally performed quite poorly. A surprising result was found in the case of left bundle branch block, where T-wave morphology features were consistently identified as being important for diagnosis, but are not used by clinicians.
Abstract:
The human heart is subject to highly variable amounts of strain during day-to-day activities and needs to adapt to a wide range of physiological demands. This adaptation is driven by an autoregulatory loop that includes both electrical and the mechanical components. In particular, mechanical forces are known to feed back into the cardiac electrophysiology system, which can result in pro- and anti-arrhythmic effects. Despite the widespread use of computational modelling and simulation for cardiac electrophysiology research, the majority of in silico experiments ignore this mechano-electric feedback entirely due to the high computational cost associated with solving cardiac mechanics. In this study, we therefore use an electromechanically coupled whole-heart model to investigate the differential and combined effects of electromechanical feedback mechanisms with a focus on their physiological relevance during sinus rhythm. In particular, we consider troponin-bound calcium, the effect of deformation on the tissue diffusion tensor, and stretch-activated channels. We found that activation of the myocardium was only significantly affected when including deformation into the diffusion term of the monodomain equation. Repolarization, on the other hand, was influenced by both troponin-bound calcium and stretch-activated channels and resulted in steeper repolarization gradients in the atria. The latter also caused afterdepolarizations in the atria. Due to its central role for tension development, calcium bound to troponin affected stroke volume and pressure. In conclusion, we found that mechano-electric feedback changes activation and repolarization patterns throughout the heart during sinus rhythm and lead to a markedly more heterogeneous electrophysiological substrate.
Abstract:
We investigate the properties of static mechanical and dynamic electro-mechanical models for the deformation of the human heart. Numerically this is realized by a staggered scheme for the coupled partial/ordinary differential equation (PDE-ODE) system. First, we consider a static and purely mechanical benchmark configuration on a realistic geometry of the human ventricles. Using a penalty term for quasi-incompressibility, we test different parameters and mesh sizes and observe that this approach is not sufficient for lowest order conforming finite elements. Then, we compare the approaches of active stress and active strain for cardiac muscle contraction. Finally, we compare in a coupled anatomically realistic electro-mechanical model numerical Newmark damping with a visco-elastic model using Rayleigh damping. Nonphysiological oscillations can be better mitigated using viscosity.
Abstract:
IntroductionThe role of the right atrium (RA) in atrial fibrillation (AF) has long been overlooked. Computer models of the atria can aid in assessing how the RA influences arrhythmia vulnerability and in studying the role of RA drivers in the induction of AF, both aspects challenging to assess in living patients until now. It remains unclear whether the incorporation of the RA influences the propensity of the model to reentry induction. As personalized ablation strategies rely on non-inducibility criteria, the adequacy of left atrium (LA)-only models for developing such ablation tools is uncertain.AimTo evaluate the effect of incorporating the RA in 3D patient-specific computer models on arrhythmia vulnerability.MethodsImaging data from 8 subjects were obtained to generate patient-specific computer models. For each subject, we created 2 models: one monoatrial consisting only of the LA, and one biatrial model consisting of both the RA and LA. We considered 3 different states of substrate remodeling: healthy (H), mild (M), and severe (S). The Courtemanche et al. cellular model was modified from control conditions to a setup representing AF-induced remodeling with 0%, 50%, and 100% changes for H, M, and S, respectively. Conduction velocity along the myocyte preferential direction was set to 1.2, 1.0, and 0.8m/s for each remodeling level. To incorporate fibrotic substrate, we manually placed six seeds on each biatrial model, 3 in the LA and 3 in the RA, corresponding to regions with the most frequent enhancement (IIR>1.2) in LGE-MRI. The extent of the fibrotic substrate corresponded to the Utah 2 (5-20%) and Utah 4 (>35%) stages, for M and S respectively, while the H state was modeled without fibrosis. Electrical propagation in the atria was modeled using the monodomain equation solved with openCARP. Arrhythmia vulnerability was assessed by virtual S1S2 pacing from different points separated by 2cm. A point was classified as inducing arrhythmia if reentry was initiated and maintained for at least 1s. The vulnerability ratio was defined as the number of inducing points divided by the number of stimulation points. The mean tachycardia cycle length (TCL) of the induced arrhythmia was assessed at the stimulation site. We compared the vulnerability ratio of the LA in monoatrial and biatrial configurations.ResultsThe incorporation of the RA increased the mean LA vulnerability ratio by 115.79% (0.19±0.13 to 0.41±0.22, p=0.033) in state M, and 29.03% in state S (0.31±0.14 to 0.40±0.15, p=0.219) as illustrated in Figure 1. No arrhythmia was induced in the H models. RA inclusion increased the TCL of LA reentries by 5.51% (186.9±13.3ms to 197.2±18.3ms, p=0.006) in M scenario, and decreased it by 7.17% (224.3±27.6ms to 208.2±34.8ms, p=0.010) in scenario S. RA inclusion resulted in an elevated LA inducibility, revealing 4.9±3.3 additional points per patient in the LA for the biatrial model that did not induce reentry in the monoatrial model. ConclusionsThe LA vulnerability in a biatrial model differs from the LA vulnerability in a monoatrial model. Incorporating the RA in patient-specific computational models unmasked potential inducing points in the LA. The RA had a substrate-dependent effect on reentry dynamics, altering the TCL of LA-induced reentries. Our results provide evidence for an important role of the RA in the maintenance and induction of arrhythmia in patient-specific computational models, thus suggesting the use of biatrial models.
Abstract:
Atrial fibrillation is a heart condition that causes an irregular and abnormally fast heart rate, as well as a multifactorial and progressive cardiac disease with different manifestations in each patient. The treatment of such illness remains a challenge, especially on those patients showing severe remodelling of the cardiac substrate. Regions of scar and fibrotic tissue have been identified as a potential driving region of arrhythmic activity during AF, so the ablation of these areas represents a standard treatment. High density mapping techniques can provide important information about low voltage and slow conduction zones, both characteristics of arrhythmogenic areas. However, these modalities remain showing discordances in the location and extent of arrhythmogenic areas.Recently, local impedance (LI) measurements have gained attention as they are expected to distinguish between healthy and scar tissue independently from the atrial rhythm, which can improve the understanding of underlying substrate modifications. A new generation of ablation catheters incorporates the option of LI recordings as a novel technique to char- acterize the process of lesion formation. To extend this technology towards a mapping system implementation, a better understanding on how different factors are influencing LI measurements is needed. For that, two approaches were followed during this work: in silico investigations with different catheters and tissue settings, and in vitro experiments to support simulated studies.By performing in silico studies that can relate to commonly seen clinical scenarios, we were able to predict and understand how different factors contribute to measured LI values. A 3D model of the ablation catheters with the DirectSenseTM technology was employed in different scenarios reported in the clinics, such as the introduction of the catheter in a steerable sheath, or the variation of the catheter-tissue distance, angle, and force. LI data from recruited patients at the Städtisches Klinikum Karlsruhe allowed the validation of the simulation setting.Later, this in silico setting was extended to multielectrode catheters. Simulating the impact of several design parameters in LI, such as stimulation and measurement bipolar pairs, inter-electrode distance, or electrode shape and size, tissue conductivities were reconstructed to account for scarred tissue patterns.Lately, in vitro experiments with a mapping catheter were performed built on the previous simulated findings. Various contact impedance recordings in tissue phantom demonstrated statistical significance when comparing the measurements between electrodes in direct catheter-tissue contact and floating in saline. During this work, the potential capabilities of LI measurements were proven and paved the way towards its use as a surrogate for detection of fibrotic areas in cardiac mapping, complementing commonly used techniques based on electrogram (EGM) analysis.
Abstract:
In this work, we registered retinal diagnostic Optical Coherence Tomography (OCT) to instument-integrated OCT (iiOCT). High registration accuracy and distortion correction of the OCT was demonstrated, which has the potential to support advanced tasks in robotic eye surgery.Robotic eye surgery has been investigated in the near past to assist in challenging opera- tions, but it requires comprehensive anatomical information. The robot-mounted distance sensor, namely the iiOCT, cannot provide this on its own, but diagnostic OCT can sup- ply this essential data. However, the integration of diagnostic OCT with the robotic iiOCT has not been previously achieved, which is why this registration was investigated in this work.We designed an accurate, fast, and automatable pipeline that registered real OCT to real iiOCT data. The pipeline was composed of three stages: feature extraction, coarse reg- istration, and fine registration. During the feature extraction phase, both modalities were transformed into point clouds. In the coarse registration stage, point features including the locations of the fovea and Optic Nerve Head (ONH) were utilized to perform a coarse align- ment. The final stage, fine registration, utilized nonrigid transformation based on Coherent Point Drift (CPD) to refine the coarse alignment.Extensive experiments evaluated different pipeline combinations with real data acquired from a realistic eye model. The influence of nonrigidity, efficiency, and first experiments with porcine eyes were assessed in detail. Our results showed an impressive accuracy of 131 μm in the macular region with short computation time. Furthermore, the nonrigid deformation significantly improved accuracy by 21% by correcting the curvature in the OCT data.This research underscores that precise OCT-iiOCT registration is possible which enables advanced tasks in robotic eye surgery. This could enhance efficiency and patient outcomes in eye care, marking a significant stride for both surgeons and patients.
Abstract:
Accurately predicting the difficulty of wisdom tooth extraction is paramount in oral surgery, guiding treatment planning and ensuring patient safety. This study focuses on developing a predictive model for classifying the difficulty of wisdom tooth extraction using advanced deep learning techniques. Leveraging the insights from previous research and clinical exper- tise, we integrate Cone Beam Computed Tomography (CBCT) imaging and deep learning methodologies to automate the assessment of extraction complexity. Through a series of experimental stages, we refine our model by evaluating different annotation techniques, incorporation of edge information, feature fusion methods, backbone architectures, and projection-based classification approaches. Our findings demonstrate the efficacy of incorpo- rating segmented data from CBCT imaging, utilizing Principal Component Analysis (PCA) for feature extraction, and optimizing projection-based classification methods. By automating the identification of tooth positions and surrounding structures, our model provides valuable decision-making support to clinicians, enhancing surgical planning and risk assessment. This interdisciplinary approach to predicting wisdom tooth extraction difficulty represents a significant advancement in oral surgery, promising to improve surgical outcomes and patient care in clinical practice.
Abstract:
This master thesis investigates the complexities of transit time measurement in quantita- tive fluorescence angiography (QFA), a critical factor affecting the precision of blood flow assessment during neurosurgical procedures. Motivated by the clinical need for accurate intraoperative blood flow measurement to improve surgical outcomes and patient safety, the study delves into the factors contributing to the variability of transit time measurements, essential for quantifying cerebral blood flow. Utilizing a laboratory setup that simulates cerebral blood flow dynamics, ICG-fluorescence-angiography was employed to explore the impact of different parameters, such as vessel diameter, flow pulsatility, and the selection of the region of interest (ROI), on transit time accuracy.Through a series of carefully designed experiments using silicone tubes of varying diameters to model cerebrovascular structures, the relationship between transit time measurements and factors such as flow rate, pulsatility, and geometric characteristics of the blood vessels was examined. Advanced imaging techniques and software analysis tools were utilized to capture and quantify the fluorescent signal of ICG, allowing for precise measurement of transit times under different flow conditions.The findings reveal significant insights into the factors influencing transit time measurements. Notably, the study demonstrates that vessel diameter and the choice of ROI substantially affect transit time accuracy, with smaller diameters and strategically selected ROIs yielding more reliable measurements. Furthermore, the research highlights the critical role of flow pulsatility, underscoring the importance of accounting for the dynamic nature of blood flow in the accuracy of transit time measurements.This work contributes to the advancement of QFA by providing a deeper understanding of the variables that affect transit time measurement. The insights gained from this research are expected to contribute to more accurate and reliable QFA techniques, ultimately enhancing the efficacy of intraoperative blood flow assessment and supporting improved surgical decision-making and patient outcomes.
Abstract:
In recent decades, surgical navigation systems have undergone significant change, evolving from two-dimensional (2D) imaging techniques to more complex three-dimensional (3D) imaging modalities. While these advances have improved the precision and effectiveness of surgical procedures, they have also brought with them a challenge: increased radiation exposure to patients. To address this issue, this research is exploring the potential of integrating 2D fan beam scout scans from the AIRO intraoperative CT scanner into surgical navigation systems. This approach aims to minimize radiation exposure while maintaining and potentially improving the functionality and versatility of the AIRO scanner through the addition of 2D imaging capabilitiesThis work presents a novel approach to developing a navigation model specifically designed for scout scans. An in-depth analysis of the scout scan characteristics is performed, which forms the basis for a novel calibration method using the linear pushbroom camera model. This model is essential for accurately describing the scout scan acquisition process and is further refined to approximate a pinhole camera model. The adaptation aims to seamlessly integrate scout scans into the surgical navigation scene.The results demonstrate that the linear pushbroom camera model provides a robust mathe- matical foundation for understanding and implementing scout scans in surgical navigation. The effectiveness of the proposed approximation approach was evaluated through its appli- cation in 2D navigation and 2D/3D registration tasks and revealed error margins beyond current requirements. Despite these initial findings, there is still considerable potential for improvement. By incorporating the linear pushbroom model more directly, it is expected that the deviations associated with approximating a pinhole camera model can be significantly reduced. This potential method offers a promising way to improve the accuracy of the system and suggests that, with further refinement, the approach could meet the precision requirements of surgical navigation.
Abstract:
This Bachelor’s thesis investigates the use of Convolutional Neural Networks (CNNs) for the segmentation of the Foveal Avascular Zone (FAZ) in Optical Coherence Tomography Angiography (OCTA) images. Given the challenge of directly comparing results across different research groups, this work focuses on analyzing three key factors that could influence comparability: dataset size, model complexity, and the evaluation metrics used. By generating four datasets of varying sizes through data augmentation and training state-of-the- art segmentation architectures with uniform hyperparameters, these factors were evaluated using a 6-fold cross-validation approach.Initial findings highlighted performance disparities among networks across different datasets, suggesting that factors beyond dataset size and model complexity might play significant roles. Despite these disparities, it was observed that neither larger dataset sizes nor increased model complexity necessarily lead to better segmentation performance. Moreover, reliance solely on technical metrics like the Dice coefficient was found inadequate for a comprehensive assessment of segmentation outcomes. In response, a new benchmark design was proposed to provide a consistent and transparent basis for evaluating and comparing future research findings, aiming to account for the observed performance variations.This work offers insights into the challenges of comparability in FAZ segmentation with CNNs and proposes ways to address these challenges. It aims to stimulate discussion on standardized evaluation methods in this field.
Abstract:
Atrial fibrillation (AF) is one of the most common arrhythmias with a high mortality and morbidity rate associated with it. Due to its age-dependent incidence rate and the aging population, the incidences and thus the financial costs to the health system will be likely to rise significantly in the future. Further, the current treatment of AF patients is suboptimal with success rates of only 20% to 60% for patients in advanced stages of AF.In the last decades, various factors have been investigated to enhance the understanding of AF pathophysiology, including anatomical, electrophysiological, and tissue-related factors. Conducting in-vivo and clinical studies to investigate the complex interplay of these factors during AF is challenging due to patient variability, data collection complexity, ethical con- straints, among others. Therefore, studies comparing these factors against each other are still sparse.Computer models of the atria serve as a valuable tool to address some of these challenges, of- fering a controlled and reproducible environment. This thesis aimed to give some additional quantitative insights into the influences of biatrial anatomical factors, such as volume and sphericity, and tissue-related factors, including fibrosis extent and fibrosis density, on arrhyth- mia vulnerability. Through in-silico experiments, a vulnerability assessment to reentries was conducted by applying an S1S2 protocol to pacing points distributed around both atria. The vulnerability was defined as the number of pacing points inducing reentries divided by the total number of pacing points on both atria.In total, sixteen biatrial bilayer models, derived from instances of a statistical shape model (SSM) were generated. To investigate the influence of fibrosis on arrhythmia vulnerability, a fibrosis atlas was generated from late gadolinium enhancement magnetic resonance imaging (LGE-MRI) data obtained from fifty-four patients from two independent institutions. Four- teen fibrotic distributions were generated and were mapped to the biatrial models.In addition, the mean fibrosis density in the boundary zone between fibrotic and non-fibrotic tissue was varied across five different values, ranging from 26% to 75%. Using the Utah classification score, the models were further categorized into one of four Utah stages.To compare the influence of anatomical factors and fibrosis factors, the vulnerability assess- ment was run with the same biatrial models with and without fibrosis. This comparison allowed for the calculation of a vulnerability factor between both scenarios.In the sixteen biatrial simulations without fibrosis and varying sphericity and volume values, sphericity exhibited a higher correlation to the resulting arrhythmia vulnerability of the biatrial simulations without fibrosis than volume, with 0.37 being the highest correlation coefficient for the combined sphericity of both atria. The RA volume obtained the highest correlation coefficient of 0.18 among all volume factors. With the addition of fibrosis, Utah 3 stage (20-30% of fibrosis) models showed the highest vulnerability (0.76). Further, the Utah 3 stage models got the highest vulnerability factor between simulations with and without fibrosis equal to 3.28. Apart from Utah 2 stage (10-20% of fibrosis), the these factors ranged from 2 or slightly below the highest factor counting. Moreover, using the same fibrosis map of Utah 3 and Utah 4 on four different biatrial models revealed a variation of the factor from 1.73 to 3.28 (Utah 3) and 2.0 to 2.62 (Utah 4).The change in fibrosis density led to a 20% vulnerability increase, exhibiting a steady in- crease in vulnerability from less dense to more dense fibrotic tissue in the boundary zone. The results obtained in this thesis suggest a higher influence of sphericity compared to volume on arrhythmia vulnerability. After the addition of fibrosis, the vulnerability doubled or in one case, even tripled (3.28). Thus, the influence of fibrosis extent is at least double that of the anatomical factors. However, a high impact of the geometry on the influence of the fibrosis extent was observed.
Abstract:
Cardiovascular diseases are responsible for an estimated 17.9 million deaths each year, according to WHO. The electrocardiogram (ECG) stands out as the most widely employed method for assessing the heart’s condition, given its non-invasive nature and ease of recording. While there are millions of signals recorded every day, ECG data available for research and development of new analyzation tools is rare due to patient protection and the limited time of healthcare professionals. Therefore, much of today’s software development relies on a few dozen publicly available data sets, offering only a limited representation of the ECG signals recorded daily in clinics.In an effort to address the challenge of limited ECG data for research, this thesis explores a method for encoding and augmenting ECG data using machine learning. An autoencoder is trained on existing ECG data to acquire a compressed representation of ECG signals and extract essential components from them. These extracted elements serve as a condensed yet informative summary of the original signals. Subsequently, the aim is to adjust and augment given ECG signals by modifying the extracted features. For simplification of the used data, the vectorcardiographic (VCG) representation of ECG signals is employed, reducing the required leads from twelve to three. The generated signals are heartbeats aligned at the R-peak with a fixed time window size. The intention is for these individual heartbeats to be stitched together to create a longer signal in a subsequent step.During this project it was firstly aimed to find a Variational Autoencoder (VAE) model of low complexity, which is able to achieve an acceptable reconstruction loss. Consequently, the VAE was trained on a large data set to learn a compressed representation in the latent space, capable of reconstructing the signal to high accuracy. Afterwards, transfer learning was employed to retrain the VAE on signals from a single patient, to incorporate learned morphologies from other patients into this specific patient’s signal.I achieved a latent space representation in which each variable represents specific morpholo- gies of the VCG, with little correlation between variables. The research has shown that this machine learning approach has improved capabilities compared the principal component analysis (PCA) regarding the ability to construct and augment unseen VCG signals. Transfer learning proved to be helpful for achieving better generalization and faster convergence of the model but the transfer of morphologies between patients did not meet the desired expec- tations. Lastly, a suggestion of potential improvements and further methods is presented, aiming to improve various aspects of the latent space.
Abstract:
The most common arrhythmia worldwide is atrial fibrillation (AF), recognized as a substantialpublic health burden due to its rising incidence. Patients affected by AF face elevated risksfor stroke, myocardial infarction, and mortality. Moreover, current treatment approachesoften prove ineffective, resulting in a high recurrence rate. Hence, there is an urgent need forfurther investigation into the mechanisms underlying AF to advance treatment strategies.The objective of this study was to assess the impact of the morphology of the conductionvelocity (CV) restitution curve on reentry events. We evaluated this influence using metricssuch as the vulnerability window, the average reentry duration, and the dominant frequency.By conducting this vulnerability assessment, the aim was to establish correlations betweenthe morphology of the CV restitution curve and these key features.We investigated the impact of using the pacing cycle length (PCL) and the diastolic interval(DI) on the restitution curve through simulations in the monodomain model. Additionally, theinfluence of the maximum longitudinal CV on the CV restitution curve was analyzed. ClinicalCV restitution curves of 13 patients with persistent AF, measured at various atrial locations,were employed in simulations on a 2D tissue slab utilizing the diffusion reaction eikonalalternant model (DREAM) to simulate electrical wave propagation with a personalizedionic model (Courtemanche) for the action potential (AP). The vulnerability assessmentwas done using an S1-S2 protocol. The experiments encompassed diverse morphologies ofrestitution curves and varying maximal longitudinal CV values. Moreover, experiments withheterogeneous meshes using two different restitution curve morphologies were conducted.No notable influence of the maximal longitudinal CV on the morphology of the CV restitutioncurve was identified. Moreover, the ionic model was successfully personalized using afunction that interpolated conductance values between healthy and AF tissue. Additionally, acorrelation between the steepness of the CV restitution curve and the vulnerability window,average reentry duration, and dominant frequency was established.Nevertheless, this work has limitations regarding the data acquisition and the model usedfor the electrophysiological simulations although it was shown that a shallow CV restitutioncurve is more vulnerable to AF and maintains it longer. Summarizing, the CV restitutioncurve proved to be a crucial factor for reentry events, promising to improve vulnerabilityassessment and treatment outcomes of AF.
Abstract:
Premature ventricular contractions (PVC) are a common heart arrhythmia that can be observed in 40\% to 75\% in the general population under 24- or 48-hour Holter monitoring \cite{Ahn2013-mf}. In severe cases these can lead to a restriction of the quality of life or even result in fatal complications. A possible therapy is ablation therapy for which the exact site of origin of the extrasystoles needs to be known. In this thesis it was investigated whether the class of conditional invertible neural networks can be used to predict the locations of the \acp{SOO} and whether the additional information that is provided by the special network architecture through a predicted likelihood is beneficial. As input only body surface potential maps were used. This means that the process on a real patient to get all necessary data is noninvasive and does not require imaging.The network was trained on a simulated dataset that consists of 1.8 million different simulated extrasystoles. On the test set a median geodesic error of 1.76 mm was measured. The network predicts the location as patient-agnostic cobiveco coordinates. As such no additional medical imaging data is necessary.In addition, several necessary additions to the network that enable stable and reliable training were documented. The effect that several tunable hyper parameters have on the prediction and the calibration are also shown. This is important because so far only a small amount of work using conditional Invertible Neural Networks has been published so far.Several methods to summarize and visualize the information gained by a sampling based inference are presented, one of which visualizes the predicted negative log-likelihood on the heart surface to enable an intuitive understanding of potentially affected areas of the heart.