Atrial fibrillation (AF) is the most prevalent form of cardiac arrhythmia. The atrial wall thickness (AWT) can potentially improve our understanding of the mechanism underlying atrial structure that drives AF and provides important clinical information. However, most existing studies for estimating AWT rely on ruler-based measurements performed on only a few selected locations in 2D or 3D using digital calipers. Only a few studies have developed automatic approaches to estimate the AWT in the left atrium, and there are currently no methods to robustly estimate the AWT of both atrial chambers. Therefore, we have developed a computational pipeline to automatically calculate the 3D AWT across bi-atrial chambers and extensively validated our pipeline on both ex vivo and in vivo human atria data. The atrial geometry was first obtained by segmenting the atrial wall from the MRIs using a novel machine learning approach. The epicardial and endocardial surfaces were then separated using a multi-planar convex hull approach to define boundary conditions, from which, a Laplace equation was solved numerically to automatically separate bi-atrial chambers. To robustly estimate the AWT in each atrial chamber, coupled partial differential equations by coupling the Laplace solution with two surface trajectory functions were formulated and solved. Our pipeline enabled the reconstruction and visualization of the 3D AWT for bi-atrial chambers with a relative error of 8% and outperformed existing algorithms by >7%. Our approach can potentially lead to improved clinical diagnosis, patient stratification, and clinical guidance during ablation treatment for patients with AF.
Electrocardiographic imaging (ECGI) reconstructs the electrical activity of the heart from a dense array of body-surface electrocardiograms and a patient-specific heart-torso geometry. Depending on how it is formulated, ECGI allows the reconstruction of the activation and recovery sequence of the heart, the origin of premature beats or tachycardia, the anchors/hotspots of re-entrant arrhythmias and other electrophysiological quantities of interest. Importantly, these quantities are directly and noninvasively reconstructed in a digitized model of the patient’s three-dimensional heart, which has led to clinical interest in ECGI’s ability to personalize diagnosis and guide therapy. Despite considerable development over the last decades, validation of ECGI is challenging. Firstly, results depend considerably on implementation choices, which are necessary to deal with ECGI’s ill-posed character. Secondly, it is challenging to obtain (invasive) ground truth data of high quality. In this review, we discuss the current status of ECGI validation as well as the major challenges remaining for complete adoption of ECGI in clinical practice. Specifically, showing clinical benefit is essential for the adoption of ECGI. Such benefit may lie in patient outcome improvement, workflow improvement, or cost reduction. Future studies should focus on these aspects to achieve broad adoption of ECGI, but only after the technical challenges have been solved for that specific application/pathology. We propose ‘best’ practices for technical validation and highlight collaborative efforts recently organized in this field. Continued interaction between engineers, basic scientists and physicians remains essential to find a hybrid between technical achievements, pathological mechanisms insights, and clinical benefit, to evolve this powerful technique towards a useful role in clinical practice.
INTRODUCTION: The "Experimental Data and Geometric Analysis Repository", or EDGAR is an Internet-based archive of curated data that are freely distributed to the international research community for the application and validation of electrocardiographic imaging (ECGI) techniques. The EDGAR project is a collaborative effort by the Consortium for ECG Imaging (CEI, ecg-imaging.org), and focused on two specific aims. One aim is to host an online repository that provides access to a wide spectrum of data, and the second aim is to provide a standard information format for the exchange of these diverse datasets. METHODS: The EDGAR system is composed of two interrelated components: 1) a metadata model, which includes a set of descriptive parameters and information, time signals from both the cardiac source and body-surface, and extensive geometric information, including images, geometric models, and measure locations used during the data acquisition/generation; and 2) a web interface. This web interface provides efficient, search, browsing, and retrieval of data from the repository. RESULTS: An aggregation of experimental, clinical and simulation data from various centers is being made available through the EDGAR project including experimental data from animal studies provided by the University of Utah (USA), clinical data from multiple human subjects provided by the Charles University Hospital (Czech Republic), and computer simulation data provided by the Karlsruhe Institute of Technology (Germany). CONCLUSIONS: It is our hope that EDGAR will serve as a communal forum for sharing and distribution of cardiac electrophysiology data and geometric models for use in ECGI research.
Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.
Conference Contributions (4)
B. Wang, W. H. W. Schulze, and O. Dössel. Non-invasive reconstruction of myocardial activation: a wavefront-based Tikhonov approach with tolerance operator. In Biomedizinische Technik / Biomedical Engineering (Proc. BMT 2011), vol. 56(s1) , 2011
Background: Noninvasive localization of premature ventricular complexes (PVCs) to guide ablation therapy is one of the emerging applications of electrocardiographic imaging (ECGI). Because of its increasing clinical use, it is essential to compare the many implementations of ECGI that exist to understand the specific characteristics of each approach. Objective: Our consortium is a community of researchers aiming to collaborate in the field of ECGI, and to objectively compare and improve methods. Here, we will compare methods to localize the origin of PVCs with ECGI. Methods: Our consortium hosts a repository of ECGI data on its website. For the current study, participants...
W. H. W. Schulze, B. Wang, D. Potyagaylo, and O. Dössel. Use of a tolerance operator in wavefront-based ECG imaging of transmembrane voltages. In IFMBE Proceedings World Congress on Medical Physics and Biomedical Engineering, vol. 39, 2012
A computer-implemented method for reconstruction of a magnetic resonance image includes acquiring a first incomplete k-space data set comprising a plurality of first k-space lines spaced according to an acceleration factor and one or more calibration lines. A parallel imaging reconstruction technique is applied to the first incomplete k-space data to determine a plurality of second k-space lines not included in the first incomplete k-space data set, thereby yielding a second incomplete k-space data set. Then, the parallel imaging reconstruction technique is applied to the second incomplete k-space data to determine a plurality of third k-space lines not included in the second incomplete k-space data, thereby yielding a complete k-space data set.
Student Theses (1)
B. Wang. Ein Wellenfront-basierter Ansatz zur nicht-invasiven Rekonstruktion myokardialer Aktiviät. Institut für Biomedizinische Technik, Karlsruher Institut für Technologie (KIT). Diplomarbeit. 2011