Electrocardiographic imaging (ECGI) reconstructs the electrical activity of the heart from a dense array of body-surface electrocardiograms and a patient-specific heart-torso geometry. Depending on how it is formulated, ECGI allows the reconstruction of the activation and recovery sequence of the heart, the origin of premature beats or tachycardia, the anchors/hotspots of re-entrant arrhythmias and other electrophysiological quantities of interest. Importantly, these quantities are directly and noninvasively reconstructed in a digitized model of the patient’s three-dimensional heart, which has led to clinical interest in ECGI’s ability to personalize diagnosis and guide therapy. Despite considerable development over the last decades, validation of ECGI is challenging. Firstly, results depend considerably on implementation choices, which are necessary to deal with ECGI’s ill-posed character. Secondly, it is challenging to obtain (invasive) ground truth data of high quality. In this review, we discuss the current status of ECGI validation as well as the major challenges remaining for complete adoption of ECGI in clinical practice. Specifically, showing clinical benefit is essential for the adoption of ECGI. Such benefit may lie in patient outcome improvement, workflow improvement, or cost reduction. Future studies should focus on these aspects to achieve broad adoption of ECGI, but only after the technical challenges have been solved for that specific application/pathology. We propose ‘best’ practices for technical validation and highlight collaborative efforts recently organized in this field. Continued interaction between engineers, basic scientists and physicians remains essential to find a hybrid between technical achievements, pathological mechanisms insights, and clinical benefit, to evolve this powerful technique towards a useful role in clinical practice.
Electrocardiographic imaging (ECGI) has recently gained attention as a viable diagnostic tool for reconstructing cardiac electrical activity in normal hearts as well as in cardiac arrhythmias. However, progress has been limited by the lack of both standards and unbiased comparisons of approaches and techniques across the community, as well as the consequent difficulty of effective collaboration across research groups.. To address these limitations, we created the Consortium for Electrocardiographic Imaging (CEI), with the objective of facilitating collaboration across the research community in ECGI and creating standards for comparisons and reproducibility. Here we introduce CEI and describe its two main efforts, the creation of EDGAR, a public data repository, and the organization of three collaborative workgroups that address key components and applications in ECGI. Both EDGAR and the workgroups will facilitate the sharing of ideas, data and methods across the ECGI community and thus address the current lack of reproducibility, broad collaboration, and unbiased comparisons.
INTRODUCTION: The "Experimental Data and Geometric Analysis Repository", or EDGAR is an Internet-based archive of curated data that are freely distributed to the international research community for the application and validation of electrocardiographic imaging (ECGI) techniques. The EDGAR project is a collaborative effort by the Consortium for ECG Imaging (CEI, ecg-imaging.org), and focused on two specific aims. One aim is to host an online repository that provides access to a wide spectrum of data, and the second aim is to provide a standard information format for the exchange of these diverse datasets. METHODS: The EDGAR system is composed of two interrelated components: 1) a metadata model, which includes a set of descriptive parameters and information, time signals from both the cardiac source and body-surface, and extensive geometric information, including images, geometric models, and measure locations used during the data acquisition/generation; and 2) a web interface. This web interface provides efficient, search, browsing, and retrieval of data from the repository. RESULTS: An aggregation of experimental, clinical and simulation data from various centers is being made available through the EDGAR project including experimental data from animal studies provided by the University of Utah (USA), clinical data from multiple human subjects provided by the Charles University Hospital (Czech Republic), and computer simulation data provided by the Karlsruhe Institute of Technology (Germany). CONCLUSIONS: It is our hope that EDGAR will serve as a communal forum for sharing and distribution of cardiac electrophysiology data and geometric models for use in ECGI research.
Activation times (AT) describe the sequence of cardiac depolarization and represent one of the most important parameters for analysis of cardiac electrical activity. However, estimation of ATs can be challenging due to multiple sources of noise such as fractionation or baseline wander. If ATs are estimated from signals reconstructed using electrocardiographic imaging (ECGI), additional problems can arise from over-smoothing or due to ambiguities in the inverse problem. Often, resulting AT maps show falsely homogeneous regions or artificial lines of block. As ATs are not only important clinically, but are also commonly used for evaluation of ECGI methods, it is important to understand where these errors come from. We present results from a community effort to compare methods for AT estimation on a common dataset of simulated ventricular pacings. ECGI reconstructions were performed using three different surface source models: transmembrane voltages, epi-endo potentials and pericardial potentials, all using 2nd-order Tikhonov and 6 different regularization parameters. ATs were then estimated by the community participants and compared to the ground truth. While the pacing site had the largest effect on AT correlation coefficients (CC larger for lateral than for septal pacings), there were also differences between methods and source models that were poorly reflected in CCs. Results indicate that artificial lines of block are most severe for purely temporal methods. Compared to the other source models, ATs estimated from transmembrane voltages are more precise and less prone to artifacts.
Background: Noninvasive localization of premature ventricular complexes (PVCs) to guide ablation therapy is one of the emerging applications of electrocardiographic imaging (ECGI). Because of its increasing clinical use, it is essential to compare the many implementations of ECGI that exist to understand the specific characteristics of each approach. Objective: Our consortium is a community of researchers aiming to collaborate in the field of ECGI, and to objectively compare and improve methods. Here, we will compare methods to localize the origin of PVCs with ECGI. Methods: Our consortium hosts a repository of ECGI data on its website. For the current study, participants...
Cardiac electrical imaging, that is, reconstructing car- diac electrical activity from body surface measurements, is a technology with great potential. However, ill-posedness of this problem hinders its routine usage in clinical envi- ronment and continues to motivate the search for improve- ments on current methods. Messnarz et al. introduced an algorithm that constraints the reconstructed transmem- brane potential (TMP) to be non-decreasing over time dur- ing QRS-complex. This physiologically meaningful con- straint reduces the solution space of the problem and reg- ularizes the solution. However, this approach is compu- tationally extensive and can become prohibitive as spatial and temporal resolution of the problem increase. Here we compare three distinct options to reduce the computational load: downsampling the measurements in time, downsam- pling the measurements after filtering with an algorithm based on principal component analysis and non-linearly interpolating the potentials with a spline-based method. The data used were simulated TMPs that were forward propagated to the body surface in a densely sampled ge- ometry. The resulting body surface potential simulations were corrupted with noise and the inverse computed using a much coarser mesh to take geometry errors into account. The results indicate that reducing the dimension of the sig- nal in time does not reduce the quality of the solutions obtained, while the computational requirements decrease considerably, especially for the spline method.