Objective: To investigate cardiac activation maps estimated using electrocardiographic imaging and to find methods reducing line-of-block (LoB) artifacts, while preserving real LoBs. Methods: Body surface potentials were computed for 137 simulated ventricular excitations. Subsequently, the inverse problem was solved to obtain extracellular potentials (EP) and transmembrane voltages (TMV). From these, activation times (AT) were estimated using four methods and compared to the ground truth. This process was evaluated with two cardiac mesh resolutions. Factors contributing to LoB artifacts were identified by analyzing the impact of spatial and temporal smoothing on the morphology of source signals. Results: AT estimation using a spatiotemporal derivative performed better than using a temporal derivative. Compared to deflection-based AT estimation, correlation-based methods were less prone to LoB artifacts but performed worse in identifying real LoBs. Temporal smoothing could eliminate artifacts for TMVs but not for EPs, which could be linked to their temporal morphology. TMVs led to more accurate ATs on the septum than EPs. Mesh resolution had a negligible effect on inverse reconstructions, but small distances were important for cross-correlation-based estimation of AT delays. Conclusion: LoB artifacts are mainly caused by the inherent spatial smoothing effect of the inverse reconstruction. Among the configurations evaluated, only deflection-based AT estimation in combination with TMVs and strong temporal smoothing can prevent LoB artifacts, while preserving real LoBs. Significance: Regions of slow conduction are of considerable clinical interest and LoB artifacts observed in non-invasive ATs can lead to misinterpretations. We addressed this problem by identifying factors causing such artifacts and methods to reduce them.
Electrocardiographic imaging (ECGI) reconstructs the electrical activity of the heart from a dense array of body-surface electrocardiograms and a patient-specific heart-torso geometry. Depending on how it is formulated, ECGI allows the reconstruction of the activation and recovery sequence of the heart, the origin of premature beats or tachycardia, the anchors/hotspots of re-entrant arrhythmias and other electrophysiological quantities of interest. Importantly, these quantities are directly and noninvasively reconstructed in a digitized model of the patient’s three-dimensional heart, which has led to clinical interest in ECGI’s ability to personalize diagnosis and guide therapy. Despite considerable development over the last decades, validation of ECGI is challenging. Firstly, results depend considerably on implementation choices, which are necessary to deal with ECGI’s ill-posed character. Secondly, it is challenging to obtain (invasive) ground truth data of high quality. In this review, we discuss the current status of ECGI validation as well as the major challenges remaining for complete adoption of ECGI in clinical practice. Specifically, showing clinical benefit is essential for the adoption of ECGI. Such benefit may lie in patient outcome improvement, workflow improvement, or cost reduction. Future studies should focus on these aspects to achieve broad adoption of ECGI, but only after the technical challenges have been solved for that specific application/pathology. We propose ‘best’ practices for technical validation and highlight collaborative efforts recently organized in this field. Continued interaction between engineers, basic scientists and physicians remains essential to find a hybrid between technical achievements, pathological mechanisms insights, and clinical benefit, to evolve this powerful technique towards a useful role in clinical practice.
Electrocardiographic imaging (ECGI) has recently gained attention as a viable diagnostic tool for reconstructing cardiac electrical activity in normal hearts as well as in cardiac arrhythmias. However, progress has been limited by the lack of both standards and unbiased comparisons of approaches and techniques across the community, as well as the consequent difficulty of effective collaboration across research groups.. To address these limitations, we created the Consortium for Electrocardiographic Imaging (CEI), with the objective of facilitating collaboration across the research community in ECGI and creating standards for comparisons and reproducibility. Here we introduce CEI and describe its two main efforts, the creation of EDGAR, a public data repository, and the organization of three collaborative workgroups that address key components and applications in ECGI. Both EDGAR and the workgroups will facilitate the sharing of ideas, data and methods across the ECGI community and thus address the current lack of reproducibility, broad collaboration, and unbiased comparisons.
INTRODUCTION: The "Experimental Data and Geometric Analysis Repository", or EDGAR is an Internet-based archive of curated data that are freely distributed to the international research community for the application and validation of electrocardiographic imaging (ECGI) techniques. The EDGAR project is a collaborative effort by the Consortium for ECG Imaging (CEI, ecg-imaging.org), and focused on two specific aims. One aim is to host an online repository that provides access to a wide spectrum of data, and the second aim is to provide a standard information format for the exchange of these diverse datasets. METHODS: The EDGAR system is composed of two interrelated components: 1) a metadata model, which includes a set of descriptive parameters and information, time signals from both the cardiac source and body-surface, and extensive geometric information, including images, geometric models, and measure locations used during the data acquisition/generation; and 2) a web interface. This web interface provides efficient, search, browsing, and retrieval of data from the repository. RESULTS: An aggregation of experimental, clinical and simulation data from various centers is being made available through the EDGAR project including experimental data from animal studies provided by the University of Utah (USA), clinical data from multiple human subjects provided by the Charles University Hospital (Czech Republic), and computer simulation data provided by the Karlsruhe Institute of Technology (Germany). CONCLUSIONS: It is our hope that EDGAR will serve as a communal forum for sharing and distribution of cardiac electrophysiology data and geometric models for use in ECGI research.
Electrocardiographic Imaging (ECGI) requires robust ECG forward simulations to accurately calculate cardiac activity. However, many questions remain regarding ECG forward simulations, for instance: there are not common guidelines for the required cardiac source sampling. In this study we test equivalent double layer (EDL) forward simulations with differing cardiac source resolutions and different spatial interpolation techniques. The goal is to reduce error caused by undersampling of cardiac sources and provide guidelines to reduce said source undersampling in ECG forward simulations. Using a simulated dataset sampled at 5 spatial resolutions, we computed body surface potentials using an EDL forward simulation pipeline. We tested two spatial interpolation methods to reduce error due to undersampling triangle weighting and triangle splitting. This forward modeling pipeline showed high frequency artifacts in the predicted ECG time signals when the cardiac source resolution was too low. These low resolutions could also cause shifts in extrema location on the body surface maps. However, these errors in predicted potentials can be mitigated by using a spatial interpolation method. Using spatial interpolation can reduce the number of nodes required for accurate body surface potentials from 9,218 to 2,306. Spatial interpolation in this forward model could also help improve accuracy and reduce computational cost in subsequent ECGI applications.
The boundary element method is widely used to solve the forward problem of electrocardiography, i.e. to calculate the body surface potentials (BSP) caused by the heart’s electrical activity. This requires discretization of boundary surfaces between compartments of a torso model. Often, the resolution of the surface bounding the heart is chosen above 1 mm, which can lead to spikes in resulting BSPs. We demonstrate that this artifact is caused by discontinuous propagation of the wavefront on coarse meshes and can be avoided by blurring cardiac sources before spatial downsampling. We evaluate different blurring methods and show that Laplacian blurring reduces the BSP error 5-fold for both transmembrane voltages and extracellular potentials downsampled to 3 different resolutions. We suggest a method to find the optimal blurring parameter without having to compute BSPs using a fine mesh.
Background: Noninvasive localization of premature ventricular complexes (PVCs) to guide ablation therapy is one of the emerging applications of electrocardiographic imaging (ECGI). Because of its increasing clinical use, it is essential to compare the many implementations of ECGI that exist to understand the specific characteristics of each approach. Objective: Our consortium is a community of researchers aiming to collaborate in the field of ECGI, and to objectively compare and improve methods. Here, we will compare methods to localize the origin of PVCs with ECGI. Methods: Our consortium hosts a repository of ECGI data on its website. For the current study, participants...