BackgroundMultiple wavelets and rotors are accused of maintaining atrial fibrillation (AF). However, snake-like excitation patterns have recently been observed in AF. So far, computer models have investigated AF in a simplified anatomical model. In this work, pulmonary vein firing is simulated to investigate the initiation and maintenance of AF in a realistic anatomical model.Methods and ResultsThirty-five ectopic foci situated around all pulmonary veins were simulated by a unidirectional conduction block. The excitation propagation was simulated by an adaptive cellular automaton on a realistic 3-dimensional atrial anatomy. Atrial fibrillation was initiated in 65.7% of the simulations. Stable excitation patterns were broken up in anatomically heterogeneous regions, creating a streak-like excitation pattern similar to snakes. Multiple wavelets and rotors could be observed in anatomically smooth areas at the atria's roofs.ConclusionsThe influence of macroscopic anatomical structures on the course of AF seems to play an important role in the excitation propagation in AF. The computer simulations indicate that multiple mechanisms contribute to the maintenance of AF.
Ablation strategies to prevent episodes of paroxysmal atrial fibrillation (AF) have been subject to many clinical studies. The issues mainly concern pattern and transmurality of the lesions. This paper investigates ten different ablation strategies on a multilayered 3-D anatomical model of the atria with respect to 23 different setups of AF initiation in a biophysical computer model. There were 495 simulations carried out showing that circumferential lesions around the pulmonary veins (PVs) yield the highest success rate if at least two additional linear lesions are carried out. The findings compare with clinical studies as well as with other computer simulations. The anatomy and the setup of ectopic beats play an important role in the initiation and maintenance of AF as well as the resulting therapy. The computer model presented in this paper is a suitable tool to investigate different ablation strategies. By including individual patient anatomy and electrophysiological measurement, the model could be parameterized to yield an effective tool for future investigation of tailored ablation strategies and their effects on atrial fibrillation.
An optimal electrode position, atrio-ventricular (AV) and interventricular (VV) delay in cardiac resynchronization therapy (CRT) improves its success. An optimization strategy does not yet exist. A computer model of the Visible Man and a patient heart was used to simulate an atrio-ventricular and a left bundle branch block with 0%, 20% and 40% reduction in interventricular conduction velocity, respectively. The minimum error between physiological excitation and pathology/therapy was automatically computed for 12 different electrode positions. AV and VV delay timing was adjusted accordingly. The results show the importance of individually adjusting the electrode position as well as the timing delays to the patient's anatomy and pathology, which is in accordance with current clinical studies. The presented methods and strategy offer the opportunity to carry out non-invasive, automatic optimization of CRT preoperatively. The model is subject to validation in future clinical studies.
M. Reumann, B. Osswald, and O. Doessel. Noninvasive, automatic optimization strategy in cardiac resynchronization therapy. In Anadolu Kardiyoloji Dergisi : AKD = the Anatolian Journal of Cardiology, vol. 7 Suppl 1, pp. 209-212, 2007
OBJECTIVE: Optimization of cardiac resynchronization therapy (CRT) is still unsolved. It has been shown that optimal electrode position,atrioventricular (AV) and interventricular (VV) delays improve the success of CRT and reduce the number of non-responders. However, no automatic, noninvasive optimization strategy exists to date. METHODS: Cardiac resynchronization therapy was simulated on the Visible Man and a patient data-set including fiber orientation and ventricular heterogeneity. A cellular automaton was used for fast computation of ventricular excitation. An AV block and a left bundle branch block were simulated with 100%, 80% and 60% interventricular conduction velocity. A right apical and 12 left ventricular lead positions were set. Sequential optimization and optimization with the downhill simplex algorithm (DSA) were carried out. The minimal error between isochrones of the physiologic excitation and the therapy was computed automatically and leads to an optimal lead position and timing. RESULTS: Up to 1512 simulations were carried out per pathology per patient. One simulation took 4 minutes on an Apple Macintosh 2 GHz PowerPC G5. For each electrode pair an optimal pacemaker delay was found. The DSA reduced the number of simulations by an order of magnitude and the AV-delay and VV - delay were determined with a much higher resolution. The findings are well comparable with clinical studies. CONCLUSION: The presented computer model of CRT automatically evaluates an optimal lead position and AV-delay and VV-delay, which can be used to noninvasively plan an optimal therapy for an individual patient. The application of the DSA reduces the simulation time so that the strategy is suitable for pre-operative planning in clinical routine. Future work will focus on clinical evaluation of the computer models and integration of patient data for individualized therapy planning and optimization.
Cardiac arrhythmia is currently investigated from two different points of view. One considers ECG bio-signal analysis and investigates heart rate variability, baroreflex control, heart rate turbulence, alternans phenomena, etc. The other involves building computer models of the heart based on ion channels, bio-domain models and forward calculations to finally reach ECG and body surface potential maps. Both approaches aim to support the cardiologist in better understanding of arrhythmia, improving diagnosis and reliable risk stratification, and optimizing therapy. This article summarizes recent results and aims to trigger new research to bridge the different views.
R. Miri, M. Reumann, D. Farina, and O. Dössel. Concurrent optimization of timing delays and electrode positioning in biventricular pacing based on a computer heart model assuming 17 left ventricular segments. In Biomedizinische Technik. Biomedical Engineering, vol. 54(2) , pp. 55-65, 2009
BACKGROUND: The efficacy of cardiac resynchronization therapy through biventricular pacing (BVP) has been demonstrated by numerous studies in patients suffering from congestive heart failure. In order to achieve a guideline for optimal treatment with BVP devices, an automated non-invasive strategy based on a computer model of the heart is presented. MATERIALS AND METHODS: The presented research investigates an off-line optimization algorithm regarding electrode positioning and timing delays. The efficacy of the algorithm is demonstrated in four patients suffering from left bundle branch block (LBBB) and myocardial infarction (MI). The computer model of the heart was used to simulate the LBBB in addition to several MI allocations according to the different left ventricular subdivisions introduced by the American Heart Association. Furthermore, simulations with reduced interventricular conduction velocity were performed in order to model interventricular excitation conduction delay. More than 800,000 simulations were carried out by adjusting a variety of 121 pairs of atrioventricular and interventricular delays and 36 different electrode positioning set-ups. Additionally, three different conduction velocities were examined. The optimization measures included the minimum root mean square error (E(RMS)) between physiological, pathological and therapeutic excitation, and also the difference of QRS-complex duration. Both of these measures were computed automatically. RESULTS: Depending on the patient's pathology and conduction velocity, a reduction of E(RMS) between physiological and therapeutic excitation could be reached. For each patient and pathology, an optimal pacing electrode pair was determined. The results demonstrated the importance of an individual adjustment of BVP parameters to the patient's anatomy and pathology. CONCLUSION: This work proposes a novel non-invasive optimization algorithm to find the best electrode positioning sites and timing delays for BVP in patients with LBBB and MI. This algorithm can be used to plan an optimal therapy for an individual patient.
R. Miri, M. Reumann, D. Farina, B. Osswald, and O. Dössel. Computer assisted optimization of biventricular pacing assuming ventricular heterogeneity. In 11th Mediterranean Conference on Medical and Biomedical Engineering and Computingand Computing, vol. 16(15) , pp. 541-544, 2007
Reduced cardiac output, dysfunction of the conduction system, atrio-ventricular block, bundle branch blocks and remodeling of the chambers are results of congestive heart failure (CHF). Biventricular pacing as Cardiac Resynchronization Therapy (CRT) is a recognized therapy for the treatment of heart failure. The present paper investigates an automated non-invasive strategy to optimize CRT with respect to electrode positioning and timing delays based on a complex threedimensional computer model of the human heart. The anatomical model chosen for this study was the segmented data set of the Visible Man and a set of patient data with dilated ventricles and left bundle branch block. The excitation propagation and intra-ventricular conduction were simulated with Ten Tusscher electrophysiological cell model and adaptive cellular automaton. The pathologies simulated were a total atrioventricular (AV) block and a left bundle branch block (LBBB) in conjunction with reduced interventricular conduction velocities. The simulated activation times of different myocytes in the healthy and diseased heart model are compared in terms of root mean square error. The outcomes of the investigation show that the positioning of the electrodes, with respect to proper timing delay influences the efficiency of the resynchronization therapy. The proposed method may assist the surgeon in therapy planning.
M. Reumann, M. Mohr, O. Dössel, and A. Diez. Vorlesung, Übung und Tutorium im koordinierten Zusammenspiel. Ein Lehr-/Lernpaket schnüren - Grundlagenveranstaltung. Berendt, Brigitte, 2006.
Es wird eine Methode beschrieben, wie medizinische Bilder des Herzens modellbasiert mit EKG-Daten verknüpft werden können, um damit zu einer spezifischen Diagnostik und zu einer besseren Therapieplanung in der Kardiologie zu gelangen. Zunächst wird aus MRT- oder CT-Bildern des Patienten die Geometrie seines Herzens ermittelt. Elektrokardiographische Messungen an der Körperoberfläche (EKG oder Body Surface Potential Mapping) und aus dem Inneren des Herzens (intracardial mapping) werden aufgenommen und die Orte der Messung in den Bilddatensatz eingetragen (registration). Ein elektrophysiologisches Computermodell vom Herzen des Patienten wird mit Hilfe der elektrophysiologischen Messdaten iterativ angepasst. Schließlich entsteht im Computer ein virtuelles Herz des Patienten, welches sowohl die Geometrie als auch die Elektrophysiologie wiedergibt. Ein Modell der Vorhöfe hat beispielsweise das Potenzial, die Ursachen von Vorhofflimmern zu erkennen und die Radiofrequenz-Ablationsstrategie zu optimieren. Ein Modell der Ventrikel des Herzens kann helfen, genetisch bedingte Rhythmusstörungen besser zu verstehen oder auch die Parameter bei der kardialen Resynchronisationstherapie zu optimieren. Die Modellierung des Herzens mit einem Infarktgebiet könnte die elektrophysiologischen Auswirkungen des Infarktes beschreiben und die Risikostratifizierung für gefährliche ventrikuläre Arrhythmien unterstützen oder die Erfolgsrate bei ventrikulären Ablationen erhöhen.
Conference Contributions (33)
M. Reumann, J. Bohnert, and O. Dössel. Simulating pulmonary vein activity leading to atrial fibrillation using a rule-based approach on realistic anatomical data. In Conf Proc IEEE Eng Med Biol Soc., vol. 1, pp. 3943-3946, 2006
Atrial fibrillation (AF) is the most common cardiac arrhythmia leading to a high rate of stroke. The underlying mechanisms of initiation and maintenance of AF are not fully understood. Several findings suggest a multitude of factors to leave the atria vulnerable to AF. In this work, a rule-based approach is taken to simulate the initiation of AF in a computer model for the purpose of generating a model with which the influence of anatomical structures, electrophysiological properties of the atria and arrhythmogenic activity can be evaluated. Pulmonary vein firing has been simulated leading to AF in 65.7 % of all simulations. The excitation pattern generated resemble chaotic excitation behavior, which is characteristic for AF as well as stable reentrant circuits responsible for atrial flutter. The findings compare well with literature. In future, the presented computer model of AF can be used in therapy planning such as ablation therapy or overdrive pacing.
Question: The mechanisms responsible for atrial fibrillation (AF) are not completely understood. Various conduction velocities and realistic anatomical structures of the atria are implemented into a computer model showing the influence of complex anatomical structures on the initiation and maintenance of AF.Method Used: In a computer model of the Visible Female heart (National Library of Medicine, Bethseda, Maryland, USA), the initiation of AF was simulated by pulmonary vein (PV) firing. The anatomical model had a resolution of 1,696,740 tissue voxel with 0.33 mm voxel side length. 32 foci around all pulmonary veins were set. The excitation propagation was simulated using an adaptive cellular automaton. Electrophysiological parameters depending on different tissue types can be set. In this work, only the conduction velocity was reduced compared to physiological data.Results: The initiation of AF through ectopic foci creates re-entrant circuits and quasi-chaotic excitation pattern in the computer model. 8 of 16 foci in the left superior, 3 of 4 foci in the left inferior, 5 of 8 foci in the right superior and 4 of 4 foci in the right inferior PV created AF after only 1.5 s. The excitation pattern shows stable re-entrant circuits as well as chaotic behavior. A breakup of stable re-entrant circuits was also observed when simulating the pathology for 17.5 s. The other foci caused self-terminating rotors.Conclusion: Computer models of the excitation propagation of the heart can be used to simulate AF initiated by triggers in the PV. A reduction in conduction velocity caused the establishment of re-entrant circuits and quasi-chaotic behavior. The complex model of the Visible Female heart showed the importance of anatomical structures in the maintenance of AF. Future work will include an improvement of the computer model by incorporating heterogeneities of atrial tissue and an implementation of individual patient models for therapy planning.
High performance computing is required to make feasible simulations of whole organ models of the heart with biophysically detailed cellular models in a clinical setting. Increasing model detail by simulating electrophysiology and mechanical models increases computation demands. We present scaling results of an electro mechanical cardiac model of two ventricles and compare them to our previously published results using an electrophysiological model only. The anatomical data-set was given by both ventricles of the Visible Female data-set in a 0.2 mm resolution. Fiber orientation was included. Data decomposition for the distribution onto the distributed memory system was carried out by orthogonal recursive bisection. Load weight ratios for non tissue vs. tissue elements used in the data decomposition were 1:1, 1:2, 1:5, 1:10, 1:25, 1:38.85, 1:50 and 1:100. The ten Tusscher et al. (2004) electrophysiological cell model was used and the Rice et al. (1999) model for the computation of the calcium transient dependent force. Scaling results for 512, 1024, 2048, 4096, 8192 and 16,384 processors were obtained for 1 ms simulation time. The simulations were carried out on an IBM Blue Gene/L supercomputer. The results show linear scaling from 512 to 16,384 processors with speedup factors between 1.82 and 2.14 between partitions. The most optimal load ratio was 1:25 for on all partitions. However, a shift towards load ratios with higher weight for the tissue elements can be recognized as can be expected when adding computational complexity to the model while keeping the same communication setup. This work demonstrates that it is potentially possible to run simulations of 0.5 s using the presented electro-mechanical cardiac model within 1.5 hours.
Multi-scale, multi-physical heart models have not yet been able to include a high degree of accuracy and resolution with respect to model detail and spatial resolution due to computational limitations of current systems. We propose a framework to compute large scale cardiac models. Decomposition of anatomical data in segments to be distributed on a parallel computer is carried out by optimal recursive bisection (ORB). The algorithm takes into account a computational load parameter which has to be adjusted according to the cell models used. The diffusion term is realized by the monodomain equations. The anatomical data-set was given by both ventricles of the Visible Female data-set in a 0.2 mm resolution. Heterogeneous anisotropy was included in the computation. Model weights as input for the decomposition and load balancing were set to (a) 1 for tissue and 0 for non-tissue elements; (b) 10 for tissue and 1 for non-tissue elements. Scaling results for 512, 1024, 2048, 4096 and 8192 computational nodes were obtained for 10 ms simulation time. The simulations were carried out on an IBM Blue Gene/L parallel computer. A 1 s simulation was then carried out on 2048 nodes for the optimal model load. Load balances did not differ significantly across computational nodes even if the number of data elements distributed to each node differed greatly. Since the ORB algorithm did not take into account computational load due to communication cycles, the speedup is close to optimal for the computation time but not optimal overall due to the communication overhead. However, the simulation times were reduced form 87 minutes on 512 to 11 minutes on 8192 nodes. This work demonstrates that it is possible to run simulations of the presented detailed cardiac model within hours for the simulation of a heart beat.
Increasing biophysical detail in multi physical, multiscale cardiac model will demand higher levels of parallelism in multi-core approaches to obtain fast simulation times. As an example of such a highly parallel multi-core approaches, we develop a completely distributed bidomain cardiac model implemented on the IBM Blue Gene/L architecture. A tissue block of size 50 times 50 times 100 cubic elements based on ten Tusscher et al. (2004) cell model is distributed on 512 computational nodes. The extracellular potential is calculated by the Gauss-Seidel (GS) iterative method that typically requires high levels of inter-processor communication. Specifically, the GS method requires knowledge of all cellular potentials at each of it iterative step. In the absence of shared memory, the values are communicated with substantial overhead. We attempted to reduce communication overhead by computing the extracellular potential only every 5th time step for the integration of the cell models. We also investigated the effects of reducing inter-processor communication to every 5th, 10th, 50th iteration or no communication within the GS iteration. While technically incorrect, these approximation had little impact on numerical convergence or accuracy for the simulations tested. The results suggest some heuristic approaches may further reduce the inter-processor communication to improve the execution time of large-scale simulations.
Orthogonal recursive bisection (ORB) algorithm can be used as data decomposition strategy to distribute a large data set of a cardiac model to a distributed memory supercomputer. It has been shown previously that good scaling results can be achieved using the ORB algorithm for data decomposition. However, the ORB algorithm depends on the distribution of computational load of each element in the data set. In this work we investigated the dependence of data decomposition and load balancing on different rotations of the anatomical data set to achieve optimization in load balancing. The anatomical data set was given by both ventricles of the Visible Female data set in a 0.2 mm resolution. Fiber orientation was included. The data set was rotated by 90 degrees around x, y and z axis, respectively. By either translating or by simply taking the magnitude of the resulting negative coordinates we were able to create 14 data sets of the same anatomy with different orientation and position in the overall volume. Computation load ratios for non tissue vs. tissue elements used in the data decomposition were 1:1, 1:2, 1:5, 1:10, 1:25, 1:38.85, 1:50 and 1:100 to investigate the effect of different load ratios on the data decomposition. The ten Tusscher et al. (2004) electrophysiological cell model was used in monodomain simulations of 1 ms simulation time to compare performance using the different data sets and orientations. The simulations were carried out for load ratio 1:10, 1:25 and 1:38.85 on a 512 processor partition of the IBM Blue Gene/L supercomputer. The results show that the data decomposition does depend on the orientation and position of the anatomy in the global volume. The difference in total run time between the data sets is 10 s for a simulation time of 1 ms. This yields a difference of about 28 h for a simulation of 10 s simulation time. However, given larger processor partitions, the difference in run time decreases and becomes less significant. Depending on the processor partition size, future work will have to consider the orientation of the anatomy in the global volume for longer simulation runs.
M. Reumann, M. Mohr, A. Dietz, and O. Dössel. Assessing learning progress and teaching quality in large groups of students. In Engineering in Medicine and Biology Society, 2008. EMBS 2008. 30th Annual International Conference of the IEEE, pp. 2877-2880, 2008
The classic tool of assessing learning progress are written tests and assignments. In large groups of students the workload often does not allow in depth evaluation during the course. Thus our aim was to modify the course to include active learning methods and student centered teaching. We changed the course structure only slightly and established new assessment methods like minute papers, short tests, mini-projects and a group project at the end of the semester. The focus was to monitor the learning progress during the course so that problematic issues could be addressed immediately. The year before the changes 26.76 % of the class failed the course with a grade average of 3.66 (Pass grade is 4.0/30 % of achievable marks). After introducing student centered teaching, only 14 % of students failed the course and the average grade was 3.01. Grades were also distributed more evenly with more students achieving better results. We have shown that even in large groups of students with > 100 participants student centered and active learning is possible. Although it requires a great work overhead on the behalf of the teaching staff, the quality of teaching and the motivation of the students is increased leading to a better learning environment.
The output data generated in whole heart simula- tions are usually single or multiple parameters at each point in the simulation space. Visualizing data sets of gigabyte size puts great stress on the hardware and can be slow and tedious. Creating animated movies to analyze the excitation propaga- tion can take hours on standard systems. We present two par- allel visualization techniques to improve rendering of large datasets from cardiac simulations.The Scalable Parallel Visualization Networking (SPVN) toolkit provides the ability to assist in optimizing the utility and functionality of the aggregate resources in visualization clusters. Run time visualization offers the opportunity to visu- alize the results of cardiac simulations on the fly on High Per- formance Computers. Parallel visualization techniques enable fast manipulation of high resolution whole heart data sets and simulation results. The SPVN system has the potential to be linked with the simulation environment similar to the run time visualization described.Future efforts will focus on creating a simulation and visu- alization environment with appropriate characteristics for clinical setting. Specifically, speed, intuitive control and the ability to render diverse signals will likely be critical to drive adoption in the clinical setting.
M. Reumann, B. Osswald, S. Hagl, and O. Dössel. Computer aided evaluation of preventive atrial antitachycardial pacing. In 15th World Congress in Cardiac Electrophysiology and Cardiac Techniques - Cardiostim 2006. Europace, vol. 8(Supplement 1) , pp. 213-216, 2006
M. Reumann, B. Osswald, S. Hagl, and O. Dössel. Computer-based Evaluation of Atrial Antitachycardial Pacing to Prevent Atrial Fibrillation on Realistic Anatomical Data. In Gemeinsame Jahrestagung der Deutschen, der Österreichischen und der Schweizerischen Gesellschaft für Biomedizinische Technik, 2006
Heart Failure is the most common cardiac disease worldwide; supraventricular arrhythmia the most common cardiac arrhythmia. The understanding of these diseases advances treatment options. Ablation therapy is a well accepted non-pharmacological option in the treatment of atrial fibrillation. Cardiac resynchronization therapy with biventricular pacing devices has been shown successful in patients with severe heart failure. However, an optimization or even individual therapy planning is not standard or not even carried out today. These non-pharmacological treatments can be investigated and optimized with the help of computer models of the heart. Different ablation strategies are applied to terminate the arrhythmia in the virtual environment and a comparison of strategies can be carried out. With respect to cardiac resynchronization therapy, the computer model allows for automatic and non-invasive optimization of electrode positions and timing delays. With clinical validation, the presented computer models and methods have the potential to contribute to individualized therapy planning.
J. Bohnert, M. Reumann, T. Faber, and O. Dössel. Investigation of curative ablation techniques for atrial fibrillation in a computer model. In Gemeinsame Jahrestagung der Deutschen, der Österreichischen und der Schweizerischen Gesellschaft für Biomedizinische Technik, 2006
A computer model of the human heart is presented, that starts with the electrophysiology of single myocardial cells including all relevant ion channels, spans the de- and repolarization of the heart including the generation of the Electrocardiogram (ECG) and ends with the contraction of the heart that can be measured using 4D Magnetic Resonance Imaging (MRI). The model can be used to better understand physiology and pathophysiology of the heart, to improve diagnostics of infarction and arrhythmia and to enable quantitative therapy planning. It can also be used as a regularization tool to gain better solutions of the ill-posed inverse problem of ECG. Movies of the evolution of electrophysiology of the heart can be reconstructed from Body Surface Potential Maps (BSPM) and MRI, leading to a new non-invasive medical imaging technique.
R. Miri, M. Reumann, D. Farina, B. Osswald, and O. Dössel. Optimizing A-V and V-V delay in cardiac resynchronization therapy in simulations including ventricle heterogeneity. In 5th IASTED International Conference on Biomedical Engineering BioMED 2007, 2007
Congestive heart failure (CHF) is affecting more than 15 million people in the western population with an increasing number. Biventricular pacing as Cardiac Resynchronization Therapy (CRT) is a recognized therapy for the treatment of heart failure. The present paper investigates the optimal pacing sites and stimuli delays for stimulation, based on a complex three-dimensional computer model of the human heart. The anatomical features were derived from the Visible Man data set. The excitation propagation and intraventricular conduction were simulated with Ten Tusscher electrophysiological cell model and an adaptive cellular automaton. Biventricular pacing in AV block III and LBBB with different interventricular conduction delays were investigated. The simulated activation times of different myocytes in the healthy and diseased heart model are compared in terms of root mean square error (ERMS). The outcomes of the investigation underline that the positioning of the electrodes considering a proper atrioventricular and intraventricular delay influences the efficiency of the resynchronization therapy. The results of this optimization strategy may assist the surgeon in therapy planning.
R. Miri, M. Reumann, D. U. J. Keller, D. Farina, and O. Dössel. Comparison of the electrophysiologically based optimization methods with different pacing parameters in patient undergoing resynchronization treatment. In Engineering in Medicine and Biology Society, 2008. EMBS 2008. 30th Annual International Conference of the IEEE, vol. 2008, pp. 1741-1744, 2008
Many studies conducted on patients suffering from congestive heart failure have shown the efficacy of cardiac resynchronization therapy (CRT). The presented research investigates an off-line optimization algorithm based on different electrode positioning and timing delays. A computer model of the heart was used to simulate left bundle branch block (LBBB), myocardial infarction (MI) and reduction of intraventricular conduction velocity in order to customize the patient symptom. The optimization method evaluates the error between the healthy heart and pathology with/without pacing in terms of activation time and QRS length. Additionally, a torso model of the patient is extracted to compute the body surface potential map (BSPM) and to simulate the ECG with Wilson leads to validate the results obtained by the electrophysiological heart model optimization.
R. Miri, M. Reumann, D. U. J. Keller, D. Farina, and O. Dössel. A non-invasive computer based optimization strategy of biventricular pacing. In Tagungsband 6. Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie e. V., pp. 133-136, 2007
R. Miri, M. Reumann, D. Keller, D. Farina, and O. Dössel. Computer based optimization of biventricular pacing according to the left ventricular 17 myocardial segments. In Proceedings of the 29th Annual International Conference of the IEEE EMBS, pp. 1418-1421, 2007
J. Qin, M. Reumann, S. H. Osswald, and O. Dössel. Developing Algorithms for The Optimization of Overpacing Strategies in Patients with Atrial Fibrillation. In Biomedizinische Technik, vol. 50(S1) , pp. 1426-1427, 2005
Atrial Fibrillation (AF) is the most common cardiac arrhythmia with overall prevalence of almost 1%. Treatment options range from pharmacological to surgical options. While ablation strategies are the most common thera- pies, it has been testified that overdrive stimulation can prevent AF successfully. The presented work suggests an approach to simulate the excitation propagation with various overdrive pacing frequencies based on a detailed cell model in a simplified anatomical structure. Through simulations, a relationship between the overdrive pac- ing position and its frequency with respect to the onset of AF was shown. The method applied in this work can be used to further develop optimization strategies for overdrive pacing. In the long run, the application trans- ferred to individual patient data might be used to increase the success rate of overdrive pacing in terminating atrial fibrillation.
The purpose of this study is to develop a computer model-based planning environment for therapeutically cardiac interventions, i.e. surgical or catheter ablation procedures in atrial cases and placing pacemaker electrodes in biventricular pacing. Existing mathematical models are used to simulate the electrophysiology on an anatomical pig model during a heart cycle. The results of these models were validated in multiple domestic pig animal experiments. We found that the models created enable us to simulate the electrical behaviour of the heart nearly in real time and that it reproduces the properties of the heart in atrial flutter and in ventricular pacing with different pacing locations. The results of computer-based simulations may lead to a better understanding of cardiac rhythm disorders and the development of new, less invasive operative techniques.
M. Reumann. Computer assisted optimisation of non-pharmacological treatment of congestive heart failure and supraventricular arrhythmia. KIT Scientific Publishing. Dissertation. 2007
Heart Failure is the most common cardiac disease worldwide; supraventricular arrhythmia the most common cardiac arrhythmia. The understanding of these diseases advances treatment options. Ablation therapy and atrial antitachycardial pacing are non-pharmacological options in the treatment of atrial fibrillation. Cardiac resynchronization therapy with biventricular pacing devices has been shown successful in patients with severe heart failure. However, an optimization or even individual therapy planning is not standard or not even carried out today.