366 research outputs found

    High-frequency Electrocardiogram Analysis in the Ability to Predict Reversible Perfusion Defects during Adenosine Myocardial Perfusion Imaging

    Get PDF
    Background: A previous study has shown that analysis of high-frequency QRS components (HF-QRS) is highly sensitive and reasonably specific for detecting reversible perfusion defects on myocardial perfusion imaging (MPI) scans during adenosine. The purpose of the present study was to try to reproduce those findings. Methods: 12-lead high-resolution electrocardiogram recordings were obtained from 100 patients before (baseline) and during adenosine Tc-99m-tetrofosmin MPI tests. HF-QRS were analyzed regarding morphology and changes in root mean square (RMS) voltages from before the adenosine infusion to peak infusion. Results: The best area under the curve (AUC) was found in supine patients (AUC=0.736) in a combination of morphology and RMS changes. None of the measurements, however, were statistically better than tossing a coin (AUC=0.5). Conclusion: Analysis of HF-QRS was not significantly better than tossing a coin for determining reversible perfusion defects on MPI scans

    New techniques to integrate blockchain in Internet of Things scenarios for massive data management

    Get PDF
    Mención Internacional en el título de doctorNowadays, regardless of the use case, most IoT data is processed using workflows that are executed on different infrastructures (edge-fog-cloud), which produces dataflows from the IoT through the edge to the fog/cloud. In many cases, they also involve several actors (organizations and users), which poses a challenge for organizations to establish verification of the transactions performed by the participants in the dataflows built by the workflow engines and pipeline frameworks. It is essential for organizations, not only to verify that the execution of applications is performed in the strict sequence previously established in a DAG by authenticated participants, but also to verify that the incoming and outgoing IoT data of each stage of a workflow/pipeline have not been altered by third parties or by the users associated to the organizations participating in a workflow/pipeline. Blockchain technology and its mechanism for recording immutable transactions in a distributed and decentralized manner, characterize it as an ideal technology to support the aforementioned challenges and challenges since it allows the verification of the records generated in a secure manner. However, the integration of blockchain technology with workflows for IoT data processing is not trivial considering that it is a challenge not to lose the generalization of workflows and/or pipeline engines, which must be modified to include the embedded blockchain module. The main objective of this doctoral research was to create new techniques to use blockchain in the Internet of Things (IoT). Thus, we defined the main goal of this thesis is to develop new techniques to integrate blockchain in Internet of Things scenarios for massive data management in edge-fog-cloud environments. To fulfill this general objective, we have designed a content delivery model for processing big IoT data in Edge-Fog-Cloud computing by using micro/nanoservice composition, a continuous verification model based on blockchain to register significant events from the continuous delivery model, selecting techniques to integrate blockchain in quasi-real systems that allow ensuring traceability and non-repudiation of data obtained from devices and sensors. The evaluation proposed has been thoroughly evaluated, showing its feasibility and good performance.Hoy en día, independientemente del caso de uso, la mayoría de los datos de IoT se procesan utilizando flujos de trabajo que se ejecutan en diferentes infraestructuras (edge-fog-cloud) desde IoT a través del edge hasta la fog/cloud. En muchos casos, también involucran a varios actores (organizaciones y usuarios), lo que plantea un desafío para las organizaciones a la hora de verificar las transacciones realizadas por los participantes en los flujos de datos. Es fundamental para las organizaciones, no solo para verificar que la ejecución de aplicaciones se realiza en la secuencia previamente establecida en un DAG y por participantes autenticados, sino también para verificar que los datos IoT entrantes y salientes de cada etapa de un flujo de trabajo no han sido alterados por terceros o por usuarios asociados a las organizaciones que participan en el mismo. La tecnología Blockchain, gracias a su mecanismo para registrar transacciones de manera distribuida y descentralizada, es un tecnología ideal para soportar los retos y desafíos antes mencionados ya que permite la verificación de los registros generados de manera segura. Sin embargo, la integración de la tecnología blockchain con flujos de trabajo para IoT no es baladí considerando que es un desafío proporcionar el rendimiento necesario sin perder la generalización de los motores de flujos de trabajo, que deben ser modificados para incluir el módulo blockchain integrado. El objetivo principal de esta investigación doctoral es desarrollar nuevas técnicas para integrar blockchain en Internet de las Cosas (IoT) para la gestión masiva de datos en un entorno edge-fog-cloud. Para cumplir con este objetivo general, se ha diseñado un modelo de flujos para procesar grandes datos de IoT en computación Edge-Fog-Cloud mediante el uso de la composición de micro/nanoservicio, un modelo de verificación continua basado en blockchain para registrar eventos significativos de la modelo de entrega continua de datos, seleccionando técnicas para integrar blockchain en sistemas cuasi-reales que permiten asegurar la trazabilidad y el no repudio de datos obtenidos de dispositivos y sensores, La evaluación propuesta ha sido minuciosamente evaluada, mostrando su factibilidad y buen rendimiento.This work has been partially supported by the project "CABAHLA-CM: Convergencia Big data-Hpc: de los sensores a las Aplicaciones" S2018/TCS-4423 from Madrid Regional Government.Programa de Doctorado en Ciencia y Tecnología Informática por la Universidad Carlos III de MadridPresidente: Paolo Trunfio.- Secretario: David Exposito Singh.- Vocal: Rafael Mayo Garcí

    Dense agent-based HPC simulation of cell physics and signaling with real-time user interactions

    Get PDF
    Introduction: Distributed simulations of complex systems to date have focused on scalability and correctness rather than interactive visualization. Interactive visual simulations have particular advantages for exploring emergent behaviors of complex systems. Interpretation of simulations of complex systems such as cancer cell tumors is a challenge and can be greatly assisted by using “built-in” real-time user interaction and subsequent visualization.Methods: We explore this approach using a multi-scale model which couples a cell physics model with a cell signaling model. This paper presents a novel communication protocol for real-time user interaction and visualization with a large-scale distributed simulation with minimal impact on performance. Specifically, we explore how optimistic synchronization can be used to enable real-time user interaction and visualization in a densely packed parallel agent-based simulation, whilst maintaining scalability and determinism. We also describe the software framework created and the distribution strategy for the models utilized. The key features of the High-Performance Computing (HPC) simulation that were evaluated are scalability, deterministic verification, speed of real-time user interactions, and deadlock avoidance.Results: We use two commodity HPC systems, ARCHER (118,080 CPU cores) and ARCHER2 (750,080 CPU cores), where we simulate up to 256 million agents (one million cells) using up to 21,953 computational cores and record a response time overhead of ≃350 ms from the issued user events.Discussion: The approach is viable and can be used to underpin transformative technologies offering immersive simulations such as Digital Twins. The framework explained in this paper is not limited to the models used and can be adapted to systems biology models that use similar standards (physics models using agent-based interactions, and signaling pathways using SBML) and other interactive distributed simulations

    Aerospace Medicine and Biology: A continuing bibliography with indexes (supplement 153)

    Get PDF
    This bibliography lists 175 reports, articles, and other documents introduced into the NASA scientific and technical information system in March 1976

    Scalable and Accurate ECG Simulation for Reaction-Diffusion Models of the Human Heart

    Get PDF
    International audienceRealistic electrocardiogram (ECG) simulation with numerical models is important for research linking cellular and molecular physiology to clinically observable signals, and crucial for patient tailoring of numerical heart models. However, ECG simulation with a realistic torso model is computationally much harder than simulation of cardiac activity itself, so that many studies with sophisticated heart models have resorted to crude approximations of the ECG. This paper shows how the classical concept of electrocardiographic lead fields can be used for an ECG simulation method that matches the realism of modern heart models. The accuracy and resource requirements were compared to those of a full-torso solution for the potential and scaling was tested up to 14,336 cores with a heart model consisting of 11 million nodes. Reference ECGs were computed on a 3.3 billion-node heart-torso mesh at 0.2 mm resolution. The results show that the lead-field method is more efficient than a full-torso solution when the number of simulated samples is larger than the number of computed ECG leads. While the initial computation of the lead fields remains a hard and poorly scalable problem, the ECG computation itself scales almost perfectly and, even for several hundreds of ECG leads, takes much less time than the underlying simulation of cardiac activity

    Computer modeling and signal analysis of cardiovascular physiology

    Get PDF
    This dissertation aims to study cardiovascular physiology from the cellular level to the whole heart level to the body level using numerical approaches. A mathematical model was developed to describe electromechanical interaction in the heart. The model integrates cardio-electrophysiology and cardiac mechanics through excitation-induced contraction and deformation-induced currents. A finite element based parallel simulation scheme was developed to investigate coupled electrical and mechanical functions. The developed model and numerical scheme were utilized to study cardiovascular dynamics at cellular, tissue and organ levels. The influence of ion channel blockade on cardiac alternans was investigated. It was found that the channel blocker may significantly change the critical pacing period corresponding to the onset of alternans as well as the alternans’ amplitude. The influence of electro-mechanical coupling on cardiac alternans was also investigated. The study supported the earlier assumptions that discordant alternans is induced by the interaction of conduction velocity and action potential duration restitution at high pacing rates. However, mechanical contraction may influence the spatial pattern and onset of discordant alternans. Computer algorithms were developed for analysis of human physiology. The 12-lead electrocardiography (ECG) is the gold standard for diagnosis of various cardiac abnormalities. However, disturbances and mistakes may modify physiological waves in ECG and lead to wrong diagnoses. This dissertation developed advanced signal analysis techniques and computer software to detect and suppress artifacts and errors in ECG. These algorithms can help to improve the quality of health care when integrated into medical devices or services. Moreover, computer algorithms were developed to predict patient mortality in intensive care units using various physiological measures. Models and analysis techniques developed here may help to improve the quality of health care

    A Comprehensive Analysis of Literature Reported Mac and Phy Enhancements of Zigbee and its Alliances

    Get PDF
    Wireless communication is one of the most required technologies by the common man. The strength of this technology is rigorously progressing towards several novel directions in establishing personal wireless networks mounted over on low power consuming systems. The cutting-edge communication technologies like bluetooth, WIFI and ZigBee significantly play a prime role to cater the basic needs of any individual. ZigBee is one such evolutionary technology steadily getting its popularity in establishing personal wireless networks which is built on small and low-power digital radios. Zigbee defines the physical and MAC layers built on IEEE standard. This paper presents a comprehensive survey of literature reported MAC and PHY enhancements of ZigBee and its contemporary technologies with respect to performance, power consumption, scheduling, resource management and timing and address binding. The work also discusses on the areas of ZigBee MAC and PHY towards their design for specific applications

    Modeling and simulation of the electric activity of the heart using graphic processing units

    Get PDF
    Mathematical modelling and simulation of the electric activity of the heart (cardiac electrophysiology) offers and ideal framework to combine clinical and experimental data in order to help understanding the underlying mechanisms behind the observed respond under physiological and pathological conditions. In this regard, solving the electric activity of the heart possess a big challenge, not only because of the structural complexities inherent to the heart tissue, but also because of the complex electric behaviour of the cardiac cells. The multi- scale nature of the electrophysiology problem makes difficult its numerical solution, requiring temporal and spatial resolutions of 0.1 ms and 0.2 mm respectively for accurate simulations, leading to models with millions degrees of freedom that need to be solved for thousand time steps. Solution of this problem requires the use of algorithms with higher level of parallelism in multi-core platforms. In this regard the newer programmable graphic processing units (GPU) has become a valid alternative due to their tremendous computational horsepower. This thesis develops around the implementation of an electrophysiology simulation software entirely developed in Compute Unified Device Architecture (CUDA) for GPU computing. The software implements fully explicit and semi-implicit solvers for the monodomain model, using operator splitting and the finite element method for space discretization. Performance is compared with classical multi-core MPI based solvers operating on dedicated high-performance computer clusters. Results obtained with the GPU based solver show enormous potential for this technology with accelerations over 50× for three-dimensional problems when using an implicit scheme for the parabolic equation, whereas accelerations reach values up to 100× for the explicit implementation. The implemented solver has been applied to study pro-arrhythmic mechanisms during acute ischemia. In particular, we investigate on how hyperkalemia affects the vulnerability window to reentry and the reentry patterns in the heterogeneous substrate caused by acute regional ischemia using an anatomically and biophysically detailed human biventricular model. A three dimensional geometrically and anatomically accurate regionally ischemic human heart model was created. The ischemic region was located in the inferolateral and posterior side of the left ventricle mimicking the occlusion of the circumflex artery, and the presence of a washed-out zone not affected by ischemia at the endocardium has been incorporated. Realistic heterogeneity and fi er anisotropy has also been considered in the model. A highly electrophysiological detailed action potential model for human has been adapted to make it suitable for modeling ischemic conditions (hyperkalemia, hipoxia, and acidic conditions) by introducing a formulation of the ATP-sensitive K+ current. The model predicts the generation of sustained re-entrant activity in the form single and double circus around a blocked area within the ischemic zone for K+ concentrations bellow 9mM, with the reentrant activity associated with ventricular tachycardia in all cases. Results suggest the washed-out zone as a potential pro-arrhythmic substrate factor helping on establishing sustained ventricular tachycardia.Colli-Franzone P, Pavarino L. A parallel solver for reaction-diffusion systems in computational electrocardiology, Math. Models Methods Appl. Sci. 14 (06):883-911, 2004.Colli-Franzone P, Deu hard P, Erdmann B, Lang J, Pavarino L F. Adaptivity in space and time for reaction-diffusion systems in electrocardiology, SIAM J. Sci. Comput. 28 (3):942-962, 2006.Ferrero J M(Jr), Saiz J, Ferrero J M, Thakor N V. Simulation of action potentials from metabolically impaired cardiac myocytes: Role of atp-sensitive K+ current. Circ Res, 79(2):208-221, 1996.Ferrero J M (Jr), Trenor B. Rodriguez B, Saiz J. Electrical acticvity and reentry during acute regional myocardial ischemia: Insights from simulations.Int J Bif Chaos, 13:3703-3715, 2003.Heidenreich E, Ferrero J M, Doblare M, Rodriguez J F. Adaptive macro finite elements for the numerical solution of monodomain equations in cardiac electrophysiology, Ann. Biomed. Eng. 38 (7):2331-2345, 2010.Janse M J, Kleber A G. Electrophysiological changes and ventricular arrhythmias in the early phase of regional myocardial ischemia. Circ. Res. 49:1069-1081, 1981.ten Tusscher K HWJ, Panlov A V. Alternans and spiral breakup in a human ventricular tissue model. Am. J.Physiol. Heart Circ. Physiol. 291(3):1088-1100, 2006.<br /

    Design and Implementation of a Stepped Frequency Continuous Wave Radar System for Biomedical Applications

    Get PDF
    There is a need to detect vital signs of human (e.g., the respiration and heart-beat rate) with noncontact method in a number of applications such as search and rescue operation (e.g. earthquakes, fire), health monitoring of the elderly, performance monitoring of athletes Ultra-wideband radar system can be utilized for noncontact vital signs monitoring and tracking of various human activities of more than one subject. Therefore, a stepped-frequency continuous wave radar (SFCW) system with wideband performance is designed and implemented for Vital signs detection and fall events monitoring. The design of the SFCW radar system is firstly developed using off-the-shelf discrete components. Later, the system is implemented using surface mount components to make it portable with low cost. The measurement result is proved to be accurate for both heart rate and respiration rate detection within ±5% when compared with contact measurements. Furthermore, an electromagnetic model has been developed using a multi-layer dielectric model of the human subject to validate the experimental results. The agreement between measured and simulated results is good for distances up to 2 m and at various subjects’ orientations with respect to the radar, even in the presence of more than one subject. The compressive sensing (CS) technique is utilized to reduce the size of the acquired data to levels significantly below the Nyquist threshold. In our demonstration, we use phase information contained in the obtained complex high-resolution range profile (HRRP) to derive the motion characteristics of the human. The obtained data has been successfully utilized for non-contact walk, fall and limping detection and healthcare monitoring. The effectiveness of the proposed method is validated using measured results
    corecore