317 research outputs found

    Cardiac rehabilitation in heart failure patients with devices

    Get PDF
    Cardiac rehabilitation (CR) for heart failure by systolic dysfunction benefits from a recommendation of level IA by all the scientific societies. The main core components of CR include: evaluation, personalized physical training, patient education, treatments optimization and psycho-social counselling. Heart failure patients are sometimes, and more and more often, implanted by simple or sophisticated pacemakers, resynchronization and/or automatic defibrillators for rhythm and/or hemodynamic indications.The management of these patients requires the knowledge of the functioning of every device, constraints and limits of these and risks associated to the disease and to the equipment in different situations. The cardiologic evaluation at exercise by a cardiopulmonary exercise test will allow the prescription of the training but it happens that some adjustments of the settings are required. The demonstrated benefits of exercise training are an improvement of exercise capacities, and consequently of quality of life and also a reduction on re-hopitalizations. If education is mandatory for all the heart failure patients, in implanted patients, specific knowledge and behavior modifications were also implemented. Optimization of the treatments needs to take on account the patient profile. Psychological consequences (particularly for implanted cardiac defibrillators) should be managed and possibilities to return to a normal of possible life (including return to work) evaluated.Accordingly, management of implanted heart failure patients requires a specific skills and cardiologic supervision adapted to the patient situation

    On P-H_v-Structures in a Two-Dimensional Real Vector Space

    Get PDF
    In this paper we study P-Hv-structures in connection with Hv-structures, arising from a specific P-hope in a two-dimensional real vector space. The visualization of these P-Hv-structures is our priority, since visual thinking could be an alternative and powerful resource for people doing mathematics. Using position vectors into the plane, abstract algebraic properties of these P-Hv-structures are gradually transformed into geometrical shapes, which operate, not only as a translation of the algebraic concept, but also, as a teaching process. 

    DETECTION OF POST-TRAUMATIC STRESS DISORDER (PTSD) SYMPTOMS ASSOCIATED WITH CORONAVIRUS DISEASE 2019 (COVID-19) IN THE STUDENT POPULATION

    Get PDF
    The pandemic has killed at least 670,000 people since it hit China's Wuhan, and 17 million cases have been diagnosed. The United States, Brazil, Mexico and Britain have been hit hard by COVID-19 in recent weeks (7/2020) as their governments try to find an effective response. A pandemic is a health crisis that occurs once in a hundred years, the effects of which will be felt for decades (WHO). In the dramatic changes brought about by a crisis, unfortunately many people experience a time of stress and sadness like no other in their lives and then the most common diagnosis is Post Traumatic Stress Disorder (PTSD) and Anxiety Disorder (World Health Organization [WHO]), 2001). In this study we detect and categorize symptoms of post-traumatic stress disorder (PTSD) associated with coronavirus disease 2019 in a student population.  Article visualizations

    Farmacocinética pre-clínica del nuatigenósido y otras biomoléculas radiomarcadas obtenidas del extracto semipurificado de raíces de Solanum sisymbriifolium Lam

    Get PDF
    El objetivo general del trabajo es la determinación de la farmacocinética pre-clínica del nuatigenósido y otras biomoléculas radiomarcadas obtenidas del extracto semipurificado de raíces de Solanum sisymbriifolium Lam.CONACYT – Consejo Nacional de Ciencia y TecnologíaPROCIENCI

    Estimation of electron density in the nighttime ionosphere based on remote sensing of the 135.6 nm far ultraviolet emission

    Get PDF
    This thesis develops a method to accurately estimate the electron density altitude profiles of the nighttime ionosphere, as well as important parameters such as the peak height and density, using nighttime far ultraviolet (FUV) measurements of the 135.6 nm nighttime emissions. Specifically, we will describe a method to accurately obtain the electron density content of the ionosphere by using brightness measurements of the nighttime 135.6 nm emission. The method is applied and tested using simulated measurements to relate to those to be obtained by the limb-viewing FUV instrument on board the Ionospheric Connection Explorer (ICON) satellite scheduled to be launched in 2017. The OI 135.6 nm emission can be used as a proxy of the ionosphere's electron density and is related through an integral equation of the volume emission rate to the brightness measured by the FUV instrument. The instrument's observation geometry allows for the discretization of the problem, thus connecting the ionosphere's electron density with the measured brightness through a matrix equation. Regularization methods are used in order to enforce constraints of smoothness and continuity on the estimation of the volume emission rate, to compensate for the noise amplification in the inversion process. Tikhonov regularization, generalized cross-validation, total variation and Bayesian methods that assume prior knowledge of the ionosphere's electron density distribution are investigated. Comprehensive simulations are used to explore the different brightness intensities for all longitudes, and for latitudes from -40 to 40 degrees, in order to allow the characterization of the effect of different SNR values on the electron density reconstruction accuracy. FUV measurements are simulated using the International Reference Ionosphere (IRI) and Mass Spectrometer and Incoherent Radar (MSIS) models to create a forward model which can be inverted in order to validate the altitude profile reconstruction as well as the peak height and density accuracy. This allows us to investigate the expected performance of the FUV instrument

    Epigenetic alterations involved in cancer stem cell reprogramming

    Get PDF
    Current hypotheses suggest that tumors originate from cells that carry out a process of 'malignant reprogramming' driven by genetic and epigenetic alterations. Multiples studies reported the existence of stem-cell-like cells that acquire the ability to self-renew and are able to generate the bulk of more differentiated cells that form the tumor. This population of cancer cells, called cancer stem cells (CSC), is responsible for sustaining the tumor growth and, under determined conditions, can disseminate and migrate to give rise to secondary tumors or metastases to distant organs. Furthermore, CSCs have shown to be more resistant to anti-tumor treatments than the non-stem cancer cells, suggesting that surviving CSCs could be responsible for tumor relapse after therapy. These important properties have raised the interest in understanding the mechanisms that govern the generation and maintenance of this special population of cells, considered to lie behind the on/off switches of gene expression patterns. In this review, we summarize the most relevant epigenetic alterations, from DNA methylation and histone modifications to the recently discovered miRNAs that contribute to the regulation of cancer stem cell features in tumor progression, metastasis and response to chemotherapy

    Machine Learning Based Detection and Evasion Techniques for Advanced Web Bots.

    Get PDF
    Web bots are programs that can be used to browse the web and perform different types of automated actions, both benign and malicious. Such web bots vary in sophistication based on their purpose, ranging from simple automated scripts to advanced web bots that have a browser fingerprint and exhibit a humanlike behaviour. Advanced web bots are especially appealing to malicious web bot creators, due to their browserlike fingerprint and humanlike behaviour which reduce their detectability. Several effective behaviour-based web bot detection techniques have been pro- posed in literature. However, the performance of these detection techniques when target- ing malicious web bots that try to evade detection has not been examined in depth. Such evasive web bot behaviour is achieved by different techniques, including simple heuris- tics and statistical distributions, or more advanced machine learning based techniques. Motivated by the above, in this thesis we research novel web bot detection techniques and how effective these are against evasive web bots that try to evade detection using, among others, recent advances in machine learning. To this end, we initially evaluate state-of-the-art web bot detection techniques against web bots of different sophistication levels and show that, while the existing approaches achieve very high performance in general, such approaches are not very effective when faced with only advanced web bots that try to remain undetected. Thus, we propose a novel web bot detection framework that can be used to detect effectively bots of varying levels of sophistication, including advanced web bots. This framework comprises and combines two detection modules: (i) a detection module that extracts several features from web logs and uses them as input to several well-known machine learning algo- rithms, and (ii) a detection module that uses mouse trajectories as input to Convolutional Neural Networks (CNNs). Moreover, we examine the case where advanced web bots utilise themselves the re- cent advances in machine learning to evade detection. Specifically, we propose two novel evasive advanced web bot types: (i) the web bots that use Reinforcement Learning (RL) to update their browsing behaviour based on whether they have been detected or not, and (ii) the web bots that have in their possession several data from human behaviours and use them as input to Generative Adversarial Networks (GANs) to generate images of humanlike mouse trajectories. We show that both approaches increase the evasiveness of the web bots by reducing the performance of the detection framework utilised in each case. We conclude that malicious web bots can exhibit high sophistication levels and com- bine different techniques that increase their evasiveness. Even though web bot detection frameworks can combine different methods to effectively detect such bots, web bots can update their behaviours using, among other, recent advances in machine learning to in- crease their evasiveness. Thus, the detection techniques should be continuously updated to keep up with new techniques introduced by malicious web bots to evade detection

    Hybrid focused crawling on the Surface and the Dark Web

    Get PDF
    Focused crawlers enable the automatic discovery of Web resources about a given topic by automatically navigating through the Web link structure and selecting the hyperlinks to follow by estimating their relevance to the topic of interest. This work proposes a generic focused crawling framework for discovering resources on any given topic that reside on the Surface or the Dark Web. The proposed crawler is able to seamlessly navigate through the Surface Web and several darknets present in the Dark Web (i.e., Tor, I2P, and Freenet) during a single crawl by automatically adapting its crawling behavior and its classifier-guided hyperlink selection strategy based on the destination network type and the strength of the local evidence present in the vicinity of a hyperlink. It investigates 11 hyperlink selection methods, among which a novel strategy proposed based on the dynamic linear combination of a link-based and a parent Web page classifier. This hybrid focused crawler is demonstrated for the discovery of Web resources containing recipes for producing homemade explosives. The evaluation experiments indicate the effectiveness of the proposed focused crawler both for the Surface and the Dark Web
    corecore