27 research outputs found

    Nanomaterial-based Sensors for the Study of DNA Interaction with Drugs

    Get PDF
    The interaction of drugs with DNA has been searched thoroughly giving rise to an endless number of findings of undoubted importance, such as a prompt alert to harmful substances, ability to explain most of the biological mechanisms, or provision of important clues in targeted development of novel chemotherapeutics. The existence of some drugs that induce oxidative damage is an increasing point of concern as they can cause cellular death, aging, and are closely related to the development of many diseases. Because of a direct correlation between the response, strength/ nature of the interaction and the pharmaceutical action of DNA-targeted drugs, the electrochemical analysis is based on the signals of DNA before and after the interaction with the DNA-targeted drug. Nowadays, nanoscale materials are used extensively for offering fascinating characteristics that can be used in designing new strategies for drug-DNA interaction detection. This work presents a review of nanomaterials (NMs) for the study of drug-nucleic acid interaction. We summarize types of drug-DNA interactions, electroanalytical techniques for evidencing these interactions and quantification of drug and/or DNA monitoring

    An experimental study of the intrinsic stability of random forest variable importance measures

    Get PDF
    BACKGROUND: The stability of Variable Importance Measures (VIMs) based on random forest has recently received increased attention. Despite the extensive attention on traditional stability of data perturbations or parameter variations, few studies include influences coming from the intrinsic randomness in generating VIMs, i.e. bagging, randomization and permutation. To address these influences, in this paper we introduce a new concept of intrinsic stability of VIMs, which is defined as the self-consistence among feature rankings in repeated runs of VIMs without data perturbations and parameter variations. Two widely used VIMs, i.e., Mean Decrease Accuracy (MDA) and Mean Decrease Gini (MDG) are comprehensively investigated. The motivation of this study is two-fold. First, we empirically verify the prevalence of intrinsic stability of VIMs over many real-world datasets to highlight that the instability of VIMs does not originate exclusively from data perturbations or parameter variations, but also stems from the intrinsic randomness of VIMs. Second, through Spearman and Pearson tests we comprehensively investigate how different factors influence the intrinsic stability. RESULTS: The experiments are carried out on 19 benchmark datasets with diverse characteristics, including 10 high-dimensional and small-sample gene expression datasets. Experimental results demonstrate the prevalence of intrinsic stability of VIMs. Spearman and Pearson tests on the correlations between intrinsic stability and different factors show that #feature (number of features) and #sample (size of sample) have a coupling effect on the intrinsic stability. The synthetic indictor, #feature/#sample, shows both negative monotonic correlation and negative linear correlation with the intrinsic stability, while OOB accuracy has monotonic correlations with intrinsic stability. This indicates that high-dimensional, small-sample and high complexity datasets may suffer more from intrinsic instability of VIMs. Furthermore, with respect to parameter settings of random forest, a large number of trees is preferred. No significant correlations can be seen between intrinsic stability and other factors. Finally, the magnitude of intrinsic stability is always smaller than that of traditional stability. CONCLUSION: First, the prevalence of intrinsic stability of VIMs demonstrates that the instability of VIMs not only comes from data perturbations or parameter variations, but also stems from the intrinsic randomness of VIMs. This finding gives a better understanding of VIM stability, and may help reduce the instability of VIMs. Second, by investigating the potential factors of intrinsic stability, users would be more aware of the risks and hence more careful when using VIMs, especially on high-dimensional, small-sample and high complexity datasets

    Implementierung einer strategischen Technologieplanung in der automobilen Zulieferindustrie

    No full text

    Seismic Strain Rate and Flexure at the Hawaiian Islands Constrain the Frictional Coefficient

    No full text
    Abstract Flexure occurs on intermediate geologic timescales (∼1 Myr) due to volcanic‐island building at the Island of Hawaii, and the deformational response of the lithosphere is simultaneously elastic, plastic, and ductile. At shallow depths and low temperatures, elastic deformation transitions to frictional failure on faults where stresses exceed a threshold value, and this complex rheology controls the rate of deformation manifested by earthquakes. In this study, we estimate the seismic strain rate based on earthquakes recorded between 1960 and 2019 at Hawaii, and the estimated strain rate with 10−18–10−15 s−1 in magnitude exhibits a local minimum or neutral bending plane at 15 km depth within the lithosphere. In comparison, flexure and internal deformation of the lithosphere are modeled in 3D viscoelastic loading models where deformation at shallow depths is accommodated by frictional sliding on faults and limited by the frictional coefficient (μf), and at larger depths by low‐temperature plasticity and high‐temperature creep. Observations of flexure and the seismic strain rate are best‐reproduced by models with μf = 0.3 ± 0.1 and modified laboratory‐derived low‐temperature plasticity. Results also suggest strong lateral variations in the frictional strength of faults beneath Hawaii. Our models predict a radial pattern of compressive stress axes relative to central Hawaii consistent with observations of earthquake pressure (P) axes. We demonstrate that the dip angle of this radial axis is essential to discerning a change in the curvature of flexure, and therefore has implications for constraining lateral variations in lithospheric strength

    Optimising use of rate-of-change trend arrows for insulin dosing decisions using the FreeStyle Libre flash glucose monitoring system

    No full text
    Continuous glucose monitoring and flash glucose monitoring systems are increasingly used by people with diabetes on multiple daily injections of insulin and continuous subcutaneous insulin infusion. Along with real-time updates on current glucose levels, these technologies also use trend arrows to provide information on the direction and rate of change of glucose. Two systems, the Dexcom G5 and the FreeStyle Libre, have recently been approved for use without the need for adjunct capillary blood glucose, and there is a need for practical guidance for insulin dosing which incorporates rate of change in the insulin dosing algorithm. Here, we review the integration of rate of change trend arrow information into daily glucose management, including rapid-acting insulin dosing decisions. Based on the FreeStyle Libre flash glucose monitoring system, we also review a practical decision-support tool for actions to take when using trend arrows in conjunction with current glucose readings
    corecore