213 research outputs found

    Stability, resolution, and ultra-low wear amplitude modulation atomic force microscopy of DNA: Small amplitude small set-point imaging

    Get PDF
    A way to operate fundamental mode amplitude modulation atomic force microscopy is introduced which optimizes stability and resolution for a given tip size and shows negligible tip wear over extended time periods (∼24 h). In small amplitude small set-point (SASS) imaging, the cantilever oscillates with sub-nanometer amplitudes in the proximity of the sample, without the requirement of using large drive forces, as the dynamics smoothly lead the tip to the surface through the water layer. SASS is demonstrated on single molecules of double-stranded DNA in ambient conditions where sharp silicon tips (R ∼ 2-5 nm) can resolve the right-handed double helix

    Patchiness of ion-exchanged mica revealed by DNA binding dynamics at short length scales

    Get PDF
    The binding of double-stranded (ds) DNA to mica can be controlled through ion-exchanging the mica with divalent cations. Measurements of the end-to-end distance of linear DNA molecules discriminate whether the binding mechanism occurs through 2D surface equilibration or kinetic trapping. A range of linear dsDNA fragments have been used to investigate length dependences of binding. Mica, ion-exchanged with Ni(II) usually gives rise to kinetically trapped DNA molecules, however, short linear fragments (<800 bp) are seen to deviate from the expected behaviour. This indicates that ion-exchanged mica is heterogeneous, and contains patches or domains, separating different ionic species. These results correlate with imaging of dsDNA under aqueous buffer on Ni(II)-mica and indicate that binding domains are of the order of 100 nm in diameter. Shorter DNA fragments behave intermediate to the two extreme cases of 2D equilibration and kinetic trapping. Increasing the incubation time of Ni(II) on mica, from minutes to hours, brings the conformations of the shorter DNA fragments closer to the theoretical value for kinetic trapping, indicating that long timescale kinetics play a role in ion-exchange. X-ray photoelectron spectroscopy (XPS) was used to confirm that the relative abundance of Ni(II) ions on the mica surface increases with time. These findings can be used to enhance spatial control of binding of DNA to inorganic surfaces with a view to patterning high densities arrays

    Semi-Markov Graph Dynamics

    Get PDF
    In this paper, we outline a model of graph (or network) dynamics based on two ingredients. The first ingredient is a Markov chain on the space of possible graphs. The second ingredient is a semi-Markov counting process of renewal type. The model consists in subordinating the Markov chain to the semi-Markov counting process. In simple words, this means that the chain transitions occur at random time instants called epochs. The model is quite rich and its possible connections with algebraic geometry are briefly discussed. Moreover, for the sake of simplicity, we focus on the space of undirected graphs with a fixed number of nodes. However, in an example, we present an interbank market model where it is meaningful to use directed graphs or even weighted graphs.Comment: 25 pages, 4 figures, submitted to PLoS-ON

    Statistical modeling of ground motion relations for seismic hazard analysis

    Full text link
    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area-equivalence; wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GMRs (an actual and a modeled one) as a random component leads to a general overestimation of residual variance and hazard. Beside this, we discuss important aspects of classical approaches and discover discrepancies with the state of the art of stochastics and statistics (model selection and significance, test of distribution assumptions, extreme value statistics). We criticize especially the assumption of logarithmic normally distributed residuals of maxima like the peak ground acceleration (PGA). The natural distribution of its individual random component (equivalent to exp(epsilon_0) of Joyner and Boore 1993) is the generalized extreme value. We show by numerical researches that the actual distribution can be hidden and a wrong distribution assumption can influence the PSHA negatively as the negligence of area equivalence does. Finally, we suggest an estimation concept for GMRs of PSHA with a regression-free variance estimation of the individual random component. We demonstrate the advantages of event-specific GMRs by analyzing data sets from the PEER strong motion database and estimate event-specific GMRs. Therein, the majority of the best models base on an anisotropic point source approach. The residual variance of logarithmized PGA is significantly smaller than in previous models. We validate the estimations for the event with the largest sample by empirical area functions. etc

    The cost of large numbers of hypothesis tests on power, effect size and sample size

    Get PDF
    Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing

    Mathematical model of a telomerase transcriptional regulatory network developed by cell-based screening: analysis of inhibitor effects and telomerase expression mechanisms

    Get PDF
    Cancer cells depend on transcription of telomerase reverse transcriptase (TERT). Many transcription factors affect TERT, though regulation occurs in context of a broader network. Network effects on telomerase regulation have not been investigated, though deeper understanding of TERT transcription requires a systems view. However, control over individual interactions in complex networks is not easily achievable. Mathematical modelling provides an attractive approach for analysis of complex systems and some models may prove useful in systems pharmacology approaches to drug discovery. In this report, we used transfection screening to test interactions among 14 TERT regulatory transcription factors and their respective promoters in ovarian cancer cells. The results were used to generate a network model of TERT transcription and to implement a dynamic Boolean model whose steady states were analysed. Modelled effects of signal transduction inhibitors successfully predicted TERT repression by Src-family inhibitor SU6656 and lack of repression by ERK inhibitor FR180204, results confirmed by RT-QPCR analysis of endogenous TERT expression in treated cells. Modelled effects of GSK3 inhibitor 6-bromoindirubin-3β€²-oxime (BIO) predicted unstable TERT repression dependent on noise and expression of JUN, corresponding with observations from a previous study. MYC expression is critical in TERT activation in the model, consistent with its well known function in endogenous TERT regulation. Loss of MYC caused complete TERT suppression in our model, substantially rescued only by co-suppression of AR. Interestingly expression was easily rescued under modelled Ets-factor gain of function, as occurs in TERT promoter mutation. RNAi targeting AR, JUN, MXD1, SP3, or TP53, showed that AR suppression does rescue endogenous TERT expression following MYC knockdown in these cells and SP3 or TP53 siRNA also cause partial recovery. The model therefore successfully predicted several aspects of TERT regulation including previously unknown mechanisms. An extrapolation suggests that a dominant stimulatory system may programme TERT for transcriptional stability

    Genome-Wide Analysis of Structural Variants in Parkinson Disease

    Get PDF
    OBJECTIVE: Identification of genetic risk factors for Parkinson disease (PD) has to date been primarily limited to the study of single nucleotide variants, which only represent a small fraction of the genetic variation in the human genome. Consequently, causal variants for most PD risk are not known. Here we focused on structural variants (SVs), which represent a major source of genetic variation in the human genome. We aimed to discover SVs associated with PD risk by performing the first large-scale characterization of SVs in PD. METHODS: We leveraged a recently developed computational pipeline to detect and genotype SVs from 7,772 Illumina short-read whole genome sequencing samples. Using this set of SV variants, we performed a genome-wide association study using 2,585 cases and 2,779 controls and identified SVs associated with PD risk. Furthermore, to validate the presence of these variants, we generated a subset of matched whole-genome long-read sequencing data. RESULTS: We genotyped and tested 3,154 common SVs, representing over 412 million nucleotides of previously uncatalogued genetic variation. Using long-read sequencing data, we validated the presence of three novel deletion SVs that are associated with risk of PD from our initial association analysis, including a 2 kb intronic deletion within the gene LRRN4. INTERPRETATION: We identified three SVs associated with genetic risk of PD. This study represents the most comprehensive assessment of the contribution of SVs to the genetic risk of PD to date. ANN NEUROL 202
    • …
    corecore