400 research outputs found

    Duel and sweep algorithm for order-preserving pattern matching

    Full text link
    Given a text TT and a pattern PP over alphabet Σ\Sigma, the classic exact matching problem searches for all occurrences of pattern PP in text TT. Unlike exact matching problem, order-preserving pattern matching (OPPM) considers the relative order of elements, rather than their real values. In this paper, we propose an efficient algorithm for OPPM problem using the "duel-and-sweep" paradigm. Our algorithm runs in O(n+mlogm)O(n + m\log m) time in general and O(n+m)O(n + m) time under an assumption that the characters in a string can be sorted in linear time with respect to the string size. We also perform experiments and show that our algorithm is faster that KMP-based algorithm. Last, we introduce the two-dimensional order preserved pattern matching and give a duel and sweep algorithm that runs in O(n2)O(n^2) time for duel stage and O(n2m)O(n^2 m) time for sweeping time with O(m3)O(m^3) preprocessing time.Comment: 13 pages, 5 figure

    Log-Poisson Cascade Description of Turbulent Velocity Gradient Statistics

    Full text link
    The Log-Poisson phenomenological description of the turbulent energy cascade is evoked to discuss high-order statistics of velocity derivatives and the mapping between their probability distribution functions at different Reynolds numbers. The striking confirmation of theoretical predictions suggests that numerical solutions of the flow, obtained at low/moderate Reynolds numbers can play an important quantitative role in the analysis of experimental high Reynolds number phenomena, where small scales fluctuations are in general inaccessible from direct numerical simulations

    The effect of protein mutations on drug binding suggests ensuing personalised drug selection

    Get PDF
    The advent of personalised medicine promises a deeper understanding of mechanisms and therefore therapies. However, the connection between genomic sequences and clinical treatments is often unclear. We studied 50 breast cancer patients belonging to a population-cohort in the state of Qatar. From Sanger sequencing, we identified several new deleterious mutations in the estrogen receptor 1 gene (ESR1). The effect of these mutations on drug treatment in the protein target encoded by ESR1, namely the estrogen receptor, was achieved via rapid and accurate protein-ligand binding affinity interaction studies which were performed for the selected drugs and the natural ligand estrogen. Four nonsynonymous mutations in the ligand-binding domain were subjected to molecular dynamics simulation using absolute and relative binding free energy methods, leading to the ranking of the efficacy of six selected drugs for patients with the mutations. Our study shows that a personalised clinical decision system can be created by integrating an individual patient's genomic data at the molecular level within a computational pipeline which ranks the efficacy of binding of particular drugs to variant proteins

    Hodgkin's lymphoma in remission after first-line therapy: which patients need FDG-PET/CT for follow-up?

    Get PDF
    Background: The purpose of the study was to evaluate the impact of 2-[fluorine-18]fluoro-2-deoxy-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) during follow-up of patients with Hodgkin's lymphoma. Patients and methods: Patients in complete remission or an unconfirmed complete remission after first-line therapy who received FDG-PET/CT during their follow-up were analyzed retrospectively. Confirmatory biopsy was mandatory in case of recurrence. Results: Overall, 134 patients were analyzed. Forty-two (31.3%) patients had a recurrence. The positive predictive value of FDG-PET/CT was 0.98. Single-factor analysis identified morphological residual mass [P = 0.0005, hazard ratio (HR) 3.4, 95% confidence interval (CI) 1.7-6.6] and symptoms (P 24 months). Conclusions: Asymptomatic patients without morphological residues and an early stage of disease do not need a routine FDG-PET/CT for follow-up. Asymptomatic patients with morphological residues should receive routine follow-up FDG-PET/CT for the first 24 months. Only patients with advanced initial stage do need a routine follow-up FDG-PET/CT beyond 24 month

    Risk-adapted FDG-PET/CT-based follow-up in patients with diffuse large B-cell lymphoma after first-line therapy

    Get PDF
    Background: The purpose of this study was to evaluate the impact of 2-[fluorine-18]fluoro-2-deoxy-D-glucose-positron emission tomography/computed tomography (FDG-PET/CT) during follow-up of patients with diffuse large B-cell lymphoma (DLBCL) being in complete remission or unconfirmed complete remission after first-line therapy. Patients and methods: DLBCL patients receiving FDG-PET/CT during follow-up were analyzed retrospectively. Confirmatory biopsy was mandatory in cases of suspected disease recurrence. Results: Seventy-five patients were analyzed and 23 (30%) had disease recurrence. The positive predictive value (PPV) of FDG-PET/CT was 0.85. Patients >60 years [P = 0.036, hazard ratio (HR) = 3.82, 95% confidence interval (CI) 1.02-7.77] and patients with symptoms indicative of a relapse (P = 0.015; HR = 4.1; 95% CI 1.20-14.03) had a significantly higher risk for relapse. A risk score on the basis of signs of relapse, age >60 years, or a combination of these factors identified patients at high risk for recurrence (P = 0.041). Conclusions: FDG-PET/CT detects recurrent DLBCL after first-line therapy with high PPV. However, it should not be used routinely and if only in selected high-risk patients to reduce radiation burden and costs. On the basis of our retrospective data, FDG-PET/CT during follow-up is indicated for patients 60 years with and without clinical signs of relaps

    Physical tests for Random Numbers in Simulations

    Full text link
    We propose three physical tests to measure correlations in random numbers used in Monte Carlo simulations. The first test uses autocorrelation times of certain physical quantities when the Ising model is simulated with the Wolff algorithm. The second test is based on random walks, and the third on blocks of n successive numbers. We apply the tests to show that recent errors in high precision simulations using generalized feedback shift register algorithms are due to short range correlations in random number sequences. We also determine the length of these correlations.Comment: 16 pages, Post Script file, HU-TFT-94-

    Approximate Quantum Fourier Transform and Decoherence

    Get PDF
    We discuss the advantages of using the approximate quantum Fourier transform (AQFT) in algorithms which involve periodicity estimations. We analyse quantum networks performing AQFT in the presence of decoherence and show that extensive approximations can be made before the accuracy of AQFT (as compared with regular quantum Fourier transform) is compromised. We show that for some computations an approximation may imply a better performance.Comment: 14 pages, 10 fig. (8 *.eps files). More information on http://eve.physics.ox.ac.uk/QChome.html http://www.physics.helsinki.fi/~kasuomin http://www.physics.helsinki.fi/~kira/group.htm

    Rank Statistics in Biological Evolution

    Full text link
    We present a statistical analysis of biological evolution processes. Specifically, we study the stochastic replication-mutation-death model where the population of a species may grow or shrink by birth or death, respectively, and additionally, mutations lead to the creation of new species. We rank the various species by the chronological order by which they originate. The average population N_k of the kth species decays algebraically with rank, N_k ~ M^{mu} k^{-mu}, where M is the average total population. The characteristic exponent mu=(alpha-gamma)/(alpha+beta-gamma)$ depends on alpha, beta, and gamma, the replication, mutation, and death rates. Furthermore, the average population P_k of all descendants of the kth species has a universal algebraic behavior, P_k ~ M/k.Comment: 4 pages, 3 figure

    Anomalous scaling in random shell models for passive scalars

    Full text link
    A shell-model version of Kraichnan's (1994 {\it Phys. Rev. Lett. \bf 72}, 1016) passive scalar problem is introduced which is inspired from the model of Jensen, Paladin and Vulpiani (1992 {\it Phys. Rev. A\bf 45}, 7214). As in the original problem, the prescribed random velocity field is Gaussian, delta-correlated in time and has a power-law spectrum kmξ\propto k_m^{-\xi}, where kmk_m is the wavenumber. Deterministic differential equations for second and fourth-order moments are obtained and then solved numerically. The second-order structure function of the passive scalar has normal scaling, while the fourth-order structure function has anomalous scaling. For ξ=2/3\xi = 2/3 the anomalous scaling exponents ζp\zeta_p are determined for structure functions up to p=16p=16 by Monte Carlo simulations of the random shell model, using a stochastic differential equation scheme, validated by comparison with the results obtained for the second and fourth-order structure functions.Comment: Plain LaTex, 15 pages, 4 figure available upon request to [email protected]

    Random Numbers Certified by Bell's Theorem

    Full text link
    Randomness is a fundamental feature in nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on nonlocality based and device independent quantum information processing, we show that the nonlocal correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design of a new type of cryptographically secure random number generator which does not require any assumption on the internal working of the devices. This strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately 1 meter. The observed Bell inequality violation, featuring near-perfect detection efficiency, guarantees that 42 new random numbers are generated with 99% confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.Comment: 10 pages, 3 figures, 16 page appendix. Version as close as possible to the published version following the terms of the journa
    corecore