6,430 research outputs found
Linear Algorithm for Conservative Degenerate Pattern Matching
A degenerate symbol x* over an alphabet A is a non-empty subset of A, and a
sequence of such symbols is a degenerate string. A degenerate string is said to
be conservative if its number of non-solid symbols is upper-bounded by a fixed
positive constant k. We consider here the matching problem of conservative
degenerate strings and present the first linear-time algorithm that can find,
for given degenerate strings P* and T* of total length n containing k non-solid
symbols in total, the occurrences of P* in T* in O(nk) time
Correcting curvature-density effects in the Hamilton-Jacobi skeleton
The Hainilton-Jacobi approach has proven to be a powerful and elegant method for extracting the skeleton of two-dimensional (2-D) shapes. The approach is based on the observation that the normalized flux associated with the inward evolution of the object boundary at nonskeletal points tends to zero as the size of the integration area tends to zero, while the flux is negative at the locations of skeletal points. Nonetheless, the error in calculating the flux on the image lattice is both limited by the pixel resolution and also proportional to the curvature of the boundary evolution front and, hence, unbounded near endpoints. This makes the exact location of endpoints difficult and renders the performance of the skeleton extraction algorithm dependent on a threshold parameter. This problem can be overcome by using interpolation techniques to calculate the flux with subpixel precision. However, here, we develop a method for 2-D skeleton extraction that circumvents the problem by eliminating the curvature contribution to the error. This is done by taking into account variations of density due to boundary curvature. This yields a skeletonization algorithm that gives both better localization and less susceptibility to boundary noise and parameter choice than the Hamilton-Jacobi method
Efficient pattern matching in degenerate strings with the Burrows–Wheeler transform
International audienceA degenerate or indeterminate string on an alphabet ÎŁ is a sequence of non-empty subsets of ÎŁ. Given a degenerate string t of length n, we present a new method based on the Burrows--Wheeler transform for searching for a degenerate pattern of length m in t running in O(mn) time on a constant size alphabet ÎŁ. Furthermore, it is a hybrid pattern-matching technique that works on both regular and degenerate strings. A degenerate string is said to be conservative if its number of non-solid letters is upper-bounded by a fixed positive constant q; in this case we show that the search complexity time is O(qm2). Experimental results show that our method performs well in practice
Long-lived staus from strong production in a simplified model approach
We study the phenomenology of a supersymmetric scenario where the
next-to-lightest superparticle is the lighter stau and long-lived due to a very
weakly coupled lightest superparticle, such as the gravitino. We investigate
the LHC sensitivity and its dependence on the superparticle spectrum with an
emphasis on strong production and decay. We do not assume any high-scale model
for SUSY breaking but work along the lines of simplified models. Devising cuts
that yield a large detection efficiency in the whole parameter space, we
determine the LHC's discovery and exclusion potential. This allows us to derive
robust limits on m_stau, m_gluino, a common m_squark, and m_stop1. We briefly
discuss the prospects for observing stopped staus.Comment: 25 pages + references, 27 eps figures; v3: Matches journal version,
typo in table 1 correcte
Towards a gauge-polyvalent Numerical Relativity code
The gauge polyvalence of a new numerical code is tested, both in
harmonic-coordinate simulations (gauge-waves testbed) and in
singularity-avoiding coordinates (simple Black-Hole simulations, either with or
without shift). The code is built upon an adjusted first-order
flux-conservative version of the Z4 formalism and a recently proposed family of
robust finite-difference high-resolution algorithms. An outstanding result is
the long-term evolution (up to 1000M) of a Black-Hole in normal coordinates
(zero shift) without excision.Comment: to appear in Physical Review
High-Throughput SNP Genotyping by SBE/SBH
Despite much progress over the past decade, current Single Nucleotide
Polymorphism (SNP) genotyping technologies still offer an insufficient degree
of multiplexing when required to handle user-selected sets of SNPs. In this
paper we propose a new genotyping assay architecture combining multiplexed
solution-phase single-base extension (SBE) reactions with sequencing by
hybridization (SBH) using universal DNA arrays such as all -mer arrays. In
addition to PCR amplification of genomic DNA, SNP genotyping using SBE/SBH
assays involves the following steps: (1) Synthesizing primers complementing the
genomic sequence immediately preceding SNPs of interest; (2) Hybridizing these
primers with the genomic DNA; (3) Extending each primer by a single base using
polymerase enzyme and dideoxynucleotides labeled with 4 different fluorescent
dyes; and finally (4) Hybridizing extended primers to a universal DNA array and
determining the identity of the bases that extend each primer by hybridization
pattern analysis. Our contributions include a study of multiplexing algorithms
for SBE/SBH genotyping assays and preliminary experimental results showing the
achievable tradeoffs between the number of array probes and primer length on
one hand and the number of SNPs that can be assayed simultaneously on the
other. Simulation results on datasets both randomly generated and extracted
from the NCBI dbSNP database suggest that the SBE/SBH architecture provides a
flexible and cost-effective alternative to genotyping assays currently used in
the industry, enabling genotyping of up to hundreds of thousands of
user-specified SNPs per assay.Comment: 19 page
- …