2,689 research outputs found

    Molecular solutions for double and partial digest problems in polynomial time

    Get PDF
    A fundamental problem in computational biology is the construction of physical maps of chromosomes from the hybridization experiments between unique probes and clones of chromosome fragments. Double and partial digest problems are two intractable problems used to construct physical maps of DNA molecules in bioinformatics. Several approaches, including exponential algorithms and heuristic algorithms, have been proposed to tackle these problems. In this paper we present two polynomial time molecular algorithms for both problems. For this reason, a molecular model similar to Adleman and Lipton model is presented. The presented operations are simple and performed in polynomial time. Our algorithms are computationally simulated

    Computational Molecular Biology

    No full text
    Computational Biology is a fairly new subject that arose in response to the computational problems posed by the analysis and the processing of biomolecular sequence and structure data. The field was initiated in the late 60's and early 70's largely by pioneers working in the life sciences. Physicists and mathematicians entered the field in the 70's and 80's, while Computer Science became involved with the new biological problems in the late 1980's. Computational problems have gained further importance in molecular biology through the various genome projects which produce enormous amounts of data. For this bibliography we focus on those areas of computational molecular biology that involve discrete algorithms or discrete optimization. We thus neglect several other areas of computational molecular biology, like most of the literature on the protein folding problem, as well as databases for molecular and genetic data, and genetic mapping algorithms. Due to the availability of review papers and a bibliography this bibliography

    Phase Retrieval for Sparse Signals: Uniqueness Conditions

    Get PDF
    In a variety of fields, in particular those involving imaging and optics, we often measure signals whose phase is missing or has been irremediably distorted. Phase retrieval attempts the recovery of the phase information of a signal from the magnitude of its Fourier transform to enable the reconstruction of the original signal. A fundamental question then is: "Under which conditions can we uniquely recover the signal of interest from its measured magnitudes?" In this paper, we assume the measured signal to be sparse. This is a natural assumption in many applications, such as X-ray crystallography, speckle imaging and blind channel estimation. In this work, we derive a sufficient condition for the uniqueness of the solution of the phase retrieval (PR) problem for both discrete and continuous domains, and for one and multi-dimensional domains. More precisely, we show that there is a strong connection between PR and the turnpike problem, a classic combinatorial problem. We also prove that the existence of collisions in the autocorrelation function of the signal may preclude the uniqueness of the solution of PR. Then, assuming the absence of collisions, we prove that the solution is almost surely unique on 1-dimensional domains. Finally, we extend this result to multi-dimensional signals by solving a set of 1-dimensional problems. We show that the solution of the multi-dimensional problem is unique when the autocorrelation function has no collisions, significantly improving upon a previously known result.Comment: submitted to IEEE TI

    The application of artificial intelligence techniques to a sequencing problem in the biological domain

    Get PDF
    SIGLEAvailable from British Library Document Supply Centre- DSC:DXN002816 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    String Reconstruction from Substring Compositions

    Full text link
    Motivated by mass-spectrometry protein sequencing, we consider a simply-stated problem of reconstructing a string from the multiset of its substring compositions. We show that all strings of length 7, one less than a prime, or one less than twice a prime, can be reconstructed uniquely up to reversal. For all other lengths we show that reconstruction is not always possible and provide sometimes-tight bounds on the largest number of strings with given substring compositions. The lower bounds are derived by combinatorial arguments and the upper bounds by algebraic considerations that precisely characterize the set of strings with the same substring compositions in terms of the factorization of bivariate polynomials. The problem can be viewed as a combinatorial simplification of the turnpike problem, and its solution may shed light on this long-standing problem as well. Using well known results on transience of multi-dimensional random walks, we also provide a reconstruction algorithm that reconstructs random strings over alphabets of size ≥4\ge4 in optimal near-quadratic time

    DEVELOPMENT AND CHARACTERIZATION OF IN SITU GEL OF XANTHAN GUM FOR OPHTHALMIC FORMULATION CONTAINING BRIMONIDINE TARTRATE

    Get PDF
    Objective: The goal of this study was to develop and characterize an ion-activated in situ gel-forming brimonidine tartrate, solution eye drops containing xanthan gum as a mucoadhesive polymer.Method: Sol-gel formulation was prepared using gellan gum as an ion-activated gel-forming polymer, xanthan gum as mucoadhesive agent, and hydroxypropyl methyl cellulose (HPMC E50LV) as release retardant polymer. Phenylethyl alcohol is used as preservatives in borate buffer. The 23 factorial design was employed to optimize the formulation considering the concentration of gelrite, xanthan gum and HPMC as independent variables, gelation time, gel strength, and mucoadhesive force (N). Gelation time , gel strength, mucoadhesive force (N), viscosity (cP) and in vitro percentage drug release were chosen as dependent variables. The formulation was characteristics for pH, clarity, isotonicity, sterility, rheological behavior, and in vitro drug release, ocular irritation, and ocular visualization.Result: Based on desirability index of responses, the formulation containing a concentration of gelrite (0.4%), xanthan gum (0.21%), and HPMC (HPMC E50 (0.24%) was found to be the optimized formulation concentration developed by 23 factorial design. The solution eye drops resulted in an in situ phase change to gel-state when mixed with simulated tear fluid. The gel formation was also confirmed by viscoelastic measurements. Drug release from the gel followed non-fickian mechanism with 88% of drug released in 10 h, thus increased the residence time of the drug.Conclusion: An in situ gelling system is a valuable alternative to the conventional system with added benefits of sustained drug release which may ultimately result in improved patient compliance

    Great Bay Estuary Tidal Tributary Monitoring Program: Quality Assurance Project Plan, 2018

    Get PDF

    An alternative method to crossing minimization on hierarchical graphs

    Get PDF
    A common method for drawing directed graphs is, as a first step, to partition the vertices into a set of kk levels and then, as a second step, to permute the verti ces within the levels such that the number of crossings is minimized. We suggest an alternative method for the second step, namely, removing the minimal number of edges such that the resulting graph is kk-level planar. For the final diagram the removed edges are reinserted into a kk-level planar drawing. Hence, i nstead of considering the kk-level crossing minimization problem, we suggest solv ing the kk-level planarization problem. In this paper we address the case k=2k=2. First, we give a motivation for our appro ach. Then, we address the problem of extracting a 2-level planar subgraph of maximum we ight in a given 2-level graph. This problem is NP-hard. Based on a characterizatio n of 2-level planar graphs, we give an integer linear programming formulation for the 2-level planarization problem. Moreover, we define and investigate the polytop e \2LPS(G) associated with the set of all 2-level planar subgraphs of a given 2 -level graph GG. We will see that this polytope has full dimension and that the i nequalities occuring in the integer linear description are facet-defining for \2L PS(G). The inequalities in the integer linear programming formulation can be separated in polynomial time, hence they can be used efficiently in a branch-and-cut method fo r solving practical instances of the 2-level planarization problem. Furthermore, we derive new inequalities that substantially improve the quality of the obtained solution. We report on extensive computational results

    Function-specific schemes for verifiable computation

    Get PDF
    An integral component of modern computing is the ability to outsource data and computation to powerful remote servers, for instance, in the context of cloud computing or remote file storage. While participants can benefit from this interaction, a fundamental security issue that arises is that of integrity of computation: How can the end-user be certain that the result of a computation over the outsourced data has not been tampered with (not even by a compromised or adversarial server)? Cryptographic schemes for verifiable computation address this problem by accompanying each result with a proof that can be used to check the correctness of the performed computation. Recent advances in the field have led to the first implementations of schemes that can verify arbitrary computations. However, in practice the overhead of these general-purpose constructions remains prohibitive for most applications, with proof computation times (at the server) in the order of minutes or even hours for real-world problem instances. A different approach for designing such schemes targets specific types of computation and builds custom-made protocols, sacrificing generality for efficiency. An important representative of this function-specific approach is an authenticated data structure (ADS), where a specialized protocol is designed that supports query types associated with a particular outsourced dataset. This thesis presents three novel ADS constructions for the important query types of set operations, multi-dimensional range search, and pattern matching, and proves their security under cryptographic assumptions over bilinear groups. The scheme for set operations can support nested queries (e.g., two unions followed by an intersection of the results), extending previous works that only accommodate a single operation. The range search ADS provides an exponential (in the number of attributes in the dataset) asymptotic improvement from previous schemes for storage and computation costs. Finally, the pattern matching ADS supports text pattern and XML path queries with minimal cost, e.g., the overhead at the server is less than 4% compared to simply computing the result, for all our tested settings. The experimental evaluation of all three constructions shows significant improvements in proof-computation time over general-purpose schemes
    • …
    corecore