288 research outputs found

    Internal Transcribed Spacer 2 (nu ITS2 rRNA) Sequence-Structure Phylogenetics: Towards an Automated Reconstruction of the Green Algal Tree of Life

    Get PDF
    L). Some have advocated the use of the nuclear-encoded, internal transcribed spacer two (ITS2) as an alternative to the traditional chloroplast markers. However, the ITS2 is broadly perceived to be insufficiently conserved or to be confounded by introgression or biparental inheritance patterns, precluding its broad use in phylogenetic reconstruction or as a DNA barcode. A growing body of evidence has shown that simultaneous analysis of nucleotide data with secondary structure information can overcome at least some of the limitations of ITS2. The goal of this investigation was to assess the feasibility of an automated, sequence-structure approach for analysis of IT2 data from a large sampling of phylum Chlorophyta.Sequences and secondary structures from 591 chlorophycean, 741 trebouxiophycean and 938 ulvophycean algae, all obtained from the ITS2 Database, were aligned using a sequence structure-specific scoring matrix. Phylogenetic relationships were reconstructed by Profile Neighbor-Joining coupled with a sequence structure-specific, general time reversible substitution model. Results from analyses of the ITS2 data were robust at multiple nodes and showed considerable congruence with results from published phylogenetic analyses.Our observations on the power of automated, sequence-structure analyses of ITS2 to reconstruct phylum-level phylogenies of the green algae validate this approach to assessing diversity for large sets of chlorophytan taxa. Moreover, our results indicate that objections to the use of ITS2 for DNA barcoding should be weighed against the utility of an automated, data analysis approach with demonstrated power to reconstruct evolutionary patterns for highly divergent lineages

    Error bounds for monomial convexification in polynomial optimization

    Get PDF
    Convex hulls of monomials have been widely studied in the literature, and monomial convexifications are implemented in global optimization software for relaxing polynomials. However, there has been no study of the error in the global optimum from such approaches. We give bounds on the worst-case error for convexifying a monomial over subsets of [0,1]n[0,1]^n. This implies additive error bounds for relaxing a polynomial optimization problem by convexifying each monomial separately. Our main error bounds depend primarily on the degree of the monomial, making them easy to compute. Since monomial convexification studies depend on the bounds on the associated variables, in the second part, we conduct an error analysis for a multilinear monomial over two different types of box constraints. As part of this analysis, we also derive the convex hull of a multilinear monomial over [1,1]n[-1,1]^n.Comment: 33 pages, 2 figures, to appear in journa

    Evolutionary distances in the twilight zone -- a rational kernel approach

    Get PDF
    Phylogenetic tree reconstruction is traditionally based on multiple sequence alignments (MSAs) and heavily depends on the validity of this information bottleneck. With increasing sequence divergence, the quality of MSAs decays quickly. Alignment-free methods, on the other hand, are based on abstract string comparisons and avoid potential alignment problems. However, in general they are not biologically motivated and ignore our knowledge about the evolution of sequences. Thus, it is still a major open question how to define an evolutionary distance metric between divergent sequences that makes use of indel information and known substitution models without the need for a multiple alignment. Here we propose a new evolutionary distance metric to close this gap. It uses finite-state transducers to create a biologically motivated similarity score which models substitutions and indels, and does not depend on a multiple sequence alignment. The sequence similarity score is defined in analogy to pairwise alignments and additionally has the positive semi-definite property. We describe its derivation and show in simulation studies and real-world examples that it is more accurate in reconstructing phylogenies than competing methods. The result is a new and accurate way of determining evolutionary distances in and beyond the twilight zone of sequence alignments that is suitable for large datasets.Comment: to appear in PLoS ON

    Engineering Branch-and-Cut Algorithms for the Equicut Problem

    Get PDF
    A minimum equicut of an edge-weighted graph is a partition of the nodes of the graph into two sets of equal size such hat the sum of the weights of edges joining nodes in different partitions is minimum. We compare basic linear and semidefnite relaxations for the equicut problem, and and that linear bounds are competitive with the corresponding semidefnite ones but can be computed much faster. Motivated by an application of equicut in theoretical physics, we revisit an approach by Brunetta et al. and present an enhanced branch-and-cut algorithm. Our computational results suggest that the proposed branch-andcut algorithm has a better performance than the algorithm of Brunetta et al.. Further, it is able to solve to optimality in reasonable time several instances with more than 200 nodes from the physics application

    Discovery of an intermediate-luminosity red transient in M51 and its likely dust-obscured, infrared-variable progenitor

    Get PDF
    We present the discovery of an optical transient (OT) in Messier 51, designated M51 OT2019-1 (also ZTF19aadyppr, AT 2019abn, ATLAS19bzl), by the Zwicky Transient Facility (ZTF). The OT rose over 15 days to an observed luminosity of Mr=13M_r=-13 (νLν=9×106 L{\nu}L_{\nu}=9\times10^6~L_{\odot}), in the luminosity gap between novae and typical supernovae (SNe). Spectra during the outburst show a red continuum, Balmer emission with a velocity width of 400\approx400 km s1^{-1}, Ca II and [Ca II] emission, and absorption features characteristic of an F-type supergiant. The spectra and multiband light curves are similar to the so-called "SN impostors" and intermediate-luminosity red transients (ILRTs). We directly identify the likely progenitor in archival Spitzer Space Telescope imaging with a 4.5 μ4.5~\mum luminosity of M[4.5]12.2M_{[4.5]}\approx-12.2 and a [3.6][4.5][3.6]-[4.5] color redder than 0.74 mag, similar to those of the prototype ILRTs SN 2008S and NGC 300 OT2008-1. Intensive monitoring of M51 with Spitzer further reveals evidence for variability of the progenitor candidate at [4.5] in the years before the OT. The progenitor is not detected in pre-outburst Hubble Space Telescope optical and near-IR images. The optical colors during outburst combined with spectroscopic temperature constraints imply a higher reddening of E(BV)0.7E(B-V)\approx0.7 mag and higher intrinsic luminosity of Mr14.9M_r\approx-14.9 (νLν=5.3×107 L{\nu}L_{\nu}=5.3\times10^7~L_{\odot}) near peak than seen in previous ILRT candidates. Moreover, the extinction estimate is higher on the rise than on the plateau, suggestive of an extended phase of circumstellar dust destruction. These results, enabled by the early discovery of M51 OT2019-1 and extensive pre-outburst archival coverage, offer new clues about the debated origins of ILRTs and may challenge the hypothesis that they arise from the electron-capture induced collapse of extreme asymptotic giant branch stars.Comment: 21 pages, 5 figures, published in ApJ

    Nonlinear Integer Programming

    Full text link
    Research efforts of the past fifty years have led to a development of linear integer programming as a mature discipline of mathematical optimization. Such a level of maturity has not been reached when one considers nonlinear systems subject to integrality requirements for the variables. This chapter is dedicated to this topic. The primary goal is a study of a simple version of general nonlinear integer problems, where all constraints are still linear. Our focus is on the computational complexity of the problem, which varies significantly with the type of nonlinear objective function in combination with the underlying combinatorial structure. Numerous boundary cases of complexity emerge, which sometimes surprisingly lead even to polynomial time algorithms. We also cover recent successful approaches for more general classes of problems. Though no positive theoretical efficiency results are available, nor are they likely to ever be available, these seem to be the currently most successful and interesting approaches for solving practical problems. It is our belief that the study of algorithms motivated by theoretical considerations and those motivated by our desire to solve practical instances should and do inform one another. So it is with this viewpoint that we present the subject, and it is in this direction that we hope to spark further research.Comment: 57 pages. To appear in: M. J\"unger, T. Liebling, D. Naddef, G. Nemhauser, W. Pulleyblank, G. Reinelt, G. Rinaldi, and L. Wolsey (eds.), 50 Years of Integer Programming 1958--2008: The Early Years and State-of-the-Art Surveys, Springer-Verlag, 2009, ISBN 354068274

    Catching-up and falling behind knowledge spillover from American to German machine tool makers

    Get PDF
    In our days, German machine tool makers accuse their Chinese competitors of violating patent rights and illegally imitating German technology. A century ago, however, German machine tool makers used exactly the same methods to imitate American technology. To understand the dynamics of this catching-up process we use patent statistics to analyze firms? activities between 1877 and 1932. We show that German machine tool makers successfully deployed imitating and counterfeiting activities in the late 19th century and the 1920s to catchup to their American competitors. The German administration supported this strategy by stipulating a patent law that discriminated against foreign patent holders and probably also by delaying the granting of patents to foreign applicants. Parallel to the growing international competitiveness of German firms, however, the willingness to guarantee intellectual property rights of foreigners was also increasing because German firms had now to fear retaliatory measures in their own export markets when violating foreign property rights within Germany

    The Future of Personalized Medicine in Space: From Observations to Countermeasures

    Get PDF
    The aim of personalized medicine is to detach from a “one-size fits all approach” and improve patient health by individualization to achieve the best outcomes in disease prevention, diagnosis and treatment. Technological advances in sequencing, improved knowledge of omics, integration with bioinformatics and new in vitro testing formats, have enabled personalized medicine to become a reality. Individual variation in response to environmental factors can affect susceptibility to disease and response to treatments. Space travel exposes humans to environmental stressors that lead to physiological adaptations, from altered cell behavior to abnormal tissue responses, including immune system impairment. In the context of human space flight research, human health studies have shown a significant inter-individual variability in response to space analogue conditions. A substantial degree of variability has been noticed in response to medications (from both an efficacy and toxicity perspective) as well as in susceptibility to damage from radiation exposure and in physiological changes such as loss of bone mineral density and muscle mass in response to deconditioning. At present, personalized medicine for astronauts is limited. With the advent of longer duration missions beyond low Earth orbit, it is imperative that space agencies adopt a personalized strategy for each astronaut, starting from pre-emptive personalized pre-clinical approaches through to individualized countermeasures to minimize harmful physiological changes and find targeted treatment for disease. Advances in space medicine can also be translated to terrestrial applications, and vice versa. This review places the astronaut at the center of personalized medicine, will appraise existing evidence and future preclinical tools as well as clinical, ethical and legal considerations for future space travel
    corecore