3,918 research outputs found

    Approximation Algorithms for the Asymmetric Traveling Salesman Problem : Describing two recent methods

    Full text link
    The paper provides a description of the two recent approximation algorithms for the Asymmetric Traveling Salesman Problem, giving the intuitive description of the works of Feige-Singh[1] and Asadpour et.al\ [2].\newline [1] improves the previous O(logn)O(\log n) approximation algorithm, by improving the constant from 0.84 to 0.66 and modifying the work of Kaplan et. al\ [3] and also shows an efficient reduction from ATSPP to ATSP. Combining both the results, they finally establish an approximation ratio of (43+ϵ)logn\left(\frac{4}{3}+\epsilon \right)\log n for ATSPP,\ considering a small ϵ>0\epsilon>0,\ improving the work of Chekuri and Pal.[4]\newline Asadpour et.al, in their seminal work\ [2], gives an O(lognloglogn)O\left(\frac{\log n}{\log \log n}\right) randomized algorithm for the ATSP, by symmetrizing and modifying the solution of the Held-Karp relaxation problem and then proving an exponential family distribution for probabilistically constructing a maximum entropy spanning tree from a spanning tree polytope and then finally defining the thin-ness property and transforming a thin spanning tree into an Eulerian walk.\ The optimization methods used in\ [2] are quite elegant and the approximation ratio could further be improved, by manipulating the thin-ness of the cuts.Comment: 12 page

    Matrix Recipes for Hard Thresholding Methods

    Full text link
    In this paper, we present and analyze a new set of low-rank recovery algorithms for linear inverse problems within the class of hard thresholding methods. We provide strategies on how to set up these algorithms via basic ingredients for different configurations to achieve complexity vs. accuracy tradeoffs. Moreover, we study acceleration schemes via memory-based techniques and randomized, ϵ\epsilon-approximate matrix projections to decrease the computational costs in the recovery process. For most of the configurations, we present theoretical analysis that guarantees convergence under mild problem conditions. Simulation results demonstrate notable performance improvements as compared to state-of-the-art algorithms both in terms of reconstruction accuracy and computational complexity.Comment: 26 page

    Materials and Methods A. Instrument

    Get PDF

    Quantifying precision and accuracy of measurements of dissolved inorganic carbon stable isotopic composition using continuous-flow isotope-ratio mass spectrometry

    Get PDF
    RATIONALE: We describe an analytical procedure that allows sample collection and measurement of carbon isotopic composition (δ13CV-PDB value) and dissolved inorganic carbon concentration, [DIC], in aqueous samples without further manipulation post field collection. By comparing outputs from two different mass spectrometers, we quantify with the statistical rigour uncertainty associated with the estimation of an unknown measurement. This is rarely undertaken, but it is needed to understand the significance of field data and to interpret quality assurance exercises.<p></p> METHODS: Immediate acidification of field samples during collection in evacuated, pre-acidified vials removed the need for toxic chemicals to inhibit continued bacterial activity that might compromise isotopic and concentration measurements. Aqueous standards mimicked the sample matrix and avoided headspace fractionation corrections. Samples were analysed using continuous-flow isotope-ratio mass spectrometry, but for low DIC concentration the mass spectrometer response could be non-linear. This had to be corrected for.<p></p> RESULTS: Mass spectrometer non-linearity exists. Rather than estimating precision as the repeat analysis of an internal standard, we have adopted inverse linear calibrations to quantify the precision and 95% confidence intervals (CI) of the δ13CDIC values. The response for [DIC] estimation was always linear. For 0.05–0.5 mM DIC internal standards, however, changes in mass spectrometer linearity resulted in estimations of the precision in the δ13CVPDB value of an unknown ranging from ± 0.44‰ to ± 1.33‰ (mean values) and a mean 95% CI half-width of ±1.1–3.1‰.<p></p> CONCLUSIONS: Mass spectrometer non-linearity should be considered in estimating uncertainty in measurement. Similarly, statistically robust estimates of precision and accuracy should also be adopted. Such estimations do not inhibit research advances: our consideration of small-scale spatial variability at two points on a small order river system demonstrates field data ranges larger than the precision and uncertainties. However, without such statistical quantification, exercises such as inter-lab calibrations are less meaningful.<p></p&gt

    Uptake of systematic reviews and meta-analyses based on individual participant data in clinical practice guidelines: descriptive study.

    Get PDF
    To establish the extent to which systematic reviews and meta-analyses of individual participant data (IPD) are being used to inform the recommendations included in published clinical guidelines

    LARVA - safer monitoring of real-time Java programs (tool paper)

    Get PDF
    The use of runtime verification, as a lightweight approach to guarantee properties of systems, has been increasingly employed on real-life software. In this paper, we present the tool LARVA, for the runtime verification of properties of Java programs, including real-time properties. Properties can be expressed in a number of notations, including timed-automata enriched with stopwatches, Lustre, and a subset of the duration calculus. The tool has been successfully used on a number of case-studies, including an industrial system handling financial transactions. LARVA also performs analysis of real-time properties, to calculate, if possible, an upper-bound on the memory and temporal overheads induced by monitoring. Moreover, through property analysis, LARVA assesses the impact of slowing down the system through monitoring, on the satisfaction of the properties.peer-reviewe

    Bringing Salary Transparency to the World: Computing Robust Compensation Insights via LinkedIn Salary

    Full text link
    The recently launched LinkedIn Salary product has been designed with the goal of providing compensation insights to the world's professionals and thereby helping them optimize their earning potential. We describe the overall design and architecture of the statistical modeling system underlying this product. We focus on the unique data mining challenges while designing and implementing the system, and describe the modeling components such as Bayesian hierarchical smoothing that help to compute and present robust compensation insights to users. We report on extensive evaluation with nearly one year of de-identified compensation data collected from over one million LinkedIn users, thereby demonstrating the efficacy of the statistical models. We also highlight the lessons learned through the deployment of our system at LinkedIn.Comment: Conference information: ACM International Conference on Information and Knowledge Management (CIKM 2017
    corecore