13,504,856 research outputs found

    Approximation Algorithms for the Asymmetric Traveling Salesman Problem : Describing two recent methods

    Full text link
    The paper provides a description of the two recent approximation algorithms for the Asymmetric Traveling Salesman Problem, giving the intuitive description of the works of Feige-Singh[1] and Asadpour et.al\ [2].\newline [1] improves the previous O(log⁥n)O(\log n) approximation algorithm, by improving the constant from 0.84 to 0.66 and modifying the work of Kaplan et. al\ [3] and also shows an efficient reduction from ATSPP to ATSP. Combining both the results, they finally establish an approximation ratio of (43+Ï”)log⁥n\left(\frac{4}{3}+\epsilon \right)\log n for ATSPP,\ considering a small Ï”>0\epsilon>0,\ improving the work of Chekuri and Pal.[4]\newline Asadpour et.al, in their seminal work\ [2], gives an O(log⁥nlog⁥log⁥n)O\left(\frac{\log n}{\log \log n}\right) randomized algorithm for the ATSP, by symmetrizing and modifying the solution of the Held-Karp relaxation problem and then proving an exponential family distribution for probabilistically constructing a maximum entropy spanning tree from a spanning tree polytope and then finally defining the thin-ness property and transforming a thin spanning tree into an Eulerian walk.\ The optimization methods used in\ [2] are quite elegant and the approximation ratio could further be improved, by manipulating the thin-ness of the cuts.Comment: 12 page

    Methods

    No full text
    Information assembled in this chapter will help the reader understand the basis for the preliminary conclusions of the Expedition 302 Scientists and will also enable the interested investigator to select samples for further analyses. This information concerns offshore and onshore operations and analyses described in the "Sites M0001–M0004" chapter. Methods used by various investigators for shore-based analyses of Expedition 302 samples will be described in the individual contributions published in the Expedition Research Results and in various professional journals

    Matrix Recipes for Hard Thresholding Methods

    Full text link
    In this paper, we present and analyze a new set of low-rank recovery algorithms for linear inverse problems within the class of hard thresholding methods. We provide strategies on how to set up these algorithms via basic ingredients for different configurations to achieve complexity vs. accuracy tradeoffs. Moreover, we study acceleration schemes via memory-based techniques and randomized, Ï”\epsilon-approximate matrix projections to decrease the computational costs in the recovery process. For most of the configurations, we present theoretical analysis that guarantees convergence under mild problem conditions. Simulation results demonstrate notable performance improvements as compared to state-of-the-art algorithms both in terms of reconstruction accuracy and computational complexity.Comment: 26 page

    Text analysis and reader interaction

    Get PDF
    EFL teachers from the Brazilian public sector have often experienced difficulties in efficiently accessing the relevant information from articles published in 'English Teaching Forum'. This study attempts to investigate these difficulties from both 'text-analytical' and 'reader-based' perspectives and begins with a brief profile of the teachers concerned. An analytical framework incorporating elements from several approaches, specifically those of Hoey (1973) and Swales (1990) is used to highlight the organisational features from a selection of 'Forum' articles. It is then hypothesised that certain clause-relational macropatterns will facilitate access and be focused upon by 'successful' readers; in contrast, writer 'justification' moves are seen as potential barriers to efficient comprehension. A sample of FL methods articles written by Brazilians and published in Portuguese is then analysed and the same set of analytical parameters are found to be valid for describing their organisational features. A review of processing models of text comprehension and related FL reading research is made following the second 'reader-based' perspective. A set of criteria regarding the processing strategies of 'successful' and 'less-skilled' FL readers is established. Verbal report methodologies are argued as a suitable means of testing both the text-analytical hypotheses and the reader processing criteria. Various types of field work carried out in the collection of verbal report data from Brazilian EFL teachers reading 'Forum' articles are then described. Groups of 'successful' and 'problematic' readers are defined according to the processing strategies revealed in the verbal reports. Although there are substantial variations in the individual strategies of individual readers, and evidence of the influence of text informativity, the 'successful' processing consistently included focusing on the clause-relational macro signals; in contrast, there was little evidence of activation of the same text features by the 'problematic' readers. Finally suggestions are made for including FL methods articles, text-analytical elements, and verbal reporting on INSED-TEFL courses in Brazil

    Methods of Child study

    Get PDF

    Materials and Methods A. Instrument

    Get PDF

    On the constructions of the skew Brownian motion

    Get PDF
    This article summarizes the various ways one may use to construct the Skew Brownian motion, and shows their connections. Recent applications of this process in modelling and numerical simulation motivates this survey. This article ends with a brief account of related results, extensions and applications of the Skew Brownian motion.Comment: Published at http://dx.doi.org/10.1214/154957807000000013 in the Probability Surveys (http://www.i-journals.org/ps/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Quantifying precision and accuracy of measurements of dissolved inorganic carbon stable isotopic composition using continuous-flow isotope-ratio mass spectrometry

    Get PDF
    RATIONALE: We describe an analytical procedure that allows sample collection and measurement of carbon isotopic composition (ÎŽ13CV-PDB value) and dissolved inorganic carbon concentration, [DIC], in aqueous samples without further manipulation post field collection. By comparing outputs from two different mass spectrometers, we quantify with the statistical rigour uncertainty associated with the estimation of an unknown measurement. This is rarely undertaken, but it is needed to understand the significance of field data and to interpret quality assurance exercises.<p></p> METHODS: Immediate acidification of field samples during collection in evacuated, pre-acidified vials removed the need for toxic chemicals to inhibit continued bacterial activity that might compromise isotopic and concentration measurements. Aqueous standards mimicked the sample matrix and avoided headspace fractionation corrections. Samples were analysed using continuous-flow isotope-ratio mass spectrometry, but for low DIC concentration the mass spectrometer response could be non-linear. This had to be corrected for.<p></p> RESULTS: Mass spectrometer non-linearity exists. Rather than estimating precision as the repeat analysis of an internal standard, we have adopted inverse linear calibrations to quantify the precision and 95% confidence intervals (CI) of the ÎŽ13CDIC values. The response for [DIC] estimation was always linear. For 0.05–0.5 mM DIC internal standards, however, changes in mass spectrometer linearity resulted in estimations of the precision in the ÎŽ13CVPDB value of an unknown ranging from ± 0.44‰ to ± 1.33‰ (mean values) and a mean 95% CI half-width of ±1.1–3.1‰.<p></p> CONCLUSIONS: Mass spectrometer non-linearity should be considered in estimating uncertainty in measurement. Similarly, statistically robust estimates of precision and accuracy should also be adopted. Such estimations do not inhibit research advances: our consideration of small-scale spatial variability at two points on a small order river system demonstrates field data ranges larger than the precision and uncertainties. However, without such statistical quantification, exercises such as inter-lab calibrations are less meaningful.<p></p&gt

    X-ray micro-computed tomography for heritage building materials

    Get PDF
    X-ray micro-computed tomography can provide information about the composition and internal structure of materials commonly found in heritage buildings such as natural stone, mortar, brick, concrete and wood. As it is a non-destructive technique, samples can be scanned multiple times, which makes it particularly suitable for studying weathering processes and the efficacy of treatment methods

    Uptake of systematic reviews and meta-analyses based on individual participant data in clinical practice guidelines: descriptive study.

    Get PDF
    To establish the extent to which systematic reviews and meta-analyses of individual participant data (IPD) are being used to inform the recommendations included in published clinical guidelines
    • 

    corecore